The reader of Democracy and Tradition Part One is likely to have a long list of objections to the thrust of the argument. Say he agrees to work with the tide gauge analogy, but he does not so easily let go of the fact that the sea levels have risen over the past century. This seems to him to play quite easily into the progressive narrative, at least in a sense, that there is a discernible direction to history. While he doesn’t fancy himself akin to the stumbling modern man, less modern man does not suit him much either.
Firstly, let us build on our understanding of random walks. There are two people I could introduce you to when opening this topic again. The first is Andrey Markov, the Ruskie, and the latter is Karl Pearson, the Limey. The former is the namesake of a mathematical concept called the Markov Process, which is a generalization of a random walk. Essentially, if the current state of a phenomenon is the only useful piece of information in predicting its future state, then it is a Markov Process. But Pearson is going to win out in the competition to be showcased, as his now greater than a century old question in Nature was the genesis of using the phrase random walk to illustrate this concept. There is a rich line on the last page of the link provided, which I’ve snipped bellow:

Pearson was an interesting fellow. Certainly a progressive of his time, he checked the boxes of women’s suffrage, eugenics, and socialism. Of those three ideas, I’ll let slip that the author considers one of them to be mere silliness, another dangerous, and the other misguided altruism; it is up to the reader to work out which is which. To get a good feel for Pearson’s thought, I recommend a piece from the 50s in The British Journal of Sociology called Karl Pearson: Socialist and Darwinist. For now, to give the tl;dr of his thinking during his prime, here’s a quote from the end of his address at Newcastle during the time of the Boer War:

I really like analogies. The more you read this blog, the more you’ll pick up on that. Fundamentally, analogies are just isomorphisms. Or maybe, isomorphisms are just analogies, take your pick. When Karl Pearson was trying to come up with a model for random migration, he called his problem the random walk problem in Nature. One of his answerers, Lord Rayleigh, indicated that the model he was looking for was “identical with a problem in the combination of sound amplitudes in the case of notes in the same period.” So there is a seeming analogy between a problem in acoustics and one in migrations. Pearson joked about the stumblings of a drunken man, indicating an analogy between the drunken man’s gait and the prior two phenomena.
Admittedly, I skimmed the Mathematical Theory of Random Migration paper cited above. I am skeptical of the viability of mathematical proofs in explaining anything but the most controlled of phenomena. I am a simple man, so we’ll make simple adjustments to our random walk model of culture in order to illustrate concepts. We are simply making analogies between mathematical objects and phenomena. Realism, ‘proofs’, both scientific and social scientific, are foolish.
So our first modification to the random walk will be to give it some momentum. When we observe political-social history, trying to argue that it’s completely a Markov Process is tough. No one knows what the future will bring; no one knew what the future would bring. But there are directions which once set upon, more often than not that direction continues to be followed. We do not need to remove the essential randomness to our model, we just need to wonder what happens if the current random process is affected by a prior one as well. There are two ways that I’ll try to illustrate this. The first is with a Moving Average model:

The above lines are generated with a very simple formula. The current term in time t, x(t), is equal to the prior term, x(t-1), plus a random term, e(t), and half of the random term generated in the previous period, e(t-1). So: x(t) = x(t-1) + e(t) + .5(e(t-1)). Easy enough, right? This captures somewhat the idea that if the current term is greater than the prior, or more precisely: the current random element generated a positive number in this period; then the next term is likelier than random chance to be greater than the current. You see this come through in the swoops of the lines above.
The second way to do this may be a bit easier to grasp:

This one generates momentum in a much more literal sense. If the prior term is greater than its prior, then the current is a random number between -.4 and .6; else it is a random number between -.6 and .4. So if the slope is tending up, it will continue to tend up; if it is tending down, it will continue to tend down. Again, you can see this in the swoops of the above graph.
Generalize this concept of time to whatever works for you, I think it works on most any scale. Read a history of the French Revolution, until the Thermidorian Reaction, France’s revolutionary tendencies produced more revolutionary tendencies. The larger time frame version of this is well put by my favorite Son of Abraham:
“If there is any constant phenomenon in the last few hundred years of Western history, it’s that – with occasional reversals – reactionaries tend to lose and progressives tend to win. Whether you call them progressives, liberals, Radicals, Jacobins, republicans, or even revolutionaries, socialists or communists, the left is your winning team.”
There’s been a very large swoop, at least in a certain sense. Moldbug continues on to say what I plan to drive at in much of this blog:
“However, if these changes are indeed arbitrary, a random walk could reverse them. Professor Dawkins’ great-great-grandchildren could then explain to us, just as sincerely, the great moral advance of society, which early in the 21st century still turned a blind eye to rampant sodomy and had no conception of the proper relationship between man and servant.”
The next step I want to take on our journey into random walks is to open up our concept of them into another dimension. There have only been two dimensions thus far, a time and a magnitude. But we need a second magnitude, another axis, so that we can visualize this spatially.

For this one I needed to switch over to R, there’s no good way to graph this as nicely in Excel. So above we have two random walks (with momentum of the second kind discussed (though it would work just as fine with pure random walks)), and they are plotted against each other, one for the x axis and the other for the y. We now have the drunken man’s stumble. One could easily draw a map behind the plot above, showing the drunken man’s path from bar to bar, sometimes revisiting one for a second or third time. He’s definitely moved far from his starting point, but assuming he is indeed drunk (aka the process generating the numbers (the error term) has a mean of 0), he will find his way home eventually. Here are a few more:

Above are 42 momentum random walks in 2 dimensions, each with 1000 data points. I promise I didn’t just make that in Microsoft Paint. They are a bit hard to look at, but see how they blob around the middle? Try and abstract here with me a bit, say we throw different cultural traits/tendencies around the graph. Similar ones are closer together. Now let’s define a culture on our graph as some subset of these points, for now we’ll stick to circles to define our culture subsets. The following illustrates this idea:

Now put these two graphs together. Say cultures migrate over time, in a random walk-ish fashion. We’ll start them all at the point above, centered around (0,0). Now let’s see where they have gone after 100 random movements:

After 500:

After 1000:

You see the variation of these cultures increasing with time, but you can still pick out a center, even after 1000 points in. Imagine not only the same 42 circles spanning in, out, and around; but also circles spawning off other circles, as cultures have split and drifted apart. You will always have some outliers, but you will also be able to pick out that center.
In part three, we will tie this idea back around to the constants idea defined in part one.




