Neocameralism is Autism

Neocameralism: the system of government proposed by Mencius Moldbug which proposes to divvy up ownership of the state into shares based on de facto influence. The state is then run by a CEO who operates the state to maximize profit.

Now, as a thought experiment, neocameralism is great. As a way of actually running a government, much less so. The impulse to arrange sovereignty into neat little shares derives from the same autistic impulse that drives libertarianism in general. “Wouldn’t it be nice if..?” Yes, it would be interesting if humans were robots who could think of their country as a joint-stock corporation and actually respected the divvying up of shares based on pieces of paper. But, we aren’t.

Human nature, and especially the nature of who and what we respect, is extremely messy and complex. We want there to be a unified figurehead, and that figurehead will tend to have substantially more power than the rest. The fast-and-frugal heuristic “take the best” results in winner-take-all outcomes like the Mona Lisa. Why is the Mona Lisa the most popular piece of artwork in the world, far more popular than its closest rivals? Because something had to be the best, that’s how human brains work. Attention and power are handed out based on a power law curve, not evenly. This is fundamentally at odds with the notion of running a sovereign nation as a joint stock corporation with a long tail of shareholders. (I know a power law curve has a long tail, but in practice human psychology tends to chop the tail off quickly.)

No one is ever going to respect pieces of paper that say how much of a share someone owns in the government. De facto influence in the government is always going to be somewhat fluid, and the people in control will make sure of it. Loyalties and true power can shift quickly, even in an established monarchical state, and a bunch of pieces of paper describing sovereignty are going to quickly become a joke. While the broad outlines of power may be specified with such paper, the messy dynamics of hominid psychology rules over all, and will quickly shred these clippings. The true shares of power are representations in our brain, which update much more quickly and fluidly than any stiff formalities.

The autistic impulse is to put everything in neat little boxes, so we can understand it without going to the trouble of using human intuition to pull apart complex social webs with hidden components. Thing is, the engineers of these complex social webs have every incentive to hide the inner workings from those who would seek to systematize and write down their details. The value of these webs is embedded in their secrecy and esotericism. Webs of power that are made public will quickly become nothing more than symbolic, as the real power is bought and sold behind closed doors and on putting greens at country clubs.

A common rejoinder to criticisms against neocameralism is the simple phrase, “Robot drones”. First of all, for a hypothetical system of government to have to rely on technology that barely exists yet is already a sign of obvious weakness. It’s like coming up with a form of government based on abundant nuclear fusion energy. For those of us dealing in reality and living in the present, it’s just not compelling. Secondly, those who say “robot drones” are ignorant of a great many technological and military/logistic geopolitical realities. Policing and warfighting requires human beings, not drones.

Drones that can replace soldiers are not around the corner. They are likely more than 30 years off, probably 50. Right now, drones are extremely expensive pieces of specialty equipment with narrow uses. A Predator drone costs $4 million, a soldier with a rifle and body armor, plus logistics, will cost less than $100,000 a year. Wars are fought with people, not robots. Any country that uses robots to police its own population or fight wars is going to face quite a bit of international backlash and possibly provoke the use of nuclear weapons.

As we discovered in Iraq, occupying and policing a hostile area is more about culture and winning hearts and minds than it is about blowing things up or putting bullets in people. A soldier can speak the language, use a gentle touch, reason with people, and so on, where a robot can only serve to terrify and annoy. War is not just about an assault; it’s about knowing who is in charge in the local community, what their favorite tea is, who their friends are, what their relationship to the government is, and so on. Figuring this all out requires humans.

Real soldiers have to do things like fight in rain and snow, keep going when there is a scarcity of supplies, recover when they are hit by a rock, and so on. A robot that malfunctions or breaks will require a human being to repair it. Those human beings can be targeted, and will require other human beings–not just robots–to protect them. At best, drones will serve as a way of augmenting troops, not replacing them, in the foreseeable future. Robots break, they lack intelligence, they lack finesse, they lack guile. True intelligence, resilience, finesse, and guile are necessary attributes of any warfighter, and they will only belong to humans in the lifetimes of most alive today.

Machine vision is not nearly at the level where it can distinguish a camouflaged human from background garbage. This is doubly true in an urban or indoor environment. No one has ever mounted so much as a rifle on a small drone. Machine vision is not improving exponentially; if anything, it is improving at a slow linear rate. The kind of computers used for machine vision in self-driving cars are extremely bulky, and, by the way, Moore’s law itself is leveling off. Any low-flying drones will be vulnerable to explosively pulsed flux compression generators, nets, dummies, and other simple tricks.

Neocameralism, as a serious strategy for government, has many of the same problems as libertarianism; it is manifested by an autistic, systematizing yearning to put the complexities of human behavior into easily-analyzed little boxes. But historically, human behavior has proved itself remarkably resistant to such systematization. Only a few geeks and technocrats have ever had serious issues recognizing this. I suppose they always will.