Not an incentive problem. Be like Elon.
I love economics. I write a lot of regulatory comments, and I’m good at economics. You ask—where does the money come from and go, what are the inputs and outputs, and how would a “rational” person respond? This is a kind of math. For those wanting entry into this world, I always recommend “The Evolution of Cooperation” by Robert Axelrod. It’s an easy, short read and extremely intuitive and accessible.
And a lot of what people do in management is applying these principles that treat man as an economic unit, homo economicus. There is some population-wide truth to this, but just like Asimov’s psychohistory, it doesn’t apply to the individual, historical case. Men are not machines (and contra the PUA, women are not “programmable”).
In fact it’s my disagreement on this that led to one of the first times I failed to sell out. I had a job offer from a major hedge fund that is known for its adherence to strict rationality (of a sort—even now I can’t bring myself to use these terms without caveats). And in a later conversation, I was asked how I handle things when there is disagreement, even after discussions have prolonged. The answer they were looking for was “obviously you need to continue discussing.” But like an idiot I was compelled to explain that some people like vanilla ice cream, and some like chocolate ice cream, and you can’t overcome these kinds of disagreements with dialogue. Offer rescinded. I can never not be honest—it’s a curse, or a blessing, I guess.
So what is selling out, and how do you prevent it? I’m deep in the institution-building enterprise now, and there are two schools of thought. The rational one believes that selling out is caused by incentive misalignment and that proper institutional structure can “solve” it. “Selling out” is a kind of capture, and if only you align incentives you can create stronger people. Constitutionally, I love this— I love the idea that the best mind wins, that “problems” can be “solved,” and that people can be controlled. It’s an attractive position to the optimistic high IQ industry type. Look at the old Olin Foundation—a libertarian billionaire decided that the left-wing takeover of the Ford Foundation and others could be solved if he had his money be spent down rather than existing in perpetuity.
The other school of thought is the one I subscribe to. “Selling out” is a problem of constitution, of sin, and of symbol. Our obsession with ROI (which I’ve written about before) and “metrics” doesn’t reflect reality, but it creates reality. The more you talk about money and incentives, the more people are molded into that. Classical liberalism reflects nature only because it molds man into homo economicus. If you think low, you become low. Look at our bureaucrats after decades of bureaucracy—this isn’t how healthy people think, but it’s how successful people think. If you think in monetary terms and structure in monetary terms, at a social evolutionary level you select for those who fit best in those frameworks. And certainly, my mind works this way, but there’s always some nagging inability to go along with what would make me the most money.
Enter Elon. Job interest in working for Twitter is apparently up 250%. Is this because he has aligned incentives and people think they can make more money there? Is it merely because Twitter is in the news? This can account for some interest, but surely not all. No, I think it’s because Elon is demonstrating a heroic vision. At root, I don’t know if he’s heroic, but that doesn’t matter—what matters is not his subjective mental state, but the public actions he’s taking. He is demonstrating a heroic vision for something that is far more important than his space or other projects, and people are responding to that.
Selling out, I think, is aligning yourself with low goals, fungible, liquid goals. Not selling out involves aiming for the higher, spiritual goods in life. Corporations have independent ontological existence—they are greater than the sum of the people in them. They have “thoughts” of a sort. The old mind-body problem—the brain has a billion neurons, and China has a billion people—is China a brain? No, but China is a real thing. And corporations are real as well. And they can align for higher goals, or lower goals. When they align with higher goals they are attractive and secure, and go through a healthy life-cycle of birth, aging, and death. When they align with lower goals they never quite become themselves, and they are subject to possession / brain hijacking, just like people—this is, I think part of the explanation of ESG.
So if I were trying to encourage people to not “sell out,” I would be operating like Elon. I wouldn’t be nibbling around the edge with incentive alignments. People throughout history have fought and starved and died for higher goods, and we can have faith that humans are still humans, and still want this. Of course, align incentives—this is like marketing. You go to battle with the army you have, and we don’t have a society filled with tragic heroes (yet!). But when you offer the money, offer community, offer a heroic vision as well. That will attract the right people.
I recently worked for a real hero. He was offered a major incentive to sell out. And he declined. He’s a man. And I’ve been privileged to have had a string of strong female mentors as well, brilliant people, who have done similar things. They are women. I find that incredibly attractive. Each could have chosen to be a machine, programmed by society.
I won’t go through all the tragic times I’ve been faced with the choice to go big or go home, but the decision gets easier each time. Take a risk. Life is too short to be a small soul.