The Technological Republic: Hard Power, Soft Belief, and the Future of the West by Alexander C. Karp and Nicholas W. Zamiska.

One can read Alex Karp’s new book, “The Technological Republic,” in many different ways. Is it a call-to-arms? A critique of inefficient corporate cultures? A historical review of the concept of “the nation?” An attempt to rewind the clock of American higher education to before the 1960s counter-cultural revolution? Or is it just a bunch of self-praise by the billionaire co-founder of Palantir Technologies?

If there’s a through-line to all of this, it may be the following: Why is it that the smartest people in tech—especially in Silicon Valley—spend so much time and effort on things that contribute little to the national interests of the United States? Why do companies such as Google, Microsoft, and Meta attract and retain the most highly educated, tremendously well-paid talent in their fields—particularly in software engineering and AI—only to waste that potential on trivialities like photo-sharing apps and online advertising? And was there a moment in history, somewhere between the Apollo Space Program and Instagram’s latest generation of AR filters, when the pursuits of founders, leaders, engineers, and scientists began to diverge drastically from the most vital interests of the nation and its citizens?

For Karp, of course, these interests circle around military and police applications. And from a European perspective—especially at a time when the global security architecture is being fundamentally reshaped—it is tempting to dismiss Karp’s U.S.-centric conclusions as warmongering. Yet when one engages earnestly with his arguments and looks beyond the shallow cultural critiques, the self-congratulation, and the bellicosity, Karp raises deeper questions about the meaning of work and the purposes for which we actually build and use some of the most powerful technologies of our day and age.


The elite of Silicon Valley, from founders and leaders down to engineers, Karp argues, have largely abandoned the pursuit of problems of public interest. Instead, they’re merely chasing “the consumer”—building little apps and quirky gadgets that entertain (at best) but contribute nothing of meaningful value to society at large. This, however, was not always the case: Companies at the forefront of technological innovation used to collaborate closely with the state, working on projects that actually advanced the U.S. and, by extension, all of humanity. Think of the Apollo Space Program, ARPANET (the precursor of the internet), or the invention of GPS, for example. These and countless other large-scale projects could not have been completed by public institutions alone, nor would any private company have had the resources and stamina to see them to fruition.

Karp, billionaire, eccentric, contrarian, and co-founder of Palantir, points to contemporary technologies, such as AI, autonomous vehicles and drones, and nano- and biotech, which hold immense potential for public-private collaboration—that remains largely untapped. Of course, increased public investment in any of these domains would benefit Palantir’s and Karp’s bottom lines. But I don’t think his argument is driven solely by financial self-interest. Throughout the book, one gets the impression that he’s not so much frustrated with a lack of government funding as with a particular mindset in Silicon Valley that stands in the way of the private sector upholding its side of the bargain. Karp highlights two cases in point where employees at Google (project Maven) and Microsoft (HoloLens) petitioned their respective CEOs to cut government contracts on moral grounds. They, so the engineers argued, had not signed up “to develop weapons.”

This stance, which at first sounds reasonable to many—maybe even laudable—is sharply criticized by Karp for two reasons: First, the technologies in question would doubtlessly have advanced the U.S. military capacities. They would have helped deter the nation’s adversaries, strengthened the strategic position of its allies, and potentially either helped avoid armed conflict or, if it came to it, increased the chances of the good guys winning. Declining to contribute to that goal on pacifistic grounds, Karp argues, amounts to a short-sighted form of free-riding: As a citizen of the U.S. (and, by extension, of any Western democracy), you owe your privileged existence ripe with peace, prosperity, and liberty to the existence of an invisible umbrella of military protection. By that logic, a refusal to support the upkeep and expansion of that umbrella would be the modern-day equivalent of 1960s draft-dodging.

Secondly, Karp hones in on the moral double standards of these engineers: They turn a blind eye to all the dubious practices their employers otherwise engage in—from preying on consumers’ personal data to building ever more invasive advertising targeting mechanisms to manipulating children to spend more time on their sites and apps—but then balk at the idea of contributing to AI systems that would help soldiers in war zones evade roadside bombs.

In Karp’s analysis, this lack of moral integrity can be attributed to a deeper problem: The absence of a larger, unifying vision to which the engineers, entrepreneurs, founders, and leaders of Silicon Valley feel allegiance. For previous generations, the concept of “the nation” at large, Karp argues, has managed to serve that purpose—for better or worse. From the end of World War II up until the Space Race, engineers and scientists have taken pride in building ingenious things which put the U.S. ahead of first Nazi Germany and later the USSR. In our day and age, however, shallow, commercial, hedonistic pursuits have taken over. In fact, the individual programmer or data scientist today is much more concerned with protecting individual rights and freedoms from infringement by the state, rather than how they can contribute to the advancement of their country. Needless to say that, in Karp’s judgment, this puts the U.S. at an enormous strategic disadvantage against its adversaries—most prominently, China.

Here’s where, as a European observer, one starts scratching their head a bit: Hasn’t a focus on the individual, rather than the collective, always been a core component of American culture? Especially vis-à-vis East Asia? Yes and no, Karp muses. In the 1960s, academic institutions in the U.S. cut back their (up until then mandatory) courses on the history of Western civilization. Where previous generations had been compelled to read Plato, Aristotle, Marcus Aurelius, John Stuart Mill, and even Karl Marx, those who built today’s most powerful tech companies have often not received that type of education. Many of these founders and leaders are oblivious to the long arc of history that spans, arguably, from Athenian democracy to the Cold War, and which sets the context in which the United States as a nation has to be read. Untethered from such historic foundations, neither the CEOs who set vision and strategy for their companies, nor the engineers who work in these environments, would prioritize contributing to the national project over individualistic pursuits.

One can, of course, take Karp’s interpretation as a vain attempt to warm up a stale academic debate that has been heckled over since the 60s on U.S. college campuses. However, I think there’s an underlying question here that we in Europe have to grapple with just as well: Is it wise that our most highly educated engineers and scientists are trained more or less exclusively on subjects that are economically exploitable right here, right now—and not, say, in history, art, culture, philosophy, morality, and ethics? Where, to pick up a recent public debate in light of the E.U.’s efforts to regulate Artificial Intelligence, should “ethical” AI systems come from, if the engineers who are supposed to build them have never read, or even heard of, Socrates, Thomas Aquinas, Immanuel Kant, or Friedrich Nietzsche?

Karp’s inferred conclusion that a re-installment of compusory Western civ courses in college would lead to more tech companies working on government projects—if this is what we want—is, of course, a bit of a stretch. The merit of teaching up-and-coming engineers something more substantial than coding and to endow our rising entrepreneurs with tools beyond accounting and business modeling is, in my view, undeniable.