When I was thirteen, I snuck into a theater to see South Park:Bigger, Longer & Uncut. The film is so vulgar that the Motion Picture Association of America insisted on giving it an R rating, which is why I had to sneak in. I’m glad I did. I have never laughed harder in my life.
What I remember most clearly is a scene that involves Bill Gates. It begins with a US Army general briefing a group of soldiers. In the middle of his presentation, his computer crashes. He demands to see Gates, who is promptly hauled out. “You told us that Windows 98 would be faster and more efficient, with better access to the Internet!” yells the general. “It is faster,” Gates insists—and the general shoots him in the face.
Everyone in the theater cheered when this happened. Loud, joyous cheering. How hated do you have to be, I remember thinking at the time, for your point-blank execution to elicit such unanimous delight?
This was the state of Gates’s reputation in 1999. As the journalist Anupreeta Das explains in her new book Billionaire, Nerd, Savior, King: Bill Gates and His Quest to Shape Our World,by century’s end the “king of software had been dethroned.” When Gates cofounded Microsoft in 1975, he was only twenty years old. Over the subsequent decades he led the personal computing revolution, alongside his frenemy Steve Jobs. After Microsoft went public in 1986, he became the country’s youngest billionaire (at thirty-one) and the first who hailed from tech.
In this respect, as in many others, Gates was an innovator. Other people had made money from computers, but he was the first to attain the kind of wealth and power that invited comparisons to John D. Rockefeller. In the 1990s it dawned on the American public that digitization could alter the deep structures of the economy and that high technology might prove to be the breeding ground for monopolies on the scale of the great trusts of the Gilded Age. In 1995 the most valuable publicly traded company in the United States was General Electric. Five years later it was Microsoft.
With a new kind of monopoly came a new kind of monopolist. As Das argues, Gates was an “early template” for a species of capitalist overlord that has since become excruciatingly familiar: the nerd-bully, whose oddness and rudeness are the necessary effluent of his genius. Computerization meant that the commanding heights of the economy would come to be occupied by men who had spent their formative years getting stuffed into lockers, and were now determined to exact their revenge.
At Microsoft, Gates was by many accounts a nightmare boss, “prone to expletive-laden fits of rage,” reports Das. A workaholic, he demanded long days from his workers and even, as he later confessed to a BBC interviewer, memorized the license plate numbers of their cars so that he could keep track of their hours. Microsoft could be especially inhospitable to women; the historian Margaret O’Mara tells Das that the firm was so “intensely masculine” that it resembled a “frat house.” Gates himself, Das reports, had a reputation as a womanizer. “A former senior Microsoft employee recalled being told by an office assistant to Gates that he was like ‘a kid in a candy store’ in the company of women, if not restrained,” writes Das. Her book includes multiple stories of Gates hitting on his subordinates; in 2000 he conducted an affair with an employee that led, after she reported it years later, to his departure from the Microsoft board in 2020.
By the 1990s the unpleasant parts of Gates’s personality were becoming more widely known. The press increasingly portrayed him as
arrogant, disdainful, indignant, angry, snide, condescending, petulant, contemptuous, truculent, evasive, hyperaggressive, despotic, bullying, an enfant terrible of the tech industry, and a robber baron.
This public relations crisis culminated with the Justice Department’s antitrust suit against Microsoft, which began in 1998 and continued for years. Gates’s videotaped deposition from the trial remains a masterclass in how not to behave in a deposition: slouched and pouty, he debated the definition of words like “we” and pretended not to understand simple questions. He was so obnoxious that when the Justice Department lawyers played excerpts in court, the judge laughed.
“The more others painted him a villain, the more he thought himself a victim,” Das writes. The same could be said for any number of founders who followed in his footsteps, despite their efforts to distinguish themselves from him. As Silicon Valley took its modern shape in the 1990s, it did so under the shadow of Gates, that mad boomer to the north—since 1979 Microsoft has been headquartered in his home state of Washington—whom the young prospectors of the Internet economy feared and hated. They wanted to build companies that were the opposite of Microsoft, and in doing so they laid the basis of Big Tech. Defying the monopolist Gates, they constructed monopolies of surpassing ambition and ruthlessness. He was the father they rebelled against and, in so rebelling, became.
In the 1990s it remained an open question which metro area would lead the digital revolution. Silicon Valley had overtaken the Route 128 tech corridor in Massachusetts, whose crown jewel, the Digital Equipment Corporation, manufactured the computer that the young Gates used to create Microsoft’s first commercial software. But Gates’s home base of Seattle posed a more serious threat. Microsoft was merciless and rich. One former Microsoft executive tells Das that he and his colleagues would regularly call Silicon Valley venture capitalists to make “lowball offers to buy their portfolio companies.” If they refused, Microsoft’s response was: “Fuck you, we’re going to crush you.”
The standard account is that Microsoft ultimately lost this tug-of-war because of the Internet. That is, Gates didn’t see the Internet coming until it was too late, tried to pivot, failed, got ambushed by the dot-com boom that swept Silicon Valley, and then, more fatally, by the platforms (like Google) that arose in the aftermath. This is broadly true but obscures a crucial point. What made the Internet difficult for Gates wasn’t anything about the technology itself—he understood the principles of networking perfectly well and had even worked on a program designed to run across multiple computers on ARPANET, the Internet’s predecessor, back in college. The real issue was that the Internet cut deeply against his instincts as a businessman.
Gates made his money by taking something that used to be free and putting a price tag on it. Born in 1955 to a wealthy family in Seattle, he got his start in computing at Lakeside, the elite private school that he attended from 1967 to 1973. In those days computers were massive and expensive, and they were rarely glimpsed outside of military or corporate settings. Lakeside was one of the few schools in the world that had access to one, which is how Gates became a programmer at age thirteen. Soon he and his classmates Paul Allen and Paul Gilbert had a lucrative software consultancy, earning tens of thousands of dollars writing programs for local clients. In 1973 Gates enrolled at Harvard, but he completed only three semesters before leaving to start Microsoft with Allen in 1975.
By then miniaturization had advanced to the point where a computer could fit on a desk. These so-called microcomputers were still primitive, but Gates and Allen glimpsed a market opportunity. As computers became smaller and cheaper, they reasoned, mass adoption would follow, as people would turn to the devices to crunch spreadsheets, play games, and write documents. By selling software for microcomputers—they called their company “Micro-soft” before dropping the hyphen—they could secure a profitable foothold in the coming personal computing revolution.
Early on, however, they faced a problem. There was a thriving hobbyist community around microcomputers, whose members circulated code at clubs around the country, free of charge. They tended to see microcomputers in countercultural terms, as a technology that would enable the democratization of computing, and they didn’t understand why programs should be bought and sold.
Gates became notorious for denouncing these hippie-hacker gift economies. In 1976, after Microsoft’s inaugural product—an “interpreter” that made the programming language BASIC compatible with the Altair 8800, a popular microcomputer—was widely pirated, he published an open letter accusing the nation’s hobbyists of “theft.” Gates’s haters often cite this document as early proof of his villainy, but he was making an entirely reasonable point. If nobody paid for software, there would be no way to develop it professionally. And without professional software it would be difficult to sustainably build and maintain programs that were sophisticated and reliable enough for personal computing to attain a mass scale.
Gates’s admonition would soon have the force of law: in the early 1980s Congress and the courts established that computer programs were “literary works” that could be copyrighted. Software turned out to be a very lucrative genre of literature. A program, once written, costs nearly nothing to reproduce. Easy reproducibility is a defining feature of digital information, and what Gates discovered was that if you took this fact and fused it with the protections of intellectual property, you could build an infinite money machine. The same principle that underwrote the gift economies of early computer enthusiasts became, in a Reaganite inversion that mirrored the broader disintegration of the counterculture, the basis of a capitalist miracle.
The Internet, in its way, represented the counterculture’s revenge. In the 1990s, as the technology became broadly adopted, the ethos of the 1970s returned. Online, bits flowed even more freely than they had among the hobbyists. In time this problem would be confronted in various ways, from the passage of the Digital Millennium Copyright Act in 1998 to the demolition of high-profile piracy enablers like Napster. But the more fundamental question, for Gates and other digital capitalists, was how to construct a viable business model within a networked system that seemed indifferent, if not hostile, to commercial imperatives.
This was why Gates initially considered the Internet “a side project inside Microsoft,” writes Das. He believed “there was little money to be made.” In May 1995 he shifted course, issuing a company-wide memo in which he described the Internet as a “tidal wave” that would profoundly change the way the company did business. By then an effort was already underway to create what would become Internet Explorer, Microsoft’s web browser. Its debut in August 1995 was the first shot fired in the “browser war” that brought the Seattle giant into its most vicious conflict with Silicon Valley. Internally, the company described it as a “jihad.”
At the time the most popular browser was Netscape Navigator, created by a Silicon Valley start-up that had become the poster child of the dot-coms. Gates targeted it for extermination. Microsoft began bundling Internet Explorer with the Windows 95 operating system for free, which made it everyone’s default browser. Within a few years Internet Explorer had swallowed Netscape’s market share. “I feel Bill Gates is happiest when he is crushing the life out of companies that dare establish territory on the borders of Microsoft’s sprawling dominion,” grumbled Netscape’s cofounder Jim Clark in a memoir quoted by Das.
This turned out to be something of a Pyrrhic victory, however, as Microsoft’s tactics triggered the Justice Department’s antitrust suit in 1998. While the firm ultimately avoided being broken up, the experience was traumatizing. “It completely changed everything inside the company,” one former executive tells Das.
And yet in retrospect the browser war was a sideshow. Gates was a software guy, so his main response to the Internet was to make more software. But the real money turned out to lie in data, not code. It took some time for this lesson to sink in: it was not until the 2000s that Google, from within the smoking crater of the dot-com crash, improvised the basic recipe. Das describes the early Google, founded in 1998 by two Stanford grad students, as a “very self-consciously anti-Microsoft” company, and this antipathy would become a major ingredient in the making of the modern attention economy.
Gates hated the freeness of the Web; Google’s founders made it an asset. Finding Gates’s moneygrubbing profoundly uncool, they insisted on keeping Google a free service, and they landed on advertising as the way to do so. Advertising also threatened to be uncool, so they came up with a system for showing people ads that were relevant to their interests, based on what could be inferred from monitoring user activity. It cost nothing to “surf”—a word that captures the pleasures of early browsing—and this encouraged people to do it more. The more they did it, the more data could be made about them, and this data could subsequently be mined for money in various ways, such as through the sale of targeted advertising. By this roundabout route Silicon Valley achieved national, if not global, supremacy, and the long courtship of capitalism and the Internet was finally consummated.
Today Gates’s days as a corporate killer are mostly forgotten. He became an affable philanthropist, often pictured in a cashmere sweater layered over a collared shirt. The transition began in 2000, when Gates stepped down as Microsoft’s CEO and, with his then wife, Melinda French Gates, dedicated himself to charitable work through a foundation bearing their names.1 Das makes a persuasive case that his embrace of philanthropy was driven in large part by the desire to rehabilitate his image after the shellacking it received in the 1990s. Philanthropy also, she observes, offered a new outlet for his competitiveness. “Once Gates understood that interacting with the media was essentially a game where you could score points based on performance, and the data reflected that thesis, his competitive nature kicked in,” she writes.
Gates won the game. In 2014 a YouGov poll rated him the most admired man in the world, a title he held for five consecutive years. Das attributes this makeover to the “small army of communications professionals” who surround Gates and work tirelessly to maintain his public standing. But his new aura was also, in fairness, a product of the extraordinary scale of his charitable giving.
As a rule, the superrich are stingy. In 2024 Forbes found that the total lifetime giving of the four hundred wealthiest Americans came to only 5 percent of their combined wealth. Rockefeller and his fellow industrialist Andrew Carnegie were historic exceptions to this trend: Rockefeller gave away half of his fortune, Carnegie nearly 90 percent. Gates has consciously emulated their example. In the 1990s he was called a robber baron. After 2000 he embarked on a philanthropic career whose only precedent was that of the robber barons.
The Gates Foundation is one of the largest charitable foundations on the planet. It boasts an endowment of $86 billion and has spent $102 billion since its inception. In 2024 alone it reported dispensing $8 billion in “total charitable support.” It operates in 140 countries and employs more than two thousand people across offices on four continents.
These numbers buy Gates enormous prestige. Das describes how the foundation has forged a “network of ties with governments, multilateral institutions, corporations, countries, universities, and nonprofits” through which it shapes policy. In matters of global public health and development, especially in sub-Saharan Africa, its influence is considerable. It doesn’t just cut a check and walk away; it is, in Das’s words, “scriptwriter, director, and producer.” The foundation “develops ideas and strategies about how to enact change,” she explains, “and then finds partners to implement them.” It is, in other words, a political entity, making political choices about how societies should allocate resources toward particular ends.
Many criticisms have been made of the Gates Foundation over the years, and Das gives ample space to them. Perhaps the most fundamental of these is that it is undemocratic for one unelected individual to hold so much power over people’s lives. The African countries where the foundation is especially active were once, in the decades during and after decolonization, home to social movements that articulated visions of a new international order that could overcome the inegalitarian legacies of Western empire.2 They hoped to put an end not only to colonialism of the territorial sort but also to those relationships of economic dependency and exploitation that persisted after the Europeans formally withdrew, a condition that the Ghanaian revolutionary Kwame Nkrumah called “neocolonialism.” Self-determination was decolonization’s guiding principle. It is a hard principle to square with the practices of the Gates Foundation, which, as Das documents, often takes a top-down approach, neglecting to consider “local knowledge and customs” or incorporate the feedback of those communities likely to be the most affected by the programs it pursues.
The organization has also almost certainly prevented millions of deaths. It is hard to arrive at an exact estimate, but Das cites some telling statistics: from 2000 to 2020 the worldwide maternal mortality rate fell by 34 percent and the mortality rate for children under five by 60 percent. The Gates Foundation, which invests heavily in maternal health programs and in efforts to combat infectious diseases, deserves at least partial credit for these achievements. At a time when the bar for billionaire behavior is reaching new lows, as the American oligarchy gives itself over to the usual late-imperial experiments with psychosis and depravity, Gates’s humanitarianism feels flamboyantly out of fashion.
It is worth dwelling on the Carnegie comparison for a moment. Not only because it’s important to Gates—he often claims Carnegie as an inspiration—but because it tells us something about Gates’s worldview. In 1889 Carnegie published a famous essay in the North American Review that became known as “the Gospel of Wealth.” In it, he argues that his fellow magnates should give away their money to help the poor—but only the deserving poor, only those “who will help themselves.” “The best means of benefiting the community is to place within its reach the ladders upon which the aspiring can rise,” Carnegie writes. Some of these ladders, he elaborates, should take the form of “public institutions” that are accessible to everyone. In his philanthropy, Carnegie fulfilled this dictum by financing the creation of thousands of libraries worldwide, as well as a predecessor of what is now Carnegie Mellon University in Pittsburgh.
Carnegie was unapologetic in his defense of the most pitiless version of capitalism imaginable. But he also believed that the worthiest members of the working class should be able to fulfill their potential through study and self-improvement, which is why his libraries were free to use and his school charged low fees. In this conception, there is space for institutions that do not operate on a capitalist basis; indeed, such institutions were the focus of Carnegie’s charitable work.
This is nearly the opposite of Gates’s approach. Gates is, unlike Carnegie, a bleeding heart. Das talks to a foundation employee who recalls telling his boss how much less likely it is for a child born in the Indian state of Uttar Pradesh to survive than a child born in the United States. “That is fucking insane,” Gates replied. “That’s unfair.” The antidote to this unfairness is, in Gates’s view, capitalism. While Carnegie aimed to create opportunities for personal advancement at a remove from the market, Gates believes in harnessing the market for altruistic ends. This is sometimes called “philanthrocapitalism,” and it claims to make philanthropy more efficient by importing the techniques and terminology of private enterprise.
It also preserves existing concentrations of corporate power. During the pandemic, Gates used his influence over global public health policy to oppose calls for pharmaceutical companies to waive patent protections on Covid vaccines so that countries around the world, especially poorer countries, could produce their own generic versions. As Das writes:
The foundation also played a big role in convincing Oxford University to sell the vaccine it had developed, with funding from the British government, to AstraZeneca rather than share the science with developing countries for free so that they could manufacture it locally.
A further selling point of philanthrocapitalism for its practitioners is that, as Das observes, it makes philanthropy more “accessible and deployable” to those “fluent in the language of returns, markets, and capitalism.” This proved useful for Gates because, in addition to becoming the world’s premiere philanthrocapitalist, he has also tried to persuade other billionaires to follow suit, particularly through “The Giving Pledge,” a campaign he launched in 2010 with Melinda and the investor Warren Buffett to encourage wealthy people to make a (nonbinding) commitment to give more than half of their net worth away. And it is here that Jeffrey Epstein comes oozing into the story, to catastrophic effect.
In October 2019, two months after Epstein’s death in a Manhattan jail, TheNew York Times reported that he and Gates had met multiple times between 2011 and 2014. Apparently Epstein pitched Gates on setting up a donor-advised fund for the foundation, which would make it easier for billionaires who signed the Giving Pledge to contribute money and immediately reap the tax benefits. JP Morgan would collect fees for running the fund, Gates would get more cash for his causes, and Epstein would take a cut for his trouble. It was philanthrocapitalism in full financialized flower, orchestrated by a man who was already a registered sex offender.
The idea never came to fruition, but the relationship between Gates and Epstein continued. Gates spent time with Epstein in New York and Florida and on his famed private jet. Epstein also acted as a handler of sorts for a Russian bridge player about thirty years Gates’s junior with whom Gates had an affair, helping get the woman a visa, wiring her money, and letting her stay in his New York apartments. When Gates’s Epstein ties surfaced in the press, the consequences for Gates were severe. The revelations rocked his foundation, contributed to the breakup of his marriage, and torpedoed the cuddly persona he had worked hard to build. Two decades after becoming a full-time philanthropist, he was the bad guy again.
More recent disclosures have revealed more sordid behavior. The tranche of Epstein files that was released by the Justice Department earlier this year includes a pair of emails, both sent by Epstein to himself in 2013, styled as resignation letters written to Gates from Boris Nikolic, a former science adviser for the Gates Foundation. The writer claimed that he facilitated “illicit trysts” for Gates, that he helped Gates obtain drugs “in order to deal with consequences of sex with russian girls,” and that Gates asked him for antibiotics to “surreptitiously give to Melinda.” Through his representatives, Gates has denied these claims. The month after the emails appeared, he apologized to the staff of the Gates Foundation for spending time with Epstein and admitted to two affairs. He insisted, however, that he never committed a crime. “I did nothing illicit,” he said. “I saw nothing illicit.”
This recent history is useful when trying to make sense of Source Code, Gates’s new memoir, the first of three planned volumes that could, when completed, exceed one thousand pages. Source Code covers his childhood in Seattle, his time at Harvard, and his first few years at Microsoft; later installments will give the full story of Microsoft and his philanthropic pursuits. It is probably best to understand this venture as an attempt to purify his brand, an industrial product of what Das calls the “Gates media machine.”
Source Code strikes a careful balance. Young Gates is curious and precocious but awkward and ill-tempered. He is the beneficiary of an affluent upbringing but possesses the intelligence to make the most of his opportunities. He gets into programming at the perfect time—just ahead of the first microcomputers that make personal computing a reality—but has the foresight and initiative to maximize this advantage.
Even the most meticulously humanized portrait may not be enough. As Das points out, Gates’s stature has suffered as a result of both the Epstein connection and his promotion of vaccines during the pandemic, which made him a villain to various Covid denialists and conspiracists. Relatedly, the position he has historically occupied, that of the liberal billionaire, has become lonelier in recent years. The revival of class politics on the left and the rightward shift of a prominent segment of the tech elite means that the “benevolent capitalism” championed by Gates has fewer takers.3 The irony is that benevolent capitalism was the state religion of Silicon Valley when the dot-commers were battling the unbenevolent capitalism of Microsoft—an ethos encapsulated by “Don’t be evil,” Google’s motto for many years. Gates took it up after he went into philanthropy, and has kept the faith much longer than his former competitors.
Still, if Gates has resisted full feralization, he has also tried to ingratiate himself with the current regime, praising Trump after a private dinner in January 2025 and attending a knee-bending ceremony for tech leaders at the White House in September. “Thank you for incredible leadership,” he told the president, seated at a table with Sergey Brin, Mark Zuckerberg, Sam Altman, and several others.
It is clarifying to see Gates in such company. He may once have waged war on Silicon Valley, but the Valley owes much of its present eminence to the playbook he drew up at Microsoft. Gates bent and broke laws, asked not for permission but for forgiveness (and rarely), helped himself freely to the intellectual property of others while vigorously protecting his own, and endeavored not merely to beat his competitors but to extinguish them by any means necessary. Above all he understood that software was the choke point in the personal computing revolution, that as computers proliferated, the code that made those computers useful—and especially their operating systems—would become critically important. Monopolies in the new era would be assembled not from agglomerations of infrastructure such as railroads but through mediating people’s access to the digital world. This privileged position would enable a firm to obtain what economists call “rents”: rather than compete with other companies on price and quality, the digital monopolist could demand something like tribute from his captive customers.
This is the dream that multiple generations of tech entrepreneurs have since pursued. Gates’s initial name for Microsoft Windows was “Interface Manager,” and the phrase aptly summarizes the project continued by his spiritual successors. From Brin to Zuckerberg to Altman, from search engines to social media to chatbots, the goal is to become the interface manager, controlling the surfaces that we use to simplify and humanize computing’s alien depths. Gates is the ghost in our machines.

No comments yet. Be the first to comment!