“Shall we play a game?”
―Joshua
So many of the Seeds of Destruction in the last installment are tied directly and indirectly to the Internet that I think the time has come to look at it, specifically, for the role it is currently playing in our all-encompassing civilizational drama.
In the intoxicating early days of cyberspace, before the Internet existed, you could veritably taste the early digital anarchy every time the modem burred and beeped and buzzed. Logging on was akin to descending through a portal into a mysterious subterranean world where the normal rules of reality did not (and could not) apply. You can sample some of the flavor in the early cyberpunk works of William Gibson Bruce Sterling’s cyberpunk, and in movies like War Games and Sneakers and (God help us) Hackers—or, in the obscure little show that introduced me to the world of computer geekery, Whiz Kids (which, if I recall correctly, contained the first ever broadcast reference to the National Security Agency, who up until that time was so secret they were called “No Such Agency” in wonk circles).
This alt-space was a world of the ungovernable geniuses: the Gen X malcontents who, thirty years earlier, would have been vacuumed up by the security establishment but who the now-moribund halls of power had no use for whatsoever. It had social networks and information conduits that were inaccessible to mere mortals, where some of us read classified reports leaked by insiders decades before their legal declassification, and where everyone knew at least one guy who’d prank-called NORAD’s red line to order a pizza (I suspect we all knew the same guy—the story never varied).
Before the hoi polloi knew what a diskette was, we the geeks knew computers were gonna change everything. They were going to make secrets obsolete. They were going to break the state monopoly on information (maintained through the broadcast licenses of the Big Three networks and their relationship with the intelligence establishment). They were going to open the doors of commerce and destroy academic gate-keeping and publishing and up-end copyright, because the mere existence of computers removed all the underlying physical constraints that created those things in the first place.
As we gradually got legal access to the ARPAnet and it got re-branded as “Internet,” we knew that it was the how part of change everything. The Internet, after all, took the most promising technology in centuries—the personal computer—and gave it the ability, by joining this larger network, to be the protrusion into our dimension of a vast data bank the likes of which the world had never seen. The home/office system that was plugged to the network gave you the best of all worlds—access to the great wide world, while keeping ownership of your own data. Where once upon a time you had to pay a company to keep your data for you, now you could keep it at home or at your business, independent of Big Computing. No more monthly payments to IBM unless you really wanted to. No worries about big companies turning your data over to spooks without a warrant just to keep their government contracts. No more worries about industrial espionage, or startup secrets getting raided by big competitors with whom they did business.
The nineteen sixties had seen a culture war between those who loathed the sterility, repression, and militarism of Post-War America, and those who loved the stability, prosperity, and optimism of Post-War America. By the nineteen-eighties, as that generation came into wealth and influence, the terms of the war shifted and started spiraling towards an authoritarian attractor state.
The Republicans embraced so-called “traditional family values,” a pile of ideas that didn’t really comfortably reconcile with one another, but which were united by a vague affinity for “the way things used to be when everything worked.” Hardly a coincidence, then, that, in the midst of a deep recession after a decade of scandals, violence, rootlessness, and social strife, Reagan was elected on the slogan “Make America Great Again.” What came next was more surprising, at least to the street-level radicals of the time:
The response to Reagan’s rise among the former radicals—who had, by now, grown into successful yuppies and entrepreneurs and homeowners—was not to raise high the flag of liberty, but to champion voices that were equally censorious, equally authoritarian, and equally deceptive about its desires and aims. America’s Manichean tendencies were again in full-flower, the one-time unity of the Post-War order was now was a team sport, with each team fighting for control over the most powerful military the world had ever seen. What in the sixties and seventies had seemed a paranoid fantasy was starting to become an inescapable reality:
America had lost its taste for liberty. Its parties were no longer fighting over whether to meddle in family life, or which foreign wars to fight and whether we should fight them at all, or whether to spy on bank accounts in order to catch criminals, or whether it was just to seize property because someone liked a certain drug. They were now only arguing over how civil rights, fiscal discipline, freedom of association, and privacy might best be undermined in the name of the public good. Competing interests were in the cards no longer. Labor and capital were not adversaries, nor were the hawks and doves. The only issue at stake became which flavor of authoritarian fascism the country should move towards.
In the name of prosperity, justice, and safety, of course.
The geeks got busy. They started building the infrastructure for a fully decentralized world. Mesh networks that would survive the destruction of the Internet backbone. Cryptography that not even the NSA could break. They fought court battles over software and free speech. They invented free software so that the world would not be forever in thrall to the IBMs and Suns and Microsofts of the world. The Berkeley Software Distribution, the GNU Project, and Linux all were bootstrapped by students, obsessives, autists, paranoids, and crazies working at the fringes because they listened to the hippies and the Church Committee Hearings, because they wanted a world that was not forever overshadowed by a nuclear threat held at bay by Imperial power. They were transhumanists, syndicalists, anarchists, libertarians, an-caps, Rothbardians, Objectivists, Satanists, Occultists, rationalists, and Discordians with a healthy sense of fun and an unhealthy grudge against the totalitarian potential of the centralized and corporatized security state which was (regardless of who occupied the White House) fascist-in-all-but-name.
In their unique and socially awkward way, the geeks became the torch-bearers of the hippie dream of a world with elbow room, full of the freedom to explore and to criticize and to be left the hell alone. It was the kind of moment you only get in a generation of outcasts. And, by the end of the 1990s, they were only a “eureka” or two away from having the whole damn system functional—of course, then they’d have to make it pretty and user-friendly, which would take some doing (geeks of the time had the aesthetic sense of a colorblind ferret tripping out on acid).
The 1990s were a wild, woolly, freaky time, but they proved to those of us who lived through them that you could do cultural fragmentation in style without treading on too many toes—all you had to do was create a system where anyone could win, where who you were and where you came from didn’t matter. A prince could become a pauper, and a pauper a prince, and it could happen almost overnight, and in some cases it did. By mid-2000, the world was waking up to the Great Decentralization, and everyone was on board. Business, fringe culture groups, geeks, grandmas, even the intelligence establishment was helping out, tasked as it was (at the time) with helping insurgent groups fight against warlords and dictators and the few Communist stragglers. About the only major player wholeheartedly fighting the Decentralization was Microsoft, but by that time we’d all seen Microsoft’s true colors, and their cultural capital (even among their fans) was at an all-time low.
The Decentralization was happening. Nothing could stand in its way.
But something could fall into its path, and block it.
Which two giant buildings did, in New York City, on September 11, 2001.
Al Queada was, unfortunately, one of those scrappy revolutionary groups that the US security services had pumped a lot of money into helping. They used a lot of the decentralized tech and techniques in the CIA field manual, along with the NSA secure kernel project, the Swiss and Virgin Islands privacy laws, and the off-network movement of fungible digital assets (like airline tickets and commodities and stocks). In one huge crash (well, two), they proved to the world’s greatest imperial power that decentralization might actually be a threat to their authority—something forty years of hacker manifestos had utterly failed to do.
In response, the US locked down the banking system to such an extent that developing technologies—such as digital micropayments—had no hope of taking off. Small banks which had been developing such services, unable to afford to bear up under the new regulatory frameworks, were swallowed up by larger banks that were disinclined to investigate new business models, preferring instead to lobby government for subsidies for increasingly risky asset classes in order to drive the balance sheet growth they were looking for.
Without micropayments, the Internet turned to advertising to fund its business model, and the needs of the advertisers built snoopers into every operating system, web page, cell phone, and networked device—snoopers that generated information that not only made the new platforms advertiser-friendly, it also created a pool of easily-de-anonymizable data that could be sold, no questions asked, to political campaigns, law enforcement, activists, and repressive governments around the world.
On the other side, those companies that didn’t go with advertising figured out how to realize Microsoft’s longtime dream of returning businesses to utter dependancy upon vendors through the “Cloud” and the “Software As A Service” models. No longer would a business own its own customer data, the licenses to its software, the right to exist without interference. For a low price (compared to the site licenses of old), any company in the world could now use enterprise-grade software...which would depend utterly upon the sufferance of the vendor, and could be canceled for any reason whatsoever, including political or religious or racial considerations, and the customer wouldn’t be able to get their data or business back without a lawsuit (or the thread of one). The bad old days about an economy in thrall to IBM were back in the guise of Salesforce, Microsoft, AWS, and the rest.
To grant themselves control over information flows, the US reinterpreted laws like the Digital Millennium Copyright Act as granting broad surveillance powers to stop the theft of intellectual property—and if the NSA and CIA and FBI came along for the ride through the new DHS clearing house, well, that was the price of good government.
Finally, an expansive law that had been in the works since the early 1990s, kicking around a small group of Senators from both sides of the aisle (instigated by a Senator from Delaware who has since ascended to higher office) was introduced in the September 2001 maelstrom. The law granted the US Government the power to treat American citizens as enemy combatants, to hold them without trial or arraignment on suspicion of involvement with terrorism (a term to be defined by the reigning administration), to monitor their communications, to monitor their internet and library habits, to conduct roving surveillance, to seize their voice mail messages, to circumvent FISA protections, and to avoid liability for violating civil and constitutional rights when acting in defense of the homeland.
The Great Decentralization looked dead. In the years that followed, with wars raging continually, the geopolitical and technological worlds seemed to freeze in place, a pregnant pause disturbed only by the quiet, creeping centralization of the Internet, creating monopolies on a scale unheard-of in human history, each of these “winners” in the digital marketplace subsidized early and often by lucrative contracts that made them dependent upon the intelligence services that were determined never to be embarrassed in public again as they had been on September 11, regardless of the cost.
The projects of decentralization continued to bubble beneath the surface, with many of the early minds repositioning themselves to play new roles in a hypothetical future push-back against the overreaches of the imperial state. Skirmishes between these players and governments around the world cropped up from time to time, but compared to the heady days of the 1990s, the landscape exuded an eerie, almost creepy kind of calm.
The Bush years faded into Obama years, a transition marked only by which news outlets covered governmental malfeasance, and history continued to hold its breath...
...until the world shook for a second time. And this time, nobody could figure out how to plaster up the cracks.
But we’ll talk about that next time.
In the meantime, I invite you to post any corrections or arguments in the comments, or send them directly to me at feedback@jdsawyer.net.
I'm having trouble accepting that 9/11 was really an inflection point in internet freedom. Throughout human history, whenever a new frontier opens, as soon as it's shown that one can make money at it, the established interests rush in and take over. I can't see how 9/11 accelerated or changed this.
The problem with the woolly decentralized frontier is that certain functions really are better when network effects come into play. In the 90's, our biggest problem with the internet was finding things. Google solved that better than just about everyone else. Facebook later solved the problem of finding people we'd lost touch with (remember the first wave of its use and the huge number of divorces as people connected with old flames?). Decentralization can't do network effects efficiently. Otherwise tech wouldn't be concentrated in Silicon Valley and a few other hubs. It'd be everywhere.
Which raises the question of how one pays for the network and who manages it. The problem of the "bit economy" was identified in the mid-nineties. How do you make money when replication if free? (Unlike the atom economy where replicating a bunch of atoms has a manufacturing cost). The easy solution is advertising and so "easy" beat out better and drove us to the structure we have now.
The "winners" were coming regardless of the involvement of intelligence agencies. I find it difficult to believe that the Government picked Facebook over MySpace and all the other proto-social networks. If the intelligence agencies actually are involved in Facebook (which I find very doubtful given how it's used frequently by people actively trying to overthrow the government, who would be quashed if the intelligence agencies were actually involved), then they're doing a crappy job.
Perhaps if you fleshed out the micropayments argument more, I could accept that. It's pretty clear that the collusion of the major credit card companies have effectively strangled them by ensuring that they get their "cut" of everything (as well as creating a non-governmental method of suppressing "undesirable" economic activity).
But advertising was clearly way ahead of micropayments as a way of funding the digital economy. So again, I'm struggling to see how the slowing of decentralization was anything but inevitable.
Also, the argument that America had lost its taste for liberty and was arguing over which form of authoritarianism to embrace is worth its own separate long discussion. I agree that it happened, but understanding the contributing causes may help us push back against it better.