(in case you're in need of a little TLDR):
Or how, exactly, platforms die.
Here is how platforms die: First, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die.
I call this enshittification, and it is a seemingly inevitable consequence arising from the combination of the ease of changing how a platform allocates value, combined with the nature of a "two-sided market," where a platform sits between buyers and sellers, hold each hostage to the other, raking off an ever-larger share of the value that passes between them.
When a platform starts, it needs users, so it makes itself valuable to users. Think of Amazon: For many years, it operated at a loss, using its access to the capital markets to subsidize everything you bought. It sold goods below cost and shipped them below cost. It operated a clean and useful search. If you searched for a product, Amazon tried its damndest to put it at the top of the search results.
This was a hell of a good deal for Amazon's customers. Lots of us piled in, and lots of brick-and-mortar retailers withered and died, making it hard to go elsewhere. Amazon sold us ebooks and audiobooks that were permanently locked to its platform with DRM, so that every dollar we spent on media was a dollar we'd have to give up if we deleted Amazon and its apps. And Amazon sold us Prime, getting us to pre-pay for a year's worth of shipping. Prime customers start their shopping on Amazon, and 90 percent of the time, they don't search anywhere else.
That tempted in lots of business customers—marketplace sellers who turned Amazon into the "everything store" it had promised from the beginning. As these sellers piled in, Amazon shifted to subsidizing suppliers. Kindle and Audible creators got generous packages. Marketplace sellers reached huge audiences and Amazon took low commissions from them.
This strategy meant that it became progressively harder for shoppers to find things anywhere except Amazon, which meant that they only searched on Amazon, which meant that sellers had to sell on Amazon. That's when Amazon started to harvest the surplus from its business customers and send it to Amazon's shareholders. Today, Marketplace sellers are handing more than 45 percent of the sale price to Amazon in junk fees. The company's $31 billion "advertising" program is really a payola scheme that pits sellers against each other, forcing them to bid on the chance to be at the top of your search.
Searching Amazon doesn't produce a list of the products that most closely match your search, it brings up a list of products whose sellers have paid the most to be at the top of that search. Those fees are built into the cost you pay for the product, and Amazon's "Most Favored Nation" requirement for sellers means that they can't sell more cheaply elsewhere, so Amazon has driven prices at every retailer.
Search Amazon for "cat beds" and the entire first screen is ads, including ads for products Amazon cloned from its own sellers, putting them out of business (third parties have to pay 45 percent in junk fees to Amazon, but Amazon doesn't charge itself these fees). All told, the first five screens of results for "cat bed" are 50 percent ads.
This is enshittification: Surpluses are first directed to users; then, once they're locked in, surpluses go to suppliers; then once they're locked in, the surplus is handed to shareholders and the platform becomes a useless pile of shit. From mobile app stores to Steam, from Facebook to Twitter, this is the enshittification lifecycle.
This is why—as Cat Valente wrote in her magisterial pre-Christmas essay—platforms like Prodigy transformed themselves overnight, from a place where you went for social connection to a place where you were expected to “stop talking to each other and start buying things.”
This shell-game with surpluses is what happened to Facebook. First, Facebook was good to you: It showed you the things the people you loved and cared about had to say. This created a kind of mutual hostage-taking: Once a critical mass of people you cared about were on Facebook, it became effectively impossible to leave, because you'd have to convince all of them to leave too, and agree on where to go. You may love your friends, but half the time you can't agree on what movie to see and where to go for dinner. Forget it.
Then, it started to cram your feed full of posts from accounts you didn't follow. At first, it was media companies, whom Facebook preferentially crammed down its users' throats so that they would click on articles and send traffic to newspapers, magazines, and blogs. Then, once those publications were dependent on Facebook for their traffic, it dialed down their traffic. First, it choked off traffic to publications that used Facebook to run excerpts with links to their own sites, as a way of driving publications into supplying full-text feeds inside Facebook's walled garden.
This made publications truly dependent on Facebook—their readers no longer visited the publications' websites, they just tuned into them on Facebook. The publications were hostage to those readers, who were hostage to each other. Facebook stopped showing readers the articles publications ran, tuning The Algorithm to suppress posts from publications unless they paid to "boost" their articles to the readers who had explicitly subscribed to them and asked Facebook to put them in their feeds.
Now, Facebook started to cram more ads into the feed, mixing payola from people you wanted to hear from with payola from strangers who wanted to commandeer your eyeballs. It gave those advertisers a great deal, charging a pittance to target their ads based on the dossiers of non-consensually harvested personal data they'd stolen from you.
Sellers became dependent on Facebook, too, unable to carry on business without access to those targeted pitches. That was Facebook's cue to jack up ad prices, stop worrying so much about ad fraud, and to collude with Google to rig the ad market through an illegal program called Jedi Blue.
Today, Facebook is terminally enshittified, a terrible place to be whether you're a user, a media company, or an advertiser. It's a company that deliberately demolished a huge fraction of the publishers it relied on, defrauding them into a "pivot to video" based on false claims of the popularity of video among Facebook users. Companies threw billions into the pivot, but the viewers never materialized, and media outlets folded in droves.
But Facebook has a new pitch. It claims to be called Meta, and it has demanded that we live out the rest of our days as legless, sexless, heavily surveilled low-poly cartoon characters. It has promised companies that make apps for this metaverse that it won't rug them the way it did the publishers on the old Facebook. It remains to be seen whether they'll get any takers. As Mark Zuckerberg once candidly confessed to a peer, marveling at all of his fellow Harvard students who sent their personal information to his new website, "TheFacebook":
Once you understand the enshittification pattern, a lot of the platform mysteries solve themselves. Think of the SEO market, or the whole energetic world of online creators who spend endless hours engaged in useless platform Kremlinology, hoping to locate the algorithmic tripwires, which, if crossed, doom the creative works they pour their money, time, and energy into.
Working for the platform can be like working for a boss who takes money out of every paycheck for all the rules you broke, but who won't tell you what those rules are because if he told you that, then you'd figure out how to break those rules without him noticing and docking your pay. Content moderation is the only domain where security through obscurity is considered a best practice.
The situation is so dire that organizations like Tracking Exposed have enlisted an human army of volunteers and a robot army of headless browsers to try to unwind the logic behind the arbitrary machine judgments of The Algorithm, both to give users the option to tune the recommendations they receive, and to help creators avoid the wage theft that comes from being shadow banned.
But what if there is no underlying logic? Or, more to the point, what if the logic shifts based on the platform's priorities? If you go down to the midway at your county fair, you'll spot some poor sucker walking around all day with a giant teddy bear that they won by throwing three balls in a peach basket.
The peach-basket is a rigged game. The carny can use a hidden switch to force the balls to bounce out of the basket. No one wins a giant teddy bear unless the carny wants them to win it. Why did the carny let the sucker win the giant teddy bear? So that he'd carry it around all day, convincing other suckers to put down five bucks for their chance to win one.
The carny allocated a giant teddy bear to that poor sucker the way that platforms allocate surpluses to key performers—as a convincer in a "Big Store" con, a way to rope in other suckers who'll make content for the platform, anchoring themselves and their audiences to it.
Which brings me to TikTok. TikTok is many different things, including “a free Adobe Premiere for teenagers that live on their phones.” But what made it such a success early on was the power of its recommendation system. From the start, TikTok was really, really good at recommending things to its users. Eerily good.
By making good-faith recommendations of things it thought its users would like, TikTok built a mass audience, larger than many thought possible, given the death grip of its competitors, like YouTube and Instagram. Now that TikTok has the audience, it is consolidating its gains and seeking to lure away the media companies and creators who are still stubbornly attached to YouTube and Insta.
Yesterday, Forbes's Emily Baker-White broke a fantastic story about how that actually works inside of ByteDance, TikTok's parent company, citing multiple internal sources, revealing the existence of a "heating tool" that TikTok employees use to push videos from select accounts into millions of viewers' feeds.
These videos go into TikTok users' For You feeds, which TikTok misleadingly describes as being populated by videos "ranked by an algorithm that predicts your interests based on your behavior in the app." In reality, For You is only sometimes composed of videos that TikTok thinks will add value to your experience—the rest of the time, it's full of videos that TikTok has inserted in order to make creators think that TikTok is a great place to reach an audience.
"Sources told Forbes that TikTok has often used heating to court influencers and brands, enticing them into partnerships by inflating their videos’ view count. This suggests that heating has potentially benefitted some influencers and brands—those with whom TikTok has sought business relationships—at the expense of others with whom it has not."
In other words, TikTok is handing out giant teddy bears.
But TikTok is not in the business of giving away giant teddy bears. TikTok, for all that its origins are in the quasi-capitalist Chinese economy, is just another paperclip-maximizing artificial colony organism that treats human beings as inconvenient gut flora. TikTok is only going to funnel free attention to the people it wants to entrap until they are entrapped, then it will withdraw that attention and begin to monetize it.
"Monetize" is a terrible word that tacitly admits that there is no such thing as an "attention economy." You can't use attention as a medium of exchange. You can't use it as a store of value. You can't use it as a unit of account. Attention is like cryptocurrency: a worthless token that is only valuable to the extent that you can trick or coerce someone into parting with "fiat" currency in exchange for it. You have to "monetize" it—that is, you have to exchange the fake money for real money.
In the case of cryptos, the main monetization strategy was deception-based. Exchanges and "projects" handed out a bunch of giant teddy-bears, creating an army of true-believer Judas goats who convinced their peers to hand the carny their money and try to get some balls into the peach-basket themselves.
But deception only produces so much "liquidity provision." Eventually, you run out of suckers. To get lots of people to try the ball-toss, you need coercion, not persuasion. Think of how US companies ended the defined benefits pension that guaranteed you a dignified retirement, replacing it with market-based 401(k) pensions that forced you to gamble your savings in a rigged casino, making you the sucker at the table, ripe for the picking.
Early crypto liquidity came from ransomware. The existence of a pool of desperate, panicked companies and individuals whose data had been stolen by criminals created a baseline of crypto liquidity because they could only get their data back by trading real money for fake crypto money.
The next phase of crypto coercion was Web3: converting the web into a series of tollbooths that you could only pass through by trading real money for fake crypto money. The internet is a must-have, not a nice-to-have, a prerequisite for full participation in employment, education, family life, health, politics, civics, even romance. By holding all those things to ransom behind crypto tollbooths, the holders hoped to convert their tokens to real money.
For TikTok, handing out free teddy-bears by "heating" the videos posted by skeptical performers and media companies is a way to convert them to true believers, getting them to push all their chips into the middle of the table, abandoning their efforts to build audiences on other platforms (it helps that TikTok's format is distinctive, making it hard to repurpose videos for TikTok to circulate on rival platforms).
Once those performers and media companies are hooked, the next phase will begin: TikTok will withdraw the "heating" that sticks their videos in front of people who never heard of them and haven't asked to see their videos. TikTok is performing a delicate dance here: There's only so much enshittification they can visit upon their users' feeds, and TikTok has lots of other performers they want to give giant teddy-bears to.
Tiktok won't just starve performers of the "free" attention by depreferencing them in the algorithm, it will actively punish them by failing to deliver their videos to the users who subscribed to them. After all, every time TikTok shows you a video you asked to see, it loses a chance to show you a video it wants you to see, because your attention is a giant teddy-bear it can give away to a performer it is wooing.
This is just what Twitter has done as part of its march to enshittification: thanks to its "monetization" changes, the majority of people who follow you will never see the things you post. I have ~500k followers on Twitter and my threads used to routinely get hundreds of thousands or even millions of reads. Today, it's hundreds, perhaps thousands.
I just handed Twitter $8 for Twitter Blue, because the company has strongly implied that it will only show the things I post to the people who asked to see them if I pay ransom money. This is the latest battle in one of the internet's longest-simmering wars: the fight over end-to-end.
In the beginning, there were Bellheads and Netheads. The Bellheads worked for big telcos, and they believed that all the value of the network rightly belonged to the carrier. If someone invented a new feature—say, Caller ID—it should only be rolled out in a way that allows the carrier to charge you every month for its use. This is Software-As-a-Service, Ma Bell style.
The Netheads, by contrast, believed that value should move to the edges of the network—spread out, pluralized. In theory, Compuserve could have "monetized" its own version of Caller ID by making you pay $2.99 extra to see the "From:" line on email before you opened the message— charging you to know who was speaking before you started listening—but they didn't.
The Netheads wanted to build diverse networks with lots of offers, lots of competition, and easy, low-cost switching between competitors (thanks to interoperability). Some wanted this because they believed that the net would someday be woven into the world, and they didn't want to live in a world of rent-seeking landlords. Others were true believers in market competition as a source of innovation. Some believed both things. Either way, they saw the risk of network capture, the drive to monetization through trickery and coercion, and they wanted to head it off.
They conceived of the end-to-end principle: the idea that networks should be designed so that willing speakers' messages would be delivered to willing listeners' end-points as quickly and reliably as they could be. That is, irrespective of whether a network operator could make money by sending you the data it wanted to receive, its duty would be to provide you with the data you wanted to see.
The end-to-end principle is dead at the service level today. Useful idiots on the right were tricked into thinking that the risk of Twitter mismanagement was "woke shadowbanning," whereby the things you said wouldn't reach the people who asked to hear them because Twitter's deep state didn't like your opinions. The real risk, of course, is that the things you say won't reach the people who asked to hear them because Twitter can make more money by enshittifying their feeds and charging you ransom for the privilege to be included in them.
As I said at the start of this essay, enshittification exerts a nearly irresistible gravity on platform capitalism. It's just too easy to turn the enshittification dial up to eleven. Twitter was able to fire the majority of its skilled staff and still crank the dial all the way over, even with a skeleton crew of desperate, demoralized H1B workers who are shackled to Twitter's sinking ship by the threat of deportation.
The temptation to enshittify is magnified by the blocks on interoperability: When Twitter bans interoperable clients, nerfs its APIs, and periodically terrorizes its users by suspending them for including their Mastodon handles in their bios, it makes it harder to leave Twitter, and thus increases the amount of enshittification users can be force-fed without risking their departure.
Twitter is not going to be a "protocol." I'll bet you a testicle (not one of mine) that projects like Bluesky will find no meaningful purchase on the platform, because if Bluesky were implemented and Twitter users could order their feeds for minimal enshittification and leave the service without sacrificing their social networks, it would kill the majority of Twitter's "monetization" strategies.
An enshittification strategy only succeeds if it is pursued in measured amounts. Even the most locked-in user eventually reaches a breaking point and walks away, or gets pushed. The villagers of Anatevka in Fiddler on the Roof tolerated the cossacks' violent raids and pogroms for years, until they were finally forced to flee to Krakow, New York, and Chicago.
For enshittification-addled companies, that balance is hard to strike. Individual product managers, executives, and activist shareholders all give preference to quick returns at the cost of sustainability, and are in a race to see who can eat their seed-corn first. Enshittification has only lasted for as long as it has because the internet has devolved into “five giant websites, each filled with screenshots of the other four.”
With the market sewn up by a group of cozy monopolists, better alternatives don't pop up and lure us away, and if they do, the monopolists just buy them out and integrate them into your enshittification strategies, like when Mark Zuckerberg noticed a mass exodus of Facebook users who were switching to Instagram, and so he bought Instagram. As Zuck says, "It is better to buy than to compete."
This is the hidden dynamic behind the rise and fall of Amazon Smile, the program whereby Amazon gave a small amount of money to charities of your choice when you shopped there, but only if you used Amazon's own search tool to locate the products you purchased. This provided an incentive for Amazon customers to use its own increasingly enshittified search, which it could cram full of products from sellers who coughed up payola, as well as its own lookalike products. The alternative was to use Google, whose search tool would send you directly to the product you were looking for, and then charge Amazon a commission for sending you to it.
The demise of Amazon Smile coincides with the increasing enshittification of Google Search, the only successful product the company managed to build in-house. All its other successes were bought from other companies: video, docs, cloud, ads, mobile, while its own products are either flops like Google Video, clones (Gmail is a Hotmail clone), or adapted from other companies' products, like Chrome.
Google Search was based on principles set out in founder Larry Page and Sergey Brin's landmark 1998 paper, "Anatomy of a Large-Scale Hypertextual Web Search Engine," in which they wrote, “Advertising funded search engines will be inherently biased towards the advertisers and away from the needs of consumers.”
Even with that foundational understanding of enshittification, Google has been unable to resist its siren song. Today's Google results are an increasingly useless morass of self-preferencing links to its own products, ads for products that aren't good enough to float to the top of the list on its own, and parasitic SEO junk piggybacking on the former.
Enshittification kills. Google just laid off 12,000 employees, and the company is in a full-blown "panic" over the rise of "AI" chatbots, and is making a full-court press for an AI-driven search tool—that is, a tool that won't show you what you ask for, but rather, what it thinks you should see.
Now, it's possible to imagine that such a tool will produce good recommendations, like TikTok's pre-enshittified algorithm did. But it's hard to see how Google will be able to design a non-enshittified chatbot front-end to search, given the strong incentives for product managers, executives, and shareholders to enshittify results to the precise threshold at which users are nearly pissed off enough to leave, but not quite.
Even if it manages the trick, this-almost-but-not-quite-unusuable equilibrium is fragile. Any exogenous shock—a new competitor like TikTok that penetrates the anticompetitive "moats and walls" of Big Tech, a privacy scandal, a worker uprising—can send it into wild oscillations.
Enshittification truly is how platforms die. That's fine, actually. We don't need eternal rulers of the internet. It's okay for new ideas and new ways of working to emerge. The emphasis of lawmakers and policymakers shouldn't be preserving the crepuscular senescence of dying platforms. Rather, our policy focus should be on minimizing the cost to users when these firms reach their expiry date: Enshrining rights like end-to-end would mean that no matter how autocannibalistic a zombie platform became, willing speakers and willing listeners would still connect with each other.
And policymakers should focus on freedom of exit—the right to leave a sinking platform while continuing to stay connected to the communities that you left behind, enjoying the media and apps you bought and preserving the data you created.
The Netheads were right: Technological self-determination is at odds with the natural imperatives of tech businesses. They make more money when they take away our freedom—our freedom to speak, to leave, to connect.
For many years, even TikTok's critics grudgingly admitted that no matter how surveillant and creepy it was, it was really good at guessing what you wanted to see. But TikTok couldn't resist the temptation to show you the things it wants you to see rather than what you want to see. The enshittification has begun, and now it is unlikely to stop.
It's too late to save TikTok. Now that it has been infected by enshittifcation, the only thing left is to kill it with fire.
No comments:
Post a Comment