How Platforms Plan the Attention Economy

By recognizing that platforms optimize our behavior to benefit them, we can turn our attention to developing systems that benefit the public.

Cristina Spanò

Dialed into video conferences from around the world, an army of thousands of predominantly 20- and 30-somethings share graphs depicting the behavior of billions of social network users while slinging arcane jargon: MAP, DAP, CAU, L6/7, holdbacks, churn, ad load, etc. These youthful technicians dictate how billions of people spend their time online. We can think of them as the attention economy’s planners.

To dive into just one of these metrics, consider ad load, which refers to the proportion of content in a user’s experience that is paid advertising as opposed to organic content. Facebook’s News Feed, for example, has already reached the ad load equilibrium point. If Facebook were to continue increasing the number of ads, enough people would put down their phones to make the increase not worth it financially. Estimates on Instagram put this equilibrium at one ad every four stories. Meanwhile, the average daily time that an individual user spends on social platforms has ballooned to 144 minutes.

Zooming out, the optimization process behind this equilibrium resembles the process of economic planning that has been conducted by economists and technocrats around the world. In a planned system, designated planners imbibe massive amounts of information about an economy and exercise control over supply, demand, and resources to steer it toward preordained outcomes.

There’s an adage that every workplace is organized and every economy is planned — it’s just a question of who is doing the organizing or planning and what interests they serve. For the online attention economy, which commodifies scarce human attention, the planners are the product managers, data scientists, user experience designers, and software engineers of the monopoly digital platforms. These planners optimize a slew of metrics — time spent in an app, engagements with a piece of content, quantity of content viewed — that generate revenue for the shareholders of the platform corporation.

The planners balance the interests of the four primary stakeholders in content distribution platforms: content producers, content consumers, advertisers, and the managers of the platform. The voice of content consumers, the users and citizens of a platform, is quite limited and indirect. As a user consumes content on a social platform, their traces of data are recorded in meticulous detail. Large platforms like Facebook will even survey thousands of users a day, gauging sentiments such as whether or not users feel that Facebook “cares about us?” while creating new sentiment metrics (e.g., CAU). This data is fed into the planning process — a new feature may increase revenue in the short term, but if this comes at too great an expense of CAU, the feature will be discarded.

Even the few explicit controls that users do have over social platforms are ambiguous or even ignored. On Facebook, a user can switch from an algorithmically ranked feed called Top Stories to a chronologically ranked feed called Most Recent. The chronological feed is still built on some degree of ranking — for most users, the inventory of all available content is so large that some minor stories, such as a friend following a new page, are filtered out. But users are consistently surprised to find that when they switch to the Most Recent feed, the setting automatically reverses itself a few hours later. It suggests that the platform doesn’t want to relinquish more than the illusion of control to its users. The planners probably noticed that users on Most Recent feeds generate less favorable metrics than users on Top Stories, and imposed that decision on all users.

Sometimes the planners make mistakes by collecting incorrect data or applying data incorrectly. Most famously, Facebook misreported the amount of time users spent watching video content, which erroneously led media companies to gut newsrooms and reallocate budgets toward costly video production. Because content platforms have no auditable transparency through which stakeholders can investigate their behavior, this error was sustained for more than a year. The mistake was only made public through the proceedings of a class-action lawsuit brought by advertisers that was settled in 2019.

Planning to our potential

By recognizing the digital attention economy as a system that is subject to planning, we can begin to ask questions and make demands of platforms about how our time and attention should be allocated. If a platform is going to be funded by advertising, how much advertising should a user be subjected to? How should we divide our attention between content consumption, communicating with others, producing content of our own, and, say, gaming? What counter-metrics should be in place to detect when an optimization process is “running up the score” on our limited human attention?

The planners working for online platforms today doubtless understand these questions and desire to deliver positive experiences to their users. It is easy to imagine, however, how planners are structurally unable to achieve a balance of interests that leans ultimately in the users’ favor. The planners, and in particular their managers, are employed by private corporations and charged with maximizing shareholder value. While planners can occasionally make equitable or favorable decisions here and there, the long-term trend under this structure is necessarily toward maximizing profit at the expense of all else.

By potentiating choice and agency through interoperability, new online platforms can unlock unprecedented innovations that ultimately empower us to reassert control of our attention.

But digital platforms are still young, and we can be sure that today’s dominant models of organizing the attention economy will be supplanted by new developments over the coming decades. As we evaluate alternative methods of planning the future attention economy, how can we evaluate structures and regimes for planning? Three principles can guide our path: transparency, representation, and choice.

Transparency

Transparency is the bedrock of any fair process. If a subject cannot investigate the gap between what a powerful entity says they’re doing and what they’re actually doing, there can be no trust in the process. Users of today’s online platforms have little reason to trust that planners are acting in the users’ best interest. Going forward, we must demand increased transparency into the processes that online platforms use to plan the attention economy.

Transparency can include anything from making a platform’s roadmap public to requiring a process that’s analogous to Freedom of Information Act requests for platform corporations, perhaps as part of an antitrust settlement. Open-source software is inherently a tool for transparency, but it is no silver bullet. The machine learning packages used by major platforms are by and large open source, but the data flows through learning systems, and the models they produce are kept secret by platform corporations. It is reasonable to ask why the processes around content ranking are considered so secret by corporations like Facebook that an engineer who merely shared evidence of bias in moderation decisions was fired for his investigation. The current crop of platforms fails miserably at being transparent.

Representation

As practiced today, planning in the digital attention economy is an undemocratic process that denies agency to those who are subject to the actual results of the process. Planners cannot exclusively rely on usage data, survey data, and interviews performed by user experience researchers to make decisions on behalf of users. For the results to be democratic and legitimate, users must have a real way to be involved in decisions made in the planning process.

Facebook recently ceded authority over a small number of content moderation decisions to an independent oversight board that solicited input from users on Facebook’s decision to delete particular pieces of content posted by Donald Trump. This represents only the tiniest step forward in representation for users, as the oversight board has no authority to demand structural changes to Facebook’s role in the attention economy. Actual representation would require user input at much earlier stages of decision-making. Users should be able to say, for example, when a metric like time spent (the average time a user spends engaging with a platform in a day) has been optimized enough and should no longer be increased.

Choice

Even resting on the bedrock of transparency and representation, platforms still hold too much power over users if a user cannot reasonably opt to switch to another product. For digital platforms, choice means giving users the ability to transfer their data from one platform to another. It also means interoperability between platforms and clients.

The ActivityPub protocol, part of the technology stack behind the decentralized microblogging platform Mastodon, is a leading example of choice in digital platforms. Open standards mean that multiple vendors can build platforms and user experiences that communicate with each other while still providing unique value. As for today’s dominant platforms, they are more often seen constraining user choice than expanding it.

The potential of decentralized platforms

If we are looking to build digital platforms that are governed transparently, that represent user interests in the planning process, and that maximize user agency, we should look toward decentralization as our foundation.

Decentralized platforms like Mastodon are necessarily transparent because they are built, hosted, and governed by disparate entities rather than one unitary corporation. Unlike centralized platforms that may release components under an open-source license but keep the development of products fully proprietary, decentralized platforms tend to be developed in open source almost entirely. This level of transparency means that researchers and curious hobbyists alike can audit data flows.

There is plenty of space to focus on the interests of the user within the open process of negotiating a standard like ActivityPub. In fact, it would be hard to imagine practices that are hostile to the user, such as surveillance, making it through the public standards process. The founding documents of the Matrix Foundation, which guides development of the Matrix decentralized chat protocol, say the foundation must explicitly act “for the greater benefit of the whole ecosystem, not benefiting or privileging any single player or subset of players,” where the ecosystem includes end users. Matrix, like most decentralized platforms, is planned through an open process.

The interoperability of decentralized platforms also delivers meaningful choices for the user, who can switch to another federated host or export their data into a standardized format. As with transparency, user choice isn’t guaranteed merely by the good intentions of players in a decentralized ecosystem but by the very structure of a decentralized service. When multiple entities have to agree on a standardized protocol like ActivityPub, Matrix, or even just email in order to communicate, this grants vendors the freedom to create multiple implementations of the standard. Different implementations can appeal to different audiences — for instance, by providing a simplified user interface for novice users and a more powerful interface for experts, or more powerful security features for security- and privacy-conscious users. This choice of implementations gives us more agency and freedom.

Email, the first online communication platform, remains widely in use today largely because of its decentralized nature. In many ways, we are in the early days of decentralized digital platforms, a field that needs radically more research and product development. A January 2021 report from Bluesky, Twitter’s decentralized social network initiative, investigates more than a dozen protocols and applications that are built on decentralized principles. While a few projects, like Mastodon and Matrix, have achieved some scale of usage and funding in the millions of dollars, the amount of investment in decentralized platforms is small compared to the $2 billion received by Facebook prior to its IPO.

By potentiating choice and agency through interoperability, new online platforms can unlock unprecedented innovations that ultimately empower us to reassert control of our attention. The further development of decentralized platforms will give us the tools to build a better attention economy — an economy where we, the users, exercise our voices in planning how we build our online communities.

Artwork By

Cristina Spanò

Contact Us

Have an idea for a story or illustration? Interested in discussing partnerships? We want to hear from you. Send us a note at info(at)thereboot(dot)com.

Recommended Reading