Daemons are real. Or at least computer daemons are.
A daemon is early computer slang. In the first computer labs, daemons meant the programs running in the background, doing the invisible work of keeping systems online. The word came from Maxwell’s demon, a 19th-century thought experiment that imagined a supernatural creature capable of bringing order to the world; this inspired generations of scientists to try to make daemons real.
My open-access book, Internet Daemons: Digital communications possessed, explores the daemons that keep the Internet connected. But daemons — and other computer programs — do much more than transmit information. Here, I want to connect my intensive focus on internet daemons to extensive ways that computing orders and optimizes life today.
Daemons are integral to understanding contemporary concerns over smart city projects like Google’s Sidewalk Labs in Toronto and recent attention to surveillance capitalism. To address these recent matters of algorithmic regulation, we need to understand the long history of internet daemons and their ensuing policy problems.
Daemonic intelligence
The internet has its roots in the Cold War. Researchers at the United States Advanced Research Projects Agency (ARPA) embedded computers into the design of an experimental digital communication system. Known as ARPANET, its design developed into today’s internet. And these computers, known as Interface Message Processors (IMPs), became the switches, routers and gateways that make up the internet’s infrastructure today.
By embedding computers in its infrastructure, ARPANET mandated daemons to better manage our communications. Daemons were a collective intelligence distributed across the ARPANET and in constant contact with each other and early users. Their collective intelligence optimized the flows of information on the ARPANET.
ARPANET established that daemons could indeed better manage communications and that an optimal network would be solved through their coordination. As ARPANET turned into our home internet, these theoretical challenges became public ones, left for internet service providers to solve.
The old questions about the optimal way for daemons to manage our communications turned into major debates about net neutrality and the ability of service providers to manage communication. Service providers can choose to promote business partnerships, like streaming services and demote peer-to-peer networking, like BitTorrent.
5G, the term for next-generation wireless service, will raise these issues all over again. 5G requires even more daemonic intelligence to manage the complexity of sending its radio signals and allows service providers to prioritize signals so some applications preform better than others. This is a problem, because public policy makers hardly understand the work of daemons today and will certainly lack proper oversight of the next-generation daemons necessary for 5G.
The problem of optimization, however, is not just a telecommunications issue.
Governance through optimization
Internet daemons optimize how computers actively manage systems toward certain goals or highest-efficency states. Optimization is another way to understand algorithmic governance. It is at once a way of thinking and a way of doing. To optimize is to calculate optimal states that solve social and political problems. Optimization also involves ways to actualize these states. Daemons are just one kind of optimization that has developed in the history of computing.
The technical connotations of optimization obscures its social and political implications. For example, an optimal amount of news to include in Facebook’s NewsFeed or shorter passenger wait times on Uber are technical decisions and business ones.
The coming pandaemonium
Daemonic optimization became a template for digital platforms. Today, daemons (and other programs like them) are everywhere. Embedded in our screens, apps and smartphones, their nudges, rankings and interventions influence our behaviours and our social activities.
Optimization has moved beyond networked communications and is being applied to approach social problems. Google admitted as much in a 2016 internal thought experiment called Selfish Ledger that speculated how global problems — like health or climate change — could be solved by phones and other networked devices tuned to monitor and optimize human activity.
Google functions as what I imagine as a global operating system, a distributed intelligence able to steer individuals toward its goals through nudges and other cues in our phones and other devices. It depend on daemons to link, standardize, mediate, secure and manage the flows of information.
Now, we are caught between competing operating systems run by Google, Amazon, Microsoft, Tencent and, to a lesser degree, Facebook. The power of optimization is too much to be left in the hands of a few companies.
In user design, dark patterns refer to interfaces that trick people into making bad decisions. We see dark patterns everyday: for example, when we click a pop-up ad accidentally because it looked like a window on our computer or we are unable to uninstall an app because the “close window” button is hidden. These same technologies could optimize for addiction, anxiety or confusion.
Optimizing is politics by other means
The paradox is that improvements in technologies of control do not imply better control of technology. Instead, these operating systems seem out of the control to everyday people who themselves feel increasingly under control.
By Fenwick McKelvey
Associate Professor in Information and Communication Technology Policy, Concordia University