Digital Rights Archive Newsletter - Ninth edition
Sometimes it seems like most of what comprises internet and digital policy today involves a constant reconsideration of the meaning of words. A couple of decades ago, “open,” as in “open access” or “open source,” was largely seen as a good thing. It implied sharing, greater access to information. That’s no longer a tenable consensus position. “Open” has always had different political and power implications in different settings, as both Peter Baldwin in a podcast, and David Gray Widder, Sarah West and Meredith Whittaker, highlight in an article that is already being well-circulated. Baldwin places “open access” in a wider historical context, while Widder, West and Whittaker look at the “overbroad and ill-defined” use of “open,” especially with respect to OpenAI. At the very least, both of these highlight the importance of not simply assuming that “open” equals “good.” Calling something “open,” as these two works remind us, should be the beginning of the debate, not its conclusion.
On the subject of language, whenever I see an article like Stephen Shankland’s on the cables over which almost all of our internet traffic flows, I can’t help but remember how the late Republican Senator Ted Stephens was pilloried mercilessly for once referring to the internet as “a series of tubes.” When you read his whole quote, it’s clear he doesn’t fully get how the internet works, but … he’s not entirely wrong? For non-experts, is it any less misleading than referring to remote computer servers as “the cloud”? Anything that reminds us of the concreteness of the internet’s “nervous system,” as Shankland refers to these cables, how they work, their geopolitics, and who runs them – increasingly, the big American tech giants – is a welcome addition to the discussion.
Language, or rather language of hype also shows up in a nice summary article by Eric Schewe that looks at the role of hype is playing in military rivalries over the use of artificial intelligence. It places the debate in an historical context, as “merely the latest steps in the automation of state violence that took its first big leap with the formation of standing militaries in Europe in the eighteenth century,” noting further that militaries have been worried about maintaining a technological advantage in automation ever since. It tackles some big issues, noting how this hype is present in the United States and China, driven as much by fears about how rivals will use it – itself driven by hype – as the tech itself.
This newsletter has repeatedly emphasized historical context and continuity in understanding technological developments and effects. Continuity features prominently again this month, and not just in AI and military technology. Zephyr Teachout’s insightful article on algorithmic personalized wages brings me back to my undergrad microeconomics courses: a technologically new way for companies to capture all the surplus between buyers and sellers, in this case for labour. Which fits very well with Teachout’s focus on labour solidarity and how it atomizes the workforce and transfers power to the boss.
Also on the old is new again front is the Fast Company interview with Lee McGuigan, author of Selling the American People: Advertising, Optimization, and the Origins of Adtech, which reminds us that just as companies have always been interested in capturing consumer surplus, ad companies have always been interested in the ability to target advertising and predict behaviour.
For those interested in digital regulation, we have two very relevant pieces for you. Australia and Australian academics are at the forefront of tech-regulation issues, so the ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S) symposium on automation and regulation promises to be worth your time. Sticking with the theme of regulation, Asaf Lubin joins the Lawfare podcast to discuss the regulation of commercial spyware.
We also have an interesting-looking discussion out of London over the effect of AirBnB on communities that takes on greater resonance with the news that New York City is restricting all AirBnB rentals to residences where the host is present, and limiting guests to two people.
Finally, and personally speaking, Andrew Stokols’ article on an alternative imaginary of the smart city, was remarkably timely. I recently attended a play about the Google company Sidewalk Labs’ attempt to build a smart city in Toronto. Its top-down driving forces were the opposite of the grassroots ideas that underlie the concept of “insurgent digital citizenship” emerging from the 2019-20 Hong Kong Anti-ELAB protests. Stokols’ article is a welcome reminder that technology, like language, can be deployed in different ways to different – perhaps even democratic – ends, and that there are always alternatives.
- Blayne Haggart
The Insurgent Smart City: How a Social Movement Created an Alternative Imaginary of the Smart CityAndrew Stokols | Journal of Urban Affairs
This article looks at an urban social movement, the 2019-2020 Hong Kong Anti-ELAB protests, as an alternative to traditional smart cities. It shows how the communications system used in the protests enabled coordination while also remaining open to grassroots decision-making, creating an ‘insurgent smart city’ which inverted top-down visions of a total urban information system. This offers a new sociotechnical imaginary of what smart cities could be and suggests an ‘insurgent digital citizenship’ as an alternative to traditional models.
As Militaries Adopt AI, Hype Becomes a WeaponEric Schewe | JSTOR Daily
The applications of AI for military use have led to discussion as to the potential implications of these technologies, with anxiety over its potential to lead to increased oppression. To this end, states met in February 2020 to discuss the responsible use of AI for military purposes, and the US issued a political declaration of principles to guide development. Yet these advances are merely the latest steps in the automation of state violence. The real danger lies not in the capabilities of AI, but in how humans interact with it, and awareness should be raised about the implications of its use.
Algorithmic Personalized WagesZephyr Teachout | Politics & Society
This article explores algorithmically created personalized wages: what they are, what they mean, and what we can do about them. First, it establishes a taxonomy of five different forms of algorithmic wage differentiation. Second, it argues that the spread of these techniques has democratic implications. They will increase economic and racial inequality. They will harm labor solidarity. Perhaps most importantly, they put workers in a profoundly humiliating position in relationship to their boss, one where speech and autonomy are discouraged because they can lead to lowered pay. Finally, it argues that we should understand these developments as innovations in power and domination and use old antimonopoly strategies as ways to limit the democratic downsides of these tools. We should explore bans or limits on first-degree labor pricing discrimination and enhanced antitrust enforcement.
The Case for Open AccessPeter Baldwin | Then & Now
The guest argues that open access today is not a novelty but continuous with earlier developments in which artists and thinkers were ‘workers for hire’, who were compensated for their creative and scholarly labor. In the same vein, university professors are paid to produce scholarship which should incline them to accept open access.
The Secret Life of the 500+ Cables That Run the InternetStephen Shankland | CNET
Subsea cables are essential for the global internet infrastructure, connecting the world with high-speed transmission of data. These cables are incredibly low-tech, yet still vulnerable to threats such as sabotage, fishing equipment, and natural disasters. Governments are taking action to protect these cables, while tech giants like Microsoft, Amazon and Google run the internet's brains and nervous system between continents. This advancement has enabled us to view live concerts in London from our homes in Atlanta, while also providing economic benefits to even the most remote places.
Automation: A New Regulatory Agenda?Kimberlee Weatherall, David Abkiewicz, Joanne Gray, Andrew Kenyon, Bill Simpson-Young | ADM+S Centre
Automated decision-making and artificial intelligence have emerged as an area of major regulatory activity. Governments are rushing to introduce laws and policies to address possible harms, but to date the focus has mainly been on protecting society from runaway Tesla cars and the use of AI in automated assessments (like credit scores). This is starting to change following the rise of generative AI, and more attention is being paid to possible risks across the news and media sector. However, interventions in this area are still at an early stage.
Regulating Commercial SpywareAsaf Lubin | The Lawfare Podcast
The increasingly pervasive use and abuse of spyware by governments around the world has led to calls for regulation and even outright bans. How should these technologies be controlled? According to the guest, the best path forward is an international agreement that would regulate, but not outlaw, these important national security and crime-fighting tools.
Adtech’s Surveillance Ambitions Are Decades in the MakingLee McGuigan | Fast Company
Algorithms, data extraction, digital marketers monetizing "eyeballs": these all seem like such recent features of our lives. Explaining how marketers have brandished the tools of automation and management science to exploit new profit opportunities, the guest traces data-driven surveillance all the way back to the 1950s, when the computerization of the advertising business began to blend science, technology, and calculative cultures in an ideology of optimization. With that ideology came adtech, a major infrastructure of digital capitalism.
Is Airbnb Ruining Communities?Merilee Karr, Craig ab Iago | Roundtable
The short-term let industry was originally a way for people to rent out their spare rooms or houses to intrepid travellers who wanted to stay somewhere more authentic than a hotel.But once real estate investors started to see the success of companies like Airbnb, they were soon buying up housing stock to cash in. Are local communities now suffering as a result?
Open (For Business): Big Tech, Concentrated Power, and the Political Economy of Open AIDavid G. Widder, Sarah West, Meredith Whittaker | AI Now Institute
This paper examines ‘open’ AI in the context of recent attention to open and open source AI systems. The authors find that the terms ‘open’ and ‘open source’ are used in confusing and diverse ways, often constituting more aspiration or marketing than technical descriptor, and frequently blending concepts from both open source software and open science. This complicates an already complex landscape, in which there is currently no agreed on definition of ‘open’ in the context of AI, and as such the term is being applied to widely divergent offerings with little reference to a stable descriptor.