Categories
Notes

McQuillan (2022) Deep Learning and Human Disposability

Header image: KF in Dall-E

Dan McQuillan https://logicmag.io/home/deep-learning-and-human-disposability/

AI is a technology for managing social murder.

Under austerity, AI’s capacities to rank and classify help to differentiate between “deserving” and “undeserving” welfare recipients and enables a data-driven triage of public services.

While the sharp end of welfare sanctions are initially applied to those who are seen as living outside the circuits of inclusion, algorithmically powered changes to the social environment will affect everyone in the long run. The resulting social re-engineering will be marked by AI’s signature of abstraction, distancing, and optimization, and will increasingly determine how we are able to live, or whether we are able to live at all.

By ignoring our interdependencies and sharpening our differences, AI becomes the automation of former UK prime minister Margaret Thatcher’s mantra that “there is no such thing as society.”

While AI is heralded as a futuristic form of productive technology that will bring abundance for all, its methods of helping to decide who gets what, when, and how are actually forms of rationing. Under austerity, AI becomes machinery for the reproduction of scarcity

The modern conception of the state of exception was introduced by German philosopher and Nazi Party member Carl Schmitt in the 1920s, who assigned to the sovereign the role of suspending the law in the name of the public good. AI has an inbuilt tendency towards creating partial states of exception through its ability to enforce exclusion while remaining opaque. People’s lives can be impacted simply by crossing some statistical confidence limit, and they may not even know it.

  • KF: INclusion can be equally destructive RTBF

AI’s actions of segregating and scarcifying can have the force of the law without being of the law, and will create what we might call “algorithmic states of exception.” (e.g. no-fly list / Airbnb example; )

Governments are already implementing fully fledged states of exception for refugees and asylum seekers. Giorgio Agamben uses the term “bare life” to describe the body under the state of exception, stripped of political or civil existence.

One classic failing of the system has been misinterpreting medication that people had obtained for sick pets; dogs with medical problems are often prescribed opioids and benzodiazepines, and these veterinary prescriptions are made out in the owner’s name. As a result, people with a well-founded need for opioid painkillers for serious conditions like endometriosis have been denied medication by hospitals and by their own doctors

The problems with these systems go even deeper; past experience of sexual abuse has been used as a predictor of likelihood to become addicted to medication, meaning that subsequent denial of medicines becomes a kind of victim blaming

A supposed harm reduction system based on algorithmic correlations
ultimately produces harmful exclusions.

Securitization…The successful passing of measures that would not normally be socially acceptable comes from the construction of the threat as existential—a threat to the very existence of the society means more or less any response is legitimized.

The Tech to Prison Pipeline

ShotSpotter [Chicago] is a vivid example of the sedimentation of inequalities through algorithmic systems, overlaying predictive suspicion onto its deployment in communities of color and resulting, inevitably, in cases of unjust imprisonment.

The cascading effect of securitization and algorithmic states of exception is to expand carcerality—that is, aspects of governance that are prison-like.

The kind of social divisions that are amplified by AI have been put under the spotlight by Covid-19: the pandemic is a stress test for underlying social unfairness.

We already have all the computing we need.
What remains is how to transform it into a machinery of the commons.