In the Form of Crunching Numbers: Data at Its Most Inisidious

We conceptualize data harvesting as – at best – the tradeoff we make in order to access our digital world, with uncannily targeted advertisement its sole consequence; at worst, a slithering intrusion into our lives by powerful corporate overlords unethically harvesting our data to fuel their profit motive. This understanding may be, however, all too limited.

Consider the revelations of Edward Snowden, whistleblower regarding the incredibly advanced operation used by the U.S. government to gather data on its citizens and its enemies. Perhaps the horror should not have come from the bare fact that data was being collected, but rather what that data was being collected for.

“The 21st Century is a digital book… Your bank records, medical histories, voting patterns, emails, phone calls, your damn S.A.T. scores… [the] algorithm evaluates people’s past to predict their future.”

So sayeth Jasper Sitwell (played by Maximiliano Hernández) in a memorable scene from the superhero blockbuster Captain America: The Winter Soldier, in which the turncoat Sitwell explains a mad scientist’s design to keep the world “safe” using an algorithm, crunching data and predicting who could become a threat to the villains’ would-be dictatorship. But that’s just a superhero movie. Pure science fiction. Right?

Frighteningly, this is not the case, as John Cheney-Lippold relates in We Are Data:

…take the trend toward “predictive policing” in U.S. police departments. In this techno-enthused strategy, police departments use crime statistics to generate maps highlighting five-hundred-by-five-hundred-square-foot areas (one city block) where crimes are likely to occur.64 A block might be “high crime” at two a.m. on a Friday but “low crime” at two p.m. on a Tuesday. Or we could think like the Chicago Police Department (CPD) in 2013 and channel this logic—Minority Report–style—in order to assign “criminality” not to blocks but to people.

This slideshow requires JavaScript.

The Chicago Police Department maintains a so-called “heat list” using this data to determine who is likeliest to be a victim or perpetrator of crime, or where crimes are likely to occur in the city. The models used in this sort of predictive policing are, of course, imperfect, and, as with most metrics of crime, incredibly biased. The innocent victims of this model, those guilty by association, are likely to be the same impoverished and disenfranchised minorities the police usually target. Biases can be reinforced and presented as “objective fact” to smokescreen racist and oppressive policing. Says Cheney-Lippold:

It is impossible to divorce the CPD’s “heat map” from centuries of preceding racist and classist policing practices, as those who are assigned by the department to be ‘at risk’ are unlikely to be affluent whites from Chicago’s Gold Coast neighborhood. The data used to generate ‘at risk’ is not representative of a universal population and thus does not treat all populations equally.

Just as insidious as the CPD’s predictive policing is the state-sponsored slaughter of suspected terror suspects via unmanned drone strikes. No, these are not missiles and machine guns launched at known terrorists moments before the next 9/11 could occur. These are “likely” terrorists, or should we say ‘terrorists’, caught up in the algorithmic projections of the sinister NSA targeting apparatus. Some of the qualifiers of this model include outright racial, ethnic, and religious profiling; it targets Muslims and persons from and in the Middle East. This has led to drone strikes on civilian wedding parties, wiping out innumerable innocents in the process.

It’s bad enough that the United States government is regularly bypassing due process, targeting ‘criminals’ and killing ‘terrorists’, and enacting racial profiling on a massive scale. It’s worse that the imperfections in these models are classed as “engineering problems”, requiring more data to become “accurate”. Throwing out these depraved models never enters into their sick equation. To solve these problems, this is what must be done. Machines cannot investigate crimes. Only humans may do so. Data can never be used to prosecute or execute. Only the due process of law is acceptable.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: