An ICE-y View of the Future
This was originally posted on blogger.
This essay was written for a class called Human Contexts and Ethics of Data. Hope you enjoy.
“About 50 students and community members met at 12 pm to march against UC Berkeley’s ties with Silicon Valley software giant Palantir…”1 wrote the Daily Californian in September, 2019. The protesters opposed Palantir’s links with the University of California through the Corporate Access Program (CAP), a way for corporations to become “EECS Industrial Partners”2 and have special access to recruit budding computer scientists studying at Berkeley.
As the students protested, they chanted phrases like “Up up liberation, down down with deportation.” and “Carol Christ, you know it’s true, the crimes of ICE depend on you.” These statements reference Palantir’s connections to the Immigration and Customs Enforcement (ICE) agency, which uses Palantir’s tools to access and surveil people’s data in real time to hound undocumented immigrants. Palantir’s tools are a hallmark of the post-9/11, digital age – an example of big data giving unprecedented power to the US surveillance state.
This essay will focus on the socio-technical imaginariesespoused by the students compared to that of Berkeley’s EECS department. A socio-technical is defined as a way “with which individuals and collectives imagine and build their desired futures.”3 For this specific case, this paper will describe the EECS department’s vision of providing an impartial platformfor private companies, compared to the students’ vision of hosting only moral and ethical institutions. Such visions get to the heart of data and society: To what ends is data science used – and when something goes wrong, who should speak against it?
First, one can examine the supposedly neutral role that Palantir and Berkeley play in ICE’s doings. Palantir has largely abstained from moral responsibility for ICE’s immigrant abuse. They have claimed that though they provide the tools, they are not responsibles for the actions of the clients or the way that they are used. The EECS department, too, refused any moral responsibility in connecting with Palantir. The professors said that the University is not endorsing ICE, but merely providing a platform for free engagement. The Daily California quotes, “CAP should be a platform in which students and employees can freely interact.” From this perspective, the obligation to oppose wrong-doing falls on Berkeley students rather than the EECS department, absolving the University of responsibility.
However one considers the CAP program, the professors’ statements ignore a deeper, incestual bonding between Big Tech and academia. Rodrigo Ochigame, in an Interceptarticle called The Invention of ‘Ethical AI’, describes how the academic world endorses and legitimizes the ethics-washing of Big Tech4. More and more activists have been opposing practices of the industry (with movements like #NoTechForIce or #Data4BlackLives), and in response the Tech Industry has had a surge of conferences on ethics and humane use of technology. This too is a conflict of socio-technical imaginaries – is technology held accountable through public means or through self-regulation? As Ochigame describes, Tech industry wants to regulate itself, and universities like MIT or Berkeley provide space and credibility for this socio-political imaginary (Berkeley’s own Division of Data Sciences is led by a Microsoft veteran, Ochigame notes).
Having considered the EECS department, one can examine the socio-technical imaginary of the students in their protest. The students’ chants including “Carol Christ, you know it’s true, the crimes of ICE depend of you” ethically implicate the University. They assert that by providing services to Palantir, the UC perpetuates the system of violence, even if they are not directly causing deportation (possibly in response to this argument, UC Berkeley has since removed Palantir from the list of Industrial Partners). They claim that a public institution ought not to endorse a nefarious private entity – especially one that enables violence and terror on undocumented people (who may even be students). Thus, the imaginary is that data science companies ought to be constrained by morality laid out by the people, not by private interests.
To some, the University lies above the political world: the halls of the ivory tower are not to be troubled by the concerns of ordinary citizens. To others, the political consequences of the University mustbe considered in order to be a responsible citizen. UC Berkeley has both of those sides – the academic and the political – each representing a different vision of what role a university should play as an institution. Ground-breaking science developed at Cal (eg. CRISPR, the atomic bomb) has been accompanied by leaps forward in social justice (eg. Anti-War, Disability Rights movements). Social movements are a socio-technical imaginary in themselves – a collective assertion of a vision for a just world – and in many ways, the students protesting Palantir follow in this legacy.
1Alexandra Casey and Angelina Wang, “Under pressure, Palantir cancels UC Berkeley information session.” Daily Californian, Date.
2Berkeley Electrical Engineering and Computer Sciences, “EECS Industrial Partners.” Accessed September 28th, 2020.
3Margo Boenig-Liptsin and Ari Edmundson, “Human Contexts and Ethics of Data Toolkit.” Accessed September 28th, 2020.
4Rodrigo Ochigame, “The Invention of ‘Ethical AI’: How Big Tech Manipulates Academia to Avoid Regulation.” Dec 20, 2019.
More blogs...
Here are some other recent posts: