The organizers had invited PRIVACY to host a workshop at this year’s Computer, Privacy, Data Protection Conference in Brussels May 21-24, 2024. The delegation included Centre Director and Editor in Chief at Privacy Studies Journal (PSJ), Mette Birkedal Bruun, recently published authors of PSJ, Natacha Klein Käfer and Mateusz Jurewicz, and assistant editor of PSJ, Emma Klakk. The conference was packed with panels, workshops, book sessions, art, and networking sessions, involving organizations, scholars, businesses, policy makers, and activists.
We have come back with a lot of food for thought concerning interdisciplinarity, collaborations, and networks of people, technologies, and values in relation to privacy, data, and AI.
Privacy Studies Journal workshop
On the first full day of the conference, Wednesday 22, PRIVACY and PSJ hosted the workshop Artificial Intelligence and Privacy: Causes for Concern? The workshop was introduced with a presentation by Centre Director and Editor in Chief, Professor Mette Birkedal Bruun, of the journal, the centre, and our comprehensive and interdisciplinary approach to privacy and the private. At PSJ we consider privacy a broad and multifaceted concept that manifests in, and is contested by, ideas, architecture, laws, technologies, art, norms, social structures, and relations et al., and thus we see efforts across disciplines and sectors as a key to approaching, understanding, and tackling privacy related issues. After Mette’s presentation, assistant professor of history at PRIVACY, Natacha Klein Käfer and Computer Scientist Dr Mateusz Jurewicz presented the principal results from their newly published position paper in PSJ Artificial Intelligence and Privacy: Causes for Concern, as well as some insights into their interdisciplinary collaborations around the paper.
The participants in the workshop were students, consultants, and officers from policy, law, technology et al. with expertise in areas such as health care, AI, religion, human rights, social affairs, data collection, classification, and protection. We are genuinely excited about the interdisciplinary participation at our workshop, which enabled fruitful and nuanced discussions. Nevertheless, when we bring people from many different backgrounds together in order to understand and learn from each other, it is important that each of us is explicit about terminology, focus, and perspectives. As an example, we discussed at the workshop what human-centred means. Which humans are at the forefront when we take a human-centred approach to privacy and AI? The immediate association of human-centred AI is that it concerns those humans who, as users, objects, and data subjects, are influenced by AI, and whose privacy is threatened or protected. However, human beings also populate the tech-companies, their board rooms, and their investor circles, and human beings run the AI training sessions, the policy fora, and the regulating offices. Each of these human beings comes with a particular baggage of explicit and implicit assumptions as well as cultural, economic, moral, and political norms and values. At PRIVACY and in Privacy Studies Journal, we argue that history can contribute to showing how different cultural outlook, personal intentions, social conditions, and inherited values influence the way in which people act in a given situation. Such actions and their underlying motives colour the development and usage of any given technology in past and present. Accordingly, we argue that human components are indispensable if we want to understand technologies such as AI and the inherent presence and absence of privacy, and that scholars from the humanities and the social sciences need to engage with such connections in order to reach a holistic view of the technologies at hand.
Systems of technology, data, values, and labour
The hot topic at the CPDP conference this year was Artificial Intelligence. CPDP recently rebranded as CPDP.ai. It seems that we can no longer discuss computers, privacy, and data protection without considering AI. We argue, however, that we cannot discuss computers, privacy, data protection, and AI without considering the humans involved in producing, consuming, regulating, promoting, or contesting it.
On the opening night of the conference, Tuesday May 21, director of SHARE Foundation and associate professor at the New Media Department of the University of Novi Sad, Vladan Joler, gave a lecture about territories, interconnections, and resources in the age of AI. In collaboration with professor at the University of Southern California, Kate Crawford, Joler has mapped the systems and networks of producing and consuming AI technologies. Through different visualizations he showed how AI technologies are part of interlaced systems and histories of labour, natural resources, money, and classification of concepts.
Joler’s visualizations are a striking reminder that often when we discuss ethical AI, we talk about biases in the Large Language Models, privacy, as well as data protection for the consumers, and maybe gender-based violence enabled by AI. While these are very important issues to discuss and attend to, they are only the top of the iceberg (or the map) of ethical issues: what about the working conditions of those mining for the cobalt used in our devices from which we use AI? What about the huge energy consumption of the data centres that drive the AI systems? What about the impact of AI on the planet? It is important to ask, how did we get here? What choices were made on the way and by whom? What are the supply chains and the histories behind AI technologies? What people, resources, technologies, labour, languages, and data does the development and use of AI technologies rely on? The historical research can help us answer such questions.
At the CPDP.ai conference, we were reminded once again of the fact that focusing on the human component of privacy and surrounding issues, as we strive to do at PRIVACY, means also to pay heed to responsibilities. If we talk about technologies and laws without talking about the people who produce the technologies, the people who make the laws, the people who market and use the technologies, as well as the people who abide by laws, and those who violate the laws, we are missing parts of the picture. A panel at CPDP.ai discussed gender-based online violence. A main topic in this debate was deep nudes; non-consensual sexual images/videos (primarily) of women and girls produced with AI technologies by adding faces of specific people to pictures and movies of either bodies of sex workers/actors or AI-generated bodies. These kinds of AI technologies do not only raise questions regarding regulations, but also questions concerning the cultural norms of consent, sexuality, and bodies. To tackle problems with deep nude technologies we need to understand the norms, values, beliefs, and practices among, e.g., young boys, who use these technologies. Of no less importance is, however, a focus on those who market such technologies, targeting, for example, young boys.
With our basis in Archaeology, Church History, History of Architecture, History of Ideas, Legal History, Archeology, and Social History and our work in interdisciplinary teams PRIVACY researchers are used to approaching privacy from several perspectives at once. We consider both legislative, religious, social, cultural, material, and architectural aspects. The CPDP.ai conference inspires us to pursue the past-present perspective with even greater vigour. The insights from our historical research into privacy continue to prompt questions about contemporary notions and issues of privacy. At PRIVACY we ask such questions in an ongoing exchange with scholars and practitioners working on privacy today. We do this by inviting scholars and practitioners to seminars and workshops at the centre, by meeting them in fora like CPDP.ai and the International Association of Privacy Professionals (IAPP), and by conducting interdisciplinary research with a past-present dimension (e.g. STAY HOME). While historical research helps contemporary privacy professionals to pose questions regarding the assumptions that underlie notions of privacy, current debates on privacy exemplify the historical and cultural situatedness of notions of privacy.
OpenEdition suggests that you cite this post as follows:
Mette Birkedal Bruun and Emma Klakk (June 18, 2024). A meeting of disciplines, people, laws and technologies: PRIVACY at the CPDP.ai Conference in Brussels. Centre for Privacy Studies. Retrieved December 12, 2024 from https://doi.org/10.58079/11uk8