[ad_1]

Kate Crawford has always studied the complicated intersections between science, technology, art and politics. An expert in frontier disciplines, she is herself a hybrid and eclectic scholar. Her area of ​​expertise is the social and political impact of artificial intelligence. Crawford is senior researcher at Microsoft Research (a subsidiary of the company founded by Bill Gates) and a professor of communications at the University of Annenberg, California. In addition, she leads an international research group focused on machine learning at the Parisian Higher School.

His latest book, Atlas of Ai: Power, Politics and the Planetary Costs of Artificial Intelligence, was translated to Italy in 2022, with the title Neither intelligent nor artificial: The dark side of AI. In the volume, Crawford delves into the implications of the our interaction with artificial intelligence: from the environmental effects of mining, to privacy and data management issues, to violations of the rights of the workers who train the algorithms. Crawford introduces artificial intelligence not as an abstract solutionan arid set of data and numbers, but as one sprawling machine which touches a multitude of aspects of human life and society. “I wanted to write about how AI is built, in a broader sense – he said in a‘interview to the Guardian –. Its costs in terms of natural resources, work processes, its classification logic“.

The artistic work

In addition to her academic work, Crawford has also been involved in visual arts and music. In the late nineties and early 2000s you were part of an electronic music duo called B (if) tek, along with Nicole Skeltys. She has also been the curator of several artistic projects, including Anatomy of Ai,an anatomical study by Amazon Echo”, Which reconstructs the origin of each component of the system within the global economy and ecology – on which he worked together with Vladan Joler. Anatomy of Ai it consists of drawings, diagrams and twenty-one short essays that tell how “every little moment of comfort – whether it’s answering a question, turning on a light or playing a song – requires a vast planetary networkfueled by the extraction of non-renewable materials, labor and data“.

Together with the artist Trevor Paglen Crawford he also curated the exhibition Training Humansdedicated to the images used to train artificial intelligences, in particular those intended for facial recognition applications, highlighting the ethical problems and bias intrinsic. Interviewed by Wired on the occasion of the inauguration of the exhibition in Italy, Crawford said: “Notice how human labor is actually used to create systems that they seem supernatural or more objective than human intelligence has ever been. Our project shows instead how these systems are the opposite of neutral and objective and that they are, on the contrary, deeply subjective, human and in some cases even stereotyped or discriminatory“. The exhibition was hosted by the Prada Foundation in Milan, between September 2019 and February 2020.

Crawford is also part of the cyberfeminist collective Deep Lab, a group of female artists, researchers, writers and academics studying the intersections between technology and gender policies. Deep Lab deals with design in the areas of surveillance and privacy, ethical hacking and online anonymity.

The cartography of power

Technological devices draw a map, one cartography of power relations on the planet: between countries, between companies and workers, between users and platforms, between humans and the natural world. “I hope that the curtain rises and people say: Let’s really look at who manages the levers of these systems “ – explains the academic a Mit Technology Review -. This means that we must not only focus on aspects such as ethical principles, but we must talk about power“. Kate Crawford will talk about the social and political aspects of artificial intelligence at the Wired Next Fest in Milan, on 7 and 8 October 2022 at the Fabbrica del Vapore.

.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *