Worldwide icon-chevron-right Europe icon-chevron-right United Kingdom icon-chevron-right England icon-chevron-right London icon-chevron-right Trevor Paglen: From 'Apple' to 'Anomaly' review

Trevor Paglen: From 'Apple' to 'Anomaly' review

Art Barbican Centre , Barbican Until Sunday February 16 2020
2 out of 5 stars
Trevor Paglen: From 'Apple' to 'Anomaly' review
Trevor Paglen, The Treachery of Object Recognition, 2019 © Trevor Paglen, Courtesy of the Artist, Metro Pictures, New York, Altman Siegel, San Francisco

Time Out says

2 out of 5 stars

AI networks are tracking photos of you. Sorting and categorising you by your face and actions, by the digital breadcrumbs you leave behind. Artist Trevor Paglen doesn’t trust that. And with good reason.

Across the Curve’s wall, he’s plastered images from ImageNet, a research project with 14 million pictures in its databanks, all categorised according to content. He’s printed thousands of them here, each relating to a specific tag. It starts at apple, runs through anchovy, farmer, fireball, investor, spam, hood, celebrity and alcoholic, through to anti-Semite, accused, creep and anomaly. A human would have picked the categories, a machine would’ve learned them.

Each category has its quirks, because language is tricky. Hood means two things, so does spam, so does porker. So the images are incongruous, funny, odd. But they get uncomfortable. Look how white the ‘investor’ faces are, look how non-white the ‘accused’ faces are, look at the wall of Asians under ‘divorce lawyer’, look how all the ‘cat fancier’ images are of women, etc, etc, etc. AI isn’t neutral, it’s man-made and it has man-made biases built into it. Those biases are tracking YOU, tagging YOU as ‘creep’ and ‘anomaly’. It’s disquieting, awkward and very, very scary.

So we get that AI does dodgy shit with images, but how were the pictures here chosen? That’s where things getting a little muddy. Paglen designed an algorithm to choose images within each of his selected categories, he then printed out the images and pinned them to the wall. But if we’re questioning the intention of AI as a technology, why should we trust his algorithm? What biases did he feed into his programming? What biases did he express in how he hung the images on the wall? What narratives is he creating? You walk away from this feeling like you should question AI, and you should definitely question Paglen too.

This isn’t satisfying, gratifying or even that interesting as a visual experience either. Turning the digital into analog by printing the photos out makes them physically real, but it doesn’t add anything. Going for an analog-over-digital presentation feels like a missed opportunity. Plus, something about hanging countless photographs on a wall like this will always feel like a teenager’s bedroom.

Paglen has an important point to make about future technologies, and he has something to say that should get us all talking. But the inconsistencies in his approach don’t help get his message across, they just hold it back.

Details

Venue name: Barbican Centre
Venue website: www.barbican.org.uk
Venue phone: 020-76388891
Address: Beech Street
Barbican
London
EC2Y 8AE
Transport: Tube: Barbican; Rail/Tube: Moorgate
Price: Free

Dates And Times

Users say

LiveReviews|0
1 person listening

Snap up exclusive discounts in London

Time Out's handpicked deals — hurry, they won't be around for long...