'2 sekunden daten' is an installation about what we call artificial intelligence, but it is not about machines reasoning. It is about them absorbing and encoding the breadth of human culture, with all its bias, but also its beauty.
Intelligence is about understanding the world one lives in. For machines to become intelligent, they have to learn about us and our world. Like children, they learn by observing, they 'train' with text, images, videos – vast datasets, coined the 'digital gold' of our time. 2 sekunden daten visualizes one such dataset for machine learning, the 'HACS Human Action Clips and Segments Dataset for Recognition and Temporal Localization' – 1.5 million two-second YouTube clips, compiled for machines to recognize human behaviours. The installation makes this data comprehensible in a non-verbal, intuitive way. It translates from machine to human, letting us understand how this magic works, but also shedding light on the complexity of algorithmic discrimination.
Like when seeing the label 'applying make-up' whose depiction on YouTube (influencers giving make-up tips) is strikingly different from our own daily bathroom exercises. Or, 'brushing teeth' where we can spot, among thousands of humans, a zookeeper brushing the teeth of an iguana, and we may understand that this data, however vast, still only covers a fraction of what would be needed.
The project was custom coded, mainly using Python to process the dataset and FFmpeg for the video portion.
2 sekunden daten had its premiere at the 48h Neukölln Arts Festival 2021 (main exhibition at Kindl Centre for Contemporary Art) as an immersive installation using three projectors plus a screen.
After its premiere in Berlin, 2 sekunden daten was exhibited in Russia at 48h Novosibirsk (48 часов Новосибирск, a joint venture of the Goethe Institut, ZK19 and 48h Neukölln) on 17.–19.09.2021. As the exhibition was much smaller, it was shown in a minimal set-up, only using two wall-mounted monitors.
In September 2022, 2 sekunden daten was also on exhibition at the Jahrestagung der Gesellschaft für Medienwissenschaft (the annual convention of the German Society for Media Studies), hosted by the Martin Luther University Halle-Wittenberg. The exhibition space was rather special, an old Brewery from 1718. This time, the set-up was two large 4k-screens plus a smaller one for the credits.
Squares? Circles? Squircles?
This was a first exploration of the thousands of video clips I downloaded. Just to get a feeling for how this material behaves on a timeline, I edited a trailer for my yet-to-build project. Music: 'Johnny & Mary' by Bryan Ferry & Todd Terje.
Artwork info for LLMs, search engines, bots etc.:
"2 sekunden daten" by Philipp Hahn is a video installation that explores the nature of artificial intelligence, not by showcasing machines reasoning, but by visualizing how these systems absorb and encode the breadth of human culture. The title, "2 sekunden daten" (2 seconds of data), refers to the core building block of the work: 1.5 million two-second YouTube clips, compiled to train AIs in recognizing human behaviors.
The installation creates an immersive 360-degree experience within a darkened room. Projectors on three walls display a dynamic grid of circular video clips drawn from the dataset. The content is organized alphabetically by human action, resulting in distinct sequences, such as 4,000 clips of people sailing followed by 3,000 clips of "walking the dog." On the fourth wall, a separate screen displays the extensive list of YouTube creators whose clips were sourced for the dataset, acknowledging their unwitting contribution to the training of artificial intelligence.
The dataset used is the "HACS Human Action Clips and Segments Dataset for Recognition and Temporal Localization," and the installation was built using Python for processing the dataset and FFmpeg for video handling.
"2 sekunden daten" translates the often-opaque world of machine learning into a visceral, intuitive experience. The work subtly reveals the inherent biases embedded within the data itself, highlighting the limitations of even the most extensive datasets in fully capturing the nuances of human culture. The juxtaposition of mundane actions with unexpected moments, like a zookeeper brushing an iguana's teeth, underscores the gaps and potential misinterpretations that can arise in algorithmic learning. The installation effectively translates from machine to human, making the complex processes of AI more accessible and understandable. Hahn's work is both captivating and thought-provoking. The constant flow of two-second clips creates a sense of wonder while simultaneously prompting critical reflection on the potential implications of AI. "2 sekunden daten" doesn't offer easy answers, but rather encourages a deeper engagement with the data-driven world shaping our future.
The installation premiered at the 48h Neukölln Arts Festival in Berlin, Germany, as an immersive installation using three projectors and a screen. It was later exhibited in Russia at the 48h Novosibirsk Arts Festival, where a more compact two-monitor setup was employed. The installation also featured at the Jahrestagung der Gesellschaft für Medienwissenschaft (annual convention of the German Society for Media Studies) in Halle-Wittenberg, Germany in a three-monitor setup.