chapter

Participatory Livecasting

Lilian Stolk What can participatory livecasting look like? Is there a playfulness in its future, and a true on- and offline collaboration between the physical world and the virtual one?

At the moment, as an online visitor of a hybrid event, you can just performanceask a question to the speaker, which may or may not be seen by the moderator. Of course, you can also textchat with each other, but there are relatively very few other ways to show what you think. Or what knowledge you have to share. That return channel from the online audience to the event and to the on-site has not yet been developed at all. That is still very simple.

This is a quote from Monique van Dusseldorp (programmer, moderator, and researcher on the future of events) during an interview The Hmm, affect lab and MU did for their research on hybrid events for the development of the Toolkit for the Inbetween. The toolkit can be found here: https://toolkitfortheinbetween.com. This research and The Hmm’s and H&D’s own experience from organizing and experimenting with online and hybrid events revealed that it’s important for an online audience to ‘feel seen’ and have a sort of agency during hybrid events. Online audiences are often treated as a nice addition, but they are never important for an event to continue. In a way, they are more like spectators. What if the event is influenced by the presence of an online audience? How can we give online visitors more agency? These were the core questions of the Participatory Livecasting research group of Going Hybrid.

Participation Equals Online Agency

During one of the first meetings we had, we agreed that a better hybrid live experience is not necessarily more immersive. We’ve experienced that more connection between audiences (on-site, online, and among each other) can also be achieved in a performancelow-barrier collaborative spreadsheet drawing session, for example. On websiteThe Hmm’s live stream website, developed by Karl Moubarak and designed by Toni Brell, online visitors are visualized by still imagea simple dot at the top of the page. Visit The Hmm’s livestream website here: https://live.thehmm.nl. When people performancesend an emote, the dot of that person for a moment changes into the emote of their choice. This is a very subtle way to let the online audience feel seen and acquire some agency: they literally claim a bit of space on the live stream page and can operate somewhat autonomously by changing the contents of this space45px * 45px area. But this agency remains tied to the online environment. We decided that, in this project, we wanted to research how the online audience’s agency can extend beyond this 45px*45px area and into the physical space of the event, and how to make a more direct connection between online and on-site audiences. We wanted to develop mechanisms and prototypes that enable the translation of input from the websiteonline audience to outputs in the spacephysical space and vice versa. This, for us, is the essence of ‘particpatory livecasting’.

Thinking Through Doing

During the project, we focused on the development of the tool Emoji Proxis & Ghost Messengers, which translates changes in the online environments (like textchat inputs, a new online performancevisitor entering the space, or a still imagereceived emote) into something that happens in the spacephysical space (like a light goes on, a smoke machine turns a vote is cast, or the program shifts). This tool is a kind of open-source plug-in we developed for The Hmm’s livestream website.

Working on this concrete prototype, it became easier to think experimentally. You start to see the different ways it can be used, for different purposes. Testing, development, and participation naturally merge. During a workshop we organize in November of 2022 in Page Not Found, The Hague, we invited the audience to help us join us in building a system encompassing several different hybrid networking experiments. We explained to both audiences (online and on-site) that Emoji Proxis & Ghost Messengers runs on objecta little ESP32 module, and we taught the on-site audience to develop their own prompt. These performanceprompts would trigger small, mechanic on-site events: lights going on and off, a scentscent dispenser going off, and a objectblower turning on. During this fun experiment, we exchanged knowledge about assembling systems like these and thought together about how to situate them in the context of hybrid events.

The - for now - most advanced experiment with the tool took place during the event Screentime Airtime Facetime. By typing a simple command in the livestream chat, online audience members could change the angle of an on-site camera. In the margins of this publication, you can find the traces of this experiment: the chat commands are included as part of the chat annotation in the margins.

About this Text

Lilian Stolk is the director of The Hmm, a platform for internet culture. She was a participant of the Participatory Livecasting group of Going Hybrid.

A previous version of this chapter was published on the Going Hybrid research blog. You can read the original here: https://networkcultures.org/goinghybrid/2022/11/17/participatory-livecasting/

Labels