A Toolbox for Cross-Reality Events

Over the past three years, we have learnt to use the advantages of hybrid events worldwide. XRevent creates an event metaverse that connects physical and virtual spaces, visitors and performers across the boundaries of the respective reality. This offers the opportunity to organise events internationally and for new target groups. It also promotes diversity and inclusion of people who would otherwise be excluded from events.

With the XRevent platform, we are primarily focussing on „latency-free“ signal paths for camera, sound, light and many other sensor data. This enables a shared experience through interaction, communication or the exchange of objects across the boundaries of the physical and virtual world.

XRevent Creator

With the XRevent Creator, it is possible to create individual events such as concerts, theatre and stage shows, conferences, club events, exhibitions and trade fairs without any programming effort. The workflow for setting up a real event is taken into account and transferred to the virtual event. Standard event equipment such as lighting consoles, video mapping tools and sound equipment can be used to manipulate both events simultaneously. An integrated ticket shop turns the virtual event back into a source of income for artists and organisers.

XRevent Broadcaster

The main benefit of the XRevent Broadcaster lies in the reduction of streaming latency and also offers performers the opportunity to provide feedback from the real space. This means that they can perform in the real space and see almost simultaneously how their performance is received by the audience in the virtual space and can react to this. This also enables Q&A sessions between the real and virtual conference.

XRevent Lightshow

For a media architecture test event, storytelling was developed for the façade of the Carl-Bosch-Gymnasium in Ludwigshafen within a week. The focus was on playing with the conditions of the building, e.g. backlighting of the windows and direct lighting of the façade.

The real show was controlled via an MA lighting console, while the signals were simultaneously transferred to the virtual lighting editor via ArtNet. This editor converts the otherwise quite large ArtNet packages so that they can be sent to the end users and their glasses via WLAN. At the same time, the editor serves as a control room for the lighting operator/designer. He can now control the show from a remote location.

To reproduce the way a lighting designer works, last-minute changes can also be made to the real show in VR. The end user does not have to download a new app as usual, but simply receives the changes as an update.