The best way to connect live music events

LIPS research consortium presents networked live music event at its concluding workshop

Launched in April 2018 to develop smart services for the professional production of cultural events, the LIPS project (Live Interactive PMSE Services) has gained even greater importance due to Covid-19. One of the central questions of the research project was: What do musicians need when working remotely to make music together from different locations? As well as researching this question, the project, which was co-funded by the German Federal Ministry for Economic Affairs and Energy (BMWi), showed how the theory could be put into practice during its concluding workshop in October with an impressive live concert: Musicians from the cooperating universities HMTM Hannover and HMT München played together from separate locations.

Two locations, one gig: the musicians in Munich were linked up to the second half of the band, which played live in Wedemark

In addition to Sennheiser, the LIPS consortium includes ARRI, TVN MOBILE PRODUCTION, Smart Mobile Labs AG, Fraunhofer Heinrich Hertz Institut, Friedrich-Alexander Universität Nürnberg-Erlangen and Leibniz Universität Hannover. Sennheiser’s Dr Andreas Wilzeck, who chaired the consortium, explained the starting point for the team: “There are many different and contradicting statements in literature as to what is required for networked events. The LIPS project decided to take a step back and take a fresh look at what is really needed, thus creating a solid data basis for any networked production.”

Obviously, an important aspect (and hurdle) for such remote and networked productions is latency, and the consortium partners focused their efforts on achieving latency figures that are comparable to those in a standard room. In their model set-up, a fibre-optic network played an important role, as did 5G wireless as a potential option for bridging the last mile to such a cable-based network infrastructure. Other aspects included added value for the audience, for example, by offering assistive live listening with premium quality.

The jam session effectively demoed today’s state of the art. The two parts of the band could see each other in a ‘windowed’, immersive reality that felt as if their fellow-musicians were playing just a few metres behind the big screen that was part of the set-up at each location.

Andreas Wilzeck: “The concert was an impressive audio-visual demonstration of some of the LIPS technologies developed over a span of two and a half years. The consortium partners are now looking at the future practical implementation of the findings to support networked live events – a format that will enable bands to play together no matter where they are and allow productions that would not have been possible before.”

Click here for an interview with Dr Andreas Wilzeck.

Please visit the LIPS website for more information on the project and its results, plus a full video of the concluding workshop.

 

 

View Source