Brainstorm is participating in the European Project EMERALD “AI and Process Automation for Sustainable Entertainment and Media” funded by the European Commission under the Horizon Europe Programme.

The interdisciplinary Consortium of seven partners includes, as well as Brainstorm, companies from the movie, broadcast, streaming and live entertainment technology sectors (BBC, Brainstorm Multimedia SL, Disguise Systems, Filmlight and MOG Technologies) supported by two major European universities (Universidad Pompeu Fabra and The Provost, Fellows, Foundation Scholars & The Other Members Of Board Of The College Of The Holy & Undivided Trinity Of Queen Elizabeth Near Dublin).

“EMERALD strives to pioneer ground-breaking tools for the digital entertainment and media sectors by harnessing the potential of AI, Machine Learning, and Big Data technologies. The overarching goal is to revolutionize processing, enhance production efficiency, minimize energy consumption, and elevate content quality through cutting-edge innovations”, says Francisco Ibáñez, R&D Project Manager at Brainstorm. “Currently, there is a massive increase in the volume of video-based and extended reality content, with an unsustainable demand for skilled human resources, data processing and energy. This project aims to address this challenge through the development of process automation for sustainable media creation.”

EMERALD aims to apply ML to automate some of the most labor-intensive tasks involved in video content production, which have considerable implications for both time and energy use. Javier Montesa, R&D Technical Coordinator at Brainstorm, says that Brainstorm in collaboration with University Pompeu Fabra (UPF) will develop DL-based methods and tools for video matting. “The main objective is to produce high-quality results for the automated real-time integration of remote presenters or performers into virtual scenes and sets for broadcast/streaming media using DL without the need of a trimap. With the AI enhancement of BRA’s InfinitySet, we will bring the quality of green-screen methods to simpler configurations.”

Montesa explains that Brainstorm plans to integrate InfinitySet with UPF Deep Learning systems for the estimation of the head and body pose that will allow the operator to trigger content that will be displayed automatically on different parts of the scene, virtual screens, 3D graphics place holders, or simply in front of the presenter as he or she moves around. Francisco adds that “we will explore new ways to improve the presenter insertion, to billboard and twist its silhouette, or to calculate its shadow with more precision and realism”.

Furthermore, Brainstorm will be involved in the creation of tools designed for automated color balancing and matching shots. Color manipulations are required in post-production, in VP and in broadcast virtual studios. Javier Montesa states that automating the color grading of the presenter to match the virtual scene will be particularly valuable for broadcast virtual studios that do not have colorists. “The integration of this automated color correction in InfinitySet will simplify the use of the tool and improve the presenter-scene integration when lighting conditions are not controlled or when virtual scene lighting conditions are meant to vary during a programme”.

¿Te gustó este artículo?

Suscríbete a nuestro RSS feed y no te perderás nada.

Other articles on , , , ,
By • 28 Nov, 2023
• Section: Study, Media management, Business