en:lang="en-US"
1
1
https://www.panoramaaudiovisual.com/en/2016/11/09/ibm-desarrolla-interesantes-aplicaciones-basadas-en-inteligencia-artificial-para-video-en-nube/

Watson technology will make it possible to analyze in almost real time the public's reaction on social networks to a live event or detect a scene based on video content and allow segmentation according to the consumption habits of users on different platforms.

IBM Watson

IBM is deploying new cloud video services based on Watson artificial intelligence technology to help deliver personalized and differentiated viewing experiences to consumers.

Accessible through the IBM Cloud API Watson, for example, it will be possible to perform near real-time analysis of the public's reaction on social networks to a live event or detect a scene based on video content and allow segmentation according to the consumption habits of users on different platforms.

Braxton Jarratt, general manager of IBM Video Cloud, highlights that "enterprises are creating videos that generate a lot of valuable data, but they still don't have ways to easily identify this information or the audience's reaction to certain types of content. Today, with these new services and the cognitive capabilities of our technology, IBM can help companies obtain relevant information about videos and their audiences so they can create and evaluate the most targeted content for specific audiences."

IBM claims that the Speech to Text and AlchemyLanguage APIs can be used to monitor the reaction of viewers “word by word” on social media when a live event is happening. At a product launch, for example, viewer engagement can increase or decrease when specific features are shown, providing valuable insight into aspects that are considered important and should be prioritized in the future.

In terms of detecting scenes in a video, the current technology available now allows for automatic segmentation based on visual changes in the video. Understanding content and separating it into topics and themes still has to be a manual process performed by one person. With Watson, IBM plans to offer a solution that “understands the semantics and patterns of language and image, enabling the identification of high-level concepts such as when a movie or show changes topics.”

According to the company, a major content provider is already testing the service as a way to improve the ranking of videos, the indexing of specific chapters and its clients' search for relevant content.

Media Insights Platform

Media Insights Platform

On the other hand, the Media Insights Platform is already being incorporated into existing catalog, logistics and subscriber managers in the IBM Cloud, allowing for greater detail in existing consumer viewing habits.

The new service, scheduled to launch later this year, is designed to use the Insights media platform to analyze video and social media viewing behavior, identifying complex patterns that can be used to help improve content matching and find new viewers interested in existing content. Media Insights Platform uses multiple Watson APIs, including Speech to Text, AlchemyLanguage, Tone Analyzer and Personality Insights.

In order to demonstrate the capabilities of its artificial intelligence system, IBM, in partnership with 20th Century Fox, has developed a “cognitive trailer” for the horror film Morgan. For this, Watson has analyzed and learned from more than a hundred horror movie trailers.

IBM has also worked at the US Open Tennis this year to convert text comments more accurately by having Watson learn tennis terminology and player names before the tournament.

https://www.youtube.com/watch?v=U-c0jTwxG-0

By, Nov 9, 2016, Section:Media management

Other articles about

Did you like this article?

Subscribe to our NEWSLETTER and you won't miss anything.