Mind control TV4 February 2014
What if your TV suggested what to watch based on your mood? Further still, what if your TV could detect what you wanted to watch by reading your mind?
As far-fetched as that sounds, this is exactly what teams of researchers at Rovi and Technicolor are working on.
According to content personalisation specialists Rovi, when viewers are polled about how they would really like to navigate a programme guide a consistent response is: “I’d like it to read my mind.” So that’s what it set out to do.
At CES last month, Rovi demonstrated an experimental project that employed a headset to monitor both brainwaves and nearby nerve activity. Incredibly, after a short training period it seems possible to switch from watching TV to a guide with a literal blink of the eyes, and then select a new programme with a bit of concentration on the guide.
Patent liaison officer Mike Nichols says of the Brainwaves project, in an online video: “We’ve been working with partners to try to develop new techniques such as voice and motion recognition to change the television navigation experience.”
The video notes that the team set out to determine how they could improve upon the experience. Fellow patent liaison Walt Klappert (pictured), said: “The software I wrote takes the information from the headset and uses it to bring up the guide. For example, you blink and the guide comes up. If you blink again, intentionally, then it will stop at a channel and if you raise your focus it will tune to that channel. It was a lot of work, but now you can tune a channel on your television using nothing but your brain.
“As we see it this is not just making a thought control remote controller. Watching what goes on with people’s brainwaves and the other measurements while they watch TV will lead to a lot more breakthroughs.”
Camron Shiny, manager, software development, added: “We haven’t seen anything like this being done before. This is a new field. We have an opportunity to give people the ability that they haven’t been able to do before.”
Can the increasing number of sensors surrounding us deliver a greater understanding of individual behaviour and emotion at a particular moment and then serve up just the right content answer?
It was a theme picked up by the Consumer Electronics Association’s soothsayer Shawn DuBravac. “What if Netflix had access to my smart watch to know my current pulse, or the camera sensor embedded in my TV to see how many people are in the room, or a temperature gauge to know if it is cold outside,” suggested DuBravac, CEA chief economist and director of research. “Suddenly the recommendation becomes more robust because I may now be offered a movie that will help my mood because I am stressed or because there is snow on the ground.”
Technicolor thinks so too. Its Content Emotions project, first presented at CES2013 and improved on since, detects individual emotional responses to content by way of a sensor worn against the skin on the hand.
The resulting biometric data, it says, could be aggregated to provide studios with information to better tune trailers or whole feature releases to target demographics.
It could also be used in the editing process, by highlighting the emotional response of viewers to particular scenes and it could be used downstream as the basis of a recommendation engine for content and advertising based on the viewer’s mood.
In the latter scenario, Technicolor envisage home users simply placing their finger on a sensor embedded in the TV set that reads their mood and suggests relevant content for that moment – such as up tempo drama or low-key documentaries.
“This is the future of recommendation,” explained Philippe Guillotel, scientist for video perception, Processing and Coding lab manager, Research and Innovation. “We are detecting your emotions from biological signals – it’s the same principal as lie detectors. Maybe in ten years there will be sensor on your TV set which will propose you directly relevant content according to your emotional state.”
Technicolor’s Fernando Silveira added: “As people start to have this data in smart wrist watches and glasses we want to help [content owners/distributors] process the data day to day. Imagine watching a movie on Netflix and instead of a five-star rating to judge content, imagine if your wristwatch is telling Netflix what that person is reacting to in movies they watch. Because you responded to this content in this way, then you will also respond to this content.”
Red Bee Media, which powers TV guide search on ITV.com, The Guardian and Samsung smart TVs has also toyed with the concept it calls “sentiment analysis”. Data could be received via recognition of a viewer’s facial emotion from cameras embedded in the TV.
“One way of doing this is to have a research group assigning mood tags to programming so that you had a uniform set of filters,” said Steve Plunkett, CTO, Red Bee Media. “You could ask people to rate programming using, say, 200 descriptors to tag the content ranging from ‘I’m excited’ to ‘I’m sad’. You would add these filters to the mix of utilisation data you already know about a person’s content choices. The search could bring back very subtle results.”
By Adrian Pennington