Personalization has been the holy grail of the Web. According to MoveOn.org president and author, Eli Pariser, we've finally reached that goal. The Web's most popular sites–Facebook, Google and many others–customize what you see based on the personal data it collects and how the sites perceive your interests.
Suppose you do a search on the term “android.” You might get results related to Google's operating system for smartphones if you have a history of researching mobile technology. But if you're a sci-fi fan, your top results might include links to humanoid robots.
Whether this customization is good or not is debatable. Pariser argues that what he terms “The Filter Bubble” confirms what we already think, denying us the opportunity to consider opposing viewpoints.Â In a political context, it would be like watching Fox News exclusively, which promotes conservative ideology to the exclusion of contradictory evidence.
What if the same customization could be applied to television?
NHK Science & Technology Research Laboratories has unveiled a television system that does just that. The TV has a built-in camera that captures facial expressions and motion as you watch a program.Â Based on that, software determines your degree of interest and displays programming suggestions on a tablet PC.
According to an NHK spokesperson, “If the viewer's expression does not change for a certain period of time even though he keeps watching TV, he is estimated to be concentrating on the program.”
Although the interface is still in development, you can see where this is heading. If Fox News gets your blood boiling, it might be eliminated from your “watch” list. The same fate awaits programs that don't hold your interest, although they might be valuable to view anyway–like a public hearing on an important issue.
We'll see if the NHK system attracts consumer interest. Meanwhile, Pariser argues that we should all try to listen to people and opinions outside our comfort zone to promote a more informed and civil society. After all, no one is wrong 100% of the time.