May 28, 2011

Why is my Internet different from your Internet?

May 23, 2011, 6:39 AM PDT

Takeaway: At home you search for something on Google. Ten minutes later, at work, you enter the exact same query into Google, but get different results. Why?

December 4th, 2009 was a pivotal day for the Internet. Still, as Eli Pariser points out in his new book,The Filter Bubble , very few people noticed what the search giant Google had done. Fortunately:

“Search engine blogger Danny Sullivan pores over the items on Google’s blog, looking for clues about where the monolith is headed next, and to him, the post was a big deal. In fact, he wrote later that day , it was the biggest change that has ever happened in search engines.”

 

Filter bubble? What is it?

Mr. Pariser’s book is titled after the phenomenon he calls the “filter bubble”. He explains what it’s all about in the book:

 

“The new generation of Internet filters looks at things you seem to like-the actual things you’ve done, or the things people like you like-and tries to extrapolate. They are prediction engines, constantly creating and refining a theory of who you are and what you’ll do and want next.

 

Together these engines create a unique universe of information for each of us-what I’ve come to call a filter bubble-which fundamentally alters the way we encounter ideas and information.”

 

What Google has known all along

For some time now, Google has been capturing the following information:

 

  • Search History: Google keeps track of what is clicked on in search results. If Google notices a certain site is picked more often, it will get a rankings boost.
  • Signed-Out Web History: This history is browser-centric. Google tracks all the searches and search-result selections.
  • Signed-In Web History: This history is user-centric. If the user is recognized by Google, everything is tracked.

Google uses the above data to provide customized-search results to signed-in account owners who give their permission .

 

What changed?

So what was this dramatic change? Google altered Personal Search, enabling it for everyone not just those logged on, by using what they call signed-out customization :

 

“When you’re not signed in, Google customizes your search experience based on past search information linked to your browser, using a cookie. Google stores up to 180 days of signed-out search activity linked to your browser’s cookie, including queries and results you click.”

Turning Personal Search on for everyone concerned Mr. Sullivan. Calling it the “New Normal “, he explains:

“The days of ‘normal’ search results that everyone sees are now over. Personalized results are the ‘new normal,’ and the change is going to shift the search world and society in general in unpredictable ways.”

To put it another way, Mr. Sullivan mentions:

“Happy that you’re ranking in the top results for a term that’s important to you?

 

Look again. Turn off personalized search, and you might discover that your top billing is due to the way the personalized system is a huge ego search reinforcement tool. If you visit your own site often, your own site ranks better in your own results-but not for everyone else.”

And, here I thought my articles were getting high rankings because of their merit. Ouch.

 

PageRank and then some

PageRank is what made Google famous, more than a few people rich, and how Google rates web pages. In 2009, Google altered their holy grail, in order to revamp Personal Search. Mr. Pariser, in his book, points out that Google now uses 57 different variables or “signals” to create search results tailored specifically for you. Some of the known signals are:

 

  • Search history
  • Location
  • Active browser
  • Computer being used
  • Language configured

I suspect the other 52 will remain secret, much like the formula for Coke.

 

What it all means

Ever have one of those feelings that something doesn’t seem right, but you can’t put your finger on it? I suspect that’s why it took me until now to realize the implication of Google’s Personal Search. And, why Mr. Pariser has spent a great deal of time and effort coming to his conclusions.

 

I’m glad I read the book. Understanding Mr. Pariser’s concerns will help me gage search results more realistically. For the time-challenged, Mike Elgan offers a synopsis of the book, in his blog post, How to pop your Internet ‘filter bubble’ :

“In this column, I’m going to tell you how personalization works, why you may not want it, and also how to pop the bubble and opt out of a system that censors your Internet based on stereotyping.”

I found the following tips by Mr. Elgan useful:

  • Deliberately click on links that make it hard for the personalization engines to pigeonhole you. Make yourself difficult to stereotype.
  • Erase your browser history and cookies from time to time.
  • Use an “incognito” window for exploring content you don’t want too much of later.
  • Use Twitter instead of Facebook for news. (Twitter doesn’t personalize.)

Update: As for Twitter and Facebook, I just read a Yahoo Finance article prepared by WSJ and felt compelled to share it with you. The article refers to the Facebook “Like” button and Twitter’s “Tweet” button that is displayed on web pages:

“These so-called social widgets, which appear atop stories on news sites or alongside products on retail sites, notify Facebook and Twitter that a person visited those sites even when users don’t click on the buttons, according to a study done for The Wall Street Journal.”

The article goes on to explain something that may surprise you:

“For this to work, a person only needs to have logged into Facebook or Twitter once in the past month. The sites will continue to collect browsing data, even if the person closes their browser or turns off their computers, until that person explicitly logs out of their Facebook or Twitter accounts.”

How about that?

 

An afterthought

The advantage afforded those with the ability to manipulate search-engine results is huge. And, I was interested in learning what Mr. Pariser and Mr. Sullivan thought about that. Time did not allow Mr. Pariser to respond. Mr. Sullivan did.

 

Kassner: Ultimately, my concern is how do we know that queried search results are not forced biases leading us to follow someone else’s agenda?

Sullivan: I think despite personalization, the search results still reflect lots of diversity. I also think that results are only the start of research into a new area. Wherever you end up, you’ll probably get some pointers to other material-and that also leads to greater diversity.

I also think it’s easy to assume the worse. My friends are all liberal (let’s say), so I’ll never see anything but a liberal view of the world. Perhaps. But the reality is that some of your friends will probably point toward some anti-liberal material, as part of their discussions. And that’s exposing you to more diversity.

Assuming the worse, Google could intentionally try to bias its search results to a particular view. But that assumes there’s a particular view on literally billions of unique searches that are done each month. There’s just not. Some of them have no particular slant one way or another. But even if you managed it, as I said, some of those resources (just like your friends) will point toward content they don’t agree with.

The challenge isn’t that we won’t get exposed to contrary statements. The challenge is that people are seemingly more and more happy to ignore contrary material and create their own beliefs without any critical thinking. “True Enough ” is a good book on this topic. Perhaps this really isn’t something new but rather has always been there. But it sure feels new to me.

Kassner:I am seeing people preferring to use links mentioned by Twitter and Facebook. They trust those opinions over the search engines. Are you seeing that as well? Do you see this as a growing trend?

Sullivan: I do see it growing, and it’s because our social networks offline have “caught up” to being as accessible as search engines for quick answers. We can ask many people for answers to anything, and that’s particularly attractive for subjective questions where there’s no right answer, where we want opinions from those we know.

Kassner: What is your opinion on the general health of search today?

Sullivan: I think the general health is actually pretty good. We should look for search engines to do more to increase quality, which means probably relying less on the link-based systems of ranking that worked in the past and more toward using social signals as well as our own behavior.

Kassner: Good advice. I intend on heeding it.

 

Final thoughts

My goal is to make you aware of what Mr. Pariser calls the filter bubble. And, explain why my Internet is different from your Internet. Just knowing search customization is happening is more than half the battle.

 

I learned a great deal from Mr. Sullivan about a subject I thought I understood. I was wrong and I thank him for his help.

Permalink • Print • Comment
Made with WordPress and Semiologic • Sky Gold skin by Denis de Bernardy