May 28, 2011

Why is my Internet different from your Internet?

May 23, 2011, 6:39 AM PDT

Takeaway: At home you search for something on Google. Ten minutes later, at work, you enter the exact same query into Google, but get different results. Why?

December 4th, 2009 was a pivotal day for the Internet. Still, as Eli Pariser points out in his new book,The Filter Bubble , very few people noticed what the search giant Google had done. Fortunately:

“Search engine blogger Danny Sullivan pores over the items on Google’s blog, looking for clues about where the monolith is headed next, and to him, the post was a big deal. In fact, he wrote later that day , it was the biggest change that has ever happened in search engines.”

 

Filter bubble? What is it?

Mr. Pariser’s book is titled after the phenomenon he calls the “filter bubble”. He explains what it’s all about in the book:

 

“The new generation of Internet filters looks at things you seem to like-the actual things you’ve done, or the things people like you like-and tries to extrapolate. They are prediction engines, constantly creating and refining a theory of who you are and what you’ll do and want next.

 

Together these engines create a unique universe of information for each of us-what I’ve come to call a filter bubble-which fundamentally alters the way we encounter ideas and information.”

 

What Google has known all along

For some time now, Google has been capturing the following information:

 

  • Search History: Google keeps track of what is clicked on in search results. If Google notices a certain site is picked more often, it will get a rankings boost.
  • Signed-Out Web History: This history is browser-centric. Google tracks all the searches and search-result selections.
  • Signed-In Web History: This history is user-centric. If the user is recognized by Google, everything is tracked.

Google uses the above data to provide customized-search results to signed-in account owners who give their permission .

 

What changed?

So what was this dramatic change? Google altered Personal Search, enabling it for everyone not just those logged on, by using what they call signed-out customization :

 

“When you’re not signed in, Google customizes your search experience based on past search information linked to your browser, using a cookie. Google stores up to 180 days of signed-out search activity linked to your browser’s cookie, including queries and results you click.”

Turning Personal Search on for everyone concerned Mr. Sullivan. Calling it the “New Normal “, he explains:

“The days of ‘normal’ search results that everyone sees are now over. Personalized results are the ‘new normal,’ and the change is going to shift the search world and society in general in unpredictable ways.”

To put it another way, Mr. Sullivan mentions:

“Happy that you’re ranking in the top results for a term that’s important to you?

 

Look again. Turn off personalized search, and you might discover that your top billing is due to the way the personalized system is a huge ego search reinforcement tool. If you visit your own site often, your own site ranks better in your own results-but not for everyone else.”

And, here I thought my articles were getting high rankings because of their merit. Ouch.

 

PageRank and then some

PageRank is what made Google famous, more than a few people rich, and how Google rates web pages. In 2009, Google altered their holy grail, in order to revamp Personal Search. Mr. Pariser, in his book, points out that Google now uses 57 different variables or “signals” to create search results tailored specifically for you. Some of the known signals are:

 

  • Search history
  • Location
  • Active browser
  • Computer being used
  • Language configured

I suspect the other 52 will remain secret, much like the formula for Coke.

 

What it all means

Ever have one of those feelings that something doesn’t seem right, but you can’t put your finger on it? I suspect that’s why it took me until now to realize the implication of Google’s Personal Search. And, why Mr. Pariser has spent a great deal of time and effort coming to his conclusions.

 

I’m glad I read the book. Understanding Mr. Pariser’s concerns will help me gage search results more realistically. For the time-challenged, Mike Elgan offers a synopsis of the book, in his blog post, How to pop your Internet ‘filter bubble’ :

“In this column, I’m going to tell you how personalization works, why you may not want it, and also how to pop the bubble and opt out of a system that censors your Internet based on stereotyping.”

I found the following tips by Mr. Elgan useful:

  • Deliberately click on links that make it hard for the personalization engines to pigeonhole you. Make yourself difficult to stereotype.
  • Erase your browser history and cookies from time to time.
  • Use an “incognito” window for exploring content you don’t want too much of later.
  • Use Twitter instead of Facebook for news. (Twitter doesn’t personalize.)

Update: As for Twitter and Facebook, I just read a Yahoo Finance article prepared by WSJ and felt compelled to share it with you. The article refers to the Facebook “Like” button and Twitter’s “Tweet” button that is displayed on web pages:

“These so-called social widgets, which appear atop stories on news sites or alongside products on retail sites, notify Facebook and Twitter that a person visited those sites even when users don’t click on the buttons, according to a study done for The Wall Street Journal.”

The article goes on to explain something that may surprise you:

“For this to work, a person only needs to have logged into Facebook or Twitter once in the past month. The sites will continue to collect browsing data, even if the person closes their browser or turns off their computers, until that person explicitly logs out of their Facebook or Twitter accounts.”

How about that?

 

An afterthought

The advantage afforded those with the ability to manipulate search-engine results is huge. And, I was interested in learning what Mr. Pariser and Mr. Sullivan thought about that. Time did not allow Mr. Pariser to respond. Mr. Sullivan did.

 

Kassner: Ultimately, my concern is how do we know that queried search results are not forced biases leading us to follow someone else’s agenda?

Sullivan: I think despite personalization, the search results still reflect lots of diversity. I also think that results are only the start of research into a new area. Wherever you end up, you’ll probably get some pointers to other material-and that also leads to greater diversity.

I also think it’s easy to assume the worse. My friends are all liberal (let’s say), so I’ll never see anything but a liberal view of the world. Perhaps. But the reality is that some of your friends will probably point toward some anti-liberal material, as part of their discussions. And that’s exposing you to more diversity.

Assuming the worse, Google could intentionally try to bias its search results to a particular view. But that assumes there’s a particular view on literally billions of unique searches that are done each month. There’s just not. Some of them have no particular slant one way or another. But even if you managed it, as I said, some of those resources (just like your friends) will point toward content they don’t agree with.

The challenge isn’t that we won’t get exposed to contrary statements. The challenge is that people are seemingly more and more happy to ignore contrary material and create their own beliefs without any critical thinking. “True Enough ” is a good book on this topic. Perhaps this really isn’t something new but rather has always been there. But it sure feels new to me.

Kassner:I am seeing people preferring to use links mentioned by Twitter and Facebook. They trust those opinions over the search engines. Are you seeing that as well? Do you see this as a growing trend?

Sullivan: I do see it growing, and it’s because our social networks offline have “caught up” to being as accessible as search engines for quick answers. We can ask many people for answers to anything, and that’s particularly attractive for subjective questions where there’s no right answer, where we want opinions from those we know.

Kassner: What is your opinion on the general health of search today?

Sullivan: I think the general health is actually pretty good. We should look for search engines to do more to increase quality, which means probably relying less on the link-based systems of ranking that worked in the past and more toward using social signals as well as our own behavior.

Kassner: Good advice. I intend on heeding it.

 

Final thoughts

My goal is to make you aware of what Mr. Pariser calls the filter bubble. And, explain why my Internet is different from your Internet. Just knowing search customization is happening is more than half the battle.

 

I learned a great deal from Mr. Sullivan about a subject I thought I understood. I was wrong and I thank him for his help.

Permalink • Print • Comment

May 23, 2011

Lawsuit Against YouTube Threatens Global Growth of Political Speech

April 7th, 2011

Legal Attack on Online Video Site Could Throttle Innovation with Fears of Litigation

San Francisco – The Electronic Frontier Foundation (EFF) and a coalition of advocacy groups have asked a federal appeals court to reject attempts to thwart federal copyright law and saddle online communities with new litigation fears in the appeal of Viacom v. YouTube.

In an amicus brief filed Thursday, EFF argues that the infringement claims made by Viacom and the other plaintiffs threaten to undermine the "safe harbor" provisions of the Digital Millennium Copyright Act (DMCA) — safe harbors that have fostered free speech and innovation around the globe. Without the clear legal structure of the DMCA process, companies that host user-generated expression could be hit with potentially massive damage awards, which would encourage over-blocking of content or even the shutdown of services altogether.

"If the DMCA safe harbors are undermined in the way Viacom and the other content companies would like, the free flow of information will be seriously threatened," said EFF Senior Staff Attorney Abigail Phillips. "Communications platforms like YouTube have enabled political and other speech to flourish online. We've all seen the critical role digital communications have been playing in protests across the Middle East. The safe harbors make posting of user-generated content like this possible."

At issue in this case is copyright infringement on YouTube before the online video service voluntarily implemented content filtering technologies in May of 2008. The district court correctly found that YouTube was shielded by the DMCA safe harbors, and Viacom and others appealed the ruling to the 2nd U.S. Circuit Court of Appeals.

"All the online services you use every day — Facebook, Twitter, Amazon, eBay — depend on the DMCA safe harbors in order to allow user-generated content on their sites," said EFF Intellectual Property Director Corynne McSherry. "That's why Congress designed the safe harbors — to allow innovators to manage legal risk and develop new services without fear of devastating litigation, while offering copyright owners an expedited process for taking down infringing content. Viacom's arguments here misinterpret the law, with potentially disastrous results."

Also joining EFF's brief are the International Federation of Library Associations and Institutions, the American Library Association, the Association of College and Research Libraries, the Association of Research Libraries, and the Center for Democracy and Technology.

For the full amicus brief:
https://www.eff.org/files/filenode/viacom_v_youtube/ViacomvGoogleAmicus….

For more on this case:
http://www.eff.org/cases/viacom-v-youtube

Contacts:

Corynne McSherry
Intellectual Property Director
Electronic Frontier Foundation
corynne@eff.org

Abigail Phillips
Senior Staff Attorney
Electronic Frontier Foundation
abigail@eff.org

Related Issues: DMCAIntellectual Property

Related Cases: Viacom v. YouTube

Permalink • Print • Comment

January 10, 2011

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It

We've dropped the net neutrality term around here a few times, but you may not entirely understand what it's all about. Here's a primer on what net neutrality is, how it might affect you, and what you can do about it.

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It Photo remixed from an original by The Local People Photo Archive

What is Net Neutrality?

As its name indicates, net neutrality is about creating a neutral internet. The basic principle driving net neutrality is that the internet should be a free and open platform, almost like any other utility we use in our home (like electricity). Users should be able to use their bandwidth however they want (as long as it's legal), and internet service providers should not be able to provide priority service to any corner of the internet. Every web site (whether it's Google, Netflix, Amazon, or UnknownStartup.com) should all be treated the same when it comes to giving users the bandwidth to reach the internet-connected services they prefer. Your electric company has no say over how you use your electricity—they only get to charge you for providing the electricity. Net neutrality aims to do something similar with your internet pipes.

Those against net neutrality—commonly including internet service providers (ISPs), like Comcast or AT&T—believe that, as providers of internet access, they should be able to distribute bandwidth differently depending on the service. They'd prefer, for example, to create tiers of internet service that's more about paying for priority access than for bandwidth speeds. As such, in theory, they could charge high-bandwidth services—like Netflix, for example—extra money, since their service costs more for Comcast to provide to its customers—or they could charge users, like you and me, extra to access Netflix. They can also provide certain services to you at different speeds. For example, perhaps your ISP might give preferential treatment to Hulu, so it streams Hulu videos quickly and for free, while Netflix is stuck running slowly (or we have to pay extra to access it).

What are the Arguments For Net Neutrality?

Proponents of net neutrality don't want to give the ISPs too much power because it could easily be abused. Imagine that Verizon or AT&T don't like the idea of Google Voice, because it allows you to send text messages for free using your data connection. Your cellphone carrier could block access to Google Voice from your smartphone so you're forced to pay for a texting plan from them. Or, they see that a lot of people are using Facebook on their smartphone, so even if they have the bandwidth to carry that traffic, they decide to charge you extra to access Facebook, just because they know it's in high demand and that they can make a profit.

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It

Image via Reddit. Hit the link for the full image.

Similarly, Comcast recently got in a tiff with Netflix over its streaming video offerings, essentially telling Netflix's partners that they'd need to pay if they wanted their content delivered on their network. Comcast argued that streaming Netflix is a huge traffic burden, and if they're going to provide that service they'll need to update their infrastructure. Netflix's argument was that Comcast provides the internet, and it's Comcasts users that have requested that extra bandwidth for the services they want.

Another way to look at it: Comcast also has their own On Demand service which directly competes with Netflix—and if Comcast is allowed to divide up their service as they please, the option to give preferential treatment to their own service isn't exactly fair just because they're the internet provider. And, with Comcast and NBC looking to merge, the waters can get even murkier. The resulting superpower could give preference to all of NBC's content too, thus leaving other content providers out in the cold.

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It

Another problem here is that while big services like Netflix could, in theory, afford to pay Comcast for using extra bandwidth, the small, lesser-known services—that could be big one day but aren't yet—can't. Really great web sites or internet services might never gain popularity merely because ISPs would have control over what kind of access users like you and me have to that service. That could greatly stifle viagra for women information innovation, and we'd likely miss out on a lot of cool new services.

What are the Arguments Against Net Neutrality?

Anti-net neutrality activists argue that internet service providers have a right to distribute their network differently among services, and that in fact, it's the ISPs that are innovating. They argue that giving preferential treatment to different services isn't a bad thing; in fact, sometimes it's necessary. In the recent Comcast/Netflix debate, they point out that if Netflix is sucking up all their bandwidth, they should be the ones to pay for the necessary updates that Comcast's systems will require because of it.

Many free market proponents are also against the idea of net neutrality, noting that Comcast and AT&T are companies like any other that should be able to compete freely, without government regulation. They themselves aren't "the internet"—they're merely a gateway the internet, and if they're each allowed to manage their networks differently, you're more likely to have competition between service providers which ultimately, they claim, is better for the users. If you don't like the fact that Netflix is slower on Comcast than it is on AT&T, you can switch to AT&T.

The problem, however, is still that ISPs could always favor their own services over others, leaving services with no connection to the ISP out in the cold. Furthermore, most people don't have much choice in who their ISP is, since in any given location there may be only one or two ISPs providing internet.

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It

What are the Current Laws?

The Federal Communications Comission (FCC) released a new set of net neutrality rules on December 21, 2010 for internet service providers. Here's the state of net neutrality regulation as of right now:

Transparency

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It

First and foremost, the FCC requires that ISPs publicly disclose all their network management practices, so that users can make informed decisions when purchasing internet service. That means they'd have to say what speeds it offers, what types of applications would work over that speed, how it inspects traffic, and so on. It does not necessarily mean that those disclosures will be understandable by non-tech savvy individuals—in fact, we've already seen how ISPs try to spin their "what you'll get" charts to you purchase the most expensive internet (see the misleading image above)—so this rule doesn't necessarily mean a lot to the average consumer.

No Blocking or Unreasonable Discrimination for Wired Internet

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About ItWired ISPs—that is, providers of the internet in your home—are not allowed to outright block any legal web content, applications, or services. The FCC also notes that they aren't allowed to slow down traffic either, as this often renders a service unusable and thus is no different from outright blocking. For example, Comcast has always throttled BitTorrent downloads, but it didn't block them completely—it just slowed them down to a crawl. Under these new rules, that wouldn't be allowed either. Photo by Kelly Teague.

The new rules also do not allow wired ISPs to discriminate against legal network traffic. This means that Comcast cannot, in fact, discriminate against competitive services like Netflix or stifle free speech (by, say, discriminating against political outlets that have views different from the ISP or its parent company).

Your Smartphone Doesn't Count

Mobile ISPs, on the other hand, are not subject to the same rules. The FCC believes mobile broadband—that is, the data plan you have on your cellphone—is still young enough that it may need heavier network management than wired broadband. As such, they haven't made any broad net neutrality rules as of yet. Mobile ISPs are still prohibited from blocking services on the web that compete directly with their own, but they can continue to discriminate—which means that at any given point, you could find an internet service blocked or deliberately slowed down when accessing it from your smartphone. Furthermore, if the ISPs so choose, they could charge you extra to access certain services, like Facebook or Netflix. App stores are exempt from these rules, so the App Store and Android Market can be as closed as they want to be. So, if Apple decided that they no longer wanted Google Voice to be available in the App Store, they could remove it—even though it's a service that directly competes with AT&T.

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It

Photo by David Fulmer.

The other group exempt from the rules are managed services—services that companies pay extra for, and thus require a higher level of service. A good example is AT&T's IPTV service—they provide television and on demand services through the internet instead of over cable or radio frequencies, and they dedicate a certain amount of their bandwidth for just those services, leaving less bandwidth for everything else. Again, this isn't intrinsically bad, but giving ISPs unlimited power to do this can lead to dangerous territory.

So Why the Fuss?

The rules as I've laid them out above offer a pretty condensed summary of the main points in the FCC's latest release, and while they seem like a big step forward (namely the neutrality rules in place for wired connections), a lot of net neutrality proponents are still unhappy. The exception for mobile broadband is a pretty big complaint, as are the exceptions for managed services. A lot of folks also argue that loopholes abound in the new rules, like the fact that all the rules are subject to "reasonable network management", which isn't very well defined. To be fair, neither side is happy with the current rules—which is to be expected in such a heavily debated issue. Proponents think the rules aren't strict enough and that the ISPs have gotten "exactly what they wanted", while the anti-net neutrality camp think that the internet companies are being too heavily regulated.

In the end, it's all about the control you, as a user, have over how you use the internet. While net neutrality's opponents argue that tiered service creates more control for the user, most of us don't see it that way—we'd like to be able to access all internet services equally, instead of having certain services given preferential treatment. After the passing of these rules, the wired internet in our homes is a bit safer, but the internet we access from our smartphones isn't. ISPs could still block, discriminate against, or charge extra for web sites and services we get on-the-go, taking control out of your hands.

If you really want to argue about the finer points, you'll want to dig into the actual FCC release, as this or any other summary isn't going to provide the nuances and specifics nearly well enough. But in general, this should give you a good idea of where we are now.

What Can I Do to Get Involved?

An Introduction to Net Neutrality: What It Is, What It Means for You, and What You Can Do About It

If you're reading this and foaming at the mouth in anger, there are a few things you can do. The FCC has a complaint system set up for citizens to voice their issues on communications-related topics.

Submit an Informal Complaint

Submitting an informal complaint is easy, as it's all done online, and anyone can do it. Right now, the form isn't exactly friendly—there don't seem to be any specific sections about the new net neutrality rules—but the FCC says they'll be making resources available for net neutrality-specific complaints. For now, Ars Technica recommends hitting "Internet Service and VoIP", then heading to "Billing, Service, Availability" and going to the online form from there.

Submit a Formal Complaint

End users can't submit formal complaints, but if you're a company or public interest group that's very concerned about the new rules (and you've got $200 to spend on the filing fee), you can file a formal complaint, which is often like a court hearing. You'll probably need a lawyer, and for most of us, the informal route is the best bet. But Ars has more information on formal complaints if you're interested.

Spread the Word

Net neutrality's a complicated issue, and a lot of people still aren't informed about what's going on. Explain the issue to your friends and family—the more people know about it, the more people that might be affected and might speak out. You can also check out each side's respective organization, SavetheInternet.com for pro-net neutrality voices and HandsOff.org for anti-net neutrality voices. They've each got a ton of links to other ways you can talk to your congresspeople, write letters and sign petitions to make your voice heard.


We here at Lifehacker are open supporters of net neutrality, but we know it's a very hot-button issue, and many of you probably have your own opinions on the subject—whether you agree with us or not. So let's get some discussion started in the comments below.

Send an email to Whitson Gordon, the author of this post, at whitson@lifehacker.com

Permalink • Print • Comment

December 15, 2010

You have no secrets

Privacy? No way. Government, business and even the kid next door know what you're up to.

Headbone connects to the headphones
Headphones connect to the iPhone
iPhone connected to the Internet
Connected to the Google
Connected to the government.

—M.I.A., “The Message”

You are being watched.

Your Facebook friends are watching you. So are their Facebook friends, and total strangers. The guys who run Facebook, too. Your keystrokes are being logged. Your mouse-clicks are being monitored and digested. Your behavioral patterns are being analyzed, monetized: what you buy on Amazon, who you follow on Twitter, where you say you eat on Yelp, your most shameful Google searches.

The photos you post on Flickr are encoded with little bits of geospatial metadata that pinpoint where they were taken and can reveal where you live. Your smartphone—jam-packed with apps coded by who-knows-who and potentially loaded with spyware—is a pocket homing beacon, trackable by satellite. There are trucks with cameras on their roofs, trundling past your apartment, duly noting your unsecured Wi-Fi signal.

Wal-Mart is putting radio frequency identification (RFID) tags in your underwear.

You can barely remember all the different passwords to the ever-proliferating number of websites to which you’ve entrusted personal photos and videos, likes and dislikes, credit-card info and your Social Security number. Then there are the photos of you that other people have posted without your knowledge, or the things they may have written about you on blogs or message boards—things that have a good chance of remaining online and searchable for perpetuity.

And that’s to say nothing of the vast and classified surveillance apparatus—“so large, so unwieldy, and so secretive that no one knows how much money it costs, how many people it employs, how many programs exist within it, or exactly how many agencies do the same work,” according to the Washington Post—that could (who knows?!) be silently taking note of the e-mails you’ve sent and the phone calls you’ve made.

Facebook, keeping tabs on its 500 million members—who share 30 billion bits of information each month yet are mostly ignorant of its privacy policy, which is longer than the United States Constitution—looks like a Class of 1984 high-school yearbook by comparison.

Public is the new private

Over just the past decade or so, the Web has turned things upside down. As Danah Boyd said, speaking in the spring at SXSW in Austin, we’ve seen “an inversion of defaults when it comes to what’s public and what’s private.”

Time was, what you said and did was “private by default, public through effort,” said Boyd, a fellow at Harvard’s Berkman Center for Internet and Society. That’s all changed: “Conversation is public by default, private through effort. You actually have to think about making something private because, by default, it is going to be accessible to a much broader audience . . . And, needless to say, people make a lot of mistakes learning this.”

To a degree unheard of even five years ago, we live our lives mediated by Firefox browsers and Droid screens. And that means—whether it’s ostensibly protected sensitive data (financial and medical data), ostensibly inconsequential personal data (Flickr photos, YouTube channels, Twitter feeds), or ostensibly de-personalized behavioral data (browsing patterns, search queries, HTTP cookies)—our lives are nowhere near as private as we might presume them to be.

“Precisely because the tech advances have come in so many places, it’s really quite hard to pick any one particular spot that’s the biggest problem,” says Lee Tien, senior staff attorney at the Electronic Frontier Foundation. “They all converge. Because we have a giant personal information superhighway, where all of our information travels around both the government and the business sector, what gets picked up in one place is being transferred to another place. So it all ends up, not necessarily in a central basket, but in a lot of different baskets—where it can always be accessed.”

“Data collection is becoming ubiquitous,” says Jules Polonetsky, co-chair and director of the Future of Privacy Forum, and former chief privacy officer at AOL. “It’s not science fiction anymore to think there are lots of databases that have everything we’ve done: every search we’ve done, every website we’ve visited.”

It might be comforting to think that our online identities are just anonymous strings of ones and zeros, but that’s just not true anymore. So what we used to loosely define as “privacy”—an admittedly amorphous concept—is changing fast. And only recently do consumers, voters, politicians, and the media seem to be grasping that fact.

Before, “we had privacy from obscurity,” says David Ardia, another fellow at the Berkman Center, and the director and founder of the Citizen Media Law Project. Now, almost everything worth knowing about almost anyone is online. 

“That means it’s searchable, and it’s available forever. And I don’t think we’ve caught up to that change in the way we structure our lives and the way we understand privacy.”

‘They want to know more about us’

To begin with, privacy is a problematic notion.

“It’s a very misunderstood concept from a constitutional point of view,” says civil liberties attorney Harvey Silverglate. “There are some parts of the Constitution, and of the Bill of Rights in particular, that are quite specific about it. And there are others that are quite general and amorphous.”

While the First Amendment is very explicit, for instance (“Congress shall make no law…”), the Fourth Amendment (“unreasonable searches and seizures”…“probable cause”) leaves a lot more wiggle room. It’s “seemingly intentionally vague,” says Silverglate—as if “left for the particular era and particular culture to define.” The result is a wording that suggests people are entitled to a reasonable degree of privacy—but just what it is differs in any given environment.

Obviously, the Framers “didn’t envision the Internet or telephones, but they obviously understood that this was an area that was going to be evolving, and they couldn’t define it.”

And so we find ourselves, at the beginning of the second decade of the 21st century, still trying to figure all this out.

The problem, says Silverglate, “is that the pace of technological change is proceeding so quickly that the courts, which were always a little bit behind in the development of technology, are now being left in the dust.”

Indeed, says Tien, “technology has advanced and the law has not.” Moreover, “Privacy is not easy to define. It means different things to different people.” But above all else, he says, the most acute threat nowadays is that both the government and the private sector have such vested interests in chipping away at whatever privacy actually is.

“You and I might view the information that we give off online, that we don’t want others to capture, as a negative thing like pollution in the air,” says Tien. But “for government and industry, it’s a nutrient. It’s something they can feed on. They want to know more about us.”

No such agency

“A hidden world, growing beyond control,” wrote Dana Priest and William Arkin in their Washington Post special report, “Top Secret America” —describing “some 1,271 government organizations and 1,931 private companies [working] on programs related to counterterrorism, homeland security, and intelligence in about 10,000 locations across the United States. An estimated 854,000 people, nearly 1.5 times as many people as live in Washington, D.C., hold top-secret security clearances.”

If you don’t think a goodly number of those folks are listening in to the occasional Skype conversation, you haven’t been paying attention these past 10 months.

“I’m worried about a number of phenomena,” says Silverglate. “First, because of the increasing number of searches being done by the terror warriors—the CIA, the NSA, the FBI, and God knows who else—the chaos in the federal investigative establishment is unbelievable. If you think they can’t get the mail delivered on time, just think about the wiretaps and the electronic surveillance.”

It’s enough to make the most intrusive data-mining operation seem tame by comparison. After all, says Silverglate, a corporation “can spy on you but they can’t arrest you.” And when they do spy on you, it’s “because they want to sell you something, not kill you.”

Don’t (just) worry about the government

The problem comes when governments start strong-arming those companies into doing their bidding. Consider the controversy surrounding AT&T’s cooperation with the NSA (National Security Agency), without the knowledge of its customers, on a “massive program of illegal dragnet surveillance of domestic communications” (as the Electronic Frontier Foundation charged) back in 2006. “AT&T just allowed them access to the control room,” marvels Silverglate.

The Feds, in other words, “enlist the brilliance and expertise of companies like Google for the purposes of snooping on its citizens.”

It’s a job at which Google has allegedly acquitted itself quite well in recent months.

In May, news broke that the omnipresent (and sometimes seemingly omnipotent) corporation had been vacuuming up data about citizens’ Wi-Fi networks and what sorts of content was being accessed thereon. Like in a B-movie stakeout, it was all monitored from inside a van—those camera-equipped Street View trucks that patrol the world’s cul-de-sacs and capture images of sword-and-sorcery LARPers, “horse boy,” and, well, your front door.

Google insists that the data sweeps were “unintentional” and that at any rate, were only viewed a very limited number of times, by mistake. You’re not the only one who’s dubious. Massachusetts Congressman Ed Markey has asked the Federal Trade Commission (FTC) to determine whether Google’s privacy breach broke the law. Galaxy Internet Services, an ISP based in Newton, Massachusetts, has brought suit. And Connecticut Attorney General Richard Blumenthal is heading a multi-state investigation.

In June, Representative John Conyers of Michigan requested that Google CEO Eric Schmidt enlighten him as to just how those cars came to intercept that Wi-Fi info. In his letter, Conyers got out the virtual police tape, asking that Google “retain the data collected by its Street View cars, as well as any records related to the collection of such data, until such time as review of this matter is complete.”

It was about this time that Conyers sent a letter to Mark Zuckerberg, Facebook’s twerpy bazillionaire of a CEO, inquiring whether the site shared user data “without the knowledge of the account holders.”

But however much kerfuffle there was about Facebook’s Orwellian Beacon program or its labyrinthine privacy settings , no matter how sinister David Fincher’s movie The Social Network makes Zuckerberg’s enterprise seem, when it comes to privacy, Facebook is probably the least of your problems.

Sure, it’s bad. “The interplay between the multiple options is so complex” on Facebook, says Polonetsky. “Your location. What apps you use. Your friends’ apps. Different segments of your profile. Your contact information. It’s this incredibly complicated maze. Even I gotta sit sometimes and think before I answer a question.”

But too few people realize that this stuff is everywhere these days.

“You go to a site and there’s a lot going on!” says Polonetsky. “A lot of different data being collected. Regular cookies. Flash cookies. Behavioral retargeting. Analytics. There’s data being sent to an ad exchange. There might be an affiliate program because they’re selling ads not on a click basis, but on a commission basis. There’s 20 or 30 places your browser may go when you visit a site, and then [there’s] all the different things you have to do if you want to turn that off. Your cookie settings. viagra discount Your Adobe Flash player settings. You could spend hours just disabling the data transmission that happens.”

Anonymous Rex

The omniscient eye of corporate-abetted Big Brother may get the blockbuster treatment in the Post. But oftentimes privacy intrusions grow much closer to home—and are much more damaging.

“We used to think of the threat as ‘us against them,’” says Tien. “Now, because of the Internet and ubiquitous portable devices, there’s a much more lateral threat as well.” After all, “kids can ruin each other’s privacy without really even trying. They think they’re just in a Facebook squabble, but there are a lot of other people who have access to that data. So there’s both a Big Brother problem and a Little Brother problem. And that Little Brother problem has gotten worse.”

Who is Little Brother? He’s all those people you know, sort-of-know, or wish you didn’t know: creepy, barely remembered high-school classmates; Machiavellian coworkers; your angry ex. But mostly you really don’t know who Little Brother is, because Little Brother is anonymous. He or she is part of a sea of nameless faces: the anonymity-emboldened tough guy on a message board, or an auteur posting a sadistic video on YouTube, or an obsessive Twitter-stalker, or, sometimes, a malicious suburban mom hiding behind a hoax identity while taunting a teenager to suicide.

Inexorably, we seem to be drawn to a battle between two conflicting notions—and the winner of that battle may determine what kind of Internet we end up with. The voices advocating for increased privacy protections argue that our actions online should remain invisible—unless we give our express consent to be watched and tracked. But some of the most powerful voices on the Web are beginning to suggest that you should be held responsible for your online actions: that your anonymity on the Web is dangerous.

Speaking at the Techonomy conference in Lake Tahoe a couple of months ago, Google’s Schmidt opined that the rise of user-driven technology—and the dangers posed by those who would misuse it—required a new approach. “The only way to manage this is true transparency and no anonymity,” he said. “In a world of asynchronous threats, it is too dangerous for there not to be some way to identify you. We need a [verified] name service for people. Governments will demand it.”

And Schmidt is right. The same governments that are investigating Google’s breaches of their citizens’ privacy are also demanding that their citizens be accountable for their online identities in ways that must make the world’s totalitarian regimes smile. That’s the paradox: Any measure that would allow Google to track the sources of a Chinese hacker attack would also enable the Chinese government to track its own dissidents.

Even on our shores, a look at recent government action on privacy shows how confused the issue has become.

On the one hand, US lawmakers and the nation’s top consumer-protection agency are so spooked by online marketing practices that they are threatening legislation if the industry doesn’t begin to self-regulate. By doing so, they’re affirming the public’s right to retain its anonymity.

Earlier this year, the FTC began floating the idea of a no-track list, which would prevent advertisers from gathering information from a user’s online behavior much as the federal Do Not Call list restricts the practices of telemarketers. The ability of marketers to track you has shifted so quickly, and the information they can glean is so frighteningly accurate, that in July, Congress hauled a who’s-who of the Interwebs, including representatives from Google, Facebook, Apple, and AT&T, in front of the Senate Commerce Committee, threatening to push bills through both the House and the Senate if the industry didn’t start explaining to consumers what information is being collected and how it’s being used.

After the Senate hearings, Massachusetts Senator John Kerry announced that he would draft legislation (to complement bills already introduced in the House) that would give people more control over how their information is collected and distributed online.

“Take the single example of a cancer survivor who uses a social network to connect with other cancer survivors and share her story,” said Kerry in a statement. “That story is not meant for her employer or everyone she e-mails, or marketers of medical products promising herbal cures. Misapplied and poorly distributed, this information could lead to a lost job opportunity or higher insurance rates. Even distributed without malice this information could pigeonhole her identity as a cancer survivor, which may not be how she wants to face the world.”

Deciding who gets that information “should be her right,” Kerry continues. “Whether or not she acts to protect its distribution, private firms should start with the premise that they should treat her and her information with respect. The fact that no law limits the collection of this information or its distribution is a problem that threatens an individual’s sense of self.”

That very month, however, the Obama administration tried to make it easier for the FBI to obtain records of “online transactions,” including a list of who you’ve e-mailed and what Web sites you’ve visited, without a warrant. Around the same time, the Electronic Frontier Foundation reported that the White House has circulated a draft of its plan for securing identity online, which calls for individuals to “voluntarily request a smart identity card from her home state” to “authenticate herself for a variety of online services” including “securely accessing her personal laptop computer, anonymously posting blog entries, and logging onto Internet e-mail services using a pseudonym.”

The proposal, called the National Strategy for Trusted Identities in Cyberspace, sounded alarming to some critics.

“If I’m posting on a blog, reading, browsing, who needs to know who I am? Why is it so important that my identity be verified and authenticated?” says Tien. “We have a tendency to say, ‘Well, gee, there are all these problems so we need to know people’s identity.’ But identity isn’t security. You don’t automatically know what to do about someone just because you know who they are.”

Contested concepts

At any rate, even a raft of new laws and legal precedents can’t be the only answer. Beyond legal remedies, there has to be a cultural component.

“Much of our sense of privacy in the world isn’t guaranteed by law,” says Tien. “It’s guaranteed by people acting within traditional bounds.” Unfortunately, “technology screws this up. It accelerates social change in ways where people aren’t sure what the norms are.”

Justin Silverman, a law student who blogs for Suffolk Media Law and the Citizen Media Law Project, says he suspects that ultimately people’s sensibilities will adapt as folks get “more comfortable with information online” and a lot these issues will “solve themselves.” In the meantime, he says, “the market will take care of some things.”

Indeed, even as they’ve helped create some of these issues, technology and the private sector have huge roles to play. People are starting to demand it. The Wall Street Journal reported recently that “companies with ideas on how to protect personal information”—firms such as Abine and TRUSTe—“are a new favorite of venture capitalists.”

A lot of Internet companies, according to Polonetsky, are simply saying, “I’ve had enough of this. I have some pretty big plans to do some pretty good things with technology, and I don’t want to be called a bad guy. I’m ready to have the practices that seem to be of grave concern taken off the table so I can roll things out.”

Even as the technology evolves, and legislators and courts and corporations slowly smarten up, and society gets more Web-savvy, some of this stuff will always be with us.

Tien mentions a phrase he likes from philosophy: essentially contested concept. That’s an idea that pretty much everybody recognizes and agrees exists in theory—“justice,” say—but on which there’s little concurrence about just what it is and how to achieve it.

“Privacy is essentially contested,” says Tien. “We want to protect our privacy, but there are grand incentives to know more about us. Combine this problem of competing incentives with the problem of how hard a problem it is to solve and how every era changes the technology: Even if the problem gets solved for the telephone it didn’t get solved for e-mail and it didn’t get solved for social networking. It’s always going to be work.”—Mike Miliard

Permalink • Print • Comment

December 1, 2010

Bypass Heavy-Handed Web Filters with Your Own Proxy Server


If your workplace or school's extra-restrictive internet filter has you pulling your hair out during the occasional browsing break, there's hope! Here's a quick look at how to get around heavy-handed browser restrictions with the open-source PHProxy.

Back in January we pointed you toward PHProxy, along with some instructions for setting it up on a web server; fact is, most people don't actually have access to a web server to run something like PHProxy. The solution: Install a local web server on your home computer, then run PHProxy from there. Setting one up is actually a lot easier than you may think.

A quick crash course on proxy servers: Let's say your dastardly workplace blocks you from reading Lifehacker. Many web filters block web sites based on URLs, so if Lifehacker were blocked, the filter would recognize the URL http://lifehacker.com and automatically block any connection. A proxy acts as a go-between for your browser and the web site you want to access, and as far as the web filter can tell, the proxy-employing user isn't visiting Lifehacker—she's visiting whatever the URL is for the proxy. And since we're setting PHProxy on your home computer, chances are slim that the web filter will block your home IP address (or URL, which we'll talk about more below).

When you're done here, you should be able to access restricted sites from anywhere by routing your requests through your home computer. First I'll explain how to install a local web server on your computer (for Windows and then Mac users), then explain how to install and use PHProxy from there, and finally I'll walk you through how to access your newly minted local proxy server easily from any other computer.

Download and Unzip PHProxy

Regardless of your OS of choice, the first step is easy: Head over to SourceForge and download PHProxy, then unzip your download to a folder and name that folder phproxy. Put it in a safe place, and we'll get back to it later.

Install a Local Web Server on Your Windows PC

In order to run PHProxy on your home computer, you'll need to install a local web server. You've got lots of options for doing this, but probably none easier than just downloading and installing WAMP—which stands for Windows (your operating system), Apache (the web server), MySQL (a database, which PHProxy won't actually use), and PHP (the popular programming language, which PHProxy is named for and written in).

Once you've downloaded WAMP, go ahead and run through the installer. It's a pretty basic install, and when you're done, launch the WAMP system tray application. After you do, you'll notice a new icon in your system tray (it's the one that looks like a speedometer). WAMP's running, but it's still not turned on. To put WAMP online, left-click the system tray icon and click Put Online.

Now, to verify that everything's working, left-click the WAMP icon in the system tray again and click Localhost—or just point your browser to http://localhost/. If all's well, your browser should load a page that looks like the one below.

Good work—you now officially have a web server up and running on your PC. You can skip the Mac section and head straight to the section on installing PHProxy to your server.

Install a Local Web Server on Your Mac

Above, Windows users installed a web server bundle called WAMP—in which the 'W' stood for Windows. Mac users, appropriately, have MAMPMac, Apache (the web server), MySQL (a database that you won't actually be using), and PHP (a popular web programming language after which PHProxy is named). So go download MAMP (it's a hefty 156MB download) and install it to your Applications folder (make sure you install the free version and not the Pro version).

Now it's time to fire up MAMP. Open the MAMP folder you dragged to your Applications folder, then double-click MAMP.app to launch it. On this first run, click the Preferences button in MAMP, click Ports, and then click the Set to default Apache and MySQL ports button. Hit OK (enter your password to confirm), then point your browser to http://localhost/ (or http://localhost/MAMP/ if you want to see the MAMP landing page). If everything's working as it should you should see a page called "Index of /" at localhost, or the page below if you go to the MAMP URL.

Good work, you're officially running a local web server on your Mac. Now to PHProxy.

Install PHProxy on Your Server

Now we want to install PHProxy on your server. I'm using "install" pretty loosely here; assuming you've already downloaded and unzipped PHProxy to a folder named phproxy, all you really need to do is copy that folder to the root directory of your local web server.

To find your server's root directory on Windows, just click the WAMP system tray icon and click www directory (which, on my Windows 7 installation, is located at C:\wamp\www\. Inside this folder you should see a file called index.php—that's the page that loaded when you pointed your browser to http://localhost/ above. Now simply take the phproxy folder you unzipped PHProxy to above and drag it directly inside the www folder.

Mac users, the MAMP root directory is located inside the MAMP folder at /Applications/MAMP/htdocs/. Likewise, just open that folder and copy the phproxy folder to it.

And… there you have it-you've officially installed PHProxy. To make sure it worked, point your browser to http://localhost/phproxy/. You should see the page below.

(Click the image above for a closer look.)

To test it further, all you have to do is type or paste the URL you want to visit into the web address input box and hit Enter. Below you can see me visiting Lifehacker through my PHProxy installation.

(Click the image above for a closer look.)

Depending on what your web filter is blocking, you can tweak the way PHProxy works—you can show or block images, allow or reject cookies and scripts, encode the URL you're visiting into a string that's complete gibberish, and more. Handy, huh?

Set Up Port Forwarding and a Friendly URL

At this point PHProxy should be working fine from your home computer, which is all well and good, but now we need to make it easy for you to access your local PHProxy installation from outside your home. To do so, we're going to have to set up port forwarding, then optionally we'll give your PHProxy server a friendly URL.

Set Up Port Forwarding on Your Router: When you try to communicate with your home computer from outside your local network, the request first has to go through your router—which then identifies which computer the request is intended for and sends it on its merry way. When you're running a web server on your home computer, other computers looking to communicate with that server will try communicating with it on port 80 (you don't really need to know what any of that means; web servers generally communicate on port 80, and that's what browsers try to access by default). So when your router receives a request on port 80, you need to tell it that those requests should be forwarded to your local PHProxy server.

Rather than detail the entire process, I'll point you toward our previous guide to accessing a home server behind a router/firewall. All routers are a little different, and that's a general guide, so if you want more specifics, try visiting PortForward.com, selecting your specific router model, and finding the instructions for setting up port forwarding with Apache (the web server).

If you've successfully set up port forwarding, you should now be able to access your home server by visiting your network's external IP address (this is the single address that identifies your home to all the other computers on the internet). Quickly point your browser to What Is My IP and copy the series of numbers following "Your IP Address Is:", paste that into your browser's address box, and hit Enter. If everything went according to plan above, your browser should now load up your local server. Add /phproxy/ to the end of your IP address and you should see the PHProxy homepage. Smooth.

Now that your web server is accessible to the outside world, you don't want to let just anyone access it, so at this point it's a good idea to password protect your server. We've already been down this road before, too, so rather than explain it all here, head to step three in our guide to setting up a personal home web server. (For a little extra help generating the necessary password files, I also like web site Htaccess Tools.)

Set Up a Friendly URL: You could stop at that point, but that series of numbers that makes up your IP address isn't all that friendly, and in fact, if your ISP assigns you a dynamic IP, it could change regularly. Luckily you can assign a friendly domain name to your home proxy server for free using DynDNS.com, a process that we've detailed in the past.

By assigning a domain name to your home server, you can create an easy-to-remember viagra buy uk URL like mycrazyproxy.selfip.com, rather than typing in 76.189.XX.XXX every time you want to access your home server.

A Few PHProxy Pointers

PHProxy is an excellent tool, but you should also be aware of the concessions you're making when using it. For example, you should expect your browsing experience to slow down considerably when you're browsing through your home proxy. Remember, your requests are being routed through your home proxy server every step of the way, which puts a rather slow middleman (your home network) between you and the web sites you want to access.

Also, while PHProxy works like a charm for most plain old browsing, it can be tricky when it comes time to log into some web sites. For example, I could log into Twitter without any issues, and I was able to get to the static HTML version of my Gmail account and Facebook, but—though I was able to log in—I had trouble viewing either until I told PHProxy to remove scripts. In fact, I found that removing scripts was a good step whenever I had trouble with sites I wanted to log into.

Last, a Note on Responsibility

Setting up your own proxy is a fun project, but a few things to keep in mind if you're actually planning to use it in your workplace:

  • Even if you're using a proxy, your employer can still see everything you're doing on the internet (and your computer), whether they're watching the data as it comes to your computer or they're literally watching your screen.
  • Some employers actually forbid the use of proxies in their employee agreements, so if you get caught, you could face some very serious consequences (like, you know, getting fired), so use at your own risk.

Got your own tried and true method for accessing blocked web sites? Have a web filter that just won't be defeated? Prefer not to mess with the establishment? Share your thoughts and experience in the comments.

Permalink • Print • Comment
« Previous PageNext Page »
Made with WordPress and Semiologic • Sky Gold skin by Denis de Bernardy