Monday, April 16, 2012


(huffingtonpost)
"Imagine... that you knew which sites -- or what news stories -- people you trust found useful and which they disliked," David Kirkpatrick wrote in the June 11, 2007 issue of Fortune magazine. "This isn't fantasy. Facebook might make it possible, and soon. Yes, the social-networking site college kids spend so much time on -- the one you thought was just about hooking up -- could turn out to be more important than any of us thought."

Kirkpatrick, who was then Fortune's Senior Editor for Internet and Technology, went on to write the best-selling The Facebook Effect: The Inside Story of the Company That is Connecting the World, the definitive book on the company. He was prescient. In a startlingly short period of time, Facebook did make it possible for you to find those trusted and useful news sites and stories -- along with much, much more.

Now, with Facebook facing growing scrutiny in advance of its IPO next month, which is expected to value the Internet giant at $100 billion, the question of trust looms even larger. True, the social networking giant has made it easier than ever before to find trusted friends and followers, who can now create, curate, aggregate and distribute news and information with an unprecedented ease.

But is Facebook itself, the billion dollar baby whose rapid growth has yet to be slowed by continuing controversy over the privacy of its more than 800 million users, itself worthy of our trust? Can we rely on its wunderkind CEO Mark Zuckerberg, who has repeatedly pronounced privacy to be outmoded and argued that we are living in a new era beyond it, to safeguard our interests? Despite our differing -- some would say competing -- concerns, should we regard Facebook and Zuckerberg as our friends?

After all, the online social network, which offers its tools, technologies, and services at no cost, makes profit primarily by using heretofore private information it has collected about you to target advertising. And Zuckerberg has repeatedly made sudden, sometimes ill conceived and often poorly communicated policy changes that resulted in once-private personal information becoming instantly and publicly accessible. As a result, once-latent concerns over privacy, power and profit have bubbled up and led both domestic and international regulatory agencies to scrutinize the company more closely.

In one case, consumer protection groups, including the Electronic Privacy Information Center (EPIC) and fourteen others, filed a 2009 unfair-trade complaint with the Federal Trade Commission (FTC) accusing Facebook of unfair and deceptive trade practices that "violate user expectations, diminish user privacy, and contradict Facebook's own representations." It said that Facebook's decisions to disclose previously restricted "personal information to the public" had violated users' expectations, diminished their privacy, and contradicted its own representations. It asked the FTC to order the company to "restore privacy settings that were previously available... and give users meaningful control over personal information," to investigate Facebook's trade practices, require the company to restore privacy settings that were previously available and force it to "give users meaningful control over personal information."

Facebook settled in November 2011 by agreeing to refrain from making any further deceptive privacy claims, to obtain consumers' approval before changing the way it shares their data, and to undergo independent third-party auditing for 20 years. Shortly after the uproar subsided, however, renewed concerns over privacy and trust began to shake the brand again. This privacy blunder centered on Facebook's belated admission that it was still tracking the web pages its members visited, even after they have logged out of the Facebook site...
(more)