By Steve Coll - Facebook has made jarring mistakes as its leaders have learned what it means to run a profit-motivated political and public forum. In 2009, for example, the corporation exposed Iranian dissidents to danger by unilaterally changing privacy rules that allowed the Iranian authorities to see the identities of activists' online friends. The error was corrected quickly, but in general, Facebook has encouraged its users to accept greater and greater losses of privacy. Zuckerberg believes the world will be better off if it adopts "radical transparency," as the journalist David Kirkpatrick put it in his book, "The Facebook Effect."
Zuckerberg's business model requires the trust and loyalty of his users so that he can make money from their participation, yet he must simultaneously stretch that trust by driving the site to maximize profits, including by selling users' personal information. The I.P.O. last week will exacerbate this tension: Facebook's huge valuation now puts pressure on the company's strategists to increase its revenue-per-user. That means more ads, more data mining, and more creative thinking about new ways to commercialize the personal, cultural, political, and even revolutionary activity of users.
There is something vaguely dystopian about oppressed peoples in Syria or Iran seeking dignity and liberation inside a corporate sovereign that is, for its part, creating great wealth for its founders and asserting control over its users.
Facebook is hardly the only corporation managing these sorts of dilemmas—Google is a target of investigations seeking greater information about how it manages customer information it collects, about which it has sometimes been opaque, and it too has broken trust with users. Facebook points out that it has been responsive to revolts and protests from within. Zuckerberg proudly told Kirkpatrick that he revelled in the ways Facebook's users had forced him to become more democratic: "History tells us that systems are most fairly governed when there is an open and transparent dialogue between the people who make decisions and those who are affected by them. We believe history will one day show that this principle holds true for companies as well."
That is a laudable conception. Yet for now, at least, Facebook concedes to its users only when it judges that it is in the corporation's interest to do so; what user votes and consultations there may be are purely advisory. As MacKinnon observes, this system suggests the political control strategies of the Chinese Communist Party: periodic campaigns of state-managed openness and managed local democracy.
keyboard shortcuts: V vote up article J next comment K previous comment