BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

After The Freak-Out Over Facebook's Emotion Manipulation Study, What Happens Now?

This article is more than 9 years old.

The dust is finally starting to settle around the revelation that Facebook manipulated users' emotions for science. So what now?

Legally, I don't think much will come of this beyond Facebook's government liaisons working longer, harder hours for a while. On the other side of the pond, Facebook has to provide an in-depth explanation of procedures for all of its research to its local privacy regulator. Here in the U.S., at least one privacy group filed a formal complaint with the Federal Trade Commission saying that what Facebook did was "deceptive" and violated the agency's existing 20-year consent decree with the site for previous privacy mistakes (which would mean monetary penalties). But I think that's a dead end because Facebook didn't start its probation with the FTC until August 2012, seven months after the emotion manipulation study happened. So the FTC wouldn't be able to hit the company with fines; it would just be able to get a second, redundant consent decree. That's if the FTC even thought it was harmful to consumers to have their Facebook stream more negative than usual.

"It’s clear that people were upset by this study and we take responsibility for it. We want to do better in the future and are improving our process based on this feedback," says a Facebook spokesperson. "The study was done with appropriate protections for people’s information and we are happy to answer any questions regulators may have.”

Class-action lawyers are surely sniffing around the case, but it'd be hard to prove that a Facebook-using client was one of the 100,000+ whose News Feeds turned blue for a week. Facebook says it designed the study so that the test subjects stayed anonymous. And it'll be hard to argue that there was some kind of financial harm from a week of attempted sad-making.

Instead, the biggest result from all this is a long-needed discussion about companies running experiments on their users. Facebook has published quite a lot about its research but make no mistake, it's not the only company taking A/B testing to the extreme. Some other tech companies also publish -- such as Google when it combed through your search queries for illness trends or played with Google News links for 10,000 users to see if its predictions about what they wanted to see were right -- but lots of the experimenting is known only to the data scientists working behind the scenes and under NDAs at companies.

"A statistician who lives in Silicon Valley is a 'data scientist,'" says Paul Ohm, a law professor at the University of Colorado. "Lots of companies -- Bitly, OKCupid – have this weird conflation of data research based on what their users are doing and corporate profit-making. The ethics have been begging to be discussed. There's A/B testing to better deliver a product a customer wants, but it's another thing for companies to consider users to be a willing and ready pool of lab rats that they can prod however they want."

Dating site OkCupid's OkTrends blog was a voyeuristic and often salacious look into what works in a dating profile and what doesn't and why people choose the people they choose. It was sadly discontinued after OkCupid was snatched up by IAC's Match.com for $50 million. Whether Match worried more about the competitive intelligence being leaked or the "ethics" of the blog was never announced.

"We haven't looked at the harms or invasiveness that comes along with these Big Data dives," says Ohm. Ohm was one of the first people to criticize the universally acclaimed Google Flu trends project, pointing out that while there may be some benefits to knowing where flu is going to strike, it came with the harm of the search engine combing through some of our most sensitive searches: our medical conditions. "Google breached a wall of trust by dipping into its users’ private search data in ways that went beyond traditional and historically accepted uses for search query data, such as those uses relating to security, fraud detection, and search engine design," he wrote last year. "While Google’s users likely would have acquiesced had Google asked them to add 'help avoid pandemics' or 'save lives' to the list of accepted uses, they never had the chance for public conversation. Instead, the privacy debate was held — if at all — within the walls of Google alone."

Medical ethicist Alta Charo of the University of Wisconsin's Law School thinks the outcry over the Facebook study is overblown, but that it's worthy of discussion because of the ubiquity of Facebook and the sheer scale of experiments for companies that have a billion customers. "As a business practice, companies do research on consumer behavior all the time. Which colors work? Should a mailer start with happy story about candidate or an attack on competitor? This is not novel and not limited to Facebook," she says. "I think there’s a larger question about how much individualized information we have around each person. As a matter of ethics, it’s not at all hard for a company to simply announce, 'We constantly test our business practices, let us know if you never want to be part of that.'"

Many observers say this research is important, that they don't want it to go away, but that they want a clear understand of corporate research ethics, as companies are not subject to the same oversight that academic researchers are. It's a question of the technology and the extreme personalization that can happen in the Internet age, writes Zeynep Tufekci. "[I]t is clear that the powerful have increasingly more ways to engineer the public, and this is true for Facebook, this is true for presidential campaigns, this is true for other large actors: big corporations and governments."

Ryan Calo, an academic at the University of Washington, was writing about corporate lab rats even before it became a hot topic of conversation. "It's about information asymmetry," he says. "A company has all this information about the consumer, the ability to design every aspect of the interaction and an economic incentive to extract as much value as possible. And that makes consumers nervous."

Calo has a concrete ask. Facebook has said that it didn't have a formal review procedure in place for studies in January 2012 when the emotion manipulation one took place, but that it has one now. But we don't know anything specific about how that review works at Facebook or at other companies that know a lot about us and can run fascinating tests with that information. "I want all of these companies to treat consumer research like an ethical problem on par with all the other ethical problems they deal with all the time, like deciding whether to block a Chinese dissenter in another country. They're used to making ethical choices but not with their own data," says Calo. "I want Facebook and others to publish their criteria for greenlighting research."

An academic at Microsoft Research has a suggestion for what that might look like. It'd also be nice if after reading that criteria, consumers could decide whether they want to take part in research.