Facebook users were not clicking “like” very often this weekend as news broke that the social network had partnered with Cornell and the University of California to conduct psychological experiments on users. The study used the network’s filters to show more positive posts to some users, while others saw posts with more negative connotations. The results were not surprising: Users who saw negative posts posted more negative content of their own, while users who saw positive posts tended to be more optimistic in their own content.

The outrage was quick. The lead researcher apologized for the Big Brother-like qualities of the research. Facebook has yet to make an official statement, tells the Atlantic it has a “strong internal process” for allowing research.

While Facebook may have followed the letter of the law in allowing researchers to toy with users’ newsfeeds, the news is still damaging to Facebook’s brand. The average user is comfortable with Facebook’s algorithm being used to optimize content in order to make using the world’s largest social network more engaging. But when content is throttled in order to make the users into unwitting research subjects, the ethics change significantly.

Facebook is all about seeing what resonates with users. As social media professionals, we advise our clients to experiment with copy, pictures, and targeting of paid content to maximize their reach within Facebook’s algorithm. Now that we know Facebook has given control of the algorithm to researchers, it begs the question: For the right price, would Facebook tweak its algorithm for a commercial enterprise? Or a political candidate?

Facebook needs to come out with a statement explaining how it gave access to researchers and make clear its policies on research and who can alter its algorithm going forward.

Are you upset by the study? How should Facebook counter the negative press? Let us know in the comments.