In June of this year, Facebook provoked a widespread public outcry after it became known that it had tried to manipulate the emotions of nearly 700,000 of its users as part of a social “experiment.”
At the time, researchers and commentators argued that the research had broken basic ethical guidelines that normally protect individuals from being unknowingly used in this type of research. The Electronic Privacy Information Centre (EPIC) took a more aggressive step and filed a complaint to the Federal Trade Commission asking for sanctions against Facebook that included asking Facebook to make public, details of the algorithms behind its filtering of the News Feed.
Facebook has now responded to the criticisms by putting in place a series of measures that have been detailed by Facebook CTO, Mike Schroepfer. Already however, Facebook’s response has been greeted with a degree of skepticism.
Importantly, Facebook has not said that it will stop the research on manipulating its users, nor has it said that it will subject its experiments to external review. What it will do is to educate its staff involved in research on “research practices” and it will create an internal panel that will review projects involving certain user groups like children and projects that involve manipulating its customers’ emotions. In certain cases, Facebook researchers will prefer the use of non-experimental approaches to getting the same information.
Facebook will not however, disclose what the guidelines given to staff actually are, nor will it subject these guidelines to external review. What is likely to happen is that Facebook will become more secretive about the research it conducts and is very unlikely to allow that research to be published.
Marc Rotenberg, Director of EPIC, the organisation behind the complaint filed with the Federal Trade Commission, has said that Facebook’s response does not go far enough. Speaking at an OECD conference in Tokyo about security and privacy issues around big data, Rotenberg reiterated the importance of making public the nature of algorithms that companies like Facebook use.
In particular, algorithms that are used to filter information and manipulate services provided to customers. The Commisioner of the Federal Trade Commission, Julie Brill, speaking at the same event could not comment on the case being considered against Facebook but also independently talked about the focus of the FTC on consumer trust and the importance of “algorithm transparency.”
Commissioner Brill outlined the difficulty facing the FTC in regulating something that companies could argue is a key element of their intellectual property. This was the case even if the algorithms had a direct and possibly negative impact on consumers. She was not able to comment on if, and when, the FTC would rule on the matter of Facebook’s experiments.
At the time of the reaction to Facebook’s experiments, supporters of the use of this type of research argued that there was nothing wrong with this approach and that it happened all of the time. Dating site OkCupid claimed that they too, had run experiments to manipulate its users and that this was done in the interests of improving the service to its customers.
OkCupid founder Christian Rudder proudly stated that users should get used to this type of experiment from companies. He said “if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”
There is a big difference however to a website doing what is called “A/B Testing” which is commonly used to see if a particular design is better than another, and direct emotional manipulation or tricking users.
At the OECD conference, Commissioner Brill talked generally about the idea of what companies need to do to maintain consumer trust, noting how easy it was for companies to lose that trust. She said that there was a “fine line between valuable personalization and harmful discrimination” and also that “trusted relationship can’t exist if one side has no control and information is highly asymmetric.”
A question that went unanswered is how tolerant Facebook’s users are to the imbalance that exists in their relationship with the company. It is unrealistic to expect individuals who have now come to depend on Facebook to interact with their family and friends to simply stop using the service unilaterally.
Only time will tell whether EPIC, and other organizations representing these consumers, are able to do what Facebook appears unwilling to do spontaneously, and address the balance of control in favor of its users.
David Glance, Director of Innovation, Faculty of Arts, Director of Centre for Software Practice at University of Western Australia, does not work for, consult to, own shares in or receive funding from any company or organization that would benefit from this article, and has no relevant affiliations.
This article was originally published on The Conversation. Read the original article.
Comments