How the Cambridge Analytica scandal could harm data research efforts

The scandal that erupted around The alleged collection of 50 million Facebook profiles by Cambridge Analytica from data provided by a British academic and his company, is a worrying development for legitimate researchers.

The Cambridge Analytica political data analysis firm – affiliated with Strategic Communication Laboratories (SCL) – reportedly used Facebook data after being submitted by Aleksandr Kogan, a professor in the department of psychology 39, University of Cambridge.

Kogan, through his company Global Science Research (GSR) – distinct from his academic work – gleaned data from a personality test application called "thisisyourdigitallife". About 270,000 Facebook users based in the United States voluntarily responded to the test in 2014. But the app also collected data on Facebook friends of these participants without their consent.

This was possible because of Facebook's rules when allowed third-party applications to collect data on a Facebook user's friends. The company headed by Mark Zuckerberg has since changed its policy to prevent such access to developers.

The whistleblower Christopher Wylie, who previously worked as an entrepreneur at Cambridge Analytica told the Guardian that the company was using the data to target US voters before the victory of President Donald Trump in 2016. He claimed Cambridge Analytica was a "full-service propaganda machine".

Cambridge Analytica denied any wrongdoing and stated that the commercial tactics used are prevalent among other companies. For his part, Kogan insists that what he did was at all times consistent with the law – and also says, according to CNN that he would be happy to testify before the US Congress and talk to the FBI about the work that he has done for the company.

Facebook stated on March 18 that he had suspended SCL, alleging that Kogan had "lied and violated our platform policies by transmitting data from an application that was using Facebook login to SCL / Cambridge Analytica. "Facebook states in the third part of its platform policy that developers are not allowed to" transfer the data you have received (including anonymous, aggregated or derived data) to an ad network, an data broker or other advertising or monetization services. "

In a statement to Cambridge News the University of Cambridge stated:

We know that Dr. Kogan created his own company, Global Science Research (GSR), of which SCL / Cambridge Analytica was a customer. It is not uncommon for Cambridge academics to have commercial interests, but they must convince the university that they are being held in their personal capacity and that there is no business. conflict of interest.

We understand that the application thisisyourdigitallife was created by GSR. Based on Dr. Kogan's assurances and available evidence, we have no reason to believe that he used data or academic facilities for his work with GSR, and therefore that he was not There is no reason to believe that the data and facilities of the university were used the basis of GSR's subsequent work with any other party.

One day after the Cambridge Analytica scandal, Facebook's shares tumbled to Wall Street amidst the game of privacy. But could the incident affect legitimate academic research?


Social media data is a rich source of information for many areas of research in psychology, technology, business, and the humanities. Some recent examples include the use of Facebook for predict riots the comparison of the use of Facebook with the worry of teenage body image and the investigation of the decline in stress levels . , with research suggesting that this could improve and undermine psychosocial concepts related to well-being.

It is fair to believe that researchers and their employers value the integrity of research. But cases where trust has been betrayed by an academic – even if data used for academic research purposes has not been caught in the crossfire – will have a negative impact on the fact that participants will continue to trust to researchers. It also has implications for research governance and for companies to share data with researchers in the first place.

Facebook leader Mark Zuckerberg has not yet commented on the Cambridge Analytica data collection scandal. Shutterstock

Universities, research organizations, and funders govern the integrity of research with clear and rigorous ethics procedures designed to protect study participants, for example is used. Collecting data without the permission of users is considered an unethical activity under commonly accepted research standards.

The fallout from the Cambridge Analytica controversy is potentially huge for researchers who rely on social networks for their studies, where data is regularly shared with them for research purposes. Technology companies may become more reluctant to share data with researchers. Facebook is already extremely protective of its data – the concern is that it could become doubly difficult for researchers to legitimately access this information in light of what happened with Cambridge Analytica.

Analysis of the data

Clearly, it is not only researchers who use profile data to better understand people's behaviors. Marketing organizations have been profiling consumers for decades – if they know their customers, they will understand the triggers that drive the purchase of their product, which will allow them to adjust marketing messages to improve the sales. It has become easier with digital marketing – people are constantly being tracked online, their activities are analyzed using data analysis tools and personal recommendations are made. These methods are at the heart of the commercial strategies of technological giants such as Amazon and Netflix .

Information from online behavior can be used to predict the mood, emotions and personality of people. My own research on intelligent tutoring systems uses learner interactions with software to profile personality type so that it can automatically adapt tutoring to someone's favorite style. The machine learning techniques can combine theories of psychology with new found patterns – like Facebook "likes" – to profile users.

Eli Pariser, who is the CEO of the Upworthy viral content site, has criticized the personalization tools since 2011. He warned of the dangers of filtering information, and believes that the use of algorithms – profiling people to show them information tailored to their personal tastes – is bad for democracy.

Although these fears seem corroborated by some allegations against Cambridge Analytica, it should be noted that there is no evidence showing that the US votes were exchanged in favor of Trump because of the Psychometric tool from Cambridge Analytica.

However, given his academic status, Kogan's apparent decision to transfer Facebook data for commercial purposes in violation of social network policies could have explosive consequences, particularly because researchers may have more trouble getting Facebook and its users. agree to submit the data for search only.

This article was originally published on The Conversation . Read the original article .

To read further:

This is Charlie Brown's Great White Garbage Patch and it's worse than scientists thought

Leave a Reply

Your email address will not be published.