Of the latest trouble in tech is information privacy, which can be breached by means of a malware outbreak or by intent of often a commercial nature. Severe damage can be infected by either type of data breach, yet more devastating is often the later form, for it often comprises a breach in the trust of the users involved. The latest one of which is with Facebook and Cambridge Analytica.

By participating in the personality research, these Facebook users also, unbeknownst to themselves, consented to share information of their Facebook friends along with their own.

In 2014, a Facebook app, under the pretence of a personality quiz and academic research, managed to acquire consent, to information collection, from 270,000 Facebook users, by recruiting them on Amazon’s marketplace for on-demand workforce, the Mechanical Turk. By participating in the personality research, these Facebook users also, unbeknownst to themselves, consented to share information of their Facebook friends along with their own. This was possible because, and perhaps of the most scandalous of all in this data breach, back then Facebook allowed, in their API, third-party app developer, and in this case, Aleksandr Kogan, to collect information from these 270,000 consented Facebook users, as well as from their Facebook friends who did not and could not have consented, nor were they aware of the data collection in action.

Facebook’s policy at the time forbade the collected information, either of the consented users or of those who are Facebook friends with those consented, to be marketed or sold, which however did not prevent the information, which amounted to be of 50 million Facebook users in total, to fall in the hand of Cambridge Analytica, who in turn leveraged the information to formulate targeted political campaign to influence voter opinion under the commission of interested and endowed political parties. Both the 2015 campaign of United States politician Ted Cruz and the 2016 Brexit referendum were reported to have employed the service of Cambridge Analytica.

The Guardian, in December 2015, first reported of unauthorised and illicit use of private information in political campaign, and in March 2018, The Guardian along with The New York Times brought the full scandal to light, with more details about the Facebook data breach from former Cambridge Analytica employee turned whistleblower Christopher Wylie, and there the rest of the world was made aware of the conspiracy that has been in the making for the last few years.

There is certainly much to be learned from this incident for everyone involved. Facebook had long patched the loophole in their API and have, in light of the news, announced to review their platform policy as well as to introduce heightened measure to ensure developers are held responsible to and are in full compliance with its policy. Threatening to bring legal action against The Guardian to prevent the news from surfacing was hopefully among the negative examples that Facebook learnt from its own wrongdoing.

The political motivation and exploitation behind this data breach hopefully serves as a reminder of how irrational we are and how little we are aware of it.

As for the people, whereas there is hardly any argument against our well-researched tendency to search for, interpret and focus on information that confirms our beliefs, that we rely on information that comes easily to mind, and that we are, unbeknownst to ourselves, driven by emotions rather than reasons, the political motivation and exploitation behind this data breach hopefully serves as a reminder of how irrational we are and how little we are aware of it.

If however you’re not paying for the product, you most definitely become, in one way or another, the product.

As the old saying goes, while there is no guarantee that a product or service, with a business model relying solely on its users, has its interest better aligned with that of its users. If however you’re not paying for the product, you most definitely become, in one way or another, the product.