fbpx
Blog > The Dangers of Data Mining

The Dangers of Data Mining

Even if you don’t know the details, it is likely that you’ve heard of the company called Cambridge Analytica, know that they are in trouble, and it has something to do with Facebook. Cambridge Analytica is a political consulting firm with offices in London, New York, and Washington, D.C. It focuses on data mining and data analysis to provide brand strategy for its clients.

This March it was revealed that since 2014, some of Cambridge Analytica’s data came from an app called thisisyourdigitallife that the company paid to have developed. This app administered a personality survey, gathered profile information, and tracked “Like” activity from 270,000 Facebook users who, for a small payment, agreed to participate.

Close examination of the app’s user agreement would show a stipulation that, privacy settings allowing, thisisyourdigitallife would collect the same data from the much broader (and unnotified) pool of upwards of 50 million Facebook friends of the original user group. At the time, Facebook’s own policy, since amended, also endorsed this data collection-by-proxy.

Further buried in the fine print: exactly who would be receiving the information. The app presented itself as collecting data for research purposes but, according to the app’s creator, the user agreement contained a clause allowing it to use data for commercial purposes. This use WOULD have violated Facebook policy. Unfortunately, Facebook was unprepared to police this policy (among others). It proved unable to follow its data through to final destinations and out of its depth when confronted with the enormity, given amount of data and the complexity of business relationships, of this task. [For other examples of Facebook’s willingness to launch features without pausing to anticipate their consequences or to prepare for bad actors, check out Josh Constine’s article Facebook and the Endless String of Worst-Case Scenarios.]

Also unsettling: no one knows exactly how the data was used. There are indicators that Cambridge Analytica created “psychographic” profiles for political clients to deliver targeted messages to key voters during campaigns. Investigations have been opened into this question in the United States, India, and Brazil (the three countries with the largest numbers of Facebook users[1], it is worth noting). Though Cambridge Analytica has denied this, it has not offered up an explanation of its own as to its clients and their methods.

When a company such as Facebook traffics in data, the potential for misuse goes far beyond the company collecting the information. Data has value because it can be sold, and it becomes nearly impossible to control where such transactions may end up. Marketers selling consumer goods are on the benign end of the spectrum. It is not too far-fetched to assume corrupt regimes with an interest in squelching political opposition or targeted misinformation campaigns on the other, more destructive, end.

Go2s does not reject advertising as part of its business model. Go2s rejects the idea that the customer herself is the product. When businesses ask you to pay for services with personal information, they do so at best to advertise to you more efficiently. At worst, your information may one day be used against you. In this process, the customer gives the power of her information, knowledge, and experience away rather than keeping it for herself.

Go2s invites the individual to drive her community’s economic engines by centering its business model around the user sharing her expertise directly with her trusted network. Go2s does not make your data available to externally-developed apps or strangers for advertising or any other purpose. We consider ourselves the stewards of your personal data and only allow your trusted Resources to send you offers that you can choose to share with your Go2s. You decide who is allowed into your network and how far a resource’s influence can go within your network. You decide what deserves promotion. This is the power of advocacy, and this power has explosive potential when multiplied by the exponentials of trusted connections happening in Go2s.

[1] According to Statista, a Hamburg, Germany-based data and statistical company.