By Shirley Huang
Data security has been a concern for people across the world. As we grow more reliant on information technology in our daily lives, legal and ethical concerns over data continue to grow. The privacy concerns involving Cambridge Analytica, a political firm hired by Donald Trump to acquire access to private data of millions of Facebook users, emphasized the significance and power of data, especially in swaying one’s economic, social, and even political decisions. Even The Walt Disney Company was found in 2017 to be using data from their young consumers for marketing purposes, violating the Children’s Online Privacy Protection Rule in the United States. Besides these cases, lots of companies involve themselves in similar practices related to the exploitation of data without the consent of consumers. Laws protecting data privacy are being implemented across the world, but challenges remain, calling for more detailed requirements.
The framework of law is one of the key elements that bind corporations’ actions and secures customers’ rights over their personal data. In May 2018, a reform was issued by the new European Union’s General Data Protection Regulation (GDPR), which imposes much stricter regulations over data use by corporations. Firstly, it requires companies to describe their data policy with “clear languages,” instead of the long and tedious documentation which deters users from reading the complicated text. Secondly, it switches from default consent to required consent from users. In the past, users needed to actively choose “no” in order to prohibit companies from using their data, otherwise, it was recognized as automatically granting permission. However, under the new regulation, companies can only use data if users actively click “yes,” when prompted. This action will likely cut the amount of available data tremendously as there is no incentive provided for customers to offer their private information. These changes work toward giving customers the choice to have their data shared.
According to the European Union (EU), a fine of up to 20 million euros will be imposed on violating companies. This disincentives companies from selling user’s data without their consent to third parties. Although the regulations did seem to set boundaries, the problem is that these boundaries are vague and qualitative. It appears difficult to judge whether the language in corporate data policies is “clear” and concise. Word limits are not appropriate quantitative measurements as different businesses require different levels of explanation for their use of data; however, it then becomes subjective when deciding whether the wordings are concise enough. The line between “clear enough” for comprehension and “tedious, long document” is often blurred depending on different institutions and their evaluation method. Anthropology Professor Alison Cool from The University of Colorado, said, “the problem is: no one understands GDPR.” Data processing often involves cross-country international influences which makes the case even more complex.
Companies tackled the change with various strategies; some chose to invest more resources in online questionnaires for marketing investigation, while others tried to use encryption in compliance with GDPR’s requirement. Yet, encrypted data with a poor key, or even lost key, has a low confidential status, which is undesirable for consumers. Likewise, companies who properly implemented encryption were exempted from fines, so encryption became a cheap tool to avoid heavy fines. While the cost to companies will rise in the short-run due to either reformation or fines from breaching, in the long-run, it is likely that these changes will attract more customers from building up trust.
There should be more measurable and clear rules governing corporate data usage beyond the current laws. Nonetheless, customers should not be excluded from reform. By spreading more educational information about data security, consumers will be more aware of their online choices before they click on the screen, not after a scandal is publicized. In a post-GDPR era, there are still imperative details that need to be worked through in order to set up lucid framework for protecting data use.