emerging technologies & privacy blog (1)

When Emerging Technologies and Privacy Collide

Technology development is moving more quickly than ever. Each new breakthrough makes the next one easier. Just as cloud computing and GPUs opened up a new era of AI, more ubiquitous wireless communications made connected devices possible. But these emerging technologies deliver privacy implications along with social benefits.

AI Privacy

Most AI algorithms wouldn’t work without large amounts of data. Data scientists train them using extensive collections of whatever they’re trying to process, such as images, audio snippets, or text. The use and handling of this data raise privacy questions. Where did the data come from? Does it contain any personal information? Who gave consent for its use, and what kinds of processing did they consent to?

AI users often act first and ask forgiveness later, either intentionally or through ignorance. One such company was Clearview AI, an American company that harvested billions of images without permission from popular sites, including Twitter and Facebook. It used scraping software to harvest the images without the companies’ or the subjects’ consent. Then it used them to build a facial recognition database which it made available to law enforcement organisations for a fee. Until the American Civil Liberties Union sued, it also sold access to private businesses and individuals.

Clearview AI’s scraping infringed upon the companies’ access policies, prompting several cease and desist letters. It also violated regional privacy laws according to three European countries and the UK. They all ordered it to remove their citizens’ images from its database, but the company stores this data elsewhere and has dismissed the ICO’s penalties.

Privacy issues also arise in how and where the algorithms are used. UK law enforcement has used live facial recognition to identify individuals in public places. Cambridge University condemned this as an infringement of human rights. In some cases, private companies such as retail outlets have used facial recognition systems to recognise visitors without their consent.

AI’s ability to process large amounts of information and spit out an easily digestible result (say, a decision on a loan application) also throws data privacy into the spotlight. Reports of algorithmic bias are rife. This happens when some demographic groups are under-represented in source data or where data points are given inappropriate importance in the data model. If individuals don’t get the chance to consent to their data’s use in these models – or if they do not understand the implications – their human rights and those of others in their communities could suffer.

Privacy in IoT

Another nascent technology, the Internet of Things (IoT), has also sparked privacy concerns. Everything from cars to children’s watches now gather and relay user data. Cars collect information about users, including their location and their driving behaviour. In California, from the beginning of this year, the Consumer Privacy Rights Act amends the existing Consumer Privacy Protection Act, which would, among other things, enable drivers to opt out of car vendors sending this kind of information to insurance companies. Manufacturers must tiptoe around the EU’s General Data Protection Regulation (GDPR).

IoT companies often also mishandle this data. Researchers have found servers hosting geolocation data from cheap children’s watches vulnerable to data breaches, putting minors at risk.

Lawmakers have imposed baseline security measures for IoT manufacturers. California was among the first. The UK has enacted its Product Security and Telecommunications Infrastructure (PSTI) Bill, while the EU has proposed its own Cyber Resilience Act. However, with much of the vulnerable data stored outside the EU, it remains to be seen how useful these will be.

Blockchain Privacy

Our third nascent technology, blockchains, ostensibly protect privacy through disintermediation. They remove a central party that traditionally facilitates and documents transactions, like a bank. Instead, the blockchain serves as a distributed ledger that allows everyone to transact directly while retaining their own cryptographically proven copies of the ledger. That stops a central party from misusing or losing the data.

However, the threat to privacy comes in how blockchains store information. For example, the public Ethereum blockchain stores everything in plain view, including the addresses people use to make transactions. Ethereum’s creator Vitalik Buterin has described privacy as “one of the largest remaining challenges in the Ethereum ecosystem”. He proposed using stealth addresses – one-time disposable addresses that disguise a transaction’s author – as a possible solution.

Other forms of blockchain are less susceptible to these privacy issues. For example, private blockchains only grant access to members of a specific community. The information on these blockchains is not publicly visible, meaning only members can see what’s happening. However, these blockchains are used in a subset of use cases, commonly in industry verticals such as supply chain management and finance.

New technologies often enable us to do things that were impossible before, but they also introduce new dangers. That calls for a reconsideration of user rights and how to protect them. Facebook’s old mantra, “Move fast and break things,” was an ode to disruption. However, when the things you’re breaking include social constructs such as trust and fairness, the onus is on legal and regulatory experts to move just as quickly.

Strengthen Your Data Privacy Today

If you’re looking to start your journey to better data privacy, we can help.

Our ISMS solution enables a simple, secure and sustainable approach to data privacy and information management with ISO 27701 and other frameworks. Realise your competitive advantage today.

Book A Demo

Streamline your workflow with our new Jira integration! Learn more here.