In recent years, there has been a great buzz around the development of Artificial Intelligence (AI) and what that might mean for the Indian economy. On the government’s side, Niti Aayog has come up with a national strategy on AI; the Ministry of Commerce has set up an AI task force. A ‘National Centre of AI’ is also planned. All of these initiatives have a scope to define where AI can contribute to Indian industry and how to best achieve adoption at scale. But there’s a flip side to AI and that impacts data privacy.
The relation between AI and data privacy is a complex one. Broadly speaking, the growth of AI may spell the end of data privacy if we don’t proactively try to embed privacy by design.
Algorithms in AI learn from big datasets. For example, let us take a huge dataset, say India’s Aadhaar database. To the human eye and mind, it would be almost impossible to discern any insight from looking into this huge database/spreadsheet. However, to an AI algorithm, it could serve as fuel. AI learns from big data and identifies patterns in numbers that may draw unlikely correlations.
The catch here is in the fact that the more data an AI programme is fed, the harder it becomes to de-identify people. Because the programme can compare two or more datasets, it may not need your name to identify you. Data containing ‘location stamps’ — information with geographical coordinates and time stamps — could be used to easily track the mobility trajectories of, say, where and how people live and work. Supplement this with datasets about your UPI payments, and it might also know where and what you spend your money on.
So, the more data AI is fed, the better it might get to know you. Because of this, if AI is the future, then privacy may be a thing of the past. Still, can AI be, instead, leveraged to enhance privacy for individuals and companies?
As rosy as leveraging AI for privacy might sound, data is going to drive the economies of the future, and in a data-driven regime, the idea of privacy takes centre stage to protect the interest of consumers and citizens alike.
This brings us to another question: if AI is fundamentally opposed to privacy, is there a way to get around the problem? There are two aspects to how privacy can be maintained not at the cost of development in AI. The first is that of consumer action. There is a need to modify the bridge between AI and data protection.
Terms and conditions
With rising data collection and storage, doctrinal notions around ‘consent’ and ‘privacy notices’ should be reconsidered. For instance, we may need to revisit the model of ‘clickwrap’ contracts (which allows the user to click on “I accept” button without reading a long, verbose and unintelligible privacy terms and conditions).
What consumers are not aware of is that often, they can decline the contract and still get unfettered access to the content. While this is a practice that should not be encouraged, it is still a step better than accepting terms and conditions without reading them.
The best practice would be to find out whether the following T&Cs are a part of the agreement: (1) Can the website use your content? (2) Does everything you upload become open source? (3) Can your name and likeness appear in ads? (4) Do you pay the company’s legal costs to cover late payments? (5) Is the company responsible for your data loss?
Of course, you shouldn’t have to read legalities before you want to read an article. To that extent, a possible workaround could be using tools such as ‘Polisis’.
The second solution is to change the nature of AI development. This means including privacy by design in AI algorithms. While there can be no strict set of rules or policy guidelines which can bind an algorithm designer, best practices following constitutional standards jurisdiction-wise can be developed as a benchmark.
A few techniques that could be deployed to enhance privacy when data is being processed by an AI algorithm are differential privacy, homomorphic encryptions, and generative adversarial networks. Along with these, another privacy enhancing and data protection measure which should be taken is of certification schemes and privacy seals to help demonstrate compliance by organisations.
This article was first published in Deccan Herald. Views expressed are personal.