Earlier this month, reports from the United States highlighted that a tech “bill of rights” was in development to guard people against the potential harms of artificial intelligence (AI).
While both governments and businesses are keen to harness AI’s potential to make processes and our lives more efficient, there are understandable concerns about what the use of AI means for our privacy.
What are the US proposals?
President Joe Biden’s chief science advisor, Eric Lander, published an opinion piece in Wired magazine outlining some of the potential options.
The Wired piece includes the following passage:
“Enumerating the rights is just a first step. What might we do to protect them? Possibilities include the federal government refusing to buy software or technology products that fail to respect these rights, requiring federal contractors to use technologies that adhere to this ‘bill of rights,’ or adopting new laws and regulations to fill gaps.”
The foundation of the upcoming US bill is likely to be two-fold, aiming to prevent AI from leading to discrimination and avoid privacy violations. The Biden Administration is currently engaged in consultation and seeking comments from AI developers and experts, as well as anyone who already feels they have been affected by biometric data collection practices.
What’s the picture in Europe?
Earlier in October, the European Parliament moved to ban biometric mass surveillance. However, what this means in practice remains to be seen, as so far, all that has happened is that lawmakers have decided they need rules to prevent the police from using facial recognition technology. Given the varying political stances of governments across Europe, any Europe-wide bill is likely to face years of debate before anything comes into practice.
What does that mean for the UK?
From a political standpoint, government ministers will likely bring a similar law to the table and present the ability to do so as another “big prize of Brexit.”
In practical terms, calls for such regulation are already growing. Nine schools in Scotland are reportedly already using facial recognition techniques to charge children for their lunch. Supermarket giant Tesco recently opened its first “no checkout” store in London, following Amazon, which already operates a handful of similar sites. These developments follow years of concerns around the concept of shops potentially using AI to develop “personalised pricing” to charge different people varying amounts for the same product in a physical retail location. Meanwhile, the police are known to have been using facial recognition since at least 2015.
However, the use of AI and facial recognition technology in schools is of particular concern, with local authorities already facing accusations of not fully informing students and parents about the privacy risks. Interestingly, some US states, including New York, have already banned such systems after they were introduced into public education settings.
Watch this space for anything concrete
While there is nothing concrete currently in the pipeline as far as new UK laws go, there are several resources available if you’re concerned about how AI systems recognise you and collect data about you.
The ICO’s guidance for organisations is worth reading even as a consumer. The ICO also has a broad collection of resources that can help you understand your data rights and how to ask an organisation what data they hold on you.
What if my data rights have been breached?
While there may not be legislation that deals specifically with the use of AI, your data rights are still protected under the Data Protection Act 2018, which is the UK’s post-Brexit adaptation of the European General Data Protection Regulation (GDPR).
If your data is collected or used without your permission, you may have grounds for compensation.
If you believe your data rights have been breached, and you have been unable to get a satisfactory response from the organisation you believe to be responsible, you can contact LawPlus for a free, no-obligation assessment of your potential claim.