It’s no secret that China’s use of facial recognition technology (FRT) is the most invasive in the world. In our recent analysis of how 100 countries use FRT, China had the worst score with the most widespread use of the technology.
But in a recent twist, China’s Supreme Court has given citizens the right to refuse FRT use by private businesses, including banks, hotels, and nightclubs. The ruling, which the Supreme Court has hurried to execute because it “couldn’t wait anymore,” ensures that informed, opt-in consent is gained from private businesses who want to use FRT.
This makes this a landmark case, placing China head-and-shoulders above the likes of the United States, United Kingdom, Canada, and Australia when it comes to protecting citizens’ privacy against FRT use in private businesses.
Not only is this a standout case because it comes from a country that’s renowned for disregarding citizens’ privacy, but it places China within an elite (and incredibly small) group of countries that have specific rulings/pieces of legislation surrounding the use of FRT for private businesses. While FRT grows at an exponential rate, many countries are failing to keep up with it through their legislation. So, even though certain privacy protections may be provided through the likes of GDPR, the lack of specific and explicit legislation around the use of facial recognition means there are gaps and loopholes that can be exploited.
To find out just how China’s new ruling stands up against other countries, we studied the top 25 countries (according to GDP) to see what, if any, legislation/court rulings there are.
(Please note: Thailand is in the process of introducing a data protection law that would ensure explicit consent is sought when using sensitive data (including biometrics) but ongoing delays make this unclear at present).
China is just one of six countries to govern FRT in private businesses
Out of the 25 countries we covered, only six countries have explicit laws/rulings around the use of FRT in a private setting. These are China, Brazil, Spain, the Netherlands, Sweden, and Belgium. Belgium categorically bans the use of FRT, while Spain severely restricts its use (in general, it has no legal basis except in extreme circumstances, e.g. for critical infrastructure). The remainder rule that explicit consent is required (informed, opt-in consent). But, in Sweden’s case, this is only required if the technology is being used to identify people. If it is being used in an anonymized manner (it uses a case where visitor movement patterns were being monitored but facial images were being stored/analyzed), consent isn’t required. This makes Sweden’s ruling less privacy-protecting than the others within this category.
France and Germany also have some guidance from their relevant data protection authorities, ruling that informed, opt-in consent is necessary.
In sharp contrast, seven countries fail to provide adequate protection for citizens when it comes to FRT by not having clear laws and not explicitly requiring consent. These are the United States, Saudi Arabia, Thailand, Taiwan, Indonesia, Canada, and India.
In the United States, only a handful of states/cities have banned FRT (including Maine, Massachusetts, and the cities of Minneapolis and San Francisco) but with no federal ruling or guidance on its use, this leaves each state free to rule on FRT as it wishes. In Canada, it is currently possible to collect and share facial images for identification purposes without consent. The Privacy Act also fails to include or define facial and biometric information.
Some other countries fall in between the two extremes noted above, with general data protection legislation/guidance that may help limit FRT use in a private setting. For example, EU laws and recommendations may come into force in Switzerland, Germany, Poland, and Italy, ensuring informed, opt-in consent is obtained. But as local law fails to interpret/enforce these actions, there is a lack of clarity in these areas.
A further six countries require informed, opt-out consent, which means a sign displaying the use of FRT at a shop door, for example, may be sufficient. This is the case in Mexico, Japan, Turkey, South Korea, Australia, and the UK.
In Australia, for example, a convenience store chain, 7-Eleven, launched facial recognition across all 700 of its stores to confirm ratings within its app. A sign, which reads: “Site is under constant video surveillance. By entering the store you consent to facial recognition cameras capturing and storing your image” was deemed adequate enough by the store’s lawyers. But, in the UK, a seemingly more invasive use of the technology was implemented within the Co-op supermarket. It quietly installed facial recognition in 18 of its southern stores in a bid to combat shoplifting. The system, Facewatch, scans shoppers’ faces against a database of suspects of interest. If someone is “recognized” they are then asked to leave the shop. The company said there were clear signs in place so as to comply with relevant legislation.
How do businesses obtain informed, opt-in consent?
The above examples are most likely seen as the easiest and clearest way to get people’s consent. A sign at the entrance of the door warns visitors of the technology in use and, if they enter in the knowledge that FRT is in place, they are giving their consent.
However, this is informed, opt-out consent (the visitor isn’t explicitly giving their consent through a signed document handed to them on a tablet before they enter the store, for example) and comes with a large number of privacy concerns.
First, the signs may be within clear view of people entering the store, but add a crowd, busy street, people rushing to do errands, and can it be guaranteed that every visitor has seen or stopped to read the sign? No.
Second, the technology may be installed at shop entrances. So, if a customer walks up to the store, notices the sign, doesn’t consent, and walks away, their image may have already been captured.
Third, the shop isn’t giving customers an option. If they want to use the store, they have to consent. This removes the “freely” and “explicitly” given element that some laws require when it comes to consent.
The complex process businesses would need to go through to obtain consent is perhaps why there are few examples within countries with strict informed, opt-in consent requirements. Having to get all customers to sign a consent form to enter the shop would be time-consuming, expensive, and perhaps off-putting. And that’s before they’ve even addressed what would happen if a visitor refused their consent (how would they remove their image from the cameras while keeping the images of those who’ve consented?).
But in countries with more relaxed laws and where opt-out consent is deemed adequate enough, FRT within public places, and especially within supermarkets, appears to be growing at an alarming rate.
Essentially, by legally requiring informed, opt-in consent, countries are severely restricting FRT use within private settings while greatly bolstering citizens’ privacy protections. And the fact that China is now at the forefront of these protections should be a clear sign for all other FRT-using countries that haven’t got such measures in place that they need to act–fast.
Methodology
Using the top 25 countries according to GDP, we searched for relevant legislation and rulings surrounding the use of facial recognition technology within private settings. These laws/rulings may or may not apply to governments or law enforcement agencies and/or separate rules may apply to these. This has not been covered in this research.
Furthermore, many countries have stipulations within their legislation that give governments and law enforcement agencies the right to use FRT in cases of public security and other such circumstances. In these types of cases, government agencies may be able to overrule privacy requirements in private businesses if they can prove that it is in the interest of public safety. Again, this is beyond the scope of our research.
For a full list of sources and legislation, please visit here.
Data researcher: Rebecca Moody