Innovative Technology has been approved as a participant in a UK government programme piloting age verification technology for the retail sale of alcohol.
The Home Office invited organisations to propose digital methods of checking customers’ ages while purchasing alcohol.
Piloted in several convenience stores in the Northwest of England, the regulatory ‘Sandbox’ trial which will run between January and May 2022, will still require humans to check customers’ ages, but the technology will be used to help retailers abide by the law currently in place to prevent alcohol being mis-sold to anyone underage.
The Oldham-based company’s facial recognition technology has also been used in a pilot for staff access at the St Alban’s Catholic Primary School in Warrington, England.
Last year, the British Retail Consortium wrote to Prime Minister Boris Johnson, highlighting a 76 percent rise in abuse to staff during the pandemic, and citing identity checks as a trigger point. Co-op is already using facial recognition technology across more than a dozen stores to scan shoppers’ faces in real-time in a bid to reduce crime and abuse against staff.
“Asking people for age verification is one of the key triggers for violent abuse in the retail setting. People get upset when asked even though staff are just following the rules,” said British Retail Consortium London spokesperson Tom Holder to The Epoch Times.
Professor of Cybersecurity at Leicester’s De Montfort University Eerke Boiten told The Epoch Times that he believed that from the point of view of law and ethics, he questioned the use of biometric age identification as there is a requirement for “personal data to be accurate.”
“And estimation by its definition is not accurate. So we are creating age information and we know in advance it’s not going to be accurate. And the decision is going to be based on inaccurate information which is the wrong thing to be doing,” he said.
He also argued that any AI algorithm that estimates age on the basis of biometric information is going to have a bias in it and that is one reason to be cautious.
“I would argue this by definition is not fair processing because it doesn’t deal fairly with people who don’t fit this stereotype. The algorithm still works on the idea of patterns that hold true for some, but inevitably do not hold true for others, there are outliers, whose ages will consistently be inaccurately estimated by these algorithms. These people exist,” Boiten said.
In response, Innovative Technology manager Andrew O'Brien told The Epoch Times that Boiten was correct insofar as an “age estimate will never be as accurate as physically checking a person’s documentation. Of course, this is assuming the documentation is valid.”
“The product we utilise for the estimation is completely anonymous, which means no data is stored and any data used in the calculation is immediately deleted. The ICO published an opinion on the anonymous use of biometric data [i.e using face analysis without identifying the person] and found this was not in breach of any GDPR guidelines—especially when the process is attempting to help retailers adhere to the 2003 licensing act,” O'Brien said.
He said that one of the objectives during the trial is to see if this eases the pressure on the retailer, as an independent device that does not get tired or feel intimidated is guiding the decision.
On bias, O'Brien said that the quality of performance of the algorithm depends on a number of factors and that “the core factor is the data you use to train the algorithm.”
“Most algorithms to date have been trained using publicly available data, and simply put this data mostly contains white males,” he said.
O'Brien said that as a result, the algorithm will perform best on white males, and subsequently relatively worse on females and darker skin tones.
He said that the company sees minimal bias between genders and skin tones. Bias may also exist for different ages (ie younger v older), bearded vs non-bearded, pose angles, lighting conditions etc.
“We have implemented various logic and training regimes to minimise these effects and are committed to building upon our results. We build our own hardware and write all our own software so we have complete control over the performance,” he said.
“There will also be a distrust of new emerging technologies and this is certainly a challenge we see. However, we believe the solution to this is independent evaluation and certification of these technologies,” said O'Brien.