Use of facial recognition tech 'dangerously irresponsible'
Black and minority ethnic people could be falsely identified and face questioning because police have failed to test how well their systems deal with non-white faces, say campaigners.
At least three chances to assess how well the systems deal with ethnicity were missed over the past five years, the BBC found.
Campaigners said the tech had too many problems to be used widely.
"It must be dropped immediately," said privacy rights group Big Brother Watch.
Please click the link below to read the entire article.