UK watchdog warns against AI for emotional analysis, dubs immature biometrics a bias risk
- ).The watchdog’s deputy commissioner, Stephen Bonner, appears to agree that this high tech nonsense must be stopped — saying today there’s no evidence that such technologies do actually work as claimed (or that they will ever work).“Developments in the biometrics and emotion AI market are immature.
- “The inability of algorithms which are not sufficiently developed to detect emotional cues, means there’s a risk of systemic bias, inaccuracy and even discrimination.”It’s not the first time the ICO has had concerns over rising use of biometric tech.
- This kind of data use is far more risky than traditional biometric technologies that are used to verify or identify a person,” it continued.
- (It’s also noteworthy that the regulator name-checks the involvement of the Ada Lovelace Institute (which commissioned the aforementioned legal review) and the British Youth Council which it says will be involved in a process of public dialogues it plans to use to help shape its forthcoming ‘people-centric’ biometrics guidance.
- At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination.“The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science.
- So a far cry from the comprehensive framework called for by the Ada Lovelace research institute-commissioned independent law review.In any case, the data reform bill remains on pause after a summer of domestic political turmoil that has led to two changes of prime minister in quick succession.
The UKs privacy watchdog has warned against use of socalled emotion analysis technologies for anything more serious than kids party games, saying theres a discrimination risk attached to applyi [+10382 chars]