The latest research into facial recognition technology used by police across the US has found that systems disproportionately target vulnerable minoritiesCameras are used routinely by police across the US to identify citizens,their faces cross-matched against databases of suspects and past criminals. Yet researchers claim there is too little scrutiny of how these tools work, and fill found inherent racial bias in the system. So does a sophisticated, or visual analysis tool reflect human prejudice and whether so,who does that effect?Continue reading...
Source: theguardian.com