Home > Media News > IBM Decides not to Offer Facial Recognition Business Anymore

IBM Decides not to Offer Facial Recognition Business Anymore
9 Jun, 2020 / 11:33 AM / omnes

899 Views

International Business Machines Corp has announced that the company will no longer offer facial recognition or analysis software. This was disclosed by IBM’s Chief Executive Officer Arvind Krishna in a letter to Congress calling for new efforts to pursue justice and racial equity.

Arvind Krishna says that the company will stop offering facial recognition software and opposes any use of such technology for purposes of mass surveillance and racial profiling. He also called for new federal rules to hold police more accountable for misconduct. IBM did not explain the timing of its decision to exit facial recognition development but Krishna told lawmakers “now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.”

The announcement came as the United States grapples with nationwide protests over the death of a black man, George Floyd, under police custody in Minneapolis, and rising calls for police reform. “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms,” Krishna wrote, adding “technology can increase transparency and help police protect communities but must not promote discrimination or racial injustice.”

The CEO contended that AI was a “powerful tool” for law enforcement, but that its use had to be kept in check with audited tests for bias. He also pushed for technology that improved accountability and transparency, such as body cameras.

Government officials across the country have proposed reforms to address police brutality and racial injustice aimed at boosting oversight of law enforcement agencies. More than one report has indicated that facial recognition systems can be biased against non-whites and women, particularly if the training data includes relatively few people from those groups. And while some facial recognition systems may only correlate faces with publicly available data, there are concerns this could be used for tracking and profile generation that could be used to intimidate people or otherwise limit their real-world privacy.

Source- Reuters