IA for emotional recognition, are a danger?

Returning with the items of the technology leading the fourth industrial revolution, the Artificial Intelligence, we find ourselves with a dilemma. What is happening within the developer community, AI is a debate on whether it is safe to continue the investigation of Emotional Recognition.

Contents

Types of Recognition

This topic what we have stated on previous occasions, however, it is time to explain thoroughly referred to in this derivative of IA. The emotional recognition is that the platform analyzes the facial features of a person. This form seeks to determine the type of emotion that you are experiencing. You could say that it is a derivative of the facial recognition, however, is broader than that. Considering that the platforms IA of emotional recognition using the facial to work.

The dilemma

The Research Institute IA Now in its annual report suggested that this development should be prohibited totally. They argue that the emotional recognition with AI is a danger. Despite the lack of evidence that the AI can determine how we feel is a growing market. It is estimated that the recognition of emotions is at least a market of $ 20 billion, and is growing rapidly. The technology is currently being used to evaluate job applicants and people suspected of crimes, and is being tested for other applications, such as in the headset VR to infer the emotional states of the players. There is also evidence that the recognition of emotions can amplify the disparities of race and gender. The regulators should intervene to restrict its use, and until then, the companies of artificial intelligence should stop deploying it, said AI Now. Specifically, he cited a recent study by the Association of Psychological Sciences, who spent two years reviewing more than 1,000 reports on the detection of emotions and concluded that it is very difficult to use only facial expressions to know precisely how it feels to someone. In his report, the IA Now called on governments and companies to stop using the facial recognition technology for sensitive applications until the risks have been adequately studied, and attacked the AI industry for its “racism, systemic, misogyny and lack of diversity.“ Also called for the mandatory disclosure of the environmental impact of the AI industry. The following two tabs change content below. I am a student of economics, interested in innovation and technological development, always faithful to that tomorrow will be a better day.

Did you like the content? Share it

Cryptocurrency Market