Skip to main content

AI Emotional Detection and Response

This is a response to a question on Quora about the challenges of developing AI systems capable of dealing with human emotions. 

Going from the other answers [on Quora] I would say that the first obstacle might be to get people to believe it is possible. I am strongly of the opinion that it is operationally possible and conceptually a “slam dunk”. By ‘operationally’, I mean that we can train an AI to recognize all the various signals that people use to convey their state of mind, including temporal, social, geographical, and cultural context. We can do that in the same relatively well-established way we use GANs to work back and forth with photos, speech and other types of input data. If we can assemble the data, we can create an AI that both recognizes emotional states and operates appropriately by expressing the appropriate emotional state as a response.

Going from what we have seen in the past year, such a system would likely take less than a year to train, ground-up, both to ‘read’ human emotions and generate appropriate responses. It would become much better than humans. Other people answering here don’t seem to be aware that we are not working ‘ground up’. Work is already well underway.

Others answering here seem to believe that we are not as far along as we are with training and generative AI. They also seem to be articulating things that would make it difficult for a human being. They also seem hung up on what the responding system is ‘thinking’, assuming that both reading and expressing emotions requires a particular theory of mind. “Theorists have suggested that emotions are canonical responses to situations ancestrally linked to survival.” (Kragel et al. 2019) -- “... it is possible to argue that infants can see emotions in others even though they lack the sort of knowledge that, in the theory of mind view, is necessary to see patterns of changes in the face as expressions of emotions.” (Zamuner, 2013)

AI can read emotions by reading facial features. It can interpret visual cues and vocal cues separately, “Whereas by combining both audio and visual features, the overall system accuracy has been significantly improved up to 80.27 %.” (Rashid, et., et al 2013). We reveal emotions by vocal cues, body language, facial expressions, language, and more. Cardiac status can change with emotion and sensory apparatus is already in use for this. (Marin-Morales, et. Al 2018) If the sensory apparatus is available to read the cues, we can collect the data. If we can collect the data, we can use it to train AI. If we can train AI, we can train it to match and surpass the human ability to detect and respond to emotions.

We’ve passed the ‘can we do it’ point on this journey. We are now travelling through ‘how much cheaper, faster, and better can we do it’. Soon, we will arrive at ‘we can do it for free, instantly, and better than we can measure.

References

Marín-Morales, J., Higuera-Trujillo, J.L., Greco, A. et al. Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors. Sci Rep 8, 13657 (2018). Affective computing in virtual reality: emotion recognition from brain and heartbeat dynamics using wearable sensors - Scientific Reports

Philip A. Kragel et al. Emotion schemas are embedded in the human visual system.Sci. Adv.5,eaaw4358(2019).DOI:10.1126/sciadv.aaw4358 Emotion schemas are embedded in the human visual system | Science Advances

Rashid, M., Abu-Bakar, S.A.R. & Mokji, M. Human emotion recognition from videos using spatio-temporal and audio features. Vis Comput 29, 1269–1275 (2013). Human emotion recognition from videos using spatio-temporal and audio features - The Visual Computer

Zamuner, E. The Role of the Visual System in Emotion Perception. Acta Anal 28, 179–187 (2013). The Role of the Visual System in Emotion Perception - Acta Analytica

Comments

Popular posts from this blog

The system cannot execute the specified program

It always annoys me no end when I get messages like the following: "The system cannot execute the specified program." I got the above error from Windows XP when I tried to execute a program I use all the time. The message is hugely aggravating because it says the obvious without giving any actionable information. If you have such a problem and you are executing from a deep directory structure that may be your problem. It was in my case. Looking on the web with that phrase brought up a bunch of arcane stuff that did not apply to me. It mostly brought up long threads (as these things tend to do) which follow this pattern: 'Q' is the guy with the problem asking for help 'A' can be any number of people who jump in to 'help'. Q: I got this error "The system cannot execute the specified program." when I tried to ... [long list of things tried] A: What program were you running, what operating system, where is the program? What type of

Crucial SSD BIOS update

Executive summary: If Crucial Storage Executive can't see your Crucial drive, you may be able to fix that by re-running as Administrator.  Windows 10 continues to be a nightmare. The latest update has caused my machine to go wonky and it was suggested that, for reasons unknown, my SSD boot drive needed a BIOS update.  The drive in question is a Crucial MX500 CT500MX500 S SD1 and the BIOS update is from M3CR020 to M3CR023.  I initially attempted to burn and boot from a DVD ROM, but that came back with an error:  "could not find kernel image boot/vmlinuz64" You would think that something whose sole purpose is to boot into one program could get that right. That is, you would think that this very basic thing would have been tested prior to release. Sigh. No doubt there is a tortured route to get that thing to boot, but for me there was an easier way. You would think that Crucial would have offered that up first rather than the burnable image, but not in my case.  I then insta

When code writes code, what do developers do?

When code writes code, what do developers do? As we head further into a future where things are automated, people’s last refuge will be curation in a bright future or serving others in a dark future. Curation devolves into saying what you want and iterating through a few rounds of “not that.” As a programmer, I always found automated programming tools laughable. We are still mostly there, but ML/AI is changing that. At one point, many people sagely nodded their heads and said computers would *never* beat a human at chess. Never. I disagreed. I thought that it was ***inevitable*** that they ***would*** beat humans ‘hands down.’ That is well behind us now. It is only a matter of time until all human ‘jobs’ will be doable by machines. Each one, including being a companion. As of now, the bottleneck is energy and knowledge. I think we will crack fusion, but if we do not, we can still harvest billions of times what we use now from the sun in space. The knowledge is increasing rapidly.