Apple’s Siri Might Understand And Interpret Human Emotions In The Near Future

Apple's Siri

Table of Contents

“Siri, do I look sad?” just might become one of the oft-repeated queries of iPhone users to Apple’s Siri. A promising insight from Apple’s patent has revealed that the company could be developing a way to add facial analysis to Siri. This is in order to interpret a user’s request like analyzing certain emotions tied to certain queries.

Such a breakthrough in the field of artificial intelligence has yet to be fully realized though. That means the said technology will take time to develop. Nevertheless, Apple’s initial aim is to reduce the number of times a person speaks a request or question to Siri and reduce misinterpretation.

In order to do this, Siri will have to analyze emotions or at least attempt to.

Apple's Siri

“Intelligent software agents can perform actions on behalf of a user. Actions can be performed in response to a natural-language user input, such as a sentence spoken by the user. In some circumstances, an action taken by an intelligent software agent may not match the action that the user intended,” as stated in Apple’s patent.

“As an example, the face image in the video input… may be analysed to determine whether particular muscles or muscle groups are activated by identifying shapes or motions,” it adds.

Apple's Siri

If successful, Apple’s Siri might also resort to personalized responses or actions which can then be further customized. Email or personal music playlists come to mind as a few of the uses for such A.I. advancement.

Apple’s Siri will definitely then be made to recognize whether a user is annoyed or pleased depending on facial analysis or many other factors like the voice tone or the manner of the speaking. At the moment, Apple hasn’t revealed any further developments regarding its plans for a more sensitive Siri.

RELATED: Find Your Silenced iPhone with iCloud and Siri

Picture of Kossi Adzo

Kossi Adzo

Kossi Adzo is a technology enthusiast and digital strategist with a fervent passion for Apple products and the innovative technologies that orbit them. With a background in computer science and a decade of experience in app development and digital marketing, Kossi brings a wealth of knowledge and a unique perspective to the Apple Gazette team.

One thought on “Apple’s Siri Might Understand And Interpret Human Emotions In The Near Future

  1. Could it be possible that Siri already has this capability in its programming, simply because we are human and the emotional aspect of our species would be unavoidably in its computer coding? After all, we are incapable of behaving without emotion so it stands to reason that we would be incapable of creating a program of artificial intelligence without emotion?

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts