Serious Issues Posed by Emotion Analytics in the Courtroom

emotion
Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on email

It is anyone’s guess whether the jurors in the classic film, “12 Angry Men” were, in fact, angry. However, emotion analytics software can give lawyers just that kind of knowledge today. As software becomes more sophisticated, the question is not whether lawyers can use this software, but whether they should. The following discussion of the software, its potential uses and the resulting individual privacy and courtroom implications will help those working in the legal system evaluate the appropriateness of emotion analytics in legal settings.

Emotion Analytics Primer

Emotion analytics refers to the collection and analysis of verbal and nonverbal cues to determine a person’s emotional state. In the nonverbal realm, the software makes use of research showing that all humans convey certain emotions with the same small, individual movement of facial muscles, termed microexpressions. Scientists have cataloged these minutiae of facial expression – the slight movement of mouth, nose, eyes and eyebrows, as well as the shifting texture of skin. Emotion analytics software records a set of facial microexpressions and using an algorithm, determines emotion by comparing them to a database.

Using a person’s unique physical characteristics to infer information is not a new phenomenon. Facebook and Google have long used facial geometry to identify individuals. However, the move from reading an identity to reading emotion is a leap – and businesses recognize the potential. Emotion analytics software can gauge audience or customer reaction, thus helping businesses design effective products and marketing campaigns. However, use of the software in legal settings poses unique issues.

Potential Courtroom Use

Determining a potential juror’s suitability or assessing a jury’s reaction to courtroom proceedings has always been essentially a matter of intuition. Emotion-reading software based on a computer algorithm appeals to litigators because it seemingly offers more certainty than an attorney’s or a consultant’s “hunch.” The psychology of picking and convincing a jury is all about making assumptions about people’s attitudes and predilections based on all sorts of information. Emotion analytics software adds another way to determine the ideal juror, the best case theme, or the most productive line of questioning.

Although the use of such software may give attorneys an edge, it presents serious privacy issues. Participants in mock trials and focus groups can consent, but jurors in real cases would remain unaware that software was collecting and analyzing their facial expressions and emotions. The growing use of courtroom cameras and surreptitious use of smartphone cameras may allow lawyers and jury consultants to utilize the software with or without consent. Such use threatens important underpinnings of the justice system:

Juror Privacy: Whether used contemporaneously or retroactively, the software would be scrutinizing jurors’ facial information without their consent. Such use may violate state biometric privacy laws and jurors’ expectations of privacy.

Courtroom Function: While use of emotion analysis in real time may not physically disrupt a courtroom, jurors’ mere awareness of such scrutiny may jeopardize the courtroom atmosphere and distract from the merits of the trial.

Juror Safety: If emotion monitoring reveals a juror’s negative attitude toward a defendant, the juror may be intimidated or harassed in an attempt to change the juror’s opinion. The specter of knowing what a juror is thinking in real time invites the next step – acting on that knowledge.

Judicial Resources: If attorneys learn jurors’ thoughts, they may use that information to support a motion for a mistrial or juror disqualification. More frequent motions and retrials would require more judicial resources from an already strapped system.

Conclusion

The days of hiding behind a poker face may be drawing to a close. Marketers already employ facial recognition and emotion- reading software to provide information on customer and audience reaction. Use in a legal setting cannot be far behind, yet legal use presents some important issues requiring careful thought by the legal profession. Emotion-reading software used on real juries may violate jurors’ privacy, cause courtroom disruption, endanger individual jurors and tax judicial resources. Understanding the technology and its implications will allow judges and attorneys to adopt rules. Without timely forethought and guidance, technology threatens to outpace safety and good practice, posing serious risks to the justice system.

Richard Martinez

TRENDING ARTICLES

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to Our Newsletter

You have successfully subscribed!

X