The Death Of Personality Tests At Work As We Know Them
Personality tests at work today mainly consist of surveys. They’re called self-report inventories. They’re popular because one can run many in a short time. However, their end is near. Computers can now read emotions. Linking these to personality types is only a matter of demand.
The Demand For Reading People’s Emotions
One should not underestimate the brain power being thrown at this. It starts with, Paul Ekman. He’s a psychologist who pioneered the study of facial expressions. Ekman has been at this since the 1970’s. To date, he has categorized over 5,000 muscle movements. Linking them, he’s able to show how the slightest wrinkles reveal emotions.
Next, take a look at the huge demand for this technology. It’s already close to $300 million. Near-term estimates put it at a few billion. More significantly, the Chinese are already using it to record their citizen’s faces. It produced the first billion-dollar startup in this industry.
The Future Of Personality Tests At Work
Think of a job interview. In the future, most big firms will video-record these. In fact, some do now. They don’t use interviewers early on. They use pre-recorded questions. Later, the firm assesses these manually.
Now, picture using facial recognition software with pre-selected questions. The questions would aim to see what emotions popped up in response. Answers would give other tells too. Linking this to assessments that look at words would add a quality control.
Go, further. As better security cameras pop up at work, emotional assessment of employees could happen in real time. With more data comes more accuracy in assessing personalities.
Death Of Self-Report Personality Tests
Facial assessments kill the advantage of self-report personality tests at work. It would be just as fast if not faster. Moreover, it solves one of the major problems with self-reports. They assume people know themselves on a conscious level. Most don’t. The internet has shown that.
People routinely give false answers on their hypothetical behaviors. This would not affect facial assessments though. It looks at unconscious facial movements as well as conscious ones. For example, Apple’s Face ID system looks at 30,000 data points. Now, combine this with Ekman’s work. Self-reports look like dinosaurs.
Of course, one still needs to see how accurate such technology will be. That means self-reports will be around for a bit longer. But in the end, no one can scientifically prove any personality test tells the truth. So, don’t expect this technology to get rid of the fairy dust with these.