Apple is Studying Mood Detection Using iPhone Data. Critics Say the Tech is Flawed

New information about a current study between UCLA and Apple shows that the iPhone maker is using facial recognition, patterns of speech, and an array of other passive behavior tracking to detect depression. The report, from Rolfe Winkler of The Wall Street Journal, raises concerns about the company’s foray into a field of computing called emotion AI, which some scientists say rests on faulty assumptions.

Apple’s depression study was first announced in August 2020. Previous information about the study suggested the company was using only certain health data points, like heart rate, sleep, and how a person interacts with their phone to understand their mental health. But The Wall Street Journal report says researchers will monitor people’s vital signs, movements, speech, sleep, typing habits—even the frequency of typos, according to the report—in an effort to detect stress, depression, and anxiety. Data will come from both the Apple Watch and iPhone, utilizing the latter’s camera and mic. Data obtained through Apple’s devices will be compared against mental health questionnaires and cortisol-levels data (ostensibly retrieved from participants’ hair follicles).

Scroll to Top
Scroll to Top