Clarifai says it deleted 3 million OkCupid user photos and the facial-recognition models trained on them

Clarifai Deletes OkCupid User Photos Following FTC Settlement

April 21, 2026 - 7:20 am

AI company Clarifai has confirmed it deleted approximately three million OkCupid user photos and the facial-recognition models trained on them, following a settlement between the US Federal Trade Commission (FTC) and dating site OkCupid over a privacy violation that dates back to 2014.

The Details:

According to a document seen by Reuters, Clarifai certified its deletion to the FTC on April 7, 2026, and confirmed to Congress on April 16 that it had deleted any models trained on the data and hadn't shared it with third parties.

This incident originated over a decade ago when OkCupid's founders were investors in Clarifai, and Clarifai's founder Matthew Zeiler reached out to request access to OkCupid’s data. He noted OkCupid's potential for "awesome" datasets.

OkCupid provided nearly three million user photos, along with location and demographic data, without any formal agreement, restrictions on use, or notification to users.

FTC Action:

The FTC opened its investigation after a New York Times article in 2019 and announced the settlement on March 30, 2026. While the order prohibits OkCupid and parent company Match Group (Tinder) from misrepresenting their data practices for 20 years, it includes no financial penalties.

Clarifai was not accused of wrongdoing as they received the data through a request rather than initiating the transfer.

Reactions:

The settlement drew criticism. Representative Lori Trahan (D-MA) called Clarifai's deletion confirmation a "step in the right direction" but added: “the FTC should have never settled for less in the first place."

Legal analysts note that, unlike many prior FTC privacy orders, this one imposes no ongoing compliance program requirements or affirmative notification obligations on the companies involved.

The sensitivity of the data used—Clarifai's facial recognition technology can identify individuals and analyze age, race, and gender from images—makes the absence of penalties particularly striking.