← Back to Blog Home

I was ready to start writing about the FTC’s recent recommendations on personally identifiable data (or PIIs), when the agency suddenly lobbed a new guideline onto the scene. Released last Monday, Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies is focused on the risks involved in not securing photographic images and data. It’s another indication that US regulatory rules will be leaning toward a broader definition of what it means to relate “anonymous” data back to an individual.

“Facing Facts” doesn’t read like a standard-issue government agency report. It opens by referencing the Speilberg movie, Minority Report, and its vision of a future where ads are served up based on scans of biometric data.

That world is not quite here yet, but the FTC’s larger point with these new guidelines–read as best practices–is that facial recognition technology has become quite sophisticated and potentially disruptive.

Not only is it possible to use non-proprietary software and hardware to pull key information out of digital images, it’s already being done on a commercial basis. Retailers now install digital signage in mall kiosks to serve up ads based on the gender, age, and other demographic information of the consumers viewing the informational screens.

Even more impressive is that existing facial recognition technology has reached a high-level of accuracy in comparing photos and finding matches. The National Institute of Standards and Technology (NIST) reports that the false reject rate–percentage of comparisons incorrectly rejected–has been cut in half every two years. At the NIST’s Face Recognition Grand Challenge in 2010, the winning company achieved a false reject rate of 2.1% while still delivering a false acceptance rate of .1%.

The FTC’s overall message for companies capturing facial data is three-fold:

  1. Build privacy into products “by design”
  2. Give consumers choice when capturing this information
  3. Be transparent about what’s being done with the images

There are obvious and direct implications for companies in consumer retail and social networking (Facebook typically sees over 1 billion photos monthly). But the scope of this FTC announcement is larger than one might at first guess.

Once upon a time, a file directory of employee digital photos—say for badges–would not have been considered worthy of much security—it’s technically not even considered PII.  This latest FTC announcement, however, indicates a far different view of what it means to trace ostensibly anonymous data back to an individual.

One of the keys points in the FTC’s argument –one I noted in a previous post—is that easy-to-access publicly available data on social networks changes the rules. In a Carnegie Mellon study cited by the FTC, researchers were able to match a set of unidentified photos against existing tagged Facebook photos.

Here’s what the FTC had to say on how companies should now treat digital facial data:

“First, companies should maintain reasonable data security protections for consumer’s images and the biometric information collected from those images to enable facial recognition (for example, unique measurements such as size of features or distance between the eyes or the ears). As the increasing public availability of identified images online … companies that store such images should consider putting protections in place.”

Do your company’s digital photo images—headshots, events, publicity, etc.—have the appropriate access rights? Do you even know where they are?

While you’re pondering that, I’ll be giving Minority Report a second viewing.

 

Leave a Comment