Wednesday, September 14, 2016

It's going to be harder to stay anonymous

Machine learning is now making it possible to recognize standard techniques to obscure details like pixelation.

PIXELATION HAS LONG been a familiar fig leaf to cover our visual media’s most private parts. Blurred chunks of text or obscured faces and license plates show up on the news, in redacted documents, and online. The technique is nothing fancy, but it has worked well enough, because people can’t see or read through the distortion. The problem, however, is that humans aren’t the only image recognition masters around anymore. As computer vision becomes increasingly robust, it’s starting to see things we can’t.
Researchers at the University of Texas at Austin and Cornell Tech say that they’ve trained a piece of software that can undermine the privacy benefits of standard content-masking techniques like blurring and pixelation by learning to read or see what’s meant to be hidden in images—anything from a blurred house number to a pixelated human face in the background of a photo. And they didn’t even need to painstakingly develop extensive new image uncloaking methodologies to do it. Instead, the team found that mainstream machine learning methods—the process of“training” a computer with a set of example data rather than programming it—lend themselves readily to this type of attack.

This new technique doesn't even have to be fully accurate to be concerning!
Even if the group’s machine learning method couldn’t always penetrate the effects of redaction on an image, it still represents a serious blow to pixelation and blurring as a privacy tool, says Lawrence Saul, a machine learning researcher at University of California, San Diego. “For the purposes of defeating privacy, you don’t really need to show that 99.9 percent of the time you can reconstruct” an image or string of text, says Saul. “If 40 or 50 percent of the time you can guess the face or figure out what the text is then that’s enough to render that privacy method as something that should be obsolete.”

No comments:

Post a Comment