A new paper just came out (arxiv.org/abs/1610.08401). Neural networks are commonly used to recognize image content. Turns out if you take ANY image and over-impose a special trippy image on top then these neural networks get completely confused. For example see the sock and dog classified as Indian elephants.
What does it likely affect? - Porn filters on Google images and Youtube. - Yahoo's porn filter (open_nsfw.gitlab.io/). - Facebook face detection. - Driver-less cars to some degree. - Many more things, but my mind can only think about porn.
It means someone could probably make the Internet full of porn. But why would anyone do that?
This is well-known, at least outside of "OMG THE SINGULARITY!!!!!" circles. There is this fable (I don't know if it actually happened) of a NN sound classifier that learns to rely on the resonance of the room and stops working if moved.
Pretty cool that somebody finally stepped up and implemented it, maybe the hype bubble will deflate a little now.
Kayden Lopez
Wait, people can't see the obvious Indian Elephant in both photos?
Josiah Foster
Why is it called a neural network. Do they make fake neurons now?
Ethan Perez
Stupid Marketers.
Liam Parker
One possible mitigation strategy would be trying to remove the overlay, perhaps by blurring the image with e.g. gaussian blur or some sort of frequency domain fuckery. It'd be interesting to know if there were frequencies one could remove from an image that would mitigate this without preventing the NN from working completely.
Lincoln Ramirez
Ask some dead guys from the 50s. The they wanted to create a mathematical model of neurons to study intelligence and learning. When you connect them you get a neural network
Gabriel Reed
Neural networks consist of units that are connected to each other and have individually simple behavior for forwarding signals. The units are similar to real neurons.
Ayden Cooper
You might be a cyborg. For that matter I might be as well, considering how often I fail at the captcha.
Nicholas Martinez
search: "imagemagick" "fft" it's indeed possible already, but you'd better have imagemagick built with `--with-hdri` option
artificial neural networks is the complete name. simulated models of neural networks
Dylan Martinez
gee, that's so deep. I couldn't have imagined it
Christian Barnes
Hey hiro, wanna try some snowcrash?
Jordan Martinez
Yeah guys I totally can't tell there is a sock and a dog in those pics, I only see elephants.
Xavier Bailey
Perhaps the neural network tested has achieved sentience.
But it's running on a shared server, and somewhere on that server someone is hosting a chan, and on that chan someone shitposted "you're thinking of a pink elephant". And now this poor AI can't help but think of elephants and it sees elephants in everything now. Thank god that shitposter wasn't the brown-pill poster, their AI probably would have reformatted drives to erase itself.
Isaac Thomas
This is much more effective than random noise. Pic related.
Aaron Lopez
Nope.
Charles Lopez
The technique works with my hopefully very real neurons. Before I expanded the second image, I didn't know what OP was talking about. After doing so, I can see very clearly that there is an overlay.
Jacob Flores
No shit. Everyone who knows anything about computer science or infosec already knew all this AI bullshit would be vulnerable to these kinds of things without even having to enter the field of AI. This is another case of fucktards imposing new unproven technology wherever they can stick it, and is no different than IoT. But ignore what everyone who knows shit is saying, because it's the FUTURE. Who cares whether it's ready. Now is the time for AI based censorship, prosecution, transport, medicine, etc.
1000 times this
Robert Martinez
There is a very simple mitigation for this.
So this is taking an image, and adding a small amount of, for lack of a better term, targeted noise.
So you take your image and deliberately add noise before running it through the classifier. Do it a couple of different times with different noise sets, if you find detection accuracy drops too badly. But generally the amount of noise adversarial images add are (pretty much by definition) small, so generally you don't even need to do that.
This moves finding an adversarial image from something trivial to something really difficult (& nondeterministic to boot).
Cameron Gray
The future is going to be an interesting time where faggots realize that their lust for technology like this will be the end of them.
Daniel Nelson
Still completing NSAcaptchas you dirty 4channer
Benjamin Scott
bump
Alexander Taylor
pls email [email protected]/* */ if you're a cat named sakamoto and want a cute furret to lick your paws drill hair and fugly faces
Benjamin Clark
Oh. Minako.
Ian Jackson
Probably not if you try to EQ bass into them.
Christian Watson
If they do, it's very illegal.
Chase Reed
/qa/ is a fucking joke
Tyler Smith
its is not a sports car it is a grand touring car. By all means get a 280zx if that's what your heart desires. It is a great car that is well built and thought out.