Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
August 27, 2017 04:00 pm

AI Training Algorithms Susceptible To Backdoors, Manipulation

An anonymous reader quote BleepingComputer: Three researchers from New York University (NYU) have published a paper this week describing a method that an attacker could use to poison deep learning-based artificial intelligence (AI) algorithms. Researchers based their attack on a common practice in the AI community where research teams and companies alike outsource AI training operations using on-demand Machine-Learning-as-a-Service (MLaaS) platforms. For example, Google allows researchers access to the Google Cloud Machine Learning Engine, which research teams can use to train AI systems using a simple API, using their own data sets, or one provided by Google (images, videos, scanned text, etc.). Microsoft provides similar services through Azure Batch AI Training, and Amazon, through its EC2 service. The NYU research team says that deep learning algorithms are vast and complex enough to hide small equations that trigger a backdoor-like behavior. For example, attackers can embed certain triggers in a basic image recognition AI that interprets actions or signs in an unwanted way. In a proof-of-concept demo of their work, researchers trained an image recognition AI to misinterpret a Stop road sign as a speed limit indicator if objects like a Post-it, a bomb sticker, or flower sticker were placed on the Stop sign's surface. In practice, such attacks could be used to make facial recognition systems ignore burglars wearing a certain mask, or make AI-driven cars stop in the middle of highways and cause fatal crashes.

Read more of this story at Slashdot.


Original Link: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/TVx0SAofeAs/ai-training-algorithms-susceptible-to-backdoors-manipulation

Share this article:    Share on Facebook
View Full Article

Slashdot

Slashdot was originally created in September of 1997 by Rob "CmdrTaco" Malda. Today it is owned by Geeknet, Inc..

More About this Source Visit Slashdot