Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
December 18, 2019 12:45 am

Facebook Has a Neural Network That Can Do Advanced Math

Guillaume Lample and Francois Charton, at Facebook AI Research in Paris, say they have developed an algorithm that can calculate integrals and solve differential equations. MIT Technology Review reports: Neural networks have become hugely accomplished at pattern-recognition tasks such as face and object recognition, certain kinds of natural language processing, and even playing games like chess, Go, and Space Invaders. But despite much effort, nobody has been able to train them to do symbolic reasoning tasks such as those involved in mathematics. The best that neural networks have achieved is the addition and multiplication of whole numbers. For neural networks and humans alike, one of the difficulties with advanced mathematical expressions is the shorthand they rely on. For example, the expression x^3 is a shorthand way of writing x multiplied by x multiplied by x. In this example, "multiplication" is shorthand for repeated addition, which is itself shorthand for the total value of two quantities combined. Enter Lample and Charton, who have come up with an elegant way to unpack mathematical shorthand into its fundamental units. They then teach a neural network to recognize the patterns of mathematical manipulation that are equivalent to integration and differentiation. Finally, they let the neural network loose on expressions it has never seen and compare the results with the answers derived by conventional solvers like Mathematica and Matlab. The first part of this process is to break down mathematical expressions into their component parts. Lample and Charton do this by representing expressions as tree-like structures. The leaves on these trees are numbers, constants, and variables like x; the internal nodes are operators like addition, multiplication, differentiate-with-respect-to, and so on. [...] Trees are equal when they are mathematically equivalent. For example, 2 + 3 = 5 = 12 - 7 = 1 x 5 are all equivalent; therefore their trees are equivalent too. These trees can also be written as sequences, taking each node consecutively. In this form, they are ripe for processing by a neural network approach called seq2seq. The next stage is the training process, and this requires a huge database of examples to learn from. Lample and Charton create this database by randomly assembling mathematical expressions from a library of binary operators such as addition, multiplication, and so on; unary operators such as cos, sin, and exp; and a set of variables, integers, and constants, such as [pi] and e. They also limit the number of internal nodes to keep the equations from becoming too big. [...] Finally, Lample and Charton put their neural network through its paces by feeding it 5,000 expressions it has never seen before and comparing the results it produces in 500 cases with those from commercially available solvers, such as Maple, Matlab, and Mathematica. The comparisons between these and the neural-network approach are revealing. "On all tasks, we observe that our model significantly outperforms Mathematica," say the researchers. "On function integration, our model obtains close to 100% accuracy, while Mathematica barely reaches 85%." And the Maple and Matlab packages perform less well than Mathematica on average. The paper, called "Deep Learning For Symbolic Mathematics," can be found on arXiv.

Read more of this story at Slashdot.


Original Link: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/DJIYWXHIkRc/facebook-has-a-neural-network-that-can-do-advanced-math

Share this article:    Share on Facebook
View Full Article

Slashdot

Slashdot was originally created in September of 1997 by Rob "CmdrTaco" Malda. Today it is owned by Geeknet, Inc..

More About this Source Visit Slashdot