Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
March 5, 2019 10:03 pm

Google Open-Sources GPipe, a Library For Training Large Deep Neural Networks

An anonymous reader quotes a report from VentureBeat: Google's AI research division today open-sourced GPipe, a library for "efficiently" training deep neural networks (layered functions modeled after neurons) under Lingvo, a TensorFlow framework for sequence modeling. It's applicable to any network consisting of multiple sequential layers, Google AI software engineer Yanping Huang said in a blog post, and allows researchers to "easily" scale performance. As Huang and colleagues explain in an accompanying paper ("GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism"), GPipe implements two nifty AI training techniques. One is synchronous stochastic gradient descent, an optimization algorithm used to update a given AI model's parameters, and the other is pipeline parallelism, a task execution system in which one step's output is streamed as input to the next step. Most of GPipe's performance gains come from better memory allocation for AI models. On second-generation Google Cloud tensor processing units (TPUs), each of which contains eight processor cores and 64 GB memory (8 GB per core), GPipe reduced intermediate memory usage from 6.26 GB to 3.46GB, enabling 318 million parameters on a single accelerator core. Without GPipe, Huang says, a single core can only train up to 82 million model parameters. That's not GPipe's only advantage. It partitions models across different accelerators and automatically splits miniature batches (i.e., "mini-batches") of training examples into smaller "micro-batches," and it pipelines execution across the micro-batches. This enables cores to operate in parallel, and furthermore accumulate gradients across the micro-batches, thereby preventing the partitions from affecting model quality.

Read more of this story at Slashdot.


Original Link: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/exzNlQ8jEH0/google-open-sources-gpipe-a-library-for-training-large-deep-neural-networks

Share this article:    Share on Facebook
View Full Article

Slashdot

Slashdot was originally created in September of 1997 by Rob "CmdrTaco" Malda. Today it is owned by Geeknet, Inc..

More About this Source Visit Slashdot