Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
June 7, 2019 08:02 pm PDT

Training a modest machine-learning model uses more carbon than the manufacturing and lifetime use of five automobiles

In Energy and Policy Considerations for Deep Learning in NLP, three UMass Amherst computer science researchers investigate the carbon budget of training machine learning models for natural language processing, and come back with the eyepopping headline figure of 78,468lbs to do a basic training-and-refinement operation.

This is about five times the lifetime, cradle-to-grave carbon budget for a car, including manufacture.

The bulk of the carbon is expended at the fine-tuning stage, which involves a lot of trial and error. More complex models, like the Transformer model (employed in machine translation) use even more carbon -- 626,155lbs.

Text and language processing are by no means the most compute-intensive (and hence carbon-intensive) forms of machine learning model -- things like vision systems are even more complex.

One implication the authors explore: the computational intensity of today's machine learning research has priced it outside the realm of most academic researchers, moving the most important work in the field to private firms whose research doesn't necessarily contribute to our collective store of knowledge.

Whats more, the researchers note that the figures should only be considered as baselines. Training a single model is the minimum amount of work you can do, says Emma Strubell, a PhD candidate at the University of Massachusetts, Amherst, and the lead author of the paper. In practice, its much more likely that AI researchers would develop a new model from scratch or adapt an existing model to a new data set, either of which can require many more rounds of training and tuning.

Read the rest


Original Link: http://feeds.boingboing.net/~r/boingboing/iBag/~3/bxn6B6h3xQ4/extinction-by-nlp.html

Share this article:    Share on Facebook
View Full Article