10:15 12/06/2019 | 7newstar.com
Total post : 1,048
Facebook releases PyTorch Hub in beta, an API and workflow for research reproducibility and support
(Tech) PyTorch Hub can quickly publish pretrained models to a GitHub repository by adding a hubconf.py file and publish models using a GitHub pull request. PyTorch Hub comes with support for models in Google Colab and PapersWithCode.
At launch, PyTorch Hub comes with access to roughly 20 pretrained versions of Google’s BERT, WaveGlow and Tacotron 2 from Nvidia, and the Generative Pre-Training (GPT) for language understanding from Hugging Face. There’s also a number of audio and generative models as well as a number of computer vision models trained using the ImageNet database.
Another popular machine learning framework, TensorFlow, introduced TensorFlow.Text, a library for preprocessing language understanding AI models based on the recently introduced RuggedTensor.
The news comes at the start of the International Conference on Machine Learning (ICML) in Long Beach, California. For the first time this year, ICML encouraged researchers to submit code alongside their research in order to prove results. As a result, about 36% of submitted papers and 67% of accepted papers shared their code.
Researchers associated with an academic university were far more likely to share code than those associated with a corporation or business. Ninety percent of work submitted by academia included code and 27.4% from industry included code.
In other PyTorch news, last month PyTorch 1.1 was released with TensorBoard support for machine learning training visualizations and an improved JIT compiler.