PyTorch Lightning 1.0: PyTorch, just faster and more flexible
Source: Heise.de added 22nd Oct 2020The PyTorch Lightning project, presented for the first time in March 2019, has reached the milestone of version number 1.0, as the development team around founder William Falcon has now announced. The framework based on PyTorch for training machine learning models is now available with a final and stable API. PyTorch Lightning wants to make complex scientific ML and deep learning model training easier and more scalable. For this purpose, the framework – in contrast to PyTorch, for example – allows interaction between models on the one hand and separates consistently between the model training and the adjustments to the necessary computing infrastructure on the other.
Decouple model and platform code The promise of the PyTorch Lightning makers is: Complex ML and DL models can also be used on multiple GPUs, TPUs and CPUs and if necessary also in 16 – train bit accuracy without having to change the code. However, this requires a clear structuring of deep learning projects in four areas: The model code written for the training flows into the so-called LightningModule. Engineering code for hardware and platform-specific adjustments is left to the Lightning Trainer. The data to be examined can be organized either via PyTorch Dataloaders or in a LightningDataModule. Any additional code required, for example for logging, can be integrated into callbacks.
While the approach involves a sequence of processing steps in the modules of common ML frameworks such as PyTorch is also suitable for training and the productive use of complex models, the team behind PyTorch Lightning focuses on the particularly challenging cases of interacting models. Generative Adversarial Networks (GAN), BERT (Bidirectional Encoder Representations from Transformers) or even auto-encoders open up greater flexibility through interaction, but this can quickly become an obstacle when scaling projects. PyTorch Lightning therefore builds on the concept of a deep learning system that bundles the models that interact via complex rules in a collection.
PyTorch Lightning includes interacting models such as a car -Encoder in one system.
(Image: PyTorch Lightning)
With its stabilized API, the framework should now not only be ready for demanding research and test projects, but also for the productive operation of deep learning models – on a wide variety of computing platforms and with comprehensive Sk
brands: Falcon media: Heise.de
Related posts
Notice: Undefined variable: all_related in /var/www/vhosts/rondea.com/httpdocs/wp-content/themes/rondea-2-0/single-article.php on line 88
Notice: Undefined variable: all_related in /var/www/vhosts/rondea.com/httpdocs/wp-content/themes/rondea-2-0/single-article.php on line 88
Related Products
Notice: Undefined variable: all_related in /var/www/vhosts/rondea.com/httpdocs/wp-content/themes/rondea-2-0/single-article.php on line 91
Warning: Invalid argument supplied for foreach() in /var/www/vhosts/rondea.com/httpdocs/wp-content/themes/rondea-2-0/single-article.php on line 91