SkaftenickiTorchMetrics v0.11 — Multimodal and nominalWe are happy to announce that Torchmetrics v0.11 is now publicly available. In Torchmetrics v0.11 we have primarily focused on cleanup of…Dec 5, 2022Dec 5, 2022
Sean NarenthiranLightning Transformers 0.2 — New 🤗Tasks, Community Features and Big Model Training & InferencePairing 🤗 Transformers and Lightning has become increasingly popular, leveraging Lightning to hide away the boilerplate of your training…Nov 29, 2022Nov 29, 2022
SkaftenickiTorchMetrics v0.10 — Large changes to classificationsTorchMetrics v0.10 is now out, and it brings significant and future breaking changes to the hole classification package. This blog post…Oct 17, 2022Oct 17, 2022
SkaftenickiTorchMetrics v0.9 — Faster forwardV0.9 of TorchMetrics is now out and it brings important changes to how forward works. This blogpost goes over the improvements we have…May 31, 2022May 31, 2022
Sunil SrinivasaTurbocharge Multi-Agent Reinforcement Learning with WarpDrive and PyTorch LightningAuthors: Sunil Srinivasa, Tian Lan, Huan Wang, Stephan Zheng, and Donald RoseMay 20, 20221May 20, 20221
Carlos MocholíPyTorch Lightning 1.6Support Intel’s Habana Accelerator, New efficient DDP strategy (Bagua), Manual Fault-tolerance, Stability and Reliability.May 10, 2022May 10, 2022
Kaushik BokkaSupercharge your training with zero code changes using Intel’s Habana Accelerator ⚡️We recently added support for Habana’s Gaudi AI Processors, which can be used to accelerate deep learning training workloads.May 5, 2022May 5, 2022
PyTorch Lightning teamExperiment with Billion-Parameter Models Faster using DeepSpeed and Meta TensorsPyTorch’s Meta Tensors can save you huge amounts of time. PyTorch Lightning, together with DeepSpeed and just a single line of code, allows…Apr 19, 2022Apr 19, 2022
SkaftenickiTorchmetrics v0.8 — Paper, Faster collection and more metricsWe are excited to announce that Torchmetrics v0.8 is now available. The release includes a number of new metrics in the classification and…Apr 14, 2022Apr 14, 2022
Carlos MocholíBagua: A new, efficient, distributed training strategy available in PyTorch Lightning 1.6Support for a new distributed training strategy in collaboration with the Bagua team.Apr 12, 20221Apr 12, 20221