Name: Dislib Distributed Training - Cache OFF Contact Person: cristian.tatu@bsc.es Access Level: public License Agreement: Apache2 Platform: COMPSs Machine: Minotauro-MN4
PyTorch distributed training of CNN on GPU. Launched using 32 GPUs (16 nodes). Dataset: Imagenet Version dislib-0.9 Version PyTorch 1.7.1+cu101
Average task execution time: 84 seconds
Type: COMPSs
Creators: Cristian Tatu, The Workflows and Distributed Computing Team (https://www.bsc.es/discover-bsc/organisation/scientific-structure/workflows-and-distributed-computing/)
Submitter: Cristian Tatu
Evaluation of Swin Transformer and knowledge transfer for denoising of super-resolution structured illumination microscopy data
In recent years, convolutional neural network (CNN)-based methods have shown remarkable performance in the denoising and reconstruction of super-resolved structured illumination microscopy (SR-SIM) data. Therefore, CNN-based architectures have been the main focus of existing studies. Recently, however, an alternative and highly competitive deep learning architecture, ...