Advanced search

Message boards : Number crunching : Accelerating SE(3)-Transformers Training Using an NVIDIA Open-Source Model Implementation

Author Message
Jim1348
Send message
Joined: 28 Jul 12
Posts: 790
Credit: 1,561,693,721
RAC: 39,552
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 57254 - Posted: 26 Aug 2021 | 14:20:18 UTC

SE(3)-Transformers are versatile graph neural networks unveiled at NeurIPS 2020. NVIDIA just released an open-source optimized implementation that uses 9x less memory and is up to 21x faster than the baseline official implementation.

https://developer.nvidia.com/blog/accelerating-se3-transformers-training-using-an-nvidia-open-source-model-implementation/

This was cited by boboviz on the Rosetta forum.
I would think that it would be of great interest here.

Post to thread

Message boards : Number crunching : Accelerating SE(3)-Transformers Training Using an NVIDIA Open-Source Model Implementation