SE(3)-Transformers are versatile graph neural networks unveiled at NeurIPS 2020. NVIDIA just released an open-source optimized implementation that uses 9x less memory and is up to 21x faster than the baseline official implementation.
This was cited by boboviz on the Rosetta forum.
I would think that it would be of great interest here.
|ID: 57254 | Rating: 0 | rate: / Reply Quote|