[Coral_current] Hrrformer Paper & Code

Mohammad Mahmudul Alam m256 at umbc.edu
Mon Jun 5 10:36:49 EDT 2023


Hello Everyone,

Last year I presented the Hrrformer paper in the lab. It's been accepted by
ICML 23 and finally, the paper and code are released. Hrrformer is a
neuro-symbolic self-attention model with linear 𝒪(n) time and space
complexity. 23× faster and consumes 24× less memory than Transformer. SOTA
performance for even over sequence length n≥100,000. Able to learn with a
single layer and converges 10× faster in LRA benchmark. The code is written
in Jax, however, it is fairly easy to implement it in PyTorch. If you are
using self-attention and are interested to reduce the time and space
complexity of your model, you can give it a try.

Paper: https://arxiv.org/pdf/2305.19534.pdf
Code: https://github.com/NeuromorphicComputationResearchProgram/Hrrformer

[image: Logo Description automatically generated]

Regards,

Mohammad Mahmudul Alam

Ph.D. candidate, Computer Science, UMBC

<https://mahmudulalam.github.io>

<https://github.com/MahmudulAlam>

<https://www.linkedin.com/in/mahmudul-alam/>

<https://scholar.google.com/citations?view_op=list_works&hl=en&user=9z9HFSEAAAAJ>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.cs.umbc.edu/pipermail/coral_current/attachments/20230605/f658632e/attachment.html>


More information about the Coral_current mailing list