dual-attention

View on PyPIReverse Dependencies (0)

0.0.7 dual_attention-0.0.7-py3-none-any.whl

Wheel Details

Project: dual-attention
Version: 0.0.7
Filename: dual_attention-0.0.7-py3-none-any.whl
Download: [link]
Size: 46764
MD5: 008a23f0b4fa5bd6ea25d0b923db235d
SHA256: 8984d5afff45daa0e9799643b5e8223b08f893b214693ca0ee72cec3bdae7043
Uploaded: 2024-09-24 20:25:48 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: dual_attention
Version: 0.0.7
Summary: Python package implementing the Dual Attention Transformer (DAT), as proposed by the paper "Disentangling and Integrating Relational and Sensory Information in Transformer Architectures" by Awni Altabaa, John Lafferty.
Author: Awni Altabaa
Author-Email: awni.altabaa[at]yale.edu
Project-Url: Documentation, https://dual-attention-transformer.readthedocs.io/
Project-Url: Source, https://github.com/Awni00/dual-attention
Project-Url: Tracker, https://github.com/Awni00/dual-attention/issues
License: MIT
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Dist: einops
Description-Content-Type: text/markdown
License-File: LICENSE
[Description omitted; length: 10166 characters]

WHEEL

Wheel-Version: 1.0
Generator: setuptools (72.1.0)
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
dual_attention/__init__.py sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU 0
dual_attention/attention.py sha256=Eafw5-W2S4W4zCMa50O2gzKUOXgBbm-UVwXHy9K3INU 8759
dual_attention/attention_utils.py sha256=3RD0xjb7CqddgGezwc3_sPOqFMo4J6h0J2X_AXddYZU 2977
dual_attention/dual_attention.py sha256=JMZLDBPmWre8GcWOr_kHYXheKoXBRfnVOiU54zPPDyo 6796
dual_attention/dual_attn_blocks.py sha256=8tGlCQCWD7ij6tUCT_bEdl_ksJoTPO8uNv-UttxPoX8 10347
dual_attention/hf.py sha256=li7N4GMnvFu9xV5FZqELUVFNixFOWWIolO1-JZ9mq-0 1595
dual_attention/language_models.py sha256=nPb0cK6pl2h09uyPN02kvgWz1Jm7dtpT-7DicBhGKnQ 25507
dual_attention/model_utils.py sha256=gS5iVjICLGTOyrHBBriVSdde5PDkWR8Bt1berO2btZ8 605
dual_attention/positional_encoding.py sha256=h3nB6JPdiyeGg02AnbZSp5qthwTuZwJ6B9LaBidb1_Y 4648
dual_attention/relational_attention.py sha256=bxErDYjNN4PjlI1BQvJhQlcpjlsXfkh9fGWVS2bGcqg 38966
dual_attention/seq2seq_models.py sha256=bUd6AOx4NisjiGB1kvYOeS0egTjdMXim9091gpnuHEA 14432
dual_attention/symbol_retrieval.py sha256=shhVwRUAWgvAMSAEmNLiNoY3NAN8ylJ_n2AzNQTe3gs 12860
dual_attention/transformer_blocks.py sha256=HmJ1ADqW8uTc3L-SzdCoamhJF4A-mSoj830QZZ2KXNc 10231
dual_attention/vision_models.py sha256=UpZA3UD7M4vXLnGpGNFNxBwZ-Gf1iEWyz7QImEnrA7A 13120
dual_attention/model_analysis/__init__.py sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU 0
dual_attention/model_analysis/datlm_utils.py sha256=gsQP-1nkjiBjjdYx6sbQfU5luQmUw3Y09VgpOY1hwII 4574
dual_attention/model_analysis/lm_inference_app.py sha256=NKEd3-QdH1froiTFCo31UKxpEXb2EObVUkF7llB0rbw 6149
dual_attention/model_analysis/lm_visualization_app.py sha256=jooqq6tQROxPT9Av8jbEdWj7Vh9CM2Jv2u4_XtIjhbc 11469
dual_attention-0.0.7.dist-info/LICENSE sha256=YG1TgmziAFi_9BNKdAcb4P4b1zt7DZRTMoOyYgSzPzE 1082
dual_attention-0.0.7.dist-info/METADATA sha256=_NRAwLnLrzmX_52sJ9qJ-bVVFkmX1WsqFBf-ekXtAVc 11154
dual_attention-0.0.7.dist-info/WHEEL sha256=R0nc6qTxuoLk7ShA2_Y-UWkN8ZdfDBG2B6Eqpz2WXbs 91
dual_attention-0.0.7.dist-info/top_level.txt sha256=7JEgQS38yyXhr3KEA_uhW6vVztCCiF8CbfcYPBBQ49E 15
dual_attention-0.0.7.dist-info/RECORD

top_level.txt

dual_attention