tokenizers

View on PyPIReverse Dependencies (565)

0.21.0 tokenizers-0.21.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
tokenizers-0.21.0-cp39-abi3-manylinux_2_17_i686.manylinux2014_i686.whl
tokenizers-0.21.0-cp39-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl
tokenizers-0.21.0-cp39-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl
tokenizers-0.21.0-cp39-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl
tokenizers-0.21.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
tokenizers-0.21.0-cp39-abi3-win_amd64.whl
tokenizers-0.21.0-cp39-abi3-win32.whl
tokenizers-0.21.0-cp39-abi3-macosx_10_12_x86_64.whl
tokenizers-0.21.0-cp39-abi3-musllinux_1_2_x86_64.whl
tokenizers-0.21.0-cp39-abi3-musllinux_1_2_i686.whl
tokenizers-0.21.0-cp39-abi3-musllinux_1_2_armv7l.whl
tokenizers-0.21.0-cp39-abi3-musllinux_1_2_aarch64.whl
tokenizers-0.21.0-cp39-abi3-macosx_11_0_arm64.whl

Wheel Details

Project: tokenizers
Version: 0.21.0
Filename: tokenizers-0.21.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Download: [link]
Size: 3008868
MD5: 805817b4adbc8e61b9484362ac488ab5
SHA256: e84ca973b3a96894d1707e189c14a774b701596d579ffc7e69debfc036a61a04
Uploaded: 2024-11-27 13:11:03 +0000

dist-info

METADATA

Metadata-Version: 2.3
Name: tokenizers
Version: 0.21.0
Author: Anthony MOI <m.anthony.moi@gmail.com>
Author-Email: Nicolas Patry <patry.nicolas[at]protonmail.com>, Anthony Moi <anthony[at]huggingface.co>
Project-Url: Homepage, https://github.com/huggingface/tokenizers
Project-Url: Source, https://github.com/huggingface/tokenizers
Keywords: NLP,tokenizer,BPE,transformer,deep learning
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Education
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.7
Requires-Dist: huggingface-hub (<1.0,>=0.16.4)
Requires-Dist: pytest; extra == "testing"
Requires-Dist: requests; extra == "testing"
Requires-Dist: numpy; extra == "testing"
Requires-Dist: datasets; extra == "testing"
Requires-Dist: black (==22.3); extra == "testing"
Requires-Dist: ruff; extra == "testing"
Requires-Dist: sphinx; extra == "docs"
Requires-Dist: sphinx-rtd-theme; extra == "docs"
Requires-Dist: setuptools-rust; extra == "docs"
Requires-Dist: tokenizers[testing]; extra == "dev"
Provides-Extra: testing
Provides-Extra: docs
Provides-Extra: dev
Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
[Description omitted; length: 5006 characters]

WHEEL

Wheel-Version: 1.0
Generator: maturin (1.7.5)
Root-Is-Purelib: false
Tag: cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64

RECORD

Path Digest Size
tokenizers-0.21.0.dist-info/METADATA sha256=oNldYkLKpnavqOq1XABp8c_yNaR65mGu_qaFlD0St2M 6719
tokenizers-0.21.0.dist-info/WHEEL sha256=PuLiPGpD0eVcoUkb9lueobt7VbCYShlDtLaTRPpT7Z0 127
tokenizers/models/__init__.py sha256=eJZ4HTAQZpxnKILNylWaTFqxXy-Ba6OKswWN47feeV8 176
tokenizers/models/__init__.pyi sha256=clPTwiyjz7FlVdEuwo_3Wa_TmQrbZhW0SGmnNylepnY 16929
tokenizers/decoders/__init__.py sha256=hfwM6CFUDvlMGGL4-xsaaYz81K9P5rQI5ZL5UHWK8Y4 372
tokenizers/decoders/__init__.pyi sha256=U0dfPVxoGpb-RmNKzZMZebe0fK2riRMbxQh9yJMHjYE 7378
tokenizers/trainers/__init__.py sha256=UTu22AGcp76IvpW45xLRbJWET04NxPW6NfCb2YYz0EM 248
tokenizers/trainers/__init__.pyi sha256=3TwFKts4me7zQfVRcSTmtXYiP4XwcRjfAYtwqoZVtoQ 5382
tokenizers/__init__.py sha256=ZE5ZagUvobBScrHBQdEobhx4wqM0bsq9F9aLYkBNjYQ 2615
tokenizers/tools/visualizer-styles.css sha256=zAydq1oGWD8QEll4-eyL8Llw0B1sty_hpIE3tYxL02k 4850
tokenizers/tools/__init__.py sha256=xG8caB9OHC8cbB01S5vYV14HZxhO6eWbLehsb70ppio 55
tokenizers/tools/visualizer.py sha256=gi-E2NCP7FuG6ujpQOdalSTXUlaV85V6NI-ZPPTvA_4 14625
tokenizers/normalizers/__init__.py sha256=_06w4cqRItveEgIddYaLMScgkSOkIAMIzYCesb5AA4U 841
tokenizers/normalizers/__init__.pyi sha256=dwfVsvg0YbeYoAaBSmKsImqL-tyxiDyHaaTFsZK4aZw 20897
tokenizers/implementations/base_tokenizer.py sha256=2TFZhLupaJiMDYGJuUNmxYJv-cnR8bDHmbMzaYpFROs 14206
tokenizers/implementations/__init__.py sha256=VzAsplaIo7rl4AFO8Miu7ig7MfZjvonwVblZw01zR6M 310
tokenizers/implementations/sentencepiece_unigram.py sha256=SYiVXL8ZtqLXKpuqwnwmrfxgGotu8yAkOu7dLztEXIo 7580
tokenizers/implementations/char_level_bpe.py sha256=Q2ZEAW0xMQHF7YCUtmplwaxbU-J0P2NK4PJGMxUb-_c 5466
tokenizers/implementations/bert_wordpiece.py sha256=sKCum0FKPYdSgJFJN8LDerVBoTDRSqyqSdrcm-lvQqI 5520
tokenizers/implementations/sentencepiece_bpe.py sha256=LwrofoohnUfME2lK2lQYoyQIhP84RP0CIlHRaj0hyNs 3738
tokenizers/implementations/byte_level_bpe.py sha256=OA_jyy3EQmYTa6hnf-EKwLOFuyroqFYOJz25ysM2BUk 4289
tokenizers/processors/__init__.py sha256=xM2DEKwKtHIumHsszM8AMkq-AlaqvBZFXWgLU8SNhOY 307
tokenizers/processors/__init__.pyi sha256=hx767ZY8SHhxb_hiXPRxm-f_KcoR4XDx7vfK2c0lR-Q 11357
tokenizers/pre_tokenizers/__init__.py sha256=wd6KYQA_RsGSQK-HeG9opTRhv4ttSRkyno2dk6az-PM 557
tokenizers/pre_tokenizers/__init__.pyi sha256=dLtaxOgcBa85vQC6byvfKGCOWTEi4c42IcqimfatksQ 23602
tokenizers/__init__.pyi sha256=jw34WZXaYu8NBBJ2_cypfOqJYxI7CXKPzlveisXw4XQ 40182
tokenizers/tokenizers.abi3.so sha256=uPe1mVjvIUFcXkPyyik0lcgOjPT3LKlTtAOTuDhZAN0 8942016
tokenizers-0.21.0.dist-info/RECORD