TakeSentenceTokenizer

View on PyPIReverse Dependencies (6)

1.0.2 TakeSentenceTokenizer-1.0.2-py3-none-any.whl

Wheel Details

Project: TakeSentenceTokenizer
Version: 1.0.2
Filename: TakeSentenceTokenizer-1.0.2-py3-none-any.whl
Download: [link]
Size: 408226
MD5: 5735c9974532de2b8467125f894cfcf9
SHA256: f9f533ef3a880c1ac01859f19bf0de44105e96a4615a8f8d67f5fb4a50465532
Uploaded: 2022-08-16 15:03:34 +0000

dist-info

METADATA

Metadata-Version: 2.1
Name: TakeSentenceTokenizer
Version: 1.0.2
Summary: TakeSentenceTokenizer is a tool for tokenizing and pre processing messages
Author: Karina Tiemi Kato
Author-Email: karinat[at]take.net
Keywords: Tokenization
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Dist: emoji (==1.7.0)
Description-Content-Type: text/markdown
[Description omitted; length: 1876 characters]

WHEEL

Wheel-Version: 1.0
Generator: bdist_wheel (0.37.1)
Root-Is-Purelib: true
Tag: py3-none-any

RECORD

Path Digest Size
SentenceTokenizer/__init__.py sha256=uXyzJs2041DZYI42duFrg4M-_4eiYCETduWTodUYVvY 40
SentenceTokenizer/tokenizer.py sha256=5b4j2btf61lrrMPXa0la70H4dAWsnL7NF8gdV2MeibI 10476
SentenceTokenizer/dictionaries/Titan_v2_dict_without_accentuation.json sha256=wR9it9B0dlrSa6EgkRKC3Npr9jki_AuZ-J_IVDgtI7s 527431
SentenceTokenizer/dictionaries/all_portuguese_words.txt sha256=8M8ZCU-WOeHDNT8NeEfCANXSiPn5KuOlzkm8fDp_93s 915376
SentenceTokenizer/dictionaries/correction_dict.json sha256=QKFTbfgrfYqw9_U0-P8cEx0nOwME9wWMI2QW3CG8xSo 2281
TakeSentenceTokenizer-1.0.2.dist-info/METADATA sha256=buqGL7hT8BE2vfOJ695cCRw4GMXstVQEYMrnScxEwp0 2385
TakeSentenceTokenizer-1.0.2.dist-info/WHEEL sha256=G16H4A3IeoQmnOrYV4ueZGKSjhipXx8zc8nu9FGlvMA 92
TakeSentenceTokenizer-1.0.2.dist-info/top_level.txt sha256=gPBMGOAnDHd-n7DS3AMdG4Ci6evJ5DWKyl0vgGaSQSg 18
TakeSentenceTokenizer-1.0.2.dist-info/RECORD

top_level.txt

SentenceTokenizer