Reverse Dependencies of simpletransformers
The following projects have a declared dependency on simpletransformers:
- animated-memory — A news aggregator with local training
- autotransformers — a Python package for automatic training and benchmarking of Language Models.
- chem-classification — Deep learning in smiles win / loss evaluation.
- chess-classification — Deep learning in FEN’s win / loss evaluation.
- clinitokenizer — Sentence tokenizer for text from clinical notes.
- condolence-models — Detecting condolence, distress, and empathy in text
- datascience-toolkits — A package containing some handy and useful modules for practical data science.
- dmol-book — Style and Imports for dmol Book
- general-text-classifier — General Text Classification Library
- maitag — MAIT - Machine-Assisted Intent Tagging
- nlpaf — Implementation of different NLP tools.
- nlpboost — A package for automatic training of NLP (transformers) models
- omdenalore — AI for Good library
- respunct — An easy-to-use package to restore punctuation of portuguese texts.
- rpunct — An easy-to-use package to restore punctuation of text.
- s2aff — Semantic Scholar's Affiliation Extraction: Link Your Raw Affiliations to ROR IDs
- smaberta — a wrapper for the huggingface transformer libraries
- social-net-img-classifier — no summary
- textmining_utility — textmining package that uses existing libraries
- tuhlbox — Personal toolbox of language processing models.
- verysimpletransformers — Very Simple Transformers provides a simplified interface for packaging, deploying, and serving Transformer models. It uses a custom .vst file format, CLI, and Python library to make sharing and using pretrained models easy.
- zyl-utils — optimizer
1