Reverse Dependencies of rjieba
The following projects have a declared dependency on rjieba:
- adapter-transformers — A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
- adapters — A Unified Library for Parameter-Efficient and Modular Transfer Learning
- amulety — Python package to create embeddings of BCR amino acid sequences.
- ExpoSeq — A pacakge which provides various ways to analyze NGS data from phage display campaigns
- mw-adapter-transformers — A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
- optimum — Optimum Library is an extension of the Hugging Face Transformers library, providing a framework to integrate third-party libraries from Hardware Partners and interface with their specific functionality.
- optimum-intel — Optimum Library is an extension of the Hugging Face Transformers library, providing a framework to integrate third-party libraries from Hardware Partners and interface with their specific functionality.
- optimum-neuron — Optimum Neuron is the interface between the Hugging Face Transformers and Diffusers libraries and AWS Trainium and Inferentia accelerators. It provides a set of tools enabling easy model loading, training and inference on single and multiple neuron core settings for different downstream tasks.
- roformer — roformer_pytorch
- transformers — State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
1