Reverse Dependencies of einops
The following projects have a declared dependency on einops:
- llm-foundry — LLM Foundry
- LLM-keyword-extractor — This is a python package to extract keywords from a given text using LLMs
- llm-quantkit — cli tool for downloading and quantizing LLMs
- llm-rewards — Lean, modular reward functions for RL training with LLMs
- llm-rs — Unofficial python bindings for llm-rs. 🐍❤️🦀
- llm-rs-cuda — Unofficial python bindings for llm-rs.
- llm-rs-metal — Unofficial python bindings for llm-rs. 🐍❤️🦀
- llm-rs-opencl — Unofficial python bindings for llm-rs. 🐍❤️🦀
- llm-sentence-transformers — Use sentence-transformers for embeddings with LLM
- llm-serve — An LLM inference solution to quickly deploy productive LLM service
- llm-toolkit — LLM Finetuning resource hub + toolkit
- llmtuner — Easy-to-use LLM fine-tuning framework
- lm-buddy — Ray-centric library for finetuning and evaluation of (large) language models.
- lm-polygraph — Uncertainty Estimation Toolkit for Transformer Language Models
- lmetric — Large Model Metrics
- lmms-eval — A framework for evaluating large multi-modality language models
- local-attention — Local attention, window with lookback, for language modeling
- local-attention-flax — Local Attention - Flax Module in Jax
- local-attention-tf — Local attention, window with lookback, for Image, Audio and language modeling
- locking-activations — Locking activations via k-sparse autoencoders.
- logavgexp-pytorch — LogAvgExp - Pytorch
- logix-ai — AI Logging for Interpretability and Explainability
- long-short-transformer — Long Short Transformer - Pytorch
- LongNet — LongNet - Pytorch
- Lossless-BS-RoFormer — Lossless BS-RoFormer - Band-Split Rotary Transformer for SOTA Music Source Separation
- lslcharge — A deep learning toolkit for proteomics, equipped with a few tools for network recycling.
- lsts — A lightweight, fast, advanced deep learning time series package for long and short-term forecast and missing value imputation of land surface variables.
- luffy — Deep Learning - Pytorch
- lumiere — Paper - Pytorch
- lumiere-pytorch — Lumiere
- lvsm-pytorch — LVSM - Pytorch
- lycoris-lora — Lora beYond Conventional methods, Other Rank adaptation Implementations for Stable diffusion
- m2pt — M2PT - Pytorch
- magic-pdf — A practical tool for converting PDF to Markdown
- magnet-pinn — This is the software package for simulating EM Fields using NNs
- magvit2-pytorch — MagViT2 - Pytorch
- make-a-video-pytorch — Make-A-Video - Pytorch
- mamba-former — Paper - Pytorch
- mamba-lens — TransformerLens port for Mamba
- mamba-r1 — Mamba R1 - Pytorch
- mambabyte — MambaByte - Pytorch
- mambatransformer — MambaTransformer - Pytorch
- mambavision — MambaVision: A Hybrid Mamba-Transformer Vision Backbone
- mambo-minhhai — A small example package
- mambular — A python package for tabular deep learning with mamba blocks.
- mammoth-nlp — Massively Multilingual Modular Open Translation @ Helsinki
- MaMMUT-pytorch — MaMMUT - Pytorch
- mantis-tsfm — Mantis: Lightweight Calibrated Foundation Model for User-Friendly Time Series Classification
- marcopolo-pytorch — MarcoPolo: a method to discover differentially expressed genes in single-cell RNA-seq data without depending on prior clustering
- marge-pytorch — Marge - Pytorch
- marinedebrisdetector — A detector of marine debris with Sentinel-2 scenes
- marlin-pytorch — Official pytorch implementation for MARLIN.
- mase-tools — Machine-Learning Accelerator System Exploration Tools
- maskbit-pytorch — MaskBit
- maskinversion-torch — MaskInversion
- matcha-tts — 🍵 Matcha-TTS: A fast TTS architecture with conditional flow matching
- mblm — Multiscale Byte Language Model
- mcvit — Paper - Pytorch
- med-seg-diff-pytorch — MedSegDiff - SOTA medical image segmentation - Pytorch
- medim — medim is a all-in-one tool for medical image segmentation.
- MedPalm — MedPalm - Pytorch
- medvae — MedVAE is a family of six medical image autoencoders that can encode high-dimensional medical images into latent representations.
- Mega-pytorch — Mega - Pytorch
- mega-vit — mega-vit - Pytorch
- MEGABYTE-pytorch — MEGABYTE - Pytorch
- megatron-core — Megatron Core - a library for efficient and scalable training of transformer based models
- memesdb — index and search your meme stash with ai
- memformer — Memformer - Pytorch
- memorizing-transformers-pytorch — Memorizing Transformer - Pytorch
- memory-efficient-attention-pytorch — Memory Efficient Attention - Pytorch
- memos — A package for memos
- meshgpt-pytorch — MeshGPT Pytorch
- metabolism — Multi-organ metabolic analysis framework
- metaego — Official repo for Meta-models.
- metaformer-gpt — Metaformer - GPT
- MetaQ-sc — The implementation of the paper 'MetaQ: fast, scalable and accurate metacell inference via deep single-cell quantization'. Please refer to the paper and code repository (https://github.com/XLearning-SCU/MetaQ) for more details.
- metatreelib — PyTorch Implementation for MetaTree: Learning a Decision Tree Algorithm with Transformers
- metnet — PyTorch MetNet Implementation
- metnet3 — Metnet - Pytorch
- metnet3-pytorch — MetNet 3 - Pytorch
- mgqa — mgqa - Pytorch
- mh-moe — Paper - Pytorch
- microsoft-aurora — Implementation of the Aurora model
- miiiii — mechanistic interpretability of irriducible integer identifiers
- miles-credit — no summary
- mindstudio-probe — Ascend Probe Utils
- minestudio — A simple and efficient Minecraft development kit.
- minGRU-pytorch — minGRU
- mini-dust3r — Miniature version of dust3r, focused on inference
- mini-transformer — An advanced minimal transformer component framework
- minimagen — Minimal Imagen text-to-image model implementation.
- miniminiai — A mini version of fastai's miniai
- mirasol-pytorch — Mirasol - Pytorch
- MIRTorch — a PyTorch-based image reconstruction toolbox
- mist-medical — MIST is a simple, fully automated framework for 3D medical imaging segmentation.
- mist-vae — MIST: an interpretable and flexible deep learning framework for single-T cell transcriptome and receptor analysis
- mistral-jax — JAX implementation of the Mistral model.
- mistral-v0.2-jax — JAX implementation of the Mistral v0.2 base model.
- mixture-of-attention — Mixture of Attention
- ml-energy — PhD work/exploration @ FEUP, PORTO