Reverse Dependencies of mistralai
The following projects have a declared dependency on mistralai:
- admyral — no summary
- ai-terminal — Interface to interact with a chatbot in Linux terminal
- api4all — Easy-to-use LLM API from a state-of-the-art provider and comparison
- arena-client — A client to use arena AI
- arize-phoenix-evals — LLM Evaluations
- autogen — A programming framework for agentic AI
- bambooai — A lightweight library for working with pandas dataframes using natural language queries
- bigcodebench — "Evaluation package for BigCodeBench"
- bl-vanna — Generate SQL queries from natural language
- camel-ai — Communicative Agents for AI Society Study
- chat-rag — no summary
- distilabel — Distilabel is an AI Feedback (AIF) framework for building datasets with and for LLMs.
- ecologits — EcoLogits tracks and estimates the energy consumption and environmental impacts of using generative AI models through APIs.
- eevee-chat — A single chat interface for multiple LLMs
- elsagendev — Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework
- gdm-concordia — A library for building a generative model of social interacions.
- gptcachelite — LLM (OpenAI and Mistral) API Wrapper with Semantic Caching via Vlite2. See more at https://github.com/raydelvecchio/gptcachelite.
- h2ogpt — no summary
- hexamind — Hexamind library to implement RAG solutions
- infinity-llm — use any llm api in a plug-and-play fashion
- inspect-ai — Framework for large language model evaluations
- instructor — structured outputs for llm
- knowledge-engineer — Engineer GPT Knowledge within a project
- langtrace-python-sdk — Python SDK for LangTrace
- latentscope — Quickly embed, project, cluster and explore a dataset.
- llama-index-embeddings-mistralai — llama-index embeddings mistralai integration
- llama-index-llms-mistralai — llama-index llms mistral ai integration
- llm-engines — A unified inference engine for large language models (LLMs) including open-source models (VLLM, SGLang, Together) and commercial models (OpenAI, Mistral, Claude).
- llm-taxi — Call LLM as easily as calling a taxi.
- llmmaster — A unified interface for interacting with multiple LLMs and generative AIs.
- llmonpy — AI pipeline framework for Python.
- lm-buddy — Ray-centric library for finetuning and evaluation of (large) language models.
- log10-io — Unified LLM data management
- lovelaice — An AI-powered chatbot for the terminal and editor
- miniogre — miniogre: from source code to reproducible environment, in seconds.
- mirascope — LLM abstractions that aren't obstructions
- MistralGPTIntegration — Integration utility for Mistral AI API to provide GPT-based functionalities.
- multiai — A Python library for text-based AI interactions
- mypyautogen20240904 — Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework
- opendatagen — Data preparation system to build controllable AI system
- openinference-instrumentation-mistralai — OpenInference Mistral AI Instrumentation
- optimodel-server — A smart framework for calling models in the most efficient way possible
- outerop — no summary
- parsee-core — no summary
- penelopa — Penelopa: AI-driven codebase modifier using OpenAI GPT models
- pilot-fusion — A package for generating code using various AI models.
- prem-utils — Prem generic utils to use across Prem Components.
- proxai — ProxAI is a lightweight abstraction layer for foundational AI model connections.
- proxyllm — LLM Proxy to reduce cost and complexity of using multiple LLMs
- py-llm-core — PyLLMCore provides a light-weighted interface with LLMs
- PyAutoGen — A programming framework for agentic AI
- pydevgpt — Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework
- pyllms — Minimal Python library to connect to LLMs (OpenAI, Anthropic, Google,, Mistral, Reka, Groq, Together, Ollama, AI21, Cohere, Aleph-Alpha, HuggingfaceHub), with a built-in model performance benchmark.
- pyText2Sql — Generate SQL queries from natural language
- redisvl — Python client library and CLI for using Redis as a vector database
- scorecard-ai — no summary
- semantic-kernel — Semantic Kernel Python SDK
- semantic-router — Super fast semantic router for AI decision making
- semantix — Give Superpowers to your python function. GenAI Application development made easy.
- semroute — SemRoute is a semantic router that helps you route using the semantic meaning of the query
- sibila — Structured queries from local or online LLM models
- smartenough — Convert cheap LLMs into efficient, validated API calls.
- structgenie — LLM Generation Framework for structured outputs with type validation
- swarmauri — This repository includes core interfaces, standard ABCs and concrete references, third party plugins, and experimental modules for the swarmaURI framework.
- text2sql-metadata-filterer — Generate SQL queries from natural language
- unillm — Unified Large Language Model Interface for ChatGPT, LLaMA, Mistral, Claude, and RAG
- unstract-sdk — A framework for writing Unstract Tools/Apps
- utils-ai-nuuuwan — Utilities with AI support
- vanna — Generate SQL queries from natural language
- vanna-scalegen — Generate SQL queries from natural language
- wyn-agent — Welcome to the **WYN-Agent** package! This Python package provides a simple yet powerful interface to create AI chatbots using the Mistral AI platform.
- yosemite — yosemite
1