Reverse Dependencies of langchain-openai
The following projects have a declared dependency on langchain-openai:
- langchain-compressa — An integration package connecting Compressa and LangChain
- langchain-dartmouth — LangChain components for Dartmouth-hosted models.
- langchain-datalayer — no summary
- langchain-deepseek — An integration package connecting DeepSeek and LangChain
- langchain-deepseek-official — An integration package connecting DeepSeek and LangChain
- langchain-fmp-data — An integration package connecting FmpData and LangChain
- langchain-googledrive — This is a more advanced integration of Google Drive with langchain.
- langchain-icosa — LangChain plugin for Icosa Computing's Combinatorial Reasoning generative AI pipeline.
- langchain-jupyter — no summary
- langchain_llamacpp_chat_model — no summary
- langchain_llm_utils — Utilities module for building LLM based apps
- langchain-mcp-tools — Model Context Protocol (MCP) To LangChain Tools Conversion Utility
- langchain-modelscope-integration — An integration package connecting ModelScope and LangChain
- langchain-nexus — Langchain-Nexus is a Python library enabling easy integration with diverse language models like ChatGPT and GLM through a unified interface.
- langchain-openai-api-bridge — A bridge to use Langchain output as an OpenAI-compatible API.
- langchain-openrouter-chat — no summary
- langchain-pangu — langchain-pangu
- langchain-pipeshift — An integration package connecting Pipeshift and LangChain
- langchain-together — An integration package connecting Together AI and LangChain
- langchain-tools — Simplifying, enhancing, and extending the LangChain library functionality
- langchain-upstage — An integration package connecting Upstage and LangChain
- langchain-xai — An integration package connecting xAI and LangChain
- LangChainBridge — A unified wrapper for various LLM providers using LangChain.
- langchainmsai — Building applications with LLMs through composability
- langevals-legacy — LangEvals Legacy evaluator
- langevals-ragas — LangEvals Ragas evaluator
- langflow — A Python package with a built-in web application
- langflow-law — A Python package with a built-in web application
- langflow-nightly — A Python package with a built-in web application
- langfuzz — Project to fuzz language model applications
- langgraph_agents — no summary
- langgraph-studio — no summary
- langgraph-tracer — A small package for langgraph trace
- langinfra — A Python package with a built-in web application
- lango-cli-beta — no summary
- langrade — A library for grading documents using LLMs
- langsmith-evaluation-helper — Helper library for langsmith evalution
- langswarm-core — A core framework for multi-agent LLM ecosystems
- langtask — A Python library for structured LLM development with schema validation
- langtrace-python-sdk — Python SDK for LangTrace
- language-transfer-flashcards — CLI tool converting Language Transfer lessons into Anki flashcards, automating content extraction for efficient language learning.
- langwatch — Python SDK for LangWatch for monitoring your LLMs
- lcstack — no summary
- Lense — For QandA
- lib-resume-builder-AIHawk — A package to generate AI-assisted resumes using GPT models
- libro-ai — libro ai
- lightdash-ai-tools — AI tools for Lightdash
- linkedin-influencer-mcp — LinkedIn influencer automation with MCP
- livia — Add your description here
- llama-cookbook — Llama-cookbook is a companion project to the Llama models. It's goal is to provide examples to quickly get started with fine-tuning for domain adaptation and how to run inference for the fine-tuned models.
- llama-github — Llama-github is an open-source Python library that empowers LLM Chatbots, AI Agents, and Auto-dev Agents to conduct Retrieval from actively selected GitHub public projects. It Augments through LLMs and Generates context for any coding question, in order to streamline the development of sophisticated AI-driven applications.
- llama-recipes — This is a compatibility package to keep projects build on llama-recipes compatible with the new name llama-cookbook. If you're updating your project or starting a new one please use llama-cookbook package
- llm-adaptive-router — An adaptive router for LLM model selection
- llm-app-test — A behavioral testing library for LLM applications that allows developers to write natural language specifications for unit and integration tests. Validate LLM application behavior using plain English assertions in a simple assert(str, str) form factor.
- llm-change-agent — llm-change-agent
- llm-cluster-optimizer — A drop-in replacement for sklearn clustering models, this package optimizes sklearn clusters using LLMs
- llm_foundation — LLM Foundation Tools
- llm-helpers — A helper package to work with LLMs
- llm-reflection — This system utilizes a large language model (LLM) and reflection
- llm-research — A minimum Python package built on top of the LangChain framework to interact with LLM.
- llm-roleplay — LLM Roleplay: Simulating Human-Chatbot Interaction
- llm-snowglobe — Snow Globe multi-agent system for open-ended wargames with large language models
- llm-term — A simple CLI to chat with LLM Models
- llmcompiler — LLMCompiler
- llmdantic — LLMdantic is a Python package that provides structured interaction with LLMs.
- llmdocparser — Using LLM to parse PDF and get better chunk for retrieval
- llmesh — HPE LLM Agentic Tool Mesh Platform is an innovative platform designed to streamline and enhance the use of AI in various applications. It serves as a central hub to orchestrate 'Intelligent Plugins,' optimizing AI interactions and processes.
- LLMeta — A tool for systematic reviews using large language models with RAG and HyDE
- llmgen — Effortlessly generate LLM APIs by simply defining input and output schemas.
- llmloader — Loads a Langchain LLM by model name as a string.
- llmservice — for creating fast and easy llm apps
- lm-buddy — Ray-centric library for finetuning and evaluation of (large) language models.
- lmconf — no summary
- lmtr-agents — Various llm agents (agent_chat, agent_tr, agent_ref, agent_imp, agent_comb) for translation
- local-operator — A Python-based agent for local command execution
- longtrainer — Production Ready LangChain
- lovecraft — Converse with H.P. Lovecraft via Retrieval Augmented Generation.
- lpw — Using Local Packet Whisperer (LPW, Chat with PCAP/PCAPNG files locally, privately!
- lumivor — Seamlessly Integrating AI with the Web
- maeser — A package for building RAG chatbot applications for educational contexts.
- mainframe-orchestra — Mainframe-Orchestra is a lightweight, open-source agentic framework for building LLM based pipelines and self-orchestrating multi-agent teams
- marinabox — An open-source toolkit for deploying containerized desktops and browsers tailored for AI agents
- materials-eunomia — Chemist AI Agent for Developing Materials Datasets with Natural Language Prompts
- mcp_server_browser_use — MCP server for browser-use
- megaparse — no summary
- memary — Longterm Memory for Autonomous Agents
- memoripy — Memoripy provides context-aware memory management with support for OpenAI and Ollama APIs, offering structured short-term and long-term memory storage for interactive applications.
- metatool — A Python interface for metatool functionality
- microlearn-llm-factory — A python package for managing microlearn LLM interaction classes
- miksi-ai-sdk — Miksi-AI empowers your BI
- miksiai — Miksi-AI empowers your BI
- mimir-ai — no summary
- mintii — Mintii Router: Intelligent LLM Model Selection and Optimization Library. Automatically selects the best Large Language Model for each prompt, optimizing for cost, quality, and performance.
- miracle-helper — MIRACLE.cowf LangChain Helper
- mirai-ai — Your Autonomous AI Agent
- ml-ai-ops — no summary
- mlcopilot — Assistant for data scientists and machine learning developers.
- mle-core — Core modules necessary during application development
- mlx-use — Make MacOS apps accessible for AI agents
- mmon — A customizable chat bot.