Reverse Dependencies of google-cloud-aiplatform
The following projects have a declared dependency on google-cloud-aiplatform:
- llama-index-vector-stores-vertexaivectorsearch — llama-index vector_stores Vertex AI Vector Search integration
- llm-vertex — Plugin for LLM adding support for Google Cloud Vertex AI
- llmware — An enterprise-grade LLM-based development framework, tools, and fine-tuned models
- log10-io — Unified LLM data management
- lpw — Using Local Packet Whisperer (LPW, Chat with PCAP/PCAPNG files locally, privately!
- lv-vectordb-gcp — A package for GCP, LangChain, and BigQuery Vector store integration
- magemaker — A CLI tool to fine-tuning and deploying open-source models
- mayan-document-classifier — Document classifier
- milocode — The all-in-one voice SDK
- mirascope — LLM abstractions that aren't obstructions
- modelsmith — Get Pydantic models and Python types as LLM responses from Google Vertex AI and OpenAI models.
- muzlin — Muzlin: a filtering toolset for semantic machine learning
- mypyautogen20240904 — Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework
- neo4j-graphrag — Python package to allow easy integration to Neo4j's GraphRAG features
- niwatoko — 自然言語でプログラミングを行うことができる新しいプログラミング言語
- openfeet — OpenHands: Code Less, Make More
- openhands-ai — OpenHands: Code Less, Make More
- openhands-ai-test — OpenHands: Code Less, Make More
- openinference-instrumentation-vertexai — OpenInference VertexAI Instrumentation
- openrelik-ai-common — Common utilities for OpenRelik AI functionality
- orient-express — A library to simplify model deployment to Vertex AI
- pandasai — Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.
- pandasai-google — Google AI integration for PandasAI
- pano-airflow — Programmatically author, schedule and monitor data pipelines
- parlant — no summary
- pascalnobereit-langchain-google-vertexai — An integration package connecting Google VertexAI and LangChain
- persona_ai — Persona is a groundbreaking distributed AI agent system utilizing Google Vertex AI's premier Large Language Models such as Gemini-pro, Text-Bison, and Code-Bison. Developed by Applica Software Guru, Persona is engineered for scalable, high-performance, and intelligent operations across various data sets and applications. It harnesses the power of Google's Vertex AI to deliver unmatched insights and automation capabilities, setting new standards in AI-driven analytics and decision-making processes
- phasellm — Wrappers for common large language models (LLMs) with support for evaluation.
- pillar1 — Official package for Pillar1 company
- platform-gen-ai — This is pipeline code for accelerating solution accelerators
- playbooks — A framework for creating AI agents using human-readable playbooks
- pr-action — KhulnaSoft PR-Assistant aims to help efficiently review and handle pull requests, by providing AI feedbacks and suggestions.
- pr-agent — CodiumAI PR-Agent aims to help efficiently review and handle pull requests, by providing AI feedbacks and suggestions.
- pr-assist — KhulnaSoft PR-Assistant aims to help efficiently review and handle pull requests, by providing AI feedbacks and suggestions.
- pr-insight — KhulnaSoftAI PR-Insight aims to help efficiently review and handle pull requests, by providing AI feedbacks and suggestions.
- prefect-gcp — Prefect integrations for interacting with Google Cloud Platform.
- PromptMeteo — Enable the use of LLMs as a conventional ML model
- prompto — Library for asynchronous querying of LLM API endpoints and logging progress
- promptweaver — PromptWeaver streamlines prompt development and management in Generative AI workflows
- proxyllm — LLM Proxy to reduce cost and complexity of using multiple LLMs
- prr — prr - command-line LLM prompt runner
- pvirie-utils — PVirie's python utility functions
- PyAutoGen — A programming framework for agentic AI
- pydevgpt — Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework
- pyllms — Minimal Python library to connect to LLMs (OpenAI, Anthropic, Google, Mistral, OpenRouter, Reka, Groq, Together, Ollama, AI21, Cohere, Aleph-Alpha, HuggingfaceHub), with a built-in model performance benchmark.
- pyText2Sql — Generate SQL queries from natural language
- python-jsonllm — LLM please cast to JSON
- python-lilypad — An open-source prompt engineering framework.
- redisvl — Python client library and CLI for using Redis as a vector database
- refuel-autolabel — Label, clean and enrich text datasets with LLMs
- scikit-llm — Scikit-LLM: Seamlessly integrate powerful language models like ChatGPT into scikit-learn for enhanced text analysis tasks.
- searchbible — Search Bible AI - Integrate Unique Bible App resources with AI tools
- searchbibleai — Search Bible AI - Integrate Unique Bible App resources with AI tools
- seed-autogen — A programming framework for agentic AI
- seed-pyautogen — A programming framework for agentic AI
- sekvo — Your project description
- semantic-kernel — Semantic Kernel Python SDK
- semantic-router — Super fast semantic router for AI decision making
- skyvern-client — A lightweight Python client for Skyvern
- smartjob — Little async python library for dealing with GCP/Cloud Run Jobs and GCP/VertexAI CustomJobs
- sunholo — Large Language Model DevOps - a package to help deploy LLMs to the Cloud.
- symposium — Interaction of multiple language models
- test_pkg_hmoazam — DSPy
- text-machina — Text Machina: Seamless Generation of Machine-Generated Text Datasets
- text2sql-metadata-filterer — Generate SQL queries from natural language
- tfx — TensorFlow Extended (TFX) is a TensorFlow-based general-purpose machine learning platform implemented at Google.
- tfx-helper — A helper library for TFX
- the-real-genotools — A collection of tools for genotype quality control and analysis
- totokenizers — Text tokenizers.
- unimpossible-langcraft — Framework to abstract common LLMs for completion, supporting vision and function calling into native Python
- unipipe — project_description
- utilsds — Solution for DS Team
- vanna — Generate SQL queries from natural language
- vanna-scalegen — Generate SQL queries from natural language
- vdf-io — This library uses a universal format for vector datasets to easily export and import data from all vector databases.
- vertex_ai_huggingface_inference_toolkit — 🤗 Hugging Face Inference Toolkit for Google Cloud Vertex AI (similar to SageMaker's Inference Toolkit, but unofficial)
- vertex-deployer — Check, compile, upload, run, and schedule Kubeflow Pipelines on GCP Vertex AI in a standardized manner.
- vertexai — Please run pip install vertexai to use the Vertex SDK.
- vesslflow — VESSLFlow
- voice-stream — A streaming library for creating voice bots using LLMs. Connects LLMs to speech recognition and speech synthesis APIs.
- wandb — A CLI and library for interacting with the Weights & Biases API.
- wanna-ml — CLI tool for managing ML projects on Vertex AI
- xmanager — A framework for managing machine learning experiments
- zenml — ZenML: Write production-ready ML code.
- zenml-nightly — ZenML: Write production-ready ML code.
1
2