Reverse Dependencies of snowflake-connector-python
The following projects have a declared dependency on snowflake-connector-python:
- acedeploy — Deployment framework for Snowflake DWH
- acepolicies — CMP and RAP policy framework for Snowflake DWH
- acryl-datahub — A CLI to work with DataHub metadata
- acryl-datahub-airflow-plugin — Datahub Airflow plugin to capture executions and send to Datahub
- acryl-datahub-dev — A CLI to work with DataHub metadata
- acryl-datahub-local — A CLI to work with DataHub metadata
- adapta — Logging, data connectors, monitoring, secret handling and general lifehacks to make data people lives easier.
- aigc-evals — aigc_evals
- airbyte — PyAirbyte
- airbyte-lib — AirbyteLib
- aladdinsdk — AladdinSDK
- alcedo-pdbc — no summary
- alo7-airflow — Programmatically author, schedule and monitor data pipelines
- altimate-dataminion — Internal package. Use this at your own risk, support not guaranteed
- amazon-sagemaker-sql-execution — SageMaker SQL Execution library
- amundsen-databuilder — Amundsen Data builder
- amundsen-databuilder-azure — Amundsen Data builder
- amundsen-databuilder-neo4j4 — Amundsen Data builder
- anai-opensource — Automated ML
- apache-airflow — Programmatically author, schedule and monitor data pipelines
- apache-airflow-backport-providers-snowflake — Backport provider package apache-airflow-backport-providers-snowflake for Apache Airflow
- apache-airflow-provider-transfers — This project contains the Universal Transfer Operator which can transfer all the data that could be read from the source Dataset into the destination Dataset. From a DAG author standpoint, all transfers would be performed through the invocation of only the Universal Transfer Operator.
- apache-airflow-providers-snowflake — Provider package apache-airflow-providers-snowflake for Apache Airflow
- apache-airflow-zack — Programmatically author, schedule and monitor data pipelines
- archetypon — Data Modeling and validation with Pydantic and Pandas, specifically designed for Jupyter Notebooks.
- arctic-training — Snowflake LLM training library
- astro-projects — A decorator that allows users to run SQL queries natively in Airflow.
- astro-sdk-python — Astro SDK allows rapid and clean development of {Extract, Load, Transform} workflows using Python and SQL, powered by Apache Airflow.
- astronomer-providers — Apache Airflow Providers containing Deferrable Operators & Sensors from Astronomer
- atscale — The AI-Link package created by AtScale
- authz-analyzer — Analyze authorization.
- balto-core — With balto, data analysts and engineers can build analytics the way engineers build applications.
- berryworld — Handy classes to improve ETL processes
- bl-vanna — Generate SQL queries from natural language
- blackneedles — no summary
- BobrTools — Tools designed to simplify routine tasks for analysts, enabling faster and more efficient data processing and analysis
- bqpipe — Wrapper around BigQuery & Snowflake libraries to simplify writing to/reading from Pandas DataFrames.
- bufu — Minimal file uploader CLI for Snowflake internal stage
- CADPR — Standardize and Automate processes utilized by the DAMs at Nike in CA
- castor-extractor — Extract your metadata assets.
- cbtham-feast-az-provider — A Feast Azure Provider
- cdbt — A CLI tool to manage dbt builds with state handling and manifest management
- cdpdev-datahub — A CLI to work with DataHub metadata
- cdwutils — Common utils for CDW jobs.
- cecil — Python SDK for Cecil Earth
- chalkpy — Python SDK for Chalk
- cimdem-test-package — A short description of your package
- ck-vanna — Generate SQL queries from natural language
- cloe-extensions-snowflake-policies — CMP and RAP policy framework for Snowflake DWH
- cloe-util-snowflake-connector — Provides basic interface for connecting to and interacting with a Snowflake instance.
- CloeExtensions — Various CLOE based helper tools independent of CLOE Core.
- clope — Python package for interacting with the Cantaloupe/Seed vending system. Primarily the Spotlight API.
- cocoon-data — Cocoon is an open-source project that aims to free analysts from tedious data transformations with LLM.
- collate-data-diff — Command-line tool and Python library to efficiently diff rows across two different databases.
- connector-factory — Connector Factory;
- core-db — This project/library contains common elements and clients related to database engines...
- covalent-blueprints-ai — A collection of AI blueprints for Covalent Cloud.
- crewai-tools — Set of tools for the crewAI framework
- cropioai-tools — Set of tools for the cropioAI framework
- CrumblPy — Common utility functions for Crumbl Data Team
- csql — Simple library for writing composeable SQL queries
- csv-to-snowflake — Creates a table in Snowflake based on CSV file
- custom-workflow-solutions — Programmatically author, schedule and monitor data pipelines
- cybee-grove — A Software as a Service (SaaS) log collection framework.
- cz-data-diff — Command-line tool and Python library to efficiently diff rows across two different databases.
- dagster-snowflake — Package for Snowflake Dagster framework components.
- dagster-snowflake-pandas — Package for integrating Snowflake and Pandas with Dagster.
- damn-tool — The DAMN (Data Assets Metric Navigation) tool extracts and reports metrics about your data assets
- dask-snowflake — Dask + Snowflake intergration
- data-diff — Command-line tool and Python library to efficiently diff rows across two different databases.
- data-diff-customize — Command-line tool and Python library to efficiently diff rows across two different databases.
- database-factory — Database Factory;
- databricks-uniform-sync — A SDK for syncing Databricks using Unity Catalog and Uniform
- databuilder-amundsen — Amundsen Data builder
- datachain-sources — Sources for DataChain library.
- datacompy — Dataframe comparison in Python
- datacontract-cli — The datacontract CLI is an open source command-line tool for working with Data Contracts. It uses data contract YAML files to lint the data contract, connect to data sources and execute schema and quality tests, detect breaking changes, and export to different formats. The tool is written in Python. It can be used as a standalone CLI tool, in a CI/CD pipeline, or directly as a Python library.
- datahyve — Server metrics in real-time with RAG-powered insights 🚀
- datajunction-query — OSS Implementation of a DataJunction Query Service
- dataligo — A library to accelerate ML and ETL pipeline by connecting all data sources
- dataops-testgen — DataKitchen's Data Quality DataOps TestGen
- datarobot-bosun — datarobot-bosun module providing MLOps Management framework and plug-ins
- DATAWAREHOUSE-CONNECTOR — A package for database session management
- davesci — no summary
- db-connector-kr — A Python library for connecting to various databases.
- db-utils — Helper class to connect to Redshift, Snowflake, DynamoDB and S3
- DBD — dbd is a data loading and transformation tool that enables data analysts and engineers to load and transform data in SQL databases.
- dbnd-snowflake — Machine Learning Orchestration
- dbt-coves — CLI tool for dbt users adopting analytics engineering best practices.
- dbt-fal — Run python scripts from any dbt project.
- dbt-gen — Tool to generate dbt resources.
- dbt-snowflake — The Snowflake adapter plugin for dbt
- dcs-cli — SDK for DataChecks
- dcs-sdk — SDK for DataChecks
- de-pytools — Data Engineering Python Tools
- defog — Defog is a Python library that helps you generate data queries from natural language questions.
- dftools-snowflake — Data Flooder Tools - Snowflake Package
- diepvries — diepvries - Picnic Data Vault framework
- django-snowflake — Django backend for Snowflake
- dlt — dlt is an open-source python-first scalable data loading library that does not require any backend to run.