Reverse Dependencies of databricks-sql-connector
The following projects have a declared dependency on databricks-sql-connector:
- acryl-datahub — A CLI to work with DataHub metadata
- airbyte-databricks-cache — no summary
- airflow-tools — no summary
- apache-airflow — Programmatically author, schedule and monitor data pipelines
- apache-airflow-providers-databricks — Provider package apache-airflow-providers-databricks for Apache Airflow
- astronomer-providers — Apache Airflow Providers containing Deferrable Operators & Sensors from Astronomer
- atscale — The AI-Link package created by AtScale
- CADPR — Standardize and Automate processes utilized by the DAMs at Nike in CA
- castor-extractor — Extract your metadata assets.
- chalkpy — Python SDK for Chalk
- connector-factory — Connector Factory;
- custom-workflow-solutions — Programmatically author, schedule and monitor data pipelines
- data-check — simple data validation
- databricks-bridge — Databricks read and write with sql connection
- databricks-session — A simple util to get a spark and mlflow session objects from an .env file
- databricks-sql — Databricks SQL framework, easy to learn, fast to code, ready for production.
- databricks-sql-cli — A DBCLI client for Databricks SQL
- databricks-sqlalchemy — Databricks SQLAlchemy plugin for Python
- databricks-sqlalchemy-oauth — SQLAlchemy OAuth connector to Databricks
- databricks-tool — no summary
- datachecks — Open Source Data Quality Monitoring
- DataConnect — This package contains connectors for databricks and sql server as well as setting up a flask environment project
- datacontract-cli — The datacontract CLI is an open source command-line tool for working with Data Contracts. It uses data contract YAML files to lint the data contract, connect to data sources and execute schema and quality tests, detect breaking changes, and export to different formats. The tool is written in Python. It can be used as a standalone CLI tool, in a CI/CD pipeline, or directly as a Python library.
- dataforge-core — Command line compiler for dataforge core projects
- dbqq — no summary
- dbsqlcli — no summary
- dbt-databricks — The Databricks adapter plugin for dbt
- dbtlogs — no summary
- dbtlogs-prebuild — no summary
- dbtlogs-prebuilt — no summary
- dbtlogs1 — no summary
- dbtunnel — Run app and get cluster proxy url for it in databricks clusters
- dbxio — High-level Databricks client
- dcs-core — Open Source Data Quality Monitoring
- dcs-sdk — SDK for DataChecks
- defog — Defog is a Python library that helps you generate data queries from natural language questions.
- dlt — dlt is an open-source python-first scalable data loading library that does not require any backend to run.
- dlt-dataops — dlt is an open-source python-first scalable data loading library that does not require any backend to run.
- do-data-utils — Functionalities to interact with Google and Azure, and clean data
- featurebyte — Python Library for FeatureOps
- functional-functions — Commonly used functions by the Compass FBI Team
- great-expectations — Always know what to expect from your data.
- great-expectations-cloud — Great Expectations Cloud
- harlequin-databricks — A Harlequin adapter for Databricks.
- hip-data-ml-utils — Common Python tools and utilities for Hipages ML work
- ie-package — Insight Extractor Package
- ingestr — ingestr is a command-line application that ingests data from various sources and stores them in any database.
- insight-extractor-package — Insight Extractor Package
- jupancon — Jupancon, connector to several DBs that returns pandas. Magic included.
- kabbes-database-connections — Provides database-agnostic connection tools
- lingualeo-sqlmesh — no summary
- llm-explorer — A Lakehouse LLM Explorer. Wrapper for spark, databricks and langchain processes
- llm-foundry — LLM Foundry
- lolpop — A software engineering framework for machine learning workflows
- mara-db — Configuration and monitoring of database connections
- metaphor-connectors — A collection of Python-based 'connectors' that extract metadata from various sources to ingest into the Metaphor app.
- metricflow-lite — Lite version of MetricFlow. This version is missing dependencies for using connectors with data warehouses
- mitzu — Product analytics over your data warehouse
- mycliapp12 — no summary
- mycliapp123 — no summary
- mycliapper132 — no summary
- mycliapper2 — no summary
- NikeCA — Standardize and Automate processes
- pano-airflow — Programmatically author, schedule and monitor data pipelines
- piperider — PiperRider CLI
- piperider-nightly — PiperRider CLI
- ploosh — A framework to automatize your tests for data projects
- qsvin — Package provides functionality for decoding vin data
- quollio-core — Quollio Core
- readdatabrickstables — Databricks connectors to read tables
- rtdip-sdk — no summary
- shipyard-databricks-sql — A local client for connecting and working Databricks SQL Warehouses
- soda-core-spark — no summary
- sqlalchemy-databricks — SQLAlchemy Dialect for Databricks
- sqlframe — Turning PySpark Into a Universal DataFrame API
- sqlmesh — no summary
- sqlmesh-cube — SQLMesh extension for generating Cube semantic layer configurations
- tentaclio-databricks — A python project containing all the dependencies for the databricks+thrift tentaclio schema.
1