Reverse Dependencies of apache-airflow
The following projects have a declared dependency on apache-airflow:
- data-tracking — Extract meta-data from DICOM and NIFTI files
- datamechanics-airflow-plugin — An Airflow plugin to launch and monitor Spark applications on the Data Mechanics platform
- dataverk-airflow — no summary
- dbt-af — Distibuted dbt runs on Apache Airflow
- dbt-airflow — A Python package that creates fine-grained Airflow tasks for dbt
- dbt-purview — no summary
- dbt-purview-integration — no summary
- dbtlogs — no summary
- dbtlogs-prebuild — no summary
- dbtlogs-prebuilt — no summary
- dbtlogs1 — no summary
- ddui — no summary
- deirokay — A tool for data profiling and data validation
- detvista-airflow-common — detvista airflow support
- django-airflow — using Django features in Airflow DAG
- domino-py — Python package for Domino.
- dominodatalab — Python bindings for the Domino API
- earnest-airflow-plugin — Operators and Hooks for Airflow
- eb-airflow-providers-siafi — Provider para interações com o SIAFI e seus sistemas derivados
- fastdomino — Python bindings for the Domino API Fork by fast.co
- flowetl — FlowETL is a collection of special purposes Airflow operators and sensors for use with FlowKit.
- flowpyter-task — no summary
- flowui-project — FlowUI project
- flytekitplugins-airflow — This package holds the Airflow plugins for flytekit
- fsqlfly — Flink SQL Job Management Website
- funcron — funcron
- geniusrise — An LLM framework
- gps-building-blocks — Modules and tools useful for use with advanced data solutions on Google Ads, Google Marketing Platform and Google Cloud.
- gusty — Making DAG construction easier
- haiqv-streaming-dag-editor — A code editor and file manager about dag for haiqv-streaming
- helloheart-airflow-utils — Apache Airflow Utilities
- i2b2-import — Import data into an I2B2 DB schema
- idg-metadata-client — Ingestion Framework for OpenMetadata
- iomete-airflow-plugin — An Airflow plugin for interacting with IOMETE platform.
- jefferson-street-composer — Library for Jefferson Street Technologies Composer Configuration
- kedro-airflow — Kedro-Airflow makes it easy to deploy Kedro projects to Airflow
- ld-platform — 本地和s3的连接器
- lingualeo-sqlmesh — no summary
- metamart-ingestion — Ingestion Framework for MetaMart
- metamart-managed-apis — Airflow REST APIs to create and manage DAGS
- metro — Metro framework for Airflow
- mojap-airflow-tools — A few wrappers and tools to use Airflow on the Analytical Platform
- my-first-projecr-demo-1 — no summary
- my-provider — no summary
- mycliapper2 — no summary
- mylearn — mylearn: my Machine Learning framework
- mytriggernew — A custom Airflow trigger for monitoring multiple GCS prefixes
- neuro-airflow-plugin — Neu.ro Airflow plugin
- newrelic-airflow-plugin — New Relic Plugin for Apache Airflow
- nr-ops — An opinionated operator framework.
- o2a — Oozie To Airflow migration tool
- ocean-spark-airflow-provider — Apache Airflow connector for Ocean for Apache Spark
- odahu-flow-airflow-plugin — no summary
- omim-airmaps — This package contains tools for generating maps with Apache Airflow.
- onepassword-secrets-backend — Custom 1password secrets backend for airflow
- opendbt — Python opendbt
- openmetadata-airflow-managed-apis — Airflow REST APIs to create and manage DAGS
- openmetadata-data-profiler — Data Profiler Library for OpenMetadata
- openmetadata-ingestion — Ingestion Framework for OpenMetadata
- openmetadata-managed-apis — Airflow REST APIs to create and manage DAGS
- pandasdb — A library for doing data analytics on databases in python without loading the data into memory of theclient computer
- pano-airflow — Programmatically author, schedule and monitor data pipelines
- pano-airflow-providers-amazon — Provider for Apache Airflow. Implements apache-airflow-providers-amazon package
- papermill-origami — The noteable API interface
- pingpong-datahub — A CLI to work with DataHub metadata
- plexflow — A short description of the package.
- ppextensions — PPExtenions - Set of iPython and Jupyter extensions
- preservation-database — A database builder for digital preservation information.
- profcomff-definitions — Data warehouse definitions and schemas
- py-orca — Python package for connecting services and building data pipelines
- pyddapi — DataDriver API package
- pypper — no summary
- pytest-airflow — pytest support for airflow.
- qbiz-airflow-presto — A containerized Presto cluster for AWS.
- quickpath-airflow-operator — Execute Blueprints Within the Quickpath Platform
- rb-message-writer — Writes messages to a data store, usually with Airflow.
- rcplus-alloy-common — RC+/Alloy helpers functions for Python
- records-mover — Records mover is a command-line tool and Python library you can use to move relational data from one place to another.
- RegScale-CLI — Command Line Interface (CLI) for bulk processing/loading data into RegScale
- reign-airflow-utils — A package of airflow utils
- rekcurd-airflow — Airflow plugins for Rekcurd
- restic-airflow — Airflow operators to backup and restore data using restic
- rudderstack-airflow-provider — Apache airflow provider for managing Reverse ETL syncs and Profiles runs in RudderStack.
- sagemaker — Open source library for training and deploying models on Amazon SageMaker.
- sai-airflow-plugins — A Python package with various operators, hooks and utilities for Apache Airflow
- secoda-airflow — Secoda Airflow Provider
- sewerpipe — Ties your debbuging workflow to automated workflows elsewhere
- simple-dag-editor — Zero configuration Airflow Dag editor
- somenergia-dag-utils — utilities to run along with airflow dags used in somenergia
- soopervisor — no summary
- spark-command-airflow-operator — no summary
- spark-on-k8s — A Python package to submit and manage Apache Spark applications on Kubernetes.
- sparta-airflow-timetables — Airflow timetable plugin for Anbima and B3 holidays
- sqlmesh — no summary
- tencentcloud-dlc-provider — no summary
- tfx — TensorFlow Extended (TFX) is a TensorFlow-based general-purpose machine learning platform implemented at Google.
- tfx-ann-ct-pipeline — This project is created to provide a simple way to
- thisbeatest — A CLI to work with DataHub metadata
- time-interval-sensor — A custom Airflow sensor to check if the current time is within a specific interval and time zone.
- tokyo-lineage — Tokyo Lineage