Reverse Dependencies of fastavro
The following projects have a declared dependency on fastavro:
- kafkit — Kafkit helps you write Kafka producers and consumers in Python with asyncio.
- kaflow — Python Stream processing backed by Apache Kafka.
- lamatic-airbyte-cdk — A framework for writing Airbyte Connectors.
- langbatch — LangBatch is a python package with unified API for AI batch processing workloads. Supports OpenAI, Anthropic, Azure OpenAI, Vertex AI
- langflow — A Python package with a built-in web application
- langflow-nightly — A Python package with a built-in web application
- langstream-ai — LangStream user API
- latentscope — Quickly embed, project, cluster and explore a dataset.
- lc-correction — Scripts for ALeRCE light curve correction
- lintML — A security-first linter for machine learning training code.
- llm-cohere — Plugin for LLM adding support for Cohere's Generate and Summarize models
- lsst-alert-packet — Code for interacting with Vera C. Rubin Observatory alert packets
- lsst-alert-stream — Code for interacting with the Vera C. Rubin Observatory alert stream
- macrometa-target-bigquery — Macrometa target bigquery connector for loading data to BigQuery
- market-data-transcoder — Market Data Transcoder
- metamart-ingestion — Ingestion Framework for MetaMart
- metaphor-connectors — A collection of Python-based 'connectors' that extract metadata from various sources to ingest into the Metaphor app.
- minos-microservice-common — The common core of the Minos Framework
- mk-feature-store — Python SDK for Feast
- multivitamin — Serving infrastructure for ML and CV models
- nacre — Nacre
- neptyne-kernel — The Neptyne kernel
- neso-utils — Library of utilities for NESO.
- odd-collector — ODD Collector
- officelyTest — A brief description of your package
- openedx-events — Open edX events from the Hooks Extensions Framework
- openmetadata-ingestion — Ingestion Framework for OpenMetadata
- oracle-ads — Oracle Accelerated Data Science SDK
- osint-python-test-bed-adapter — Python adapter for Kafka
- pandavro — The interface between Avro and pandas DataFrame
- pdp-kafka-reader — PDP Kafka package
- petl-retouched-version — A Python package for extracting, transforming and loading tables of data. Modified from Original Petl Lib
- pfb-fhir — Render a PFB graph from FHIR data
- pgb-broker-utils — Tools used by the Pitt-Google astronomical alert broker.
- pgb-utils — Tools to interact with Pitt-Google Broker data products and services.
- pgrsql2data — Utility for converting PgRouting SQL scripts output by osm2po to data files (CSV, JSON or Avro).
- pingpong-datahub — A CLI to work with DataHub metadata
- pipelinewise-target-bigquery — Singer.io target for loading data to BigQuery - PipelineWise compatible
- pittgoogle-client — Client utilities for the Pitt-Google astronomical alert broker.
- plantable — no summary
- predatools — no summary
- proteinshake — Protein structure datasets for machine learning.
- puavro — Pulsar AVRO interface allowing to read/write AVRO messages from/to dict
- pulsar-cli — no summary
- pulsar-client — Apache Pulsar Python client library
- pulsar-client-sn — Apache Pulsar Python client library
- pup-confluent-kafka — Patched version of Confluent's Python client for Apache Kafka
- py-adapter — Round-trip serialization/deserialization of any Python object to/from any serialization format including Avro and JSON.
- pyAnVIL — AnVIL client library. Data harmonization, gen3, terra single sign-on use cases.
- pyavro — A Python Avro Schema Builder
- PygQuery — 🐷 Multitread your data with Google BigQuery
- pyinsta-functions — no summary
- pyoptimus — Optimus is the missing framework for cleaning and pre-processing data in a distributed fashion.
- pypfb — Python SDK for PFB format
- python-schema-registry-client — Python Rest Client to interact against Schema Registry confluent server
- quixstreams — Python library for building stream processing applications with Apache Kafka
- race-strategist — Display telemetry data and spot anomalies.
- ragstack-ai-langflow — RAGStack Langflow
- RecordMapper — Transform records using an Avro schema and custom map functions.
- retake — Open Source Infrastructure for Vector Data Streams
- robotframework-confluentkafkalibrary — Confluent Kafka library for Robot Framework
- rpm-confluent-schemaregistry — Confluent Schema Registry lib
- scrapy-contrib-bigexporters — Scrapy exporter for Big Data formats
- selector-standardization-beam — Data Standardization pipeline in Apache Beam for Selector project
- selector-standardizers — Electoral Data Standardization classes for the Selector project
- sensorizer — Timeseries data generation and preparation for batch jobs at scale
- serdes — no summary
- spark-gaps-date-rorc-tools — spark_gaps_date_rorc_tools
- sq-blocks — Blocks provides a simple interface to read, organize, and manipulate structured data in files on local and cloud storage
- structured-data-validation — Python package for the validation of AROS [semi]structured data using pydantic models.
- superstream-confluent-kafka — Confluent's Python client for Apache Kafka
- superstream-confluent-kafka-beta — Confluent's Python client for Apache Kafka
- sysflow-tools — SysFlow APIs and utilities
- tarchia — Tarchia - Metadata Catalog
- target-avro — Singer.io target for extracting data
- TDY-PKG — its an implimentation of TF-2 , Detectron and yolov5
- TDY-PKG-saquibquddus — its an implimentation of TF-2 , Detectron and yolov5
- TFOD-Automatic — Automated Object detection for Beginner using python and Tensorflow
- thisbeatest — A CLI to work with DataHub metadata
- tom-pittgoogle — Pitt-Google broker module for the TOM Toolkit
- tspace — io interface
- UDASwissKnife — Utils and common libraries for Python
- ul-api-utils — Python api utils
- vesslflow — VESSLFlow
- winterrb — no summary
- with-cloud-blob — no summary
- wunderkafka — librdkafka-powered client for Kafka for python with (hopefully) more handful API
- yaq-traits — package defining yaq traits
- yaqc — Generic yaq client.
- yaqd-core — no summary
1
2