Reverse Dependencies of sagemaker
The following projects have a declared dependency on sagemaker:
- accelerate — Accelerate
- adapter-transformers — A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
- agent-gpt-aws — AgentGPT CLI for training and inference on AWS SageMaker
- aissemble-extensions-model-training-api-sagemaker — no summary
- anymodality — Make running Multi-Modal LLM easy.
- autogluon.cloud — Train and deploy AutoGluon backed models on the cloud
- aws-fortuna — A Library for Uncertainty Quantification.
- aws-sagemaker-remote — Simplify running processing and training remotely on AWS SageMaker
- baram — AWS Framework for python
- batcat — BatCat, A Cat Looks Like A Bat.
- cody-adapter-transformers — A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
- cohere-aws — A Python library for the Cohere endpoints in AWS Sagemaker & Bedrock
- cohere-sagemaker — A Python library for the Cohere endpoints in AWS Sagemaker
- crecs — no summary
- css-mini — Customer Success Scorecard
- cyint-aws-ml-ops-tools — Tools for assisting with managing modern ml-ops-pipelines in AWS Sagemaker
- deepchecks-llm-client — The SDK client for communicating with Deepchecks LLM service
- deeppype — The AWS based Deep Learning Pipeline Framework
- deepsea-ai — DeepSeaAI is a Python package to simplify processing deep sea video in AWS from a command line.
- easy-sm — Easy Sagemaker Ops
- easy-smr — Easy Sagemaker Ops for R projects
- exasol-sagemaker-extension — Exasol SageMaker Integration
- ezsmdeploy — Amazon SageMaker and Bedrock custom model deployments made easy
- ezsmdeploydev — SageMaker custom deployments made easy
- fahr — Tool for running remote machine learning jobs remotely.
- FloTorch-core — A Python project for FloTorch
- fmbench — Benchmark performance of **any Foundation Model (FM)** deployed on **any AWS Generative AI service**, be it **Amazon SageMaker**, **Amazon Bedrock**, **Amazon EKS**, or **Amazon EC2**. The FMs could be deployed on these platforms either directly through `FMbench`, or, if they are already deployed then also they could be benchmarked through the **Bring your own endpoint** mode supported by `FMBench`.
- fmbt — Benchmark performance of **any model** on **any supported instance type** on Amazon SageMaker.
- fmeval — Amazon Foundation Model Evaluations
- fondant — Fondant - Large-scale data processing made easy and reusable
- gab-kedro-sagemaker — Kedro plugin with AWS SageMaker Pipelines support
- gluonts — Probabilistic time series modeling in Python.
- gmlutil — General Machine Learning Utility Package
- gmlutil-data-extraction — General Machine Learning Utility Package for Data Extraction
- hydro-integrations — HydroSDK integrations
- iquaflow — Image quality framework
- jenfi-pipeline-data-app — no summary
- kedro-sagemaker — Kedro plugin with AWS SageMaker Pipelines support
- kumuniverse — Data team shared library for accessing services
- lingo-fit — no summary
- LocalCat — Fine-tune Large Language Models locally.
- mab-ranking — Online Ranking with Multi-Armed-Bandits
- magemaker — A CLI tool to fine-tuning and deploying open-source models
- mapintel — MapIntel is a system for acquiring intelligence from vast collections of text data by representing each document as a multidimensional vector that captures its own semantics. The system is designed to handle complex Natural Language queries and visual exploration of the corpus.
- metamaker — Simple CLI to train and deploy your ML models with AWS SageMaker
- ml-comp — An engine for running component based ML pipelines
- mlapp — IBM Services Framework for ML Applications Python 3 framework for building robust, production-ready machine learning applications. Official ML accelerator within the larger RAD-ML methodology.
- mlem — Version and deploy your models following GitOps principles
- mlops-utilities — no summary
- mlpiper — An engine for running component based ML pipelines
- mw-adapter-transformers — A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
- peelml — Peel away the pain of ml deployment
- productivity-stack — Python Productivity stack for simple environment set up
- py-dreambooth — Easily create your own AI avatar images!
- rastervision-aws-sagemaker — A rastervision plugin that adds an AWS SageMaker pipeline runner
- redbrick-sagemaker — RedBrick AI and AWS Sagemaker integration
- remoterl — Remote RL for training and inference on AWS SageMaker
- renate — Library for Continual Learning for Practitioners
- rkt-mlops-common — doc1
- rudderlabs.data.apps — Rudderlabs data apps library
- sagemaker-experiments-logger — PyTorch Lightning Experiment Logger
- sagemaker-huggingface-inference-toolkit — Open source library for running inference workload with Hugging Face Deep Learning Containers on Amazon SageMaker.
- sagemaker-local — Some dependencies for running sagemaker local.
- sagemaker-mlops-toolkit — A short description of the package.
- sagemaker-rightline — Python package to easily validate properties of a SageMaker Pipeline.
- sagemaker-ssh-helper — A helper library to connect into Amazon SageMaker with AWS Systems Manager and SSH (Secure Shell)
- sagemaker-tensorflow — Amazon Sagemaker specific TensorFlow extensions.
- sagemaker-tidymodels — Sagemaker framework for Tidymodels
- sagemaker-utils — Helper functions to work with SageMaker
- sagemakerify — High level framework to simplify the use of SageMaker
- SageMakerWrapper — A simple wrapper for AWS Sagemaker
- sagemode — Deploy, scale, and monitor your ML models all with one click. Native to AWS.
- sagerender — A library for configuring SageMaker pipelines using hierarchical configuration pattern.
- sagesand — SageMaker Sandbox
- sageworks — SageWorks: A Dashboard and Python API for creating and deploying AWS SageMaker Model Pipelines
- sageworks-bridges — SageWorks Bridges: End User Application Bridges to SageWorks/AWS
- sagify — Machine Learning Training, Tuning and Deployment on AWS
- sapiens-accelerator — no summary
- simple-sagemaker — A **simpler** and **cheaper** way to distribute work (python/shell/training) work on machines of your choice in the (AWS) cloud
- skellyai — generated labeled data
- sm-serverless-benchmarking — Benchmark sagemaker serverless endpoints for cost and performance
- smjsindustry — Open source library for industry machine learning on Amazon SageMaker.
- smspider — SageMaker Spider
- stepfunctions-without-sagemaker — Open source library for developing data science workflows on AWS Step Functions.
- superai — super.AI API
- sych-llm-playground — Sych LLM Playground is a command line tool deploy and interact with language models on the cloud.
- syne-tune — Distributed Hyperparameter Optimization on SageMaker
- tdapiclient — Teradata API Client Python package
- tfduck-bsd — A small example package
- torchx — TorchX SDK and Components
- torchx-nightly — TorchX SDK and Components
- training-grounds — The framework for featurization and model training
- transformers — State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
- workbench — Workbench: A Dashboard and Python API for creating and deploying AWS SageMaker Model Pipelines
- zenml — ZenML: Write production-ready ML code.
- zenml-nightly — ZenML: Write production-ready ML code.
1