What Integrates with ZenML?
Find out what ZenML integrations exist in 2025. Learn what software and services currently integrate with ZenML, and sort them by reviews, cost, features, and more. Below is a list of products that ZenML currently integrates with:
-
1
Google Cloud Platform
Google
Free ($300 in free credits) 55,297 RatingsGoogle Cloud is an online service that lets you create everything from simple websites to complex apps for businesses of any size. Customers who are new to the system will receive $300 in credits for testing, deploying, and running workloads. Customers can use up to 25+ products free of charge. Use Google's core data analytics and machine learning. All enterprises can use it. It is secure and fully featured. Use big data to build better products and find answers faster. You can grow from prototypes to production and even to planet-scale without worrying about reliability, capacity or performance. Virtual machines with proven performance/price advantages, to a fully-managed app development platform. High performance, scalable, resilient object storage and databases. Google's private fibre network offers the latest software-defined networking solutions. Fully managed data warehousing and data exploration, Hadoop/Spark and messaging. -
2
TensorFlow
TensorFlow
Free 2 RatingsOpen source platform for machine learning. TensorFlow is a machine learning platform that is open-source and available to all. It offers a flexible, comprehensive ecosystem of tools, libraries, and community resources that allows researchers to push the boundaries of machine learning. Developers can easily create and deploy ML-powered applications using its tools. Easy ML model training and development using high-level APIs such as Keras. This allows for quick model iteration and debugging. No matter what language you choose, you can easily train and deploy models in cloud, browser, on-prem, or on-device. It is a simple and flexible architecture that allows you to quickly take new ideas from concept to code to state-of the-art models and publication. TensorFlow makes it easy to build, deploy, and test. -
3
Kubernetes
Kubernetes
Free 1 RatingKubernetes (K8s), an open-source software that automates deployment, scaling and management of containerized apps, is available as an open-source project. It organizes containers that make up an app into logical units, which makes it easy to manage and discover. Kubernetes is based on 15 years of Google's experience in running production workloads. It also incorporates best-of-breed practices and ideas from the community. Kubernetes is built on the same principles that allow Google to run billions upon billions of containers per week. It can scale without increasing your operations team. Kubernetes flexibility allows you to deliver applications consistently and efficiently, no matter how complex they are, whether you're testing locally or working in a global enterprise. Kubernetes is an open-source project that allows you to use hybrid, on-premises, and public cloud infrastructures. This allows you to move workloads where they are most important. -
4
Amazon Simple Storage Service (Amazon S3), an object storage service, offers industry-leading scalability and data availability, security, performance, and scalability. Customers of all sizes and industries can use Amazon S3 to store and protect any amount data for a variety of purposes, including data lakes, websites and mobile applications, backup, restore, archive, enterprise apps, big data analytics, and IoT devices. Amazon S3 offers easy-to-use management tools that allow you to organize your data and set up access controls that are tailored to your business, organizational, or compliance needs. Amazon S3 is built for 99.999999999% (11 9,'s) of durability and stores data for millions applications for companies around the globe. You can scale your storage resources to meet changing demands without having to invest upfront or go through resource procurement cycles. Amazon S3 is designed to last 99.999999999% (11 9,'s) of data endurance.
-
5
OpenAI's mission, which is to ensure artificial general intelligence (AGI), benefits all people. This refers to highly autonomous systems that outperform humans in most economically valuable work. While we will try to build safe and useful AGI, we will also consider our mission accomplished if others are able to do the same. Our API can be used to perform any language task, including summarization, sentiment analysis and content generation. You can specify your task in English or use a few examples. Our constantly improving AI technology is available to you with a simple integration. These sample completions will show you how to integrate with the API.
-
6
Definitive functions are the heart of extensible programming. Python supports keyword arguments, mandatory and optional arguments, as well as arbitrary argument lists. It doesn't matter if you are a beginner or an expert programmer, Python is easy to learn. Python is easy to learn, whether you are a beginner or an expert in other languages. These pages can be a helpful starting point to learn Python programming. The community hosts meetups and conferences to share code and much more. The documentation for Python will be helpful and the mailing lists will keep in touch. The Python Package Index (PyPI), hosts thousands of third-party Python modules. Both Python's standard library and the community-contributed modules allow for endless possibilities.
-
7
scikit-image
scikit-image
Free 1 RatingScikit-image is a collection algorithm for image processing. It is free to download and without restriction. We are proud of our high-quality code that has been peer-reviewed and is written by a large community of volunteers. Scikit-image is a Python library that provides a variety of image processing routines. This library is being developed by its community. Contributions are most welcome! Scikit-image is a reference library for scientific image analysis using Python. This is achieved by making it easy to use and easy to install. We take care when adding new dependencies. Sometimes we remove existing ones or make them optional. Our API has detailed docstrings that clarify the expected inputs and outputs for all functions. Conceptually identical arguments share the same name and position within a function signature. The library has close to 100% test coverage and all code is reviewed by at minimum two core developers before it is included. -
8
TorchScript allows you to seamlessly switch between graph and eager modes. TorchServe accelerates the path to production. The torch-distributed backend allows for distributed training and performance optimization in production and research. PyTorch is supported by a rich ecosystem of libraries and tools that supports NLP, computer vision, and other areas. PyTorch is well-supported on major cloud platforms, allowing for frictionless development and easy scaling. Select your preferences, then run the install command. Stable is the most current supported and tested version of PyTorch. This version should be compatible with many users. Preview is available for those who want the latest, but not fully tested, and supported 1.10 builds that are generated every night. Please ensure you have met the prerequisites, such as numpy, depending on which package manager you use. Anaconda is our preferred package manager, as it installs all dependencies.
-
9
AWS offers a wide range of services, including database storage, compute power, content delivery, and other functionality. This allows you to build complex applications with greater flexibility, scalability, and reliability. Amazon Web Services (AWS), the world's largest and most widely used cloud platform, offers over 175 fully featured services from more than 150 data centers worldwide. AWS is used by millions of customers, including the fastest-growing startups, large enterprises, and top government agencies, to reduce costs, be more agile, and innovate faster. AWS offers more services and features than any other cloud provider, including infrastructure technologies such as storage and databases, and emerging technologies such as machine learning, artificial intelligence, data lakes, analytics, and the Internet of Things. It is now easier, cheaper, and faster to move your existing apps to the cloud.
-
10
MongoDB
MongoDB
Free 21 RatingsMongoDB is a distributed database that supports document-based applications and is designed for modern application developers. No other database is more productive. Our flexible document data model allows you to ship and iterate faster and provides a unified query interface that can be used for any purpose. No matter if it's your first customer, or 20 million users worldwide, you can meet your performance SLAs in every environment. You can easily ensure high availability, data integrity, and meet compliance standards for mission-critical workloads. A comprehensive suite of cloud database services that allows you to address a wide range of use cases, including transactional, analytical, search, and data visualizations. Secure mobile apps can be launched with native, edge to-cloud sync and automatic conflicts resolution. MongoDB can be run anywhere, from your laptop to the data center. -
11
GitHub
GitHub
$7 per month 22 RatingsGitHub is the most trusted, secure, and scalable developer platform in the world. Join millions of developers and businesses who are creating the software that powers the world. Get the best tools, support and services to help you build with the most innovative communities in the world. There's a free option for managing multiple contributors: GitHub Team Open Source. We also have GitHub Sponsors that help you fund your work. The Pack is back. We have partnered to provide teachers and students free access to the most powerful developer tools for the school year. Work for a government-recognized nonprofit, association, or 501(c)(3)? Receive a discount Organization account through us. -
12
GitLab
GitLab
$29 per user per month 14 RatingsGitLab is a complete DevOps platform. GitLab gives you a complete CI/CD toolchain right out of the box. One interface. One conversation. One permission model. GitLab is a complete DevOps platform, delivered in one application. It fundamentally changes the way Security, Development, and Ops teams collaborate. GitLab reduces development time and costs, reduces application vulnerabilities, and speeds up software delivery. It also increases developer productivity. Source code management allows for collaboration, sharing, and coordination across the entire software development team. To accelerate software delivery, track and merge branches, audit changes, and enable concurrent work. Code can be reviewed, discussed, shared knowledge, and identified defects among distributed teams through asynchronous review. Automate, track, and report code reviews. -
13
Discord
Discord
Free 52 RatingsDiscord is a free communication app that works on both mobile and desktop devices. Millions of Discord users use the popular game platform every single day to chat with their friends via voice or text or stream their gameplay in crystal clear quality to other Discord users. You can organize a voice/text party within seconds. You can also use the service for finding other players/teammates, searching for specific types of groups/activities or just to talk about games during your free time. Discord isn't limited to any particular game or genre; it can be used to coordinate communications for any type of game. -
14
Slack
Slack
$6.67 per user per month 244 RatingsSlack, a cloud-based project collaboration software solution that facilitates communication between teams, is designed to seamlessly integrate with other organizations. Slack offers powerful tools and services all integrated into one platform. It provides private channels for interaction within smaller teams, direct channels for sending messages to colleagues, as well as public channels that allow members to start conversations across organizations. Slack is available on Mac, Windows and Android as well as iOS apps. It offers a variety of features including chat, file sharing and collaboration, real-time notifications and two-way audio/video, screen sharing, document imaging and activity tracking and logging. -
15
Bitbucket
Atlassian
$15 per month 10 RatingsBitbucket goes beyond Git code management. Bitbucket is a place for teams to plan projects, collaborate on code and test, and then deploy. For small teams of less than 5, Bitbucket is free. Premium plans ($6/user/mo), and Standard ($3/user/mo), are available at scale. You can organize your projects by creating Bitbucket branches from Jira issues and Trello cards. Integrated CI/CD allows you to build, test, and deploy. Configuration as code allows for fast feedback loops and benefits. Pull requests make it easier to approve code reviews. With inline comments, create a merge list with the designated approvers. Bitbucket Pipelines with CI/CD lets you build, test, and deploy with integrated CI/CD. You can benefit from configuration as code and quick feedback loops. With IP whitelisting, 2-step verification and IP whitelisting, you can be sure that your code is safe in the Cloud. You can restrict access to certain users and control their actions by granting branch permissions and merging checks to quality code. -
16
Microsoft Azure
Microsoft
21 RatingsMicrosoft Azure is a cloud computing platform that allows you to quickly develop, test and manage applications. Azure. Invent with purpose. With more than 100 services, you can turn ideas into solutions. Microsoft continues to innovate to support your development today and your product visions tomorrow. Open source and support for all languages, frameworks and languages allow you to build what you want and deploy wherever you want. We can meet you at the edge, on-premises, or in the cloud. Services for hybrid cloud enable you to integrate and manage your environments. Secure your environment from the ground up with proactive compliance and support from experts. This is a trusted service for startups, governments, and enterprises. With the numbers to prove it, the cloud you can trust. -
17
Lambda GPU Cloud
Lambda
$1.25 per hour 1 RatingThe most complex AI, ML, Deep Learning models can be trained. With just a few clicks, you can scale from a single machine up to a whole fleet of VMs. Lambda Cloud makes it easy to scale up or start your Deep Learning project. You can get started quickly, save compute costs, and scale up to hundreds of GPUs. Every VM is pre-installed with the most recent version of Lambda Stack. This includes major deep learning frameworks as well as CUDAĀ®. drivers. You can access the cloud dashboard to instantly access a Jupyter Notebook development environment on each machine. You can connect directly via the Web Terminal or use SSH directly using one of your SSH keys. Lambda can make significant savings by building scaled compute infrastructure to meet the needs of deep learning researchers. Cloud computing allows you to be flexible and save money, even when your workloads increase rapidly. -
18
Amazon SageMaker
Amazon
Amazon SageMaker, a fully managed service, provides data scientists and developers with the ability to quickly build, train, deploy, and deploy machine-learning (ML) models. SageMaker takes the hard work out of each step in the machine learning process, making it easier to create high-quality models. Traditional ML development can be complex, costly, and iterative. This is made worse by the lack of integrated tools to support the entire machine learning workflow. It is tedious and error-prone to combine tools and workflows. SageMaker solves the problem by combining all components needed for machine learning into a single toolset. This allows models to be produced faster and with less effort. Amazon SageMaker Studio is a web-based visual interface that allows you to perform all ML development tasks. SageMaker Studio allows you to have complete control over each step and gives you visibility. -
19
Hugging Face
Hugging Face
$9 per monthAutoTrain is a new way to automatically evaluate, deploy and train state-of-the art Machine Learning models. AutoTrain, seamlessly integrated into the Hugging Face ecosystem, is an automated way to develop and deploy state of-the-art Machine Learning model. Your account is protected from all data, including your training data. All data transfers are encrypted. Today's options include text classification, text scoring and entity recognition. Files in CSV, TSV, or JSON can be hosted anywhere. After training is completed, we delete all training data. Hugging Face also has an AI-generated content detection tool. -
20
Comet
Comet
$179 per user per monthManage and optimize models throughout the entire ML lifecycle. This includes experiment tracking, monitoring production models, and more. The platform was designed to meet the demands of large enterprise teams that deploy ML at scale. It supports any deployment strategy, whether it is private cloud, hybrid, or on-premise servers. Add two lines of code into your notebook or script to start tracking your experiments. It works with any machine-learning library and for any task. To understand differences in model performance, you can easily compare code, hyperparameters and metrics. Monitor your models from training to production. You can get alerts when something is wrong and debug your model to fix it. You can increase productivity, collaboration, visibility, and visibility among data scientists, data science groups, and even business stakeholders. -
21
Llama 3
Meta
FreeMeta AI is our intelligent assistant that allows people to create, connect and get things done. We've integrated Llama 3. Meta AI can be used to code and solve problems, allowing you to see the performance of Llama 3. Llama 3, in 8B or 70B, will give you the flexibility and capabilities you need to create your ideas, whether you're creating AI-powered agents or other applications. We've updated our Responsible Use Guide (RUG), to provide the most comprehensive and up-to-date information on responsible development using LLMs. Our system-centric approach includes updates for our trust and security tools, including Llama Guard 2 optimized to support MLCommons' newly announced taxonomy, code shield and Cybersec Evaluation 2. -
22
Prodigy
Explosion
$490 one-time feeMachine teaching that is highly efficient An annotation tool powered with active learning. Prodigy is a scriptable tool that allows data scientists to do annotations themselves. This allows for a new level in rapid iteration. Transfer learning technologies allow you to train production-quality models using very few examples. Prodigy allows you to take full advantage modern machine learning by using a more agile approach for data collection. You'll be more productive, more independent, and deliver more successful projects. Prodigy combines state-of-the art insights from machine learning with user experience. You are only required to annotate examples that the model doesn't already know. The web application is flexible, powerful, and follows modern UX principles. It's simple to understand: it's designed for you to focus on one decision at the time and keep you clicking, much like Tinder for data. -
23
Seldon
Seldon Technologies
Machine learning models can be deployed at scale with greater accuracy. With more models in production, R&D can be turned into ROI. Seldon reduces time to value so models can get to work quicker. Scale with confidence and minimize risks through transparent model performance and interpretable results. Seldon Deploy cuts down on time to production by providing production-grade inference servers that are optimized for the popular ML framework and custom language wrappers to suit your use cases. Seldon Core Enterprise offers enterprise-level support and access to trusted, global-tested MLOps software. Seldon Core Enterprise is designed for organizations that require: - Coverage for any number of ML models, plus unlimited users Additional assurances for models involved in staging and production - You can be confident that their ML model deployments will be supported and protected. -
24
KServe
KServe
FreeKubernetes is a highly scalable platform for model inference that uses standards-based models. Trusted AI. KServe, a Kubernetes standard model inference platform, is designed for highly scalable applications. Provides a standardized, performant inference protocol that works across all ML frameworks. Modern serverless inference workloads supported by autoscaling, including a scale up to zero on GPU. High scalability, density packing, intelligent routing with ModelMesh. Production ML serving is simple and pluggable. Pre/post-processing, monitoring and explainability are all possible. Advanced deployments using the canary rollout, experiments and ensembles as well as transformers. ModelMesh was designed for high-scale, high density, and often-changing model use cases. ModelMesh intelligently loads, unloads and transfers AI models to and fro memory. This allows for a smart trade-off between user responsiveness and computational footprint. -
25
BentoML
BentoML
FreeYour ML model can be served in minutes in any cloud. Unified model packaging format that allows online and offline delivery on any platform. Our micro-batching technology allows for 100x more throughput than a regular flask-based server model server. High-quality prediction services that can speak the DevOps language, and seamlessly integrate with common infrastructure tools. Unified format for deployment. High-performance model serving. Best practices in DevOps are incorporated. The service uses the TensorFlow framework and the BERT model to predict the sentiment of movie reviews. DevOps-free BentoML workflow. This includes deployment automation, prediction service registry, and endpoint monitoring. All this is done automatically for your team. This is a solid foundation for serious ML workloads in production. Keep your team's models, deployments and changes visible. You can also control access via SSO and RBAC, client authentication and auditing logs. -
26
Pillow
Pillow
FreeThe Python Imaging Library provides image processing capabilities for your Python interpreter. This library supports many file formats, an efficient internal representation, as well as powerful image processing capabilities. The core image library allows for quick access to data stored only in basic pixel formats. It should be a solid foundation for an image processing tool. Tidelift subscribers can get Pillow for Enterprise. The Python Imaging Library is perfect for batch processing and image archival applications. The library can be used to create thumbnails, convert file formats, and print images. The current version can identify and read a wide range of formats. The most common interchange and presentation formats are the only ones that support writing. The library includes basic image processing functionality such as point operations, filtering using a set of convolution kernels and color space conversions. -
27
Google Cloud Vertex AI Workbench
Google
$10 per GBOne development environment for all data science workflows. Natively analyze your data without the need to switch between services. Data to training at scale Models can be built and trained 5X faster than traditional notebooks. Scale up model development using simple connectivity to Vertex AI Services. Access to data is simplified and machine learning is made easier with BigQuery Dataproc, Spark and Vertex AI integration. Vertex AI training allows you to experiment and prototype at scale. Vertex AI Workbench allows you to manage your training and deployment workflows for Vertex AI all from one location. Fully managed, scalable and enterprise-ready, Jupyter-based, fully managed, scalable, and managed compute infrastructure with security controls. Easy connections to Google Cloud's Big Data Solutions allow you to explore data and train ML models. -
28
Azure OpenAI Service
Microsoft
$0.0004 per 1000 tokensYou can use advanced language models and coding to solve a variety of problems. To build cutting-edge applications, leverage large-scale, generative AI models that have deep understandings of code and language to allow for new reasoning and comprehension. These coding and language models can be applied to a variety use cases, including writing assistance, code generation, reasoning over data, and code generation. Access enterprise-grade Azure security and detect and mitigate harmful use. Access generative models that have been pretrained with trillions upon trillions of words. You can use them to create new scenarios, including code, reasoning, inferencing and comprehension. A simple REST API allows you to customize generative models with labeled information for your particular scenario. To improve the accuracy of your outputs, fine-tune the hyperparameters of your model. You can use the API's few-shot learning capability for more relevant results and to provide examples. -
29
Tekton
Tekton
FreeTekton is a cloud-native platform for building CI/CD system. It consists of Tekton Pipelines which provide the building blocks and supporting components such as Tekton CLI or Tekton Catalog that make Tekton a complete environment. Tekton unifies CI/CD processes and tooling across all vendors, languages, deployment environments, and languages. It is compatible with Jenkins, Skaffold and Knative as well as many other popular CI/CD tools. Tekton abstracts the underlying architecture so you can choose the build-test-deploy workflow that suits your team's needs. Tekton makes it easy to create CI/CD systems quickly. It's serverless and cloud-native, making it easy to scale, serverless, and cloud-native. -
30
Evidently AI
Evidently AI
$500 per monthThe open-source ML observability Platform. From validation to production, evaluate, test, and track ML models. From tabular data up to NLP and LLM. Built for data scientists and ML Engineers. All you need to run ML systems reliably in production. Start with simple ad-hoc checks. Scale up to the full monitoring platform. All in one tool with consistent APIs and metrics. Useful, beautiful and shareable. Explore and debug a comprehensive view on data and ML models. Start in a matter of seconds. Test before shipping, validate in production, and run checks with every model update. By generating test conditions based on a reference dataset, you can skip the manual setup. Monitor all aspects of your data, models and test results. Proactively identify and resolve production model problems, ensure optimal performance and continually improve it. -
31
BudgetML
ebhy
FreeBudgetML is ideal for practitioners who want to quickly deploy models to an endpoint but don't want to waste time, money and effort figuring out how to do it end-to-end. BudgetML was created because it is difficult to find a way to quickly and inexpensively get a model into production. Cloud functions have limited memory and are expensive at scale. Kubernetes clusters can be overkill for a single model. Deploying from the ground up requires a data scientist to learn too many concepts, such as SSL certificate generation, Docker and REST, Uvicorn/Gunicorn servers, backend servers etc. BudgetML is the answer to this problem. It's supposed to be quick, easy, and developer friendly. It is not intended to be used as a fully-fledged, production-ready setup. It is a way to get a server running as quickly as possible at the lowest cost. -
32
Llama 3.1
Meta
FreeOpen source AI model that you can fine-tune and distill anywhere. Our latest instruction-tuned models are available in 8B 70B and 405B version. Our open ecosystem allows you to build faster using a variety of product offerings that are differentiated and support your use cases. Choose between real-time or batch inference. Download model weights for further cost-per-token optimization. Adapt to your application, improve using synthetic data, and deploy on-prem. Use Llama components and extend the Llama model using RAG and zero shot tools to build agentic behavior. Use 405B high-quality data to improve specialized model for specific use cases. -
33
Deepchecks
Deepchecks
$1,000 per monthRelease high-quality LLM applications quickly without compromising testing. Never let the subjective and complex nature of LLM interactions hold you back. Generative AI produces subjective results. A subject matter expert must manually check a generated text to determine its quality. You probably know if you're developing an LLM application that you cannot release it without addressing numerous constraints and edge cases. Hallucinations and other issues, such as incorrect answers, bias and deviations from policy, harmful material, and others, need to be identified, investigated, and mitigated both before and after the app is released. Deepchecks allows you to automate your evaluation process. You will receive "estimated annotations", which you can only override if necessary. Our LLM product has been extensively tested and is robust. It is used by more than 1000 companies and integrated into over 300 open source projects. Validate machine-learning models and data in the research and production phases with minimal effort. -
34
Llama 3.2
Meta
FreeThere are now more versions of the open-source AI model that you can refine, distill and deploy anywhere. Choose from 1B or 3B, or build with Llama 3. Llama 3.2 consists of a collection large language models (LLMs), which are pre-trained and fine-tuned. They come in sizes 1B and 3B, which are multilingual text only. Sizes 11B and 90B accept both text and images as inputs and produce text. Our latest release allows you to create highly efficient and performant applications. Use our 1B and 3B models to develop on-device applications, such as a summary of a conversation from your phone, or calling on-device features like calendar. Use our 11B and 90B models to transform an existing image or get more information from a picture of your surroundings. -
35
Llama 3.3
Meta
FreeLlama 3.3, the latest in the Llama language model series, was developed to push the limits of AI-powered communication and understanding. Llama 3.3, with its enhanced contextual reasoning, improved generation of language, and advanced fine tuning capabilities, is designed to deliver highly accurate responses across diverse applications. This version has a larger dataset for training, refined algorithms to improve nuanced understanding, and reduced biases as compared to previous versions. Llama 3.3 excels at tasks such as multilingual communication, technical explanations, creative writing and natural language understanding. It is an indispensable tool for researchers, developers and businesses. Its modular architecture enables customization in specialized domains and ensures performance at scale. -
36
Google Cloud Tekton
Google
Tekton, a Kubernetes-native open source framework for continuous integration and delivery (CI/CD), systems, is powerful and flexible. It allows you to build, test and deploy across multiple cloud providers as well as on-premises systems. Standardize your CI/CD tooling. Built-in best practices Kubernetes. Run on hybrid or multicloud. Get maximum flexibility. -
37
HashiCorp Vault
HashiCorp
Securely store, secure, and tightly control access tokens, passwords and certificates to protect secrets and other sensitive data using a UI or CLI or HTTP API. -
38
Feast
Tecton
Your offline data can be used to make real-time predictions, without the need for custom pipelines. Data consistency is achieved between offline training and online prediction, eliminating train-serve bias. Standardize data engineering workflows within a consistent framework. Feast is used by teams to build their internal ML platforms. Feast doesn't require dedicated infrastructure to be deployed and managed. Feast reuses existing infrastructure and creates new resources as needed. You don't want a managed solution, and you are happy to manage your own implementation. Feast is supported by engineers who can help with its implementation and management. You are looking to build pipelines that convert raw data into features and integrate with another system. You have specific requirements and want to use an open-source solution. -
39
Amazon SageMaker Ground Truth
Amazon Web Services
$0.08 per monthAmazon SageMaker lets you identify raw data, such as images, text files and videos. You can also add descriptive labels to generate synthetic data and create high-quality training data sets to support your machine learning (ML). SageMaker has two options: Amazon SageMaker Ground Truth Plus or Amazon SageMaker Ground Truth. These options allow you to either use an expert workforce or create and manage your data labeling workflows. data labeling. SageMaker GroundTruth allows you to manage and create your data labeling workflows. SageMaker Ground Truth, a data labeling tool, makes data labeling simple. It also allows you to use human annotators via Amazon Mechanical Turk or third-party providers. -
40
AWS AI Services
Amazon
AWS AI Services are pre-trained and ready to provide intelligence for your applications. AI Services integrate easily with your applications to address common uses cases, such as personalized recommendations and modernizing your contact centre, improving safety and security, increasing customer engagement, and improving customer satisfaction. Because we use the same deep learning technology that powers Amazon.com and our ML Services, you get quality and accuracy from continuously-learning APIs. AWS's AI Services don't require any prior machine learning experience. You can catalog assets, automate workflows, extract meaning from media and applications, and even catalog them. For complete quality control, identify missing components, vehicle and structure damage, irregularities, and other irregularities. Automated monitoring can improve operations and help identify bottlenecks. It also helps to assess safety and quality. You can quickly extract valuable information from millions upon millions of documents. -
41
Label Studio
Label Studio
The most flexible data annotation software. Quickly installable. Create custom UIs, or use pre-built labeling template. Layouts and templates that can be customized to fit your dataset and workflow. Detect objects in images. Supported are boxes, polygons and key points. Partition an image into multiple segments. Use ML models to optimize and pre-label the process. Webhooks, Python SDK and API allow you authenticate, create tasks, import projects, manage model predictions and more. ML backend integration allows you to save time by using predictions as a tool for your labeling process. Connect to cloud object storage directly and label data there with S3 and GCP. Data Manager allows you to manage and prepare your datasets using advanced filters. Support multiple projects, use-cases, and data types on one platform. You can preview the labeling interface as you type in the configuration. You can see live serialization updates at the bottom of the page. -
42
Llama 2
Meta
FreeThe next generation of the large language model. This release includes modelweights and starting code to pretrained and fine tuned Llama languages models, ranging from 7B-70B parameters. Llama 1 models have a context length of 2 trillion tokens. Llama 2 models have a context length double that of Llama 1. The fine-tuned Llama 2 models have been trained using over 1,000,000 human annotations. Llama 2, a new open-source language model, outperforms many other open-source language models in external benchmarks. These include tests of reasoning, coding and proficiency, as well as knowledge tests. Llama 2 has been pre-trained using publicly available online data sources. Llama-2 chat, a fine-tuned version of the model, is based on publicly available instruction datasets, and more than 1 million human annotations. We have a wide range of supporters in the world who are committed to our open approach for today's AI. These companies have provided early feedback and have expressed excitement to build with Llama 2 -
43
LangSmith
LangChain
Unexpected outcomes happen all the time. You can pinpoint the source of errors or surprises in real-time with surgical precision when you have full visibility of the entire chain of calls. Unit testing is a key component of software engineering to create production-ready, performant applications. LangSmith offers the same functionality for LLM apps. LangSmith allows you to create test datasets, execute your applications on them, and view results without leaving the application. LangSmith allows mission-critical observability in just a few lines. LangSmith was designed to help developers harness LLMs' power and manage their complexity. We don't just build tools. We are establishing best practices that you can rely upon. Build and deploy LLM apps with confidence. Stats on application-level usage. Feedback collection. Filter traces and cost measurement. Dataset curation - compare chain performance - AI-assisted assessment & embrace best practices. -
44
DBRX
Databricks
Databricks has created an open, general purpose LLM called DBRX. DBRX is the new benchmark for open LLMs. It also provides open communities and enterprises that are building their own LLMs capabilities that were previously only available through closed model APIs. According to our measurements, DBRX surpasses GPT 3.5 and is competitive with Gemini 1.0 Pro. It is a code model that is more capable than specialized models such as CodeLLaMA 70B, and it also has the strength of a general-purpose LLM. This state-of the-art quality is accompanied by marked improvements in both training and inference performances. DBRX is the most efficient open model thanks to its finely-grained architecture of mixtures of experts (MoE). Inference is 2x faster than LLaMA2-70B and DBRX has about 40% less parameters in total and active count compared to Grok-1. -
45
ARGO
ARGO
Are your fraud losses higher that you expected? Is your fraud prevention effectiveness lower than 95%? Are you losing money at the ATMs and on the teller line? Are your check verification thresholds higher than $500? Are you spending more that 0.01% of your bank assets on systems or analysts to review suspects? Are you reviewing 250 checks for every item worth returning? Stop wasting your time and money by reviewing more than 250 checks for each item worth returning. All-In-One Check and ACH, ATM, Wire and Cash Fraud Security Solutions. All-in-one fraud solution that includes compliance reporting, case management options and increased fraud prevention for financial transactions. Innovative technology connects healthcare customers and financial services. -
46
Azure Kubernetes Service (AKS)
Microsoft
Azure Kubernetes Services (AKS), a fully managed service that manages containerized applications, makes it easy to deploy and manage them. It provides serverless Kubernetes and integrated continuous integration/continuous delivery (CI/CD), as well as enterprise-grade security, governance, and governance. You can quickly build, deliver, scale and scale applications using confidence by bringing together your operations and development teams. You can easily provision additional capacity by using elastic provisioning without having to manage the infrastructure. KEDA allows for event-driven autoscaling. Azure Dev Spaces allows for faster end-to-end development, including integration with Visual Studio Code Kubernetes tools and Azure DevOps. Azure Policy allows for advanced identity and access management, as well as dynamic rules enforcement across multiple clusters. More regions are available than any other cloud provider. -
47
PostgreSQL
PostgreSQL Global Development Group
PostgreSQL, a powerful open-source object-relational database system, has over 30 years of experience in active development. It has earned a strong reputation for reliability and feature robustness. -
48
Apache Spark
Apache Software Foundation
Apache Sparkā¢, a unified analytics engine that can handle large-scale data processing, is available. Apache Spark delivers high performance for streaming and batch data. It uses a state of the art DAG scheduler, query optimizer, as well as a physical execution engine. Spark has over 80 high-level operators, making it easy to create parallel apps. You can also use it interactively via the Scala, Python and R SQL shells. Spark powers a number of libraries, including SQL and DataFrames and MLlib for machine-learning, GraphX and Spark Streaming. These libraries can be combined seamlessly in one application. Spark can run on Hadoop, Apache Mesos and Kubernetes. It can also be used standalone or in the cloud. It can access a variety of data sources. Spark can be run in standalone cluster mode on EC2, Hadoop YARN and Mesos. Access data in HDFS and Alluxio. -
49
Weights & Biases
Weights & Biases
Weights & Biases allows for experiment tracking, hyperparameter optimization and model and dataset versioning. With just 5 lines of code, you can track, compare, and visualise ML experiments. Add a few lines of code to your script and you'll be able to see live updates to your dashboard each time you train a different version of your model. Our hyperparameter search tool is scalable to a massive scale, allowing you to optimize models. Sweeps plug into your existing infrastructure and are lightweight. Save all the details of your machine learning pipeline, including data preparation, data versions, training and evaluation. It's easier than ever to share project updates. Add experiment logging to your script in a matter of minutes. Our lightweight integration is compatible with any Python script. W&B Weave helps developers build and iterate their AI applications with confidence. -
50
MLflow
MLflow
MLflow is an open-source platform that manages the ML lifecycle. It includes experimentation, reproducibility and deployment. There is also a central model registry. MLflow currently has four components. Record and query experiments: data, code, config, results. Data science code can be packaged in a format that can be reproduced on any platform. Machine learning models can be deployed in a variety of environments. A central repository can store, annotate and discover models, as well as manage them. The MLflow Tracking component provides an API and UI to log parameters, code versions and metrics. It can also be used to visualize the results later. MLflow Tracking allows you to log and query experiments using Python REST, R API, Java API APIs, and REST. An MLflow Project is a way to package data science code in a reusable, reproducible manner. It is based primarily upon conventions. The Projects component also includes an API and command line tools to run projects.