The WhyLabs Blog
Our ideas and thoughts on how to run AI with certainty
In the second article in this series, we break down what to look for in a data quality monitoring solution, open source and Saas tools available, and how to decide on the best one for your organization.
Stephen Oladele,
Danny D. Leybzon
May 18, 2022
- ML Monitoring
- Machine Learning
- ML Monitoring
- Machine Learning
OTHER POSTS
Deploying and Monitoring Made Easy with TeachableHub and WhyLabs
Danny D. Leybzon,
Felipe Adachi
| Mar 16, 2022
Deploying a model into production and maintaining its performance can be harrowing for many Data Scientists, especially without specialized expertise and equipment. Fortunately, TeachableHub and WhyLabs make it easy to get models out of the sandbox and into a production-ready environment.
- integration
- ML Monitoring
- AI Observability
A Comprehensive Overview Of Data Quality Monitoring
Stephen Oladele,
Danny D. Leybzon
| Apr 29, 2022
In the first article in this series, we provide a detailed overview of why data quality monitoring is crucial for building successful data and machine learning systems and how to approach it.
- ML Monitoring
- Machine Learning
WhyLabs Now Available in AWS Marketplace
Maria Karaivanova
| Mar 18, 2022
AWS customers worldwide can now quickly deploy the WhyLabs AI Observatory to monitor, understand, and debug their machine learning models deployed in AWS.
- AI Observability
- WhyLabs
- Machine Learning
- ML Monitoring
How Observability Uncovers the Effects of ML Technical Debt
Bernease Herman,
Alessya Visnjic,
Danny D. Leybzon
| Mar 10, 2022
Many teams test their machine learning models offline but conduct little to no online evaluation after initial deployment. These teams are flying blind—running production systems with no insight into their ongoing performance.
- AI Observability
- WhyLabs
- whylogs
- Thought Leadership
- ML Monitoring
- MLOps
- Data Logging
Deploy your ML model with UbiOps and monitor it with WhyLabs
Danny D. Leybzon
| Jan 5, 2022
Machine learning models can only provide value for a business when they are brought out of the sandbox and into the real world... Fortunately, UbiOps and WhyLabs have partnered together to make deploying and monitoring machine learning models easy.
- integration
- AI Observability
- Machine Learning
- ML Monitoring
- MLOps
- Startup
- WhyLabs
AI Observability for All
Alessya Visnjic
| Jan 4, 2022
We’re excited to announce our new Starter edition: a free tier of our model monitoring solution that allows users to access all of the features of the WhyLabs AI observability platform. It is entirely self-service, meaning that users can sign up for an account and get started right away.
- AI Observability
WhyLabs Achieves SOC 2 Type 2 Certification!
Maria Karaivanova
| Dec 27, 2021
We are very happy to announce that we successfully completed our SOC 2 Type 2 examination with zero exceptions. WhyLabs is committed to ensuring our current, and future customers are well informed about the robust capabilities and security of the WhyLabs AI Observatory platform.
- WhyLabs
- Security
Observability in Production: Monitoring Data Drift with WhyLabs and Valohai
Danny D. Leybzon
| Dec 3, 2021
What works today might not work tomorrow. And when a model is in real-world use, serving the faulty predictions can lead to catastrophic consequences...
- AI Observability
- ML Monitoring
- MLOps
- Data Logging
- integration
Why You Need ML Monitoring
Danny D. Leybzon
| Dec 2, 2021
Machine learning models are increasingly becoming key to businesses of all shapes and sizes, performing myriad functions... If a machine learning model is providing value to a business, it’s essential that the model remains performant.
- AI
- AI Observability
- Data Logging
- Machine Learning
- ML Monitoring
- MLOps
- WhyLabs
Data Labeling Meets Data Monitoring with Superb AI and WhyLabs
Danny D. Leybzon
| Nov 29, 2021
Data quality is the key to a performant machine learning model. That’s why WhyLabs and Superb AI are on a mission to ensure that data scientists and machine learning engineers have access to tools designed specifically for their needs and workflows.
- integration
- DataOps
- Image Data
- Data Logging
- ML Monitoring
- AI Observability
Running and Monitoring Distributed ML with Ray and whylogs
Anthony Naddeo,
Danny D. Leybzon
| Nov 23, 2021
Running and monitoring distributed ML systems can be challenging. Fortunately, Ray makes parallelizing Python processes easy, and the open source whylogs enables users to monitor ML models in production, even if those models are running in a distributed environment.
- open source
- whylogs
- integration
- AI Observability
- Logging
Monitor your SageMaker model with WhyLabs
Danny D. Leybzon
| Nov 18, 2021
In this blog post, we will dive into the WhyLabs AI Observatory, a data and ML monitoring and observability platform, and show how it complements Amazon SageMaker.
- SageMaker
- AI Observability
- Machine Learning
- integration
- WhyLabs
Deploy and Monitor your ML Application with Flask and WhyLabs
WhyLabs Staff
| Nov 9, 2021
In this article, we deploy a Flask application for pattern recognition based on the well-known Iris Dataset. For the application monitoring, we’ll explore the free, starter edition of the WhyLabs Observability Platform in order to set up our own model monitoring dashboard.
- AI Observability
- flask
- whylogs
- WhyLabs
- ML Monitoring
- MLOps
- Getting Started
WhyLabs Raises $10M from Andrew Ng, Defy Partners to bring AI observability to every AI practitioner
WhyLabs Staff
| Nov 4, 2021
SEATTLE, November 4, 2021 — WhyLabs, the leading provider of observability for AI and data applications announced today the close of a $10 million Series A co-led by Defy Partners and Andrew Ng’s AI Fund.
- MLOps
- WhyLabs
- DataOps
- Andrew Ng
- AI Observability
Detecting Semantic Drift within Image Data: Monitoring Context-Full Data with whylogs
Leandro G. Almeida
| Aug 7, 2021
Concept drifts can originate in different stages of your data pipeline, even before the data collection itself.
In this article, we’ll show how whylogs can help you monitor your machine learning system’s data ingestion pipeline by enabling concept drift detection, specifically for image data.
- Data Analytics
- Data Logging
- Semantic Drift
- Image Data
- whylogs
- MLOps
- ML Monitoring
Don’t Let Your Data Fail You; Continuous Data Validation with whylogs and Github Actions
WhyLabs Staff
| Jul 20, 2021
Ensuring data quality should be among your top priorities when developing an ML pipeline. In this article we’ll show how whylogs constraints with Github Actions can help with data validation, as a key component in ensuring data quality.
- whylogs
- Data Logging
- Data Validation
- Github Actions
WhyLabs' Data Geeks Unleashed
Alessya Visnjic,
Leandro G. Almeida,
Andy Dang,
Bernease Herman
| May 21, 2021
This month three members of the WhyLabs team are speaking at the Data and AI Summit. In this post you find descriptions and links to the talk by Alessya Visnjic, Leandro Almeida, and Andy Dang.
- Data Science
- Thought Leadership
- Big Data
- Data Analytics
- Data Logging
Integrating whylogs into your Kafka ML Pipeline
Chris Warth,
Alessya Visnjic
| Apr 7, 2021
Evaluating the quality of data in the Kafka stream is a non-trivial task due to large volumes of data and latency requirements. This is an ideal job for whylogs, an open-source package for Python or Java that uses Apache DataSketches to monitor and detect statistical anomalies in streaming data.
- Machine Learning
- Logging
- kafka
- whylogs
- MLOps
Monitoring High-Performance Machine Learning Models with RAPIDS and whylogs
Andy Dang,
Bernease Herman
| Mar 1, 2021
Machine learning (ML) data is big and messy. Organizations have increasingly adopted RAPIDS and cuML to help their teams run experiments faster and achieve better model performance on larger datasets.
- Apache Spark
- Data Analytics
- Data Logging
- Machine Learning
- MLflow
- MLOps
- RAPIDS
- whylogs
Streamlining data monitoring with whylogs and MLflow
Alex Sudbinin
| Feb 8, 2021
It's hard to overstate the importance of monitoring data quality in ML pipelines. In this post we will explore an elegant solution with whylogs and MLflow, which allows for a more informed analysis of model performance.
- Machine Learning
- Logging
- MLOps
- whylogs
- MLflow
Data Logging: Sampling versus Profiling
Isaac Backus,
Bernease Herman
| Oct 29, 2020
In traditional software, logging and instrumentation have been adopted as standard practice to create transparency and to make sense of the health of a complex system. When it comes to AI applications, the lack of tools and standardized approaches mean that logging is often spotty and incomplete.
- MLOps
- Data Science
- Logging
- Artificial Intelligence
- Machine Learning
WhyLabs: The AI Observability Platform
Alessya Visnjic
| Sep 23, 2020
As companies across industries adopt AI applications in order to improve products and stay competitive, very few have seen a return on their investments. That’s because AI operations are expensive...
- AI
- Machine Learning
- Data Science
- Data Visualization
- Artificial Intelligence
Introducing WhyLabs, a Leap Forward in AI Reliability
Alessya Visnjic
| Sep 23, 2020
Today, we are excited to announce WhyLabs, a company that empowers AI practitioners to reap the benefits of AI without the spectacular failures that so often make the news.
- AI
- Machine Learning
- Startup
- Seattle
whylogs: Embrace Data Logging Across Your ML Systems
Andy Dang,
Bernease Herman
| Sep 23, 2020
Fire up your MLOps with a scalable, lightweight, open source data logging library
Co-author: Bernease Herman
We are thrilled to announce the open-source package whylogs. It enables data logging for any ML/AI pipeline in a few lines of code. Data logging is a critic...
- Data Science
- MLOps
- DevOps
- Machine Learning
- Logging