We are launching a free Open-source ML observability course for data scientists and ML engineers!
Join us to learn production ML monitoring.
🗓 Start date: October 16, 2023.
The foundations of ML observability: from exploratory data analysis and model evaluation pre-deployment to continuous production monitoring and debugging.
The Open-source ML observability course starts on October 16, 2023.
Sign up to save your seat and receive weekly course updates.
Jump to the course website for all useful links and course notes. We will continue to publish new materials throughout the course.
Our course is organized into six modules. You can follow the complete course syllabus or pick only the modules that are most relevant to you.
📚 Module 1: Introduction to ML monitoring and observability
Basics of ML monitoring and observability.
📈 Module 2: ML monitoring metrics: model quality, data quality, data drift
Metrics and evaluation methods for structured data.
🔡 Module 3: ML monitoring for unstructured data: NLP, LLM and embeddings
Metrics and evaluation methods for unstructured data.
🏗 Module 4: Designing effective ML monitoring
Key questions to consider when customizing ML monitoring for your model.
✅ Module 5: ML pipelines validation and testing
How to deploy an end-to-end pipeline to check data and ML model quality.
📊 Module 6: Deploying an ML monitoring dashboard
How to deploy an ML monitoring service and design a live dashboard.
Emeli Dral is a Co-founder and CTO at Evidently AI, a startup developing open-source tools to evaluate, test, and monitor the performance of machine learning models.
Earlier, she co-founded an industrial AI startup and served as the Chief Data Scientist at Yandex Data Factory. She led over 50 applied ML projects for various industries: from banking to manufacturing. Emeli is a data science lecturer at GSOM SpBU and Harbour.Space University. She is a co-author of the Machine Learning and Data Analysis curriculum at Coursera with over 150,000 students.
Yes, the course is 100% free.
The course is organized into six modules. Each module consists of on-demand videos and practical code examples. We’ll release each new module every week, starting October 16, 2023.
To earn a certificate of completion, you must enroll on the course platform and complete the quizzes and a final assignment before December 1, 2023.
This is optional — you can also learn at your own pace and go through the videos, which will remain publicly available after the course cohort. You can follow the complete course syllabus or choose only the most relevant modules.
The course lasts seven weeks. There are six modules with learning materials, and one extra week to complete the final assignment.
However, you can also learn at your own pace — whatever works best for you!
You will receive a certificate if you join the course and successfully complete all the assignments before December 1, 2023.
Note that the option to receive the certificate is available only to those who participate in the course cohort. If you are taking the course after December 1, 2023, you can freely access the materials, but you won’t be eligible to receive the certificate.
There are both theoretical and code-focused modules that require knowledge of Python. But no worries: we will walk you through the code! You can also skip these parts and still learn a lot.
Have a question or just want to say “Hi”? Jump to our Discord #-ml-observability-course channel to chat with fellow learners and get support from the course team.
We will also host Q&A sessions with the course instructor, Emeli Dral, on a Zoom call once per week.
We’ll send all the information to those who signed up for the course before October 16, 2023. Please make sure that you sign up for the course to stay updated.
Jump to the course website for all useful links and course notes. We will continue to publish new materials throughout the course.
Yes, sign up here.
Yes! All course materials are public so you can get back to them at your convenience, during or after the course.
In the Evidently GitHub repository, we added a special set of issues labeled “hacktoberfest."
Check out the issues we prepared for Hacktoberfest. You can pick one of them or propose a different metric.
Choose the drift method you want to implement and submit your pull request!
Wait for your pull request to be reviewed.
Evidently is an open-source Python library for data scientists and ML engineers. It helps evaluate, test, and monitor the performance of ML models from validation to production. You can check it out on GitHub or explore the documentation.
Hacktoberfest is an annual event to celebrate open-source and encourage contributions. It runs for the 9th time this year. Eligible participants will get prizes from DigitalOcean. But first and foremost, it is a great reason to create your first (or hundredth) pull request! You can check out the complete rules here.
Evidently is an open-source project, and is always open for contributions. For Hacktoberfest, we added a special set of issues labeled “hacktoberfest” to the Evidently GitHub repository. We invite data scientists to dip their toes into open-source contribution and help us add new statistical metrics and tests to detect data drift for production ML models. Check out Hacktoberfest issues on our GitHub. Head to the Evidently Hacktoberfest guide for clear steps and detailed examples. Sign up to receive the kick-off newsletter.
Don’t forget to register for Hacktoberfest by October 31! If you register and have 4 pull requests accepted among the first 40000 participants, you can get a prize. Read more here.