From the course: Microsoft Azure Data Scientist Associate (DP-100) Cert Prep

Unlock this course with a free trial

Join today to access over 25,300 courses taught by industry experts.

Deploy a model to a batch endpoint

Deploy a model to a batch endpoint

- [Instructor] Let's take a look at an end to end MLOps model workflow with Databricks and how you can take Databricks and MLflow and convert it to another platform if you'd like. So here's a good example. I have Kaggle here where I could go in and pick pretty much any project that does a classification and I could upload that into Databricks. Once I've uploaded the dataset into Databricks, I could use the DBFS and the UI to create a table. Once I've done that, I could create an AutoML experiment. Once that AutoML experiment is completed, I would register that best model and then put that into a Databricks endpoint. If I chose to serve it out via Databricks, I don't have to necessarily do that, but I can do that. I also could call the MLflow API from any cloud environment, from Azure, from GitHub code spaces, from AWS Cloud nine, and I could develop a microservice-based approach and push that into some other environment. In fact, AWS, the ECR Container Registry could be one option. I…

Contents