From the course: Microsoft Azure Data Scientist Associate (DP-100) Cert Prep
Unlock this course with a free trial
Join today to access over 25,300 courses taught by industry experts.
Configure compute for a batch deployment - Azure Tutorial
From the course: Microsoft Azure Data Scientist Associate (DP-100) Cert Prep
Configure compute for a batch deployment
- When you're deploying models via Azure ML Studio, a couple of options include batch in real time. There also is the ability to deploy directly as a web service, and essentially abstract away some of the complexity with packaging a model. In the time that you're doing a batch, often you're going to be periodically running it. So let's say once a night you would go through and do credit card scoring. In the case of real time, you would do this on a 24/7 possibility. So you would go through and have some endpoint deployed and continuously put packages into this real-time endpoint. So really there's a couple of key options, and then there's a peripheral option, which packages the entire service. Let's go ahead and take a look at how this would work. So if we go over to Azure ML Studio here, and we dive into the model prediction interface, one of the things that we see here is that there's deploy to real time, deploy to batch, and deploy to service. So this is only for models that are…
Contents
-
-
-
-
-
-
(Locked)
Configure compute for a batch deployment2m 11s
-
(Locked)
Deploy a model to a batch endpoint4m 2s
-
(Locked)
Test a real-time deployed service4m 23s
-
(Locked)
Apply machine learning operations (MLOps) practices4m 32s
-
(Locked)
Trigger an Azure Machine Learning pipeline, including from Azure DevOps or GitHub2m 36s
-
(Locked)
Conclusion1m 6s
-
(Locked)