From the course: Databricks Certified Data Engineer Associate Cert Prep

Unlock this course with a free trial

Join today to access over 25,300 courses taught by industry experts.

Orchestrating workloads with jobs

Orchestrating workloads with jobs

- [Instructor] Here we have a Databricks Jobs overview. Let's take a look at what a typical job would look like. So let's say that I wanted to interact with a notebook. This notebook could, for example, do ETL operations or maybe create some new tables or ingest some data. It could even create a dashboard automatically. That is a great thing to do manually, but it's even better if you're able to automate it. So that's where the jobs comes in, is that what I can do is I can set up a new job here and then hook it up to the notebook that I've already created in, for example, source code repo. So notice here that we have the ability to look at the Databricks workspace, so that's a kind of default place, but I also could point the jobs to the source code repo. So this is going to be a best practice here. I would check in that notebook again, it could potentially create some new tables and just some data. Maybe create a dashboard.…

Contents