From the course: Build a No-Code ETL Pipeline with Google BigQuery
Unlock this course with a free trial
Join today to access over 24,500 courses taught by industry experts.
Use Transfer Service to ingest data - BigQuery Tutorial
From the course: Build a No-Code ETL Pipeline with Google BigQuery
Use Transfer Service to ingest data
- [Instructor] We are finally ready to set up the transfer configuration and send the data to our table. So I am back in BigQuery. And I'm going to look at my table, which is in the dataset kaggle_stocks. And it's called stock_data. So if you followed the script that I sent, you should find your table in the same place. And here is the table schema that we defined. And if I select preview, I will get a preview of the content of the table, which right now is empty. There are no rows because we haven't sent any data to it. Next, I will go to this panel on the left and I will select data transfers. So this is the homepage for the BigQuery data transfer service. And what I want to do is go here and select create transfer. Now, the first thing that I need to decide is, what is the source for my data? And in our case, the source is Google Cloud Storage. Next, I can choose a name for my transfer configuration. So I can call this stock_data one, for example. You can choose a name that fits…
Contents
-
-
-
-
(Locked)
How data load will work1m 40s
-
(Locked)
Introduction to data4m 44s
-
(Locked)
What is Google Cloud Storage?2m 42s
-
(Locked)
Put data in Google Cloud Storage3m 35s
-
(Locked)
Create table in BigQuery4m 35s
-
(Locked)
Introduction to BigQuery Data Transfer Service1m 43s
-
How we will manage data6m 4s
-
(Locked)
Use Transfer Service to ingest data6m 40s
-
(Locked)
Schedule transfers with Transfer Service3m 19s
-
(Locked)
Identify data transfer issues6m 18s
-
(Locked)
Common issues with data transfer5m 21s
-
(Locked)
-
-
-