From the course: Implementing Data Engineering Solutions Using Microsoft Fabric (DP-700) Cert Prep by Microsoft Press

Unlock this course with a free trial

Join today to access over 25,300 courses taught by industry experts.

Using Eventstreams to process data

Using Eventstreams to process data

Using event streams to process data. Now when we're using event streams, this is our low to no code method of connecting to data, applying the available transforms that we have, and then saving that to a destination downstream. We can actually connect to the Apache Kafka protocol. So when we create an event stream, if we do have any streaming sources that use Kafka, However, we can connect and process data. So here's an example of an event stream. As you can see, this is a very graphical way of streaming and processing data. So I'm loading some data and I'm saving that to a lake house. I'm also applying a transformation function, which is an aggregate, and saving that to the lake house. So we can apply these transformations. We can drag and drop. filtering, aggregating, joining, and essentially we can transform and shape that data. And then we can save that into a destination. We have lake houses, event houses, custom endpoints, fabric activator, and another stream to save our…

Contents