You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using PySpark, write code that will transform and load the data from the data lake
By using Kafka as an input source for Spark Structured Streaming and Delta Lake as a storage layer, build a complete streaming data pipeline to consolidate our data - you should read From Kafka to Delta Lake using Apache Spark Structured Streaming (michelin.io)
The text was updated successfully, but these errors were encountered:
Using PySpark, write code that will transform and load the data from the data lake
By using Kafka as an input source for Spark Structured Streaming and Delta Lake as a storage layer, build a complete streaming data pipeline to consolidate our data - you should read From Kafka to Delta Lake using Apache Spark Structured Streaming (michelin.io)
The text was updated successfully, but these errors were encountered: