From the course: Data Engineering Project: Build Streaming Ingestion Pipelines for Snowflake with AWS
Unlock the full course today
Join today to access over 24,700 courses taught by industry experts.
Final considerations
From the course: Data Engineering Project: Build Streaming Ingestion Pipelines for Snowflake with AWS
Final considerations
- [Instructor] With our streaming pipeline successfully up and running, let's talk through a few final considerations as we close out the chapter. While we used an EC2 instance to host Kafka Connect in this course, AWS does offer a service called Amazon MSK Connect, which takes care of the hosting duties for you. Let's quickly walk through how setup would differ should you choose this solution in a practical setting. Keep in mind this will only work if your MSK cluster is located in a private subnet. Firstly, compress your Kafka libraries into a ZIP and load that ZIP file into an S3 bucket. Under MSK in the console, you'll click Custom Plugins. Then create custom plugin where you'll reference that S3 bucket's location. Then click Connector on the left panel within the MSK service in the console. Then create connector. Select Use Existing Plugin and select the plugin you created in step two. Copy the contents from the…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
(Locked)
Setting up your MSK cluster and EC2 instance3m 52s
-
(Locked)
Setting up your keys5m 8s
-
(Locked)
What is Snowpipe?1m 42s
-
(Locked)
Installing Snowpipe Kafka connector3m 26s
-
(Locked)
Set up config for provider-MSK connection and create topic6m 35s
-
(Locked)
Preparing your Snowflake environment6m 23s
-
(Locked)
Setting up your Snowpipe Kafka connect config4m 29s
-
(Locked)
Sending data to Snowflake3m 25s
-
(Locked)
Final considerations2m 8s
-
(Locked)
-