From the course: Data Engineering Project: Build Streaming Ingestion Pipelines for Snowflake with AWS

Unlock the full course today

Join today to access over 24,700 courses taught by industry experts.

Final considerations

Final considerations

- [Instructor] With our streaming pipeline successfully up and running, let's talk through a few final considerations as we close out the chapter. While we used an EC2 instance to host Kafka Connect in this course, AWS does offer a service called Amazon MSK Connect, which takes care of the hosting duties for you. Let's quickly walk through how setup would differ should you choose this solution in a practical setting. Keep in mind this will only work if your MSK cluster is located in a private subnet. Firstly, compress your Kafka libraries into a ZIP and load that ZIP file into an S3 bucket. Under MSK in the console, you'll click Custom Plugins. Then create custom plugin where you'll reference that S3 bucket's location. Then click Connector on the left panel within the MSK service in the console. Then create connector. Select Use Existing Plugin and select the plugin you created in step two. Copy the contents from the…

Contents