From the course: Snowpark for Data Engineers
Unlock this course with a free trial
Join today to access over 24,700 courses taught by industry experts.
Using Snowpark from a locally running Jupyter Notebook - Snowflake Tutorial
From the course: Snowpark for Data Engineers
Using Snowpark from a locally running Jupyter Notebook
- [Instructor] It's possible to process data in Snowflake using Snowpark by developing within your local environment. Here I'll show you how you can connect to Snowflake from a Jupyter Notebook, thus allowing you to develop your Snowpark programs within your local environment. In this demo we'll explore some of the functionality that Snowpark data frames offer to analyze and process your data. You'll see that Snowpark data frames are very similar to Apache PySpark data frames and they offer lazy evaluation exactly like Apache Spark does. Just a heads up that when you're working with Snowpark in the real world, you run your code within a virtual environment. Since our focus is on learning how to work with Snowpark data frames, I haven't gone through the process of setting up a virtual environment here. You need the Snowflake Snowpark Python Library in order to get started. Go ahead and install this library so that it's available to work with. After PIP installing the Snowpark Python…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
(Locked)
Using Snowpark from a locally running Jupyter Notebook5m 31s
-
(Locked)
Data transformations using Snowpark DataFrames2m 3s
-
(Locked)
Performing union operations on DataFrames2m 37s
-
(Locked)
Performing joins3m 29s
-
(Locked)
Creating views4m 4s
-
(Locked)
Working with semi-structured data3m 15s
-
(Locked)
-
-
-
-