From the course: Snowpark for Data Engineers

Unlock this course with a free trial

Join today to access over 24,700 courses taught by industry experts.

Using Snowpark from a locally running Jupyter Notebook

Using Snowpark from a locally running Jupyter Notebook - Snowflake Tutorial

From the course: Snowpark for Data Engineers

Using Snowpark from a locally running Jupyter Notebook

- [Instructor] It's possible to process data in Snowflake using Snowpark by developing within your local environment. Here I'll show you how you can connect to Snowflake from a Jupyter Notebook, thus allowing you to develop your Snowpark programs within your local environment. In this demo we'll explore some of the functionality that Snowpark data frames offer to analyze and process your data. You'll see that Snowpark data frames are very similar to Apache PySpark data frames and they offer lazy evaluation exactly like Apache Spark does. Just a heads up that when you're working with Snowpark in the real world, you run your code within a virtual environment. Since our focus is on learning how to work with Snowpark data frames, I haven't gone through the process of setting up a virtual environment here. You need the Snowflake Snowpark Python Library in order to get started. Go ahead and install this library so that it's available to work with. After PIP installing the Snowpark Python…

Contents