From the course: Building Generative AI Apps to Talk to Your Data
Getting answers from data - Snowflake Tutorial
From the course: Building Generative AI Apps to Talk to Your Data
Getting answers from data
(upbeat music) - Have you ever wanted to talk to your data and have it talk back to you? Wouldn't it be nice if you could just ask your data a question in natural language and have it answer you? Well, by connecting the power of generative AI to data, we can do just that, and in this course, you'll learn how. Hi, I'm Josh. Thank you for joining me on this journey to build generative AI applications with data. This course will teach you how to easily talk to all of your data regardless of what it looks like, structured or unstructured, using natural language. I'm excited to get started and I hope you are too. Let's jump in. Imagine you're a new hire and you have been given some high priority questions to answer. You are asked to determine why customer trend is increasing and what the state of the relationship is between clients and account managers. You've also been asked to see if there are any insights that can be gleaned from the last 10 years of research papers that have been saved. You pause and think about how the structured data, sales numbers, customer demographics, and performance metrics are all in the database and ready to query. But the emails, is the research locked away in PDFs and the feedback scattered all across social media? That's all unstructured and not as easy to get insights from. The thing is, the volume of this data is getting bigger and the rate of growth is only accelerating. Now that we can get value out of all of this information, there's even more reason to record and store it. For example, have you joined a video meeting recently and seen a meeting assistant bot join? These AI assistants can now provide automatic summaries and action items for our meetings. 10 years ago, most of these meetings only occurred in the moment, but now they add to the data pile. Another example is wearables. From smart watches and tech-enabled glasses to audio recording pendants, there are more and more ways to capture data from our day-to-day life. Armed with today's generative AI tools, we can now make sense of all of this data, gain longer term memory, and ask questions about the data that these devices record. This presents a big opportunity for anyone that wants to get answers to their questions from all of their data, not just the data that sits nicely in tables. Deploying applications that sit on top of this mountain of data give us new power and insight that we did not have before. Now it's a good time to stop for a second and introduce myself. My name is Josh and I'm a developer advocate at Snowflake. I joined the Snowflake team via a startup that Snowflake acquired named TruEra that was focused on building tooling for trustworthy AI. As part of that work, I helped create and maintain an open source project called TruLens, a library that can be used to systematically track and evaluate LM applications. This set me up well to teach this course on how to build AI applications that use data. My favorite part of my job is delivering tech talks, workshops, and courses to thousands of developers, data scientists, and data professionals just like you. I'm very excited to lead you through this course. So in this course, we're going to learn how to build applications that enable users to ask questions of their data, whether structured or unstructured, all using natural language and without having to rely on an analyst or a programmer to get those answers. We'll do this using two approaches, retrieval-augmented generation or RAG for unstructured data and text-to-SQL for structured data. We'll first introduce these approaches in a general sense in this module, and then we'll learn how to implement them using Snowflake in upcoming modules. By the way, in this course we'll use the Cortex COMPLETE function to generate text responses from LMs given a prompt. If you haven't taken our Intro to Generative AI course, where we learned how to use that function, please check the documentation and read up on how it works. Want to take a quick peek of what you'll produce in Module 2 and 3? Here's a quick preview of what we'll create in the next two modules. This is a quick demo of how you can chat with your data. This is one of the apps you'll build by the end of the course. Let's take a look. Here I'm asking questions like what's the highest revenue for each sales region? And what this app is doing in the background is it's first creating some SQL and writing the SQL for us. This allows anyone that uses the app to talk to structured data. Then we get the response to our question in natural language. I'm showing you now the example of talking with structured data. This is what you'll build in Module 3. We'll also build a Streamlit app in Module 2 that lets you talk with unstructured data. So in this module, we'll get our hands on the key GenAI components that make this work, including RAG or retrieval-augmented generation, and text-to-SQL. We'll also learn some tricks along the way for building an engaging front end on top of generative AI apps with Streamlit, which will allow streaming and multi turn conversations. Excited? I am. In this video, we introduced the course and noted that we'll use RAG for unstructured data and text-to-SQL for structured data to enable users to ask questions of their data. When used together, these two tools can empower application development by enabling dynamic and intelligent interactions with databases all using natural language. Think about how these tools could impact the performance and workflows around you. So in the next video, we'll dive into an overview of how RAG Works and how we use it to get answers from unstructured data. (upbeat music)