From the course: Text to SQL: Amazon Redshift Serverless for Generative SQL in Amazon Q

Unlock this course with a free trial

Join today to access over 24,700 courses taught by industry experts.

Generating SQL queries

Generating SQL queries

- [Instructor] Before we build our first SQL query, let's understand the architecture of large language models or LLMs. LLMs are neural networks that are trained on vast amounts of data. Under a transformer architecture, the encoder and decoder are connected via an attention mechanism for self-learning. This means that a LLM can easily self-learn by extracting words and phrases from a document to formulate a sentence structure or use to understand grammar. Large language models can produce text. Under the transformer architecture developed by Google researchers who author the paper "Attention Is All You Need" through word tokenization, tokens are converted into numbers. These tokens are converted into a vector that we may search via a word embedding table. In a neural network, the encoder includes an attention mechanism that maps an input query, such as a phrase or similar words, which it thinks is the most important, and predicts the output response, which is a vector. With multiple…

Contents