From the course: Hands-On AI: Building LLM-Powered Apps
Unlock the full course today
Join today to access over 24,700 courses taught by industry experts.
Solution: Introduction to Chainlit solution - Python Tutorial
From the course: Hands-On AI: Building LLM-Powered Apps
Solution: Introduction to Chainlit solution
- [Instructor] All right. Hopefully you have fun implementing a basic trended scaffold. So the first question is, will you need to add a proper decorator? So this main function will get called whenever Chainlit receives the message. So we use the on_message decorator and then we will try to echo back the content of the message back to the user. And this completes the scaffold. So to see the application, we can now type chainlit run app/app.py -w from the terminal, and this should pull up an application from GitHub Codespaces. Cool. Now in this application you can type in anything and channel it or echo back whatever we send it. So as an example, when I type, "I love LLM apps," it will echo back the same sentences. Next up, we will try to understand some basics about instructing and prompting Large Language Models.
Contents
-
-
-
Language models and tokenization4m 53s
-
Large language model capabilities1m 48s
-
(Locked)
Challenge: Introduction to Chainlit2m 28s
-
(Locked)
Solution: Introduction to Chainlit solution1m 18s
-
(Locked)
Prompts and prompt templates3m
-
(Locked)
Obtaining an OpenAI token1m 20s
-
(Locked)
Challenge: Adding an LLM to the Chainlit app1m 31s
-
(Locked)
Solution: Adding an LLM to the Chainlit app3m 20s
-
(Locked)
Large language model limitations3m 43s
-
-
-
-