From the course: AWS Certified Machine Learning Engineer Associate (MLA-C01) Cert Prep

Unlock this course with a free trial

Join today to access over 24,700 courses taught by industry experts.

Hands-on learning: Tweaking inference parameters

Hands-on learning: Tweaking inference parameters

- [Instructor] Hello guys, in today's hands-on lab, we're going to walk through an example where we would be tweaking the inference parameters of a foundation model. We're going to talk about the temperature, the top P and the top K, the maximum length and the stop sequence. We would first start with the temperature, which affects the randomness so you could see how lower values lead to predictable responses, while higher values create more imaginative responses. The top P, the top K, these parameters alter the riskiness of the word choices, which impacts whether the response includes less common, more creative words versus safer, more frequent words. The maximum length, it controls the outputs length, showing how the model can cut off early or even continue longer based on this setting. The stop sequence allows the control over where the model stops generating. So for this example, we're going to test the inference parameters using the Amazon Bedrock service. Specifically, we're…

Contents