PyCon CZ

PyCon CZ 23
15–17 September
Prague

Low‑Rank Adaptation (LoRA) in Large Language Models a talk by Adam Zíka

Saturday 16 September 15:00 (30 minutes)
__floor__

In this presentation, we will explore the fascinating realm of Low-Rank Adaptation (LoRA) technique in the context of large language models.

As the demand for efficient and versatile language models continues to surge, understanding how to optimize their performance becomes paramount.

Leveraging low-rank adaptations offer a promising avenue to achieve both efficiency and generalization, enabling language models to excel across diverse domains and contexts.

What do you need to know to enjoy this talk

Python level

Medium knowledge: You use frameworks and third-party libraries.

About the topic

No previous knowledge of the topic is required, basic concepts will be explained.

Adam Zíka

At Salted CX, my role as a Machine Learning engineer revolves around specializing in Natural Language Processing. Specifically, I utilize Transformer models to gain insights into the operations of contact centers, enhancing visibility and understanding. I hold an Engineering Doctoral (EngD) degree in Data Science from the Technical University of Eindhoven.

Saturday 16 September

14:00 __main__
14:45

Break

15:00 __main__
15:00 __init__
15:00 __doc__
15:30

Break

15:40 __floor__