Dear all,

 

This Wednesday, we will host, at ICTP, the first lesson of a short course titled Large Language Models for Physics, taught by Dr. Cristiano De Nobili. This activity is part of the ICOMP's initiatives for the new academic year. The course is designed for young researchers in applied/theoretical physics and climate science, who are interested in applying large language models (LLMs) in their research. All the interested postdocs are very welcome!

Below you can find more info.


Short description of the  course

To warm up, we will review the basics of deep learning and PyTorch. Then the Transformer architecture and its Self-attention Mechanism will be introduced and coded. A simple, small but complete autoregressive generative language model such as GPT-2 will be built. This will allow us to understand several relevant aspects of more sophisticated pre-trained LLMs, such as GPT4, Mistral or Llama. Afterwards, we will play with open-source pre-trained LLMs and, if possible, fine-tune one of them. In the last part of the course, we will explore some interesting, also from a physical point of view, emerging abilities of LLMs, touch upon multi-agent systems and their collective behaviour.


Lecturer: Cristiano De Nobili

Schedule and rooms

Wednesday  (6/11/24): 14.00 - 18.00,  lecture room Giambiagi
Thursday     (7/11/24):   9.00 - 13.00,  lecture room Stasi
Friday          (8/11/24):   9.00 - 13.00,  lecture room Stasi


Best regards,

Serafina Di Gioia