Portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 2 
Published in 39th Conference on Neural Information Processing Systems, 2025.
We study the universal approximation property of Transformers via in-context learning, found that when context is represented by tokens from a finite set (a vocabulary), Transformers need positional encoding to provide density to achieve universal approximation.
Published:
This is a description of your talk, which is a markdown file that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Undergraduate course, University 1, Department, 2014.
This is a description of a teaching experience. You can use markdown like any other post.
Workshop, University 1, Department, 2015.
This is a description of a teaching experience. You can use markdown like any other post.