Ashish Vaswani

Research Scientist at Google AI

Ashish Vaswani is a Senior Research Scientist in the Brain team at GoogleAI in San Francisco. He is known for his development of pure attention-based models, such as the Transformer, which have significantly advanced the fields of machine translation, document generation, and syntactic parsing. The Transformer model architecture, co-authored by Ashish in 2017, has been cited over 1,500 times and has enabled breakthroughs in natural language processing models, including Google AI's BERT, OpenAI's GPT, and Microsoft's MT-DNN.

Before joining Google, Ashish was a PhD student and later a Research Scientist in natural language processing at the University of Southern California Information Sciences Institute. His collaborative research has been recognized and presented at prestigious conferences such as NIPS 2017, ICML 2018, and ICLR 2019, and covers areas as diverse as music generation with long-term structure, fast decoding in sequence models, and image transformation.

sessıon:
Self-Attention: Language, Images and Music
Watch Session
Ashish Vaswani