Skip to main content
background
2024 NYU STERN FINTECH CONFERENCE

Gašper Beguš

Living Large: The Latest on AI in Finance

Assistant Professor 
Department of Linguistics
University of California, Berkeley


I am an Assistant Professor at the Department of Linguistics at UC Berkeley, where I’m also affiliated with the Institute of Cognitive and Brain Sciences. Previously, I was an Assistant Professor at the University of Washington. Before that, I graduated with a Ph.D. from Harvard.

My research focuses on developing deep learning models for speech data and using well-understood dependencies in speech to interpret internal representations in deep neural networks. More specifically, I build models that learn representations of spoken words from raw audio inputs. I combine machine learning and statistical models with neuroimaging and behavioral experiments to better understand how neural networks learn internal representations in speech and how humans learn to speak. I have worked and published on sound systems of various language families such as Indo-European, Caucasian, and Austronesian languages.

In a recent set of papers (here and here), I propose that language acquisition can be modeled with Generative Adversarial Networks and propose a technique for exploring the relationship between learned representations and latent space in deep convolutional networks.

I direct the Berkeley Speech and Computation Lab. Feel free to contact me if you’re interested in getting involved with the lab. begus@berkeley.edu


Research web page
Research

Toward understanding the communication in sperm whales

Machine learning has been advancing dramatically over the past decade.  Opportunities are ripe to apply this technology to more deeply understand non-human communication. We detail a scientific roadmap for advancing the understanding of communication of whales that can be built further upon as a template to decipher other forms of animal and non-human communication.