Invited Talks

Next-Generation Intelligent Assistants for AR/VR Devices

Xin Luna Dong (Meta)

at  14:40in  Physicalfor  50min

An intelligent assistant shall be an agent that knows you and the world, can receive your requests or predict your needs, and provide you the right services at the right time with your permission. As smart devices such as Amazon Alexa, Google Home, Meta Ray-ban Stories get popular, Intelligent Assistants are gradually playing an important role in people’s lives. The Emergence of AR/VR devices brings more opportunities and calls for the next generation of Intelligent Assistants. In this talk, we discuss the many challenges and opportunities we face to grow intelligent assistants from server-side to on-device, from voice-only to multi-modal, from context-agnostic to context-aware, and from listening to the users’ requests to predicting the user’s needs. We also describe the roles public and personal knowledge graphs play to empower such an assistant. We expect these new challenges to open doors to new research areas and start a new chapter for providing personal assistance services.

Xin Luna Dong is a Principal Scientist at Meta, leading science for Meta AR/VR Assistant. Prior to joining Meta, she was a Senior Principal Scientist at Amazon, leading the efforts of constructing Amazon Product Knowledge Graph, and before that one of the major contributors to the Google Knowledge Vault project, and has led the Knowledge-based Trust project, which is called the “Google Truth Machine” by Washington’s Post. She has co-authored books “Machine Knowledge: Creation and Curation of Comprehensive Knowledge Bases” and “Big Data Integration”, was awarded ACM Distinguished Member, and VLDB Early Career Research Contribution Award for “Advancing the state of the art of knowledge fusion”. She serves in the VLDB endowment and PVLDB advisory committee, and is a PC co-chair for KDD’2022 ADS track, WSDM 2022, VLDB 2021, and Sigmod 2018.

 Talk Overview  Program