Ye Yuan 袁野
"To ordain conscience for Heaven and Earth. To secure life and fortune for the people. To continue lost teachings for past sages. To establish peace for all future generations." - Zai Zhang.
"为天地立心,为生民立命,为往圣继绝学,为万世开太平" - 张载.


I am a Ph.D. candidate in Computer Science at McGill
University and Mila - Quebec AI
Institute.
It's my great honor to work with and be supervised by Professor Xue (Steve) Liu.
And I am also grateful to have Professor Adriana Romero Soriano
and Professor Gintare Karolina Dziugaite as my supervision committee member.
Before that, I received my Bachelor's Degree of Science in Honours Computer Science at McGill
University.
What truly captivates me is the potential of intelligent systems and generative modelling to assist humans. How can artificial intelligence accurately and flawlessly complete tasks assigned by humans? How can generative models be applied for specific tasks? How do generative models mitigates the issues suffered by predominately used traditional approaches? More specifically, my research is concentrated on score-based generative algorithms and large language models with their applications in (i) addressing the Offline Black Box Optimization challenges, (ii) alleviating the challenges of knowledge-centric NLP tasks, developing a foundational knowledge model, enabling the automatic knowledge base construction, (iii) improve the efficiency and efficacy of AI systems for real-world applications.
Beyond academic community, I am interning/previously interned and closely collaborated with researchers at Microsoft Research, Samsung Research America, Noah's Ark Lab Canada, and RBC (Royal Bank of Canada) Borealis AI.
It is also my honor that my works have been recognized by the research community, and I'm fortunate to receive the DAAD AINeT Fellowship, Bank of Montreal (BMO) Responsible AI Senior Scholar, BMO Responsible AI Fellowship, ICLR 2025 Financial Assistance, NeurIPS 2023 Scholar Award, McGill Faculty of Science Graduate Scholarship, and McGill Graduate Excellence Awards. Moreover, during my internship at Noah's Ark Lab Canada, I received the Outstanding Contribution Award in R&D Peripheral Fields, Overseas Business Contribution Award, and Canada Research Institute President's Spot Award.
I listed my selected publications below:
-
▶ 🧬🤖 Offline Model-Based Optimization
- Single-Objective Optimization [NeurIPS 2024] [TMLR 2025]
- Multi-Objective Optimization [ICLR 2025]
- Comprehensive Survey of Offline Model-Based Optimization [arXiv Preprint]
-
▶ 📄📊 Knowledge-Centric Natural Language Processing
- Knowledge Foundation Model and Automatic Knowledge Base Construction [EMNLP 2024]
- Retrieval Augmented Generation System [arXiv Preprint]
- ▶ 🌍📡 Generative Models in Real World Applications
🚀 New paper accepted to TMLR! We propose DEMO — a novel two-phase method for offline model-based optimization...
— Ye Yuan (@StevenYuan666) April 17, 2025
🎉 Excited to share that our paper "ParetoFlow: Guided Flows in Multi-Objective Optimization" has been accepted to ICLR 2025! 🙌
— Ye Yuan (@StevenYuan666) January 23, 2025
I'm thrilled to share that our paper, "Learning to Extract Structured Entities Using Language Models", has been accepted to the EMNLP Main Conference 2024!
— Ye Yuan (@StevenYuan666) September 21, 2024
News
Our paper "Prompting Wireless Networks: Reinforced In-Context Learning for Power Control" was accepted by ICML 2025 ML4Wireless Workshop!
My first-authored paper "Understanding 6G through Language Models: A Case Study on LLM-aided Structured Entity Extraction in Telecom Domain" was released!
I started my new summer research internship at RBC Borealis and worked with Dr. Amin Shabani, Dr. Siqi Liu, and Dr. Jiawei (Eric) He.
My (co)first-authored paper "Design Editing for Offline Model-based Optimization" was accepted by TMLR!
Our survey paper collaborated with Turing Award Recipient Professor Yoshua Bengio "Offline Model-Based Optimization: Comprehensive Review" was released!
I attended ICLR 2025 in Singapore.🇸🇬
My (co)first-authored paper "Design Editing for Offline Model-based Optimization" was accepted by ICLR 2025 DeLTa Workshop!
Our paper "Large Language Models for Wireless Networks: An Overview from the Prompt Engineering Perspective" was accepted by IEEE WCM!
I started the collaboration with Samsung Research America.
I successfully passed the Ph.D. oral comprehensive exam and formally became a Ph.D. candidate!
My (co)first-authored paper "ParetoFlow: Guided Flows in Multi-Objective Optimization" was accepted by ICLR 2025!
Our paper "Generative AI as a service in 6g edge-cloud: Generation task offloading by in-context learning" was accepted by IEEE WCL!
I attended EMNLP 2024 in Miami.🇺🇸
My (co)first-authored paper "Learning to Extract Structured Entities Using Language Models" was accepted by EMNLP 2024 and selected as an oral presentation paper (top 7%🔥🔥🔥)!
Our survey paper "Large language model (llm) for telecommunications: A comprehensive survey on principles, key techniques, and opportunities" was accepted by IEEE COMST!
Our paper "Large Language Model (LLM)-enabled In-context Learning for Wireless Network Optimization: A Case Study of Power Control" was released!
Our survey paper "Retrieval-Augmented Generation for Natural Language Processing: A Survey" was released!
I started my informal research visiting to several professors in Nanjing, Hangzhou, Shanghai, Shenzhen, and Hong Kong China.🇨🇳
I attended my first in-person conference at NeurIPS 2023 in New Orleans.🇺🇸
I received the BMO (Bank of Montreal) Responsible AI fellowship, and was awarded the title of (BMO) Responsible AI Senior Scholar.
My first (co)first-authored paper "Importance-aware co-teaching for offline model-based optimization" was accepted by NeurIPS 2023! I was so excited to publish this paper on the one of the best venues as a first-year Ph.D. student!
I started my internship as an Associate Researcher at Noah's Ark Lab Canada, where I worked on the accelerating LLM inference and next-generation language model architectures.
I started the close collaboration at Microsoft Research's Alexandria team with Dr. Bhaskar Mitra, Dr. James Hensman, Liana Mikaelyan, Pavel Myshkov, Dr. Alexander Meulemans (during his internship at MSR), Dr. Jan Tönshoff (during his internship at MSR), Dr. Taketomo Isazawa, and Dr. Tom Minka. We work on knowledge foundation model and automatic knowledge base construction.
I started my first research project on Offline Black Box Optimization under the mentorship of Can (Sam) Chen.
I received my Bachelor's Degree of Science in Honours Computer Science at McGill University.
I received the official offer from McGill University and decided to do my direct-entry Ph.D. program at McGill University with Professor Xue (Steve) Liu as my supervisor.
I joined in Professor Xue (Steve) Liu's CPS Lab as a research assistant and attended the group meetings regularly.
I started two research projects about Multi-agent Reinforcement Learning and Domain Adaptation for Human Activity Recognition with Dr. Jikun (Jaxon) Kang and Professor Xi (Alex) Chen. Both of them were Professor Xue (Steve) Liu's students before.
I started the course COMP 597 (Applications of Machine Learning in Real World Systems) at McGill University, which is a research-oriented course taught by Professor Xue (Steve) Liu. I was impressed by the research projects of the students in the course, and I decided to apply for the Ph.D. program at McGill University.
I found I was not interested in the software engineering field, and I decided to do more research on Machine Learning and Deep Learning. Then I sent the first email to Professor Xue (Steve) Liu, who was my Ph.D. supervisor later.
I started my first industrial internship as a software engineer at CAMLUNI Education, a startup company in Beijing. Since Beijing is a large city, it took me 2 hours to commute to the office every day. During then, I studied the basics of Machine Learning and Deep Learning through self-study by watching Hung-yi Lee's lectures on YouTube.
I met my first research mentor, Professor Jie Fu, in Montreal, and I started to slowly learn the research career track.
I started to learn French.
I started my undergraduate study at McGill University.