Xixi WU 「吴茜茜」

Ph.D. Student,
The Chinese University of Hong Kong

Email / GitHub / Google Scholar


Short Bio

I am a Ph.D. student at The Chinese University of Hong Kong, under the supervision of Prof. Hong CHENG. Previously, I received my B.S. and M.S. in Computer Science from Fudan University in 2021 and 2024, respectively, under the supervision of Prof. Yun XIONG.

My research interests are in machine learning and language technologies, including reasoning, planning, and information access in language systems. Before focusing on language technologies, I conducted research on graph learning.

Recent News

  • [Aug 2025] Our paper MoLoRAG was accepted to EMNLP'2025. See you in Suzhou ✨
  • [May 2025] Our paper LLMNodeBed was accepted to ICML'2025. See you in Vancouver again 😄
  • [Nov 2024] I was awarded NeurIPS'2024 Top Reviewer 🥳
  • [Sep 2024] Our paper GNN4TaskPlan was accepted to NeurIPS'2024. See you in Vancouver!

Research Highlights

  • generation

    ReSum: Context Summarization for Long-Horizon Reasoning

    ReSum studies the use of periodic context summarization to support longer reasoning processes in language systems. It is designed as a lightweight extension with minimal changes to existing ReAct workflows, and shows improved performance on challenging information-seeking benchmarks like BrowseComp and GAIA.

  • generation

    Benchmark of LLM4Graph Algorithms (ICML'2025)

    We introduce LLMNodeBed, a PyG-based testbed for LLM-based node classification algorithms. It integrates 14 datasets, supports 8 LLM-based and 8 classic methods, and covers 3 learning configurations. With LLMNodeBed, we train and evaluate over 2,700 models to analyze the effects of factors like LLM type and size, prompt, learning paradigm, and dataset homophily. Our study reveals 8 key insights, including (1) optimal settings for each algorithm category, and (2) scenarios where LLMs significantly outperform LMs.

  • generation

    Graph Learning for Task Planning (NeurIPS'2024)

    In language agents, available tasks naturally form a task graph, where nodes represent tasks and edges denote dependencies. Under such context, task planning involves selecting a path within this graph to fulfill user requests. We find that the bottleneck in LLMs' planning abilities lies in their limited understanding of the task graph. Therefore, we introduce GNNs as a simple fix, available in both training-free and training-required variants. Extensive experiments demonstrate that GNN-based methods surpass existing solutions even without training.


Selected Publications

Experience

  • generation

    Microsoft Research Asia
    Research Intern, Shanghai AI/ML Group Feb. 2024 - Jun. 2024

  • generation

    Alibaba Group
    Research Intern, Tongyi Lab Jun. 2025 - Oct. 2025


Selected Awards

  • ACM Web Conference Student Travel Award 2023
  • Second Class Scholarship for Outstanding Student, Fudan University 2018 & 2021
  • Second Prize of Undergraduate Mathematical Contest in Modeling, Shanghai, China (CUMCM) 2019
  • First Prize in National Olympiad in Mathematics in Provinces, Jiangsu, China 2016

Professional Services

  • Conference Reviewer: NeurIPS'2024 (Top Reviewer Award🏆) - 2025, ICLR'2025 - 2026, ICML'2025 - 2026, WWW'2024 Graph Foundation Model (GFM) Workshop, SIGKDD'2024 - 2025, AAAI 2026
  • Journal Reviewer: IEEE Transactions on Knowledge and Data Engineering (TKDE), Transactions on Machine Learning Research (TMLR)

Miscellaneous

  • I love sports like swimming 🏊 and running 🏃. I also enjoy cooking Chinese food 🤣
  • During my undergraduate studies, I was interested in mobile app development (You can find all the source codes on my GitHub):

    Lose Weight, a Fluter App

    Hulv, a Mini-Program

    Chatroom, a Desktop App

  • I enjoy exploring the unknown and strive to keep moving forward on this path of discovery and learning ✨

Last updated at Feb 2, 2026 by Xixi