πŸŽ‰ Congratulations on EMNLP 2025 Paper Acceptance!

Aug 20, 2025Β·
Yihang Wang
Yihang Wang
Β· 1 min read
Image credit: EMNLP
post

We are overjoyed to share that our paper, “QUITO-X: A New Perspective on Context Compression from the Information Bottleneck Theory,” has been accepted to Findings of EMNLP 2025! πŸŽ‰

In this work, we propose a novel context compression method based on information bottleneck theory and cross-attention, enabling efficient token selection that preserves task-relevant content. πŸ”

Our experiments on four major QA datasets (DROP, CoQA, SQuAD, Quoref) demonstrate that QUITO-X improves compression rates by nearly 25% over previous state-of-the-art methods, and in some casesβ€”when removing 25% of tokensβ€”the Exact Match (EM) scores even exceed those with full, uncompressed context! πŸ’₯

We look forward to sharing more insights and presenting this work at EMNLP 2025. Huge thanks to our amazing collaborators and supporters! πŸ™

Stay tuned for further updates as the conference approaches! ✨


Yihang Wang
Authors
Yihang Wang (he/him)
MEng Artificial Intelligence
I am currently a master’s student at the Institute of Computing Technology, Chinese Academy of Sciences, with a research focus on representation learning and information retrieval. I am passionate about exploring fundamental challenges in Natural Language Processing and related areas. I possess strong self-learning capabilities, solid research and engineering practice experience, effective communication skills, and a positive, collaborative attitude. I have also accumulated rich research experience through participation in multiple academic projects.