🎉 Two Papers Accepted to ACL 2026 Main Conference!
Our two papers, “Detoxification for LLM: From Dataset Itself” and “Distilling Large Embeddings via Hyperspherical Householder Quantization,” have been accepted to the ACL 2026 Main Conference! 🎉
M.Eng Artificial Intelligence
2025-09-01
University of Chinese Academy of Sciences
B.Eng Artificial Intelligence
2021-09-01
2025-06-01
Beijing University of Posts and Telecommunications
My current research interests focus on representation learning and information retrieval, within the broader area of natural language processing.
I am dedicated to developing innovative methods for building more effective and robust representations, and advancing retrieval techniques to better support real-world applications.
Feel free to reach out if you’re interested in collaboration! 😃
Generative LLM have achieved remarkable success in various industrial applications, owing to their promising In-Context Learning capabilities. However, the issue of long context in …
With the extensive use of large language models, automatically generating QA datasets for domain-specific fine-tuning has become crucial. However, considering the multifaceted …
Our two papers, “Detoxification for LLM: From Dataset Itself” and “Distilling Large Embeddings via Hyperspherical Householder Quantization,” have been accepted to the ACL 2026 Main Conference! 🎉
Our paper “QUITO-X: A New Perspective on Context Compression from the Information Bottleneck Theory” has been accepted by EMNLP 2025! 🎉
Our paper “MDPO: Customized Direct Preference Optimization with a Metric-based Sampler for Question and Answer Generation” has been accepted by COLING 2025! 🎉