Zhao Meng

PhD Student @ETH Zurich
Office: ETZ G60.1, Gloriastrasse 35, 8092, Zurich
zhmeng@ethz.ch
[CV in PDF]

Hi, I am a PhD student at D-ITET of ETH Zurich. Advised by Prof. Roger Wattenhofer, I am broadly interested in machine learning and natural language processing.


Publications

Papers

ForecastTKGQuestions: A Benchmark for Temporal Question Answering and Forecasting over Temporal Knowledge Graphs.
Zifeng Ding, Ruoxia Qi, Zongyue Li, Bailan He, Jingpei Wu, Yunpu Ma, Zhao Meng, Zhen Han, Volker Tresp.
In Proceedings of the International Semantic Web Conference (ISWC), 2023.

Beyond prompting: Making Pre-trained Language Models Better Zero-shot Learners by Clustering Representations.
Yu Fei, Zhao Meng*, Ping Nie*, Roger Wattenhofer, Mrinmaya Sachan.
In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2022.

TempCaps: A Capsule Network-based Embedding Model for Temporal Knowledge Graph Completion.
Guirong Fu*, Zhao Meng*, Zhen Han*, Zifeng Ding, Yunpu Ma, Matthias Schubert, Volker Tresp, Roger Wattenhofer.
In Workshop on Structured Prediction for NLP at ACL, 2022.

3D-RETR: End-to-End Single and Multi-View 3D Reconstruction with Transformers.
Shi Zai*, Zhao Meng*, Yiran Xing, Yunpu Ma, Roger Wattenhofer.
In the British Machine Vision Conference (BMVC), 2021. [Code]

BERT is Robust! A Case Against Synonym-Based Adversarial Examples in Text Classification.
Jens Hauser*, Zhao Meng*, Damián Pascual, Roger Wattenhofer.
In Proceedings of the International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2023.

Self-Supervised Contrastive Learning with Adversarial Perturbations for Robust Pretrained Language Models.
Zhao Meng*, Yihan Dong*, Mrinmaya Sachan, Roger Wattenhofer.
To Appear In Findings of Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2022.

Towards Robust Graph Contrastive Learning.
Nikola Jovanović, Zhao Meng, Lukas Faber, Roger Wattenhofer.
In Workshop on Self-Supervised Learning for the Web (SSL@WWW 2021).

KM-BART: Knowledge Enhanced Multimodal BART for Visual Commonsense Generation.
Yiran Xing*, Zai Shi*, Zhao Meng*, Gerhard Lakemeyer, Yunpu Ma, Roger Wattenhofer.
In Proceedings of Annual Meeting of the Association for Computational Linguistics (ACL) , 2021. [Code]

A Geometry-Inspired Attack for Generating Natural Language Adversarial Examples.
Zhao Meng, Roger Wattenhofer.
In Proceedings of the International Conference on Computational Linguistics (COLING), 2020. [Code]

Towards Neural Speaker Modeling in Multi-Party Conversation: The Task, Dataset, and Models.
Zhao Meng, Lili Mou, Zhi Jin.
In Proceedings of the Language Resources and Evaluation Conference (LREC), 2018. [Code]

Hierarchical RNN with Static Sentence-Level Attention for Text-Based Speaker Change Detection.
Zhao Meng, Lili Mou, Zhi Jin.
In Proceedings of the ACM Conference on Information and Knowledge Management (CIKM), 2017. [Code]

How Transferable are Neural Networks in NLP Applications?
Lili Mou, Zhao Meng, Rui Yan, Ge Li, Yan Xu, Lu Zhang, Zhi Jin.
In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP), 2016.

Context-Aware Tree-Based Convolutional Neural Networks for Natural Language Inference.
Zhao Meng, Lili Mou, Ge Li, Zhi Jin.
In Proceedings of the International Conference on Knowledge Science, Engineering and Management (KSEM), 2016.


*: equal contribution