Mohna Chakraborty

CV
Please find my cv here

Publications
My publication records are here

Contact
Please email me at cmohna@umich.edu

[GitHub] [LinkedIn] [Google Scholar]





Hosted on GitHub Pages — Theme by orderedlist
Design copied from Dr. Nachi Nagappan's personal webpage

Welcome to Mohna Chakraborty’s homepage

I am a post-doctoral fellow at the University of Michigan (Michigan Institute for Data Science) under the guidance of Dr. David Jurgens and Dr. Lu Wang. I finished my Ph.D. in Computer Science from Iowa State University. I have worked as a Research Assistant in the Data Mining and Knowledge Lab under my advisor Dr. Qi Li. I have also worked as a Data Science intern at The Home Depot, Epsilon, and as a Data Analytics intern at Delaware North. My research interests are in the domain of data mining, natural language processing, and machine learning. Through my research, I have contributed several key methods in top conferences like PAKDD’ 2025, SIAM’ 2025, ACL’ 2023, UAI’ 2023, SIGKDD’ 2022, ESEC/FSE’2021 and workshops like ICLR’ 2025, WWW’ 2025, PAKDD’ 2025, RANLP’2021.

Publications

2023
Mohna Chakraborty , Adithya Kulkarni, and Qi Li. Zero-shot Approach to Overcome Perturbation Sensitivity of Prompts, ACL, 2023 [paper]

Adithya Kulkarni, Mohna Chakraborty and Qi Li. Optimal Budget Allocation for Crowdsourcing Labels for Graphs, UAI, 2023 [paper]

2022
Mohna Chakraborty , Adithya Kulkarni, and Qi Li. Open-Domain Aspect-Opinion Co-Mining with Double-Layer Span Extraction, SIGKDD, 2022 [paper]

2021
Richard D Jiles, Mohna Chakraborty. [Re] Domain Generalization using Causal Matching, ML Reproducibility Challenge, 2021: [paper]

Abhishek Kumar Mishra, Mohna Chakraborty. Does local pruning offer task-specific models to learn effectively?, Proceedings of the Student Research Workshop Associated with RANLP, 2021: [paper]

Mohna Chakraborty. Does reusing pre-trained NLP model propagate bugs?, ESEC/FSE, 2021: [paper]

Recent News!


March ‘25: Visited University of Bonn and HumanCLAIM Workshop at Gottingen as part of DAAD AInet fellowship.

March ‘25: Our paper on “Beyond Single Parsers: An Empirical Analysis of Dependency Parse Tree Aggregation.” has been accepted at Research and Applications of Foundation Models for Data Mining and Affective Computing (RAFDA), PAKDD 25.

March ‘25: Selected for the Data and AI Intensive Research with Rigor and Reproducibility (DAIR3) with full scholarship.

March ‘25: Our paper on “From Pseudo-Code to Source Code: A Self-Supervised Search Approach.” has been accepted at Deep Learning for Code (DL4C) Workshop, ICLR 25.

March ‘25: Serving as a PC member at ICLR 2025.

Jan ‘25: Our paper on “Modeling Data Diversity for Joint Instance and Verbalizer Selection in Cold-Start Scenarios.” has been accepted at Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2025.

Jan ‘25: Our paper on “Empirical Evaluation of Prompting Strategies for Fact Verification Tasks.” has been accepted at PromptEng Workshop at the ACM WebConf, WWW 25.

Jan ‘25: Our paper on “Reducing Performance Gap between Commercial and Open-Source LLMs.” has been accepted at SIAM 2025.

Dec ‘24: I will be serving as a PC member at AAAI 2025.

Dec ‘24: I will be serving as a PC member at WWW 2025.

Oct ‘24: Selected for the prestigious DAAD AInet fellowship.

Oct ‘24: I will be serving as a PC member at SDM 2025.

Sep ‘24: Joined The University of Michigan as a Post-doctoral fellow (MIDAS) under the guidance of Prof. David Jurgens and Prof. Lu Wang.

July ‘24: Defended my Ph.D. Final Oral Exam on Analysis of Textual-based Reviews with Minimal Supervision.

Nov ‘23: Served as Review member at EACL 2023.

Oct ‘23: Awarded Best Research Poster Award for my paper “Zero-shot Approach to Overcome Perturbation Sensitivity of Prompts” at MINK WIC (Missouri, Iowa, Nebraska, Kansas Women in Computing) Conference.

Oct ‘23: Selected to represent Iowa State University at MINK WIC (An ACM celebration of Women in Computing) Conference.

Oct ‘23: Defended my Preliminary Exam on Analysis of Textual-based Reviews with Minimal Supervision.

Oct ‘23: Invited to teach a graduate-level course at Iowa State University as a guest lecturer for COM S 571X (Responsible AI: Risk Management in Data Driven Discovery.).

May ‘23: Joined The Home Depot as a Data Science intern.

May ‘23: Our paper on “Optimal Budget Allocation for Crowdsourcing Labels for Graphs” has been accepted at UAI 2023.

May ‘23: Our paper on “Zero-shot Approach to Overcome Perturbation Sensitivity of Prompts” has been accepted at ACL 2023.

March ‘23: Awarded 2nd position for 7th Annual Research Competition at Iowa State University.

Sep ‘22: Selected to represent Iowa State University for the prestigious and competitive Grace Hopper Celebration.

Aug ‘21: Presented our paper and poster “Open-Domain Aspect-Opinion Co-Mining with Double-Layer Span Extraction”, at the SIGKDD, 2022 conference in Washington D.C.

Aug ‘22: Awarded Student Travel Award for SIGKDD 2022.

July ‘22: Served as Review member at HCOMP 2022.

July ‘22: Served as Review member at EMNLP 2022.

May ‘22: Our paper on “Open-Domain Aspect-Opinion Co-Mining with Double-Layer Span Extraction” has been accepted at SIGKDD 2022.

May ‘22: Joined Epsilon as a PhD intern.

April ‘22: Defended my Research Proficiency Exam on Weakly Supervised Review Analysis Based on Task Correlation.

March ‘22: Awarded 1st position for 6th Annual Research Competition at Iowa State University.

March ‘22: Our paper on “[Re] Domain Generalization using Causal Matching” has been accepted at ML Reproducibility Challenge 2021 (Fall Edition).

Dec ‘21: Served as Review member at PAKDD 2021.

Sep ‘21: Our paper on “Does local pruning offer task-specific models to learn effectively?” has been accepted at RANLP 2021.

Aug ‘21: Presented SRC paper “Does reusing pre-trained NLP model propagate bugs?”, ESEC/FSE, 2021.

June ‘21: Our paper on “Does reusing pre-trained NLP model propagate bugs?” has been accepted at ESEC/FSE, SRC 2021.

May ‘21: Joined Epsilon as a PhD intern.

Aug ‘20: Joined Ph.D. program at the Department of Computer Science at Iowa State University.