I am a Ph.D. student at UNC Chapel Hill working with Prof Colin Raffel and Prof Mohit Bansal . I am interested in updating lanuguage models efficiently and effectively. Also, I am interested in parameter-efficient and few-shot learning.
- Evaluating the Factual Consistency of Large Language Models Through Summarization
Derek Tam, Anisha Mascarenhas, Shiyue Zhang, Sarah Kwan, Mohit Bansal, Colin Raffel
Arxiv November 2022.
- Few-Shot Parameter-Efficient Fine-Tuning is Better and Cheaper than In-Context Learning
Haokun Liu*, Derek Tam*, Mohammed Muqeeth*, Jay Mohta, Tenghao Huang, Mohit Bansal, Colin Raffel
Conference on Neural Information Processing Systems NeurIps 2022.
- An Empirical Survey of Data Augmentation for Limited Data Learning in NLP
Jiaao Chen*, Derek Tam*, Colin Raffel, Mohit Bansal, Diyi Yang
Transactions of the Association for Computational Linguistics TACL 2022.
- Isochrony-Aware Neural Machine Translation for Automatic Dubbing
Derek Tam, Surafel M. Lakew, Yogesh Virkar, Prashant Mathur, Marcello Federico
- Improving and Simplifying Pattern Exploiting Training
Derek Tam*, Rakesh R Menon*, Mohit Bansal, Shashank Srivastava, Colin Raffel
Empirical Methods in Natural Language Processing EMNLP 2021 (Short).
- Predicting Institution Hierarchies with Set-based Models.
Derek Tam, Nicholas Monath, Ari Kobren, Andrew McCallum
Automated Knowledge Base Construction AKBC
- Optimal Transport-based Alignment
of Learned Character Representations for String
Derek Tam, Nicholas Monath, Ari Kobren, Aaron Traylor, Rajarshi Das, Andrew McCallum
Computational Linguistics ACL
I spent the summers of 2021 and 2022 interning at Amazon. Previously, I worked in Prof Andrew McCallum
`s Information and Extraction Synthesis Lab
while receiving a M.S. in Computer Science at the University of Massachusetts Amherst
. Before that, I received a B.S. in Computer Science and Statistics from Carnegie Mellon University
. Ultimately, I aim to live in obedience to and unto the glory of my Lord and Savior Jesus Christ.