I work on machine learning at Cohere. I recently graduated with a PhD in computer science from New York University, focusing on machine learning and NLP. I was jointly advised by Professors Sam Bowman and Kyunghyun Cho, and was part of the Machine Learning for Language group at NYU. My dissertation is available here. Previously, I graduated from Harvard University with a bachelor’s in applied mathematics and a master’s in computer science, where I was advised by Alexander Rush and spent time with the Harvard Natural Language Processing group. See my CV, Google Scholar, GitHub for more details. If you need to contact me, send me a message at wangalexc at gmail or on Twitter. If you aren’t sure you have the right Alex Wang, try this directory. Stay in touch for what I’ll be up to next!

Papers

What Do NLP Researchers Believe? Results of the NLP Community Metasurvey
Julian Michael, Ari Holtzman, Alicia Parrish, Aaron Mueller, Alex Wang, Angelica Chen, Divyam Madaan, Nikita Nangia, Richard Yuanzhe Pang, Jason Phang, Samuel R. Bowman
arXiv preprint
[paper] [site]

SQuALITY: Building a Long-Document Summarization Dataset the Hard Way
Alex Wang, Richard Yuanzhe Pang, Angelica Chen, Jason Phang, Samuel R. Bowman
arXiv preprint
[paper] [code]

QuestEval: Summarization Asks for Fact-Based Evaluation.
Thomas Scialom, Paul-Alexis Dray, Patrick Gallinari, Sylvain Lamprier, Benjamin Piwowarski, Jacopo Staiano, Alex Wang
EMNLP 2022
[paper]

Label Representations in Modeling Classification as Text Generation
Xinyi Chen, Jiangxian Xu, Alex Wang
AACL-IJCNLP SRW 2020 (Best Paper Award Runner-Up).
[paper]

Asking and Answering Questions to Evaluate the Factual Consistency of Summaries
Alex Wang, Kyunghyun Cho, Mike Lewis
ACL 2020
[paper] [code]

jiant: A Software Toolkit for Research on General-Purpose Text Understanding Models
Yada Pruksachatkun, Phil Yeres, Haokun Liu, Jason Phang, Phu Mon Htut, Alex Wang, Ian Tenney, Samuel R. Bowman
ACL 2020 (demo)
[paper] [site]

A Generalized Framework of Sequence Generation with Application to Undirected Sequence Models
Elman Mansimov, Alex Wang, Sean Welleck, Kyunghyun Cho
Northern European Journal of Language Technology
[paper] [code]

SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems
Alex Wang, Yada Pruksachatkun, Nikita Nangia, Amanpreet Singh, Julian Michael, Felix Hill, Omer Levy, Samuel R. Bowman
NeurIPS 2019
[paper] [site]

Can You Tell Me How to Get Past Sesame Street? Sentence-Level Pretraining Beyond Language Modeling
Alex Wang, Jan Hula, Patrick Xia, Raghavendra Pappagari, R. Thomas McCoy, Roma Patel, Najoung Kim, Ian Tenney, Yinghui Huang, Katherin Yu, Shuning Jin, Berlin Chen, Benjamin Van Durme, Edouard Grave, Ellie Pavlick, Samuel R. Bowman
ACL 2019
[paper] [code]

Bert has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model
Alex Wang, Kyunghyun Cho
NeuralGen 2019
[paper] [demo]

Probing What Different NLP Tasks Teach Machines about Function Word Comprehension
Najoung Kim, Roma Patel, Adam Poliak, Alex Wang, Patrick Xia, R. Thomas McCoy, Ian Tenney, Alexis Ross, Tal Linzen, Benjamin Van Durme, Sameul R. Bowman, Ellie Pavlick
StarSem 2019 (Best Paper Award)
[paper] [code]

On Measuring Social Biases in Sentence Encoders
Chandler May, Alex Wang, Shikha Bordia, Samuel R. Bowman, Rachel Rudinger
NAACL 2019
[paper] [code]

GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Wang, Amanpreet Singh, Julian Michael, Felix Hill, Omer Levy, Samuel R. Bowman
ICLR 2019
[paper] [site]

What do you learn from context? Probing for sentence structure in contextualized word representations
Ian Tenney, Patrick Xia, Berlin Chen, Alex Wang, Adam Poliak, Benjamin Van Durme, Sam Bowman, Dipanjan Das, Ellie Pavlick
ICLR 2019
[paper] [code]

Learning Linguistic Descriptors of User Roles in Online Communities
Alex Wang, William L. Hamilton, Jure Leskovec
NLP+CSS Workshop @ EMNLP 2016
[paper]

Other Projects

  • NYU AI School: formerly NYC AI Workshop, the NYU AI School is a free introductory workshop on machine learning organized by students and faculty from the NYU ML2 lab. The workshop is designed for early undergraduate students and features a week of lectures from experts in the field and hands-on coding experience. For more info see here.
  • A Neural Framework for One-Shot Learning: thorough examination in the use of matching networks, a neural network and nonparametric model hybrid, for one-shot learning in various settings. See paper for more details. Completed for my senior thesis, earning high honors.
  • Traffic Swarm Optimization: investigation in the use of swarm optimization methods for optimizing traffic light cycles. For more info see this article or this writeup.
  • Gaussian Processes for Crime Prediction: an exploration into the use of Gaussian processes to predict future crime rates in cities. See the writeup for details.
  • Twitter Plays Chess: crowdsourced chess playing against an AI where users vote for the human team’s next move via Twitter, à la Twitch Plays Pokemon.

Miscellaneous