Research

I do research on topics related to natural language processing.

At Aylien, our team works on a mix of product-focused prototyping and innovation, as well as open-ended exploratory research.

For a complete list of publications please see my Google Scholar Profile


Selected Publications

Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning

We design loss functions for unsupervised text-compression that use auxiliary signals for text compression quality, such PLM-derived fluency and consistency with source inputs. The models outperform existing approaches that use discrete-search and are also very efficient at inference time due to a policy-based reinforcement learning training setup, which distills the ensemble of training targets into a single classification decision.

Code





Lexically Constrained Decoding for Sequence Generation Using Grid Beam Search

Grid Beam Search (GBS) extends the beam search algorithm to allow the inclusion of pre-specified lexical constraints. The algorithm can be used with any model which generates sequences token by token. Lexical constraints take the form of phrases or words that must be present in the output sequence. This is a very general way to incorporate auxillary knowledge into a model’s output without requiring any modification of the parameters or training data.

Code