Code Change Embeddings
We propose an approach for obtaining representations of code changes during pre-training and evaluate them on two different downstream tasks — applying changes to code and commit message generation. During pre-training, the model learns to apply the given code change in a correct way. This task requires only code changes themselves, which makes it unsupervised.
Unsupervised Learning of General-Purpose Embeddings for Code Changes
Mikhail Pravilov, Egor Bogomolov, Yaroslav Golubev, and Timofey Bryksin