论文标题

pangu-coder:与功能级语言建模的程序合成

PanGu-Coder: Program Synthesis with Function-Level Language Modeling

论文作者

Christopoulou, Fenia, Lampouras, Gerasimos, Gritta, Milan, Zhang, Guchun, Guo, Yinpeng, Li, Zhongqi, Zhang, Qi, Xiao, Meng, Shen, Bo, Li, Lin, Yu, Hao, Yan, Li, Zhou, Pingyi, Wang, Xin, Ma, Yuchi, Iacobacci, Ignacio, Wang, Yasheng, Liang, Guangtai, Wei, Jiansheng, Jiang, Xin, Wang, Qianxiang, Liu, Qun

论文摘要

我们提出了Pangu-Coder,这是一种仅验证的仅解码器的语言模型,该模型采用了pangu-alpha架构来生成文本对代码,即给定自然语言问题描述的编程语言解决方案的合成。我们使用两阶段的策略训练pangu-coder:第一阶段采用因果语言建模(CLM)来预先培训原始编程语言数据,而第二阶段则使用因果语言建模和掩盖语言建模(MLM)培训目标的组合,这些目标培训目标,专注于对文本进行了对文本的任务生成和培训的自然策划和自然策划的定义和代码的定义和训练。最后,我们讨论了pangu-coder-ft,它是通过竞争性编程问题和代码与持续集成测试的结合进行微调的。我们评估pangu-coder,重点是它是否生成功能上正确的程序,并证明它在参加较小的上下文窗口和更少的数据培训的同时,它比诸如codex之类的类似大小的模型(例如Codex)实现等效或更好的性能。

We present PanGu-Coder, a pretrained decoder-only language model adopting the PanGu-Alpha architecture for text-to-code generation, i.e. the synthesis of programming language solutions given a natural language problem description. We train PanGu-Coder using a two-stage strategy: the first stage employs Causal Language Modelling (CLM) to pre-train on raw programming language data, while the second stage uses a combination of Causal Language Modelling and Masked Language Modelling (MLM) training objectives that focus on the downstream task of text-to-code generation and train on loosely curated pairs of natural language program definitions and code functions. Finally, we discuss PanGu-Coder-FT, which is fine-tuned on a combination of competitive programming problems and code with continuous integration tests. We evaluate PanGu-Coder with a focus on whether it generates functionally correct programs and demonstrate that it achieves equivalent or better performance than similarly sized models, such as CodeX, while attending a smaller context window and training on less data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源