Year
Month
(Preprint) CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Xin Wang ¹, Yasheng Wang ², Pingyi Zhou ², Meng Xiao ², Yadao Wang ², Li Li ³, Xiao Liu ⁴, Hao Wu 武浩 ⁵, Jin Liu 刘进 ¹, Xin Jiang ²
¹ School of Computer Science, Wuhan University 武汉大学 计算机学院
² Noah's Ark Lab, Huawei 华为 诺亚方舟实验室
³ Faculty of Information Technology, Monash University
⁴ School of Information Technology, Deakin University
⁵ School of Information Science and Engineering, Yunnan University 云南大学 信息学院
arXiv, 2021-08-10
Abstract

Pre-trained models for programming languages have proven their significant values in various code-related tasks, such as code search, code clone detection, and code translation. Currently, most pre-trained models treat a code snippet as a sequence of tokens or only focus on the data flow between code identifiers.

However, rich code syntax and hierarchy are ignored which can provide important structure information and semantic rules of codes to help enhance code representations. In addition, although the BERT-based code pre-trained models achieve high performance on many downstream tasks, the native derived sequence representations of BERT are proven to be of low-quality, it performs poorly on code matching and similarity tasks.

To address these problems, we propose CLSEBERT, a Constrastive Learning Framework for Syntax Enhanced Code Pre-Trained Model, to deal with various code intelligence tasks. In the pre-training stage, we consider the code syntax and hierarchy contained in the Abstract Syntax Tree (AST) and leverage the constrastive learning to learn noise-invariant code representations. Besides the masked language modeling (MLM), we also introduce two novel pre-training objectives. One is to predict the edges between nodes in the abstract syntax tree, and the other is to predict the types of code tokens. Through extensive experiments on four code intelligence tasks, we successfully show the effectiveness of our proposed model.
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_1
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_2
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_3
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_4
  • Soliton microcombs in optical microresonators with perfect spectral envelopes
  • Mulong Liu, Ziqi Wei, Haotong Zhu, Hongwei Wang, Xiao Yu, Xilin Han, Wei Zhao, Guangwei Hu, Peng Xie
  • Opto-Electronic Advances
  • 2025-03-12
  • Terahertz active multi-channel vortices with parity symmetry breaking and near/far field multiplexing based on a dielectric-liquid crystal-plasmonic metadevice
  • Yiming Wang, Fei Fan, Huijun Zhao, Yunyun Ji, Jing Liu, Shengjiang Chang
  • Opto-Electronic Advances
  • 2025-03-06
  • Spin-dependent amplitude and phase modulation with multifold interferences via single-layer diatomic all-silicon metasurfaces
  • Hui Li, Chenhui Zhao, Jie Li, Hang Xu, Wenhui Xu, Qi Tan, Chunyu Song, Yun Shen, Jianquan Yao
  • Opto-Electronic Science
  • 2025-02-19
  • Highly sensitive laser spectroscopy sensing based on a novel four-prong quartz tuning fork
  • Runqiu Wang, Shunda Qiao, Ying He, Yufei Ma
  • Opto-Electronic Advances
  • 2025-01-22
  • A novel approach towards robust construction of physical colors on lithium niobate crystal
  • Quanxin Yang, Menghan Yu, Zhixiang Chen, Siwen Ai, Ulrich Kentsch, Shengqiang Zhou, Yuechen Jia, Feng Chen, Hongliang Liu
  • Opto-Electronic Advances
  • 2025-01-22
  • Multi-photon neuron embedded bionic skin for high-precision complex texture and object reconstruction perception research
  • Hongyu Zhou, Chao Zhang, Hengchang Nong, Junjie Weng, Dongying Wang, Yang Yu, Jianfa Zhang, Chaofan Zhang, Jinran Yu, Zhaojian Zhang, Huan Chen, Zhenrong Zhang, Junbo Yang
  • Opto-Electronic Advances
  • 2025-01-22
  • Single-beam optical trap-based surface-enhanced raman scattering optofluidic molecular fingerprint spectroscopy detection system
  • Ning Sun, Yuan Gan, Yujie Wu, Xing Wang, Shen Shen, Yong Zhu, Jie Zhang
  • Opto-Electronic Advances
  • 2025-01-22
  • High-frequency enhanced ultrafast compressed active photography
  • Yizhao Meng, Yu Lu, Pengfei Zhang, Yi Liu, Fei Yin, Lin Kai, Qing Yang, Feng Chen
  • Opto-Electronic Advances
  • 2025-01-15
  • Efficient generation of vectorial terahertz beams using surface-wave excited metasurfaces
  • Zhuo Wang, Weikang Pan, Yu He, Zhiyan Zhu, Xiangyu Jin, Muhan Liu, Shaojie Ma, Qiong He, Shulin Sun, Lei Zhou
  • Opto-Electronic Science
  • 2025-01-15
  • High-efficiency RGB achromatic liquid crystal diffractive optical elements
  • Yuqiang Ding, Xiaojin Huang, Yongziyan Ma, Yan Li, Shin-Tson Wu
  • Opto-Electronic Advances
  • 2025-01-07
  • On-chip light control of semiconductor optoelectronic devices using integrated metasurfaces
  • Cheng-Long Zheng, Pei-Nan Ni, Yi-Yang Xie, Patrice Genevet
  • Opto-Electronic Advances
  • 2025-01-07
  • Ferroelectric domain engineering of lithium niobate
  • Jackson J. Chakkoria, Aditya Dubey, Arnan Mitchell, Andreas Boes
  • Opto-Electronic Advances
  • 2025-01-03



  • Grassland: A Rapid Algebraic Modeling System for Million-variable Optimization                                China's Technology Cooperation with Russia: Geopolitics, Economics, and Regime Security
    About
    |
    Contact
    |
    Copyright © PubCard