Year
Month

(Preprint) CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Xin Wang ¹, Yasheng Wang ², Pingyi Zhou ², Meng Xiao ², Yadao Wang ², Li Li ³, Xiao Liu ⁴, Hao Wu 武浩 ⁵, Jin Liu 刘进 ¹, Xin Jiang ²
¹ School of Computer Science, Wuhan University 武汉大学 计算机学院
² Noah's Ark Lab, Huawei 华为 诺亚方舟实验室
³ Faculty of Information Technology, Monash University
⁴ School of Information Technology, Deakin University
⁵ School of Information Science and Engineering, Yunnan University 云南大学 信息学院
arXiv , 2021-08-10
Abstract

Pre-trained models for programming languages have proven their significant values in various code-related tasks, such as code search, code clone detection, and code translation. Currently, most pre-trained models treat a code snippet as a sequence of tokens or only focus on the data flow between code identifiers.

However, rich code syntax and hierarchy are ignored which can provide important structure information and semantic rules of codes to help enhance code representations. In addition, although the BERT-based code pre-trained models achieve high performance on many downstream tasks, the native derived sequence representations of BERT are proven to be of low-quality, it performs poorly on code matching and similarity tasks.

To address these problems, we propose CLSEBERT, a Constrastive Learning Framework for Syntax Enhanced Code Pre-Trained Model, to deal with various code intelligence tasks. In the pre-training stage, we consider the code syntax and hierarchy contained in the Abstract Syntax Tree (AST) and leverage the constrastive learning to learn noise-invariant code representations. Besides the masked language modeling (MLM), we also introduce two novel pre-training objectives. One is to predict the edges between nodes in the abstract syntax tree, and the other is to predict the types of code tokens. Through extensive experiments on four code intelligence tasks, we successfully show the effectiveness of our proposed model.
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_1
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_2
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_3
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_4
  • Full-dimensional complex coherence properties tomography for multi-cipher information security
  • Yonglei Liu, Siting Dai, Yimeng Zhu, Yahong Chen, Peipei Peng, Yangjian Cai, Fei Wang
  • Opto-Electronic Advances
  • 2025-03-31
  • Quantitative detection of trace nanoplastics (down to 50 nm) via surface-enhanced raman scattering based on the multiplex-feature coffee ring
  • Xinao Lin, Fengcai Lei, Xiu Liang, Yang Jiao, Xiaofei Zhao, Zhen Li, Chao Zhang, Jing Yu
  • Opto-Electronic Advances
  • 2025-03-28
  • Tunable vertical cavity microlasers based on MAPbI₃ phase change perovskite
  • Rongzi Wang, Ying Su, Hongji Fan, Chengxiang Qi, Shuang Zhang, Tun Cao
  • Opto-Electronic Advances
  • 2025-03-28
  • Light-induced enhancement of exciton transport in organic molecular crystal
  • Xiao-Ze Li, Shuting Dai, Hong-Hua Fang, Yiwen Ren, Yong Yuan, Jiawen Liu, Chenchen Zhang, Pu Wang, Fangxu Yang, Wenjing Tian, Bin Xu, Hong-Bo Sun
  • Opto-Electronic Advances
  • 2025-03-28
  • Double topological phase singularities in highly absorbing ultra-thin film structures for ultrasensitive humidity sensing
  • Xiaowen Li, Jie Sheng, Zhengji Wen, Fangyuan Li, Xiran Huang, Mingqing Zhang, Yi Zhang, Duo Cao2, Xi Shi, Feng Liu, Jiaming Hao
  • Opto-Electronic Advances
  • 2025-03-28
  • Soliton microcombs in optical microresonators with perfect spectral envelopes
  • Mulong Liu, Ziqi Wei, Haotong Zhu, Hongwei Wang, Xiao Yu, Xilin Han, Wei Zhao, Guangwei Hu, Peng Xie
  • Opto-Electronic Advances
  • 2025-03-12
  • Terahertz active multi-channel vortices with parity symmetry breaking and near/far field multiplexing based on a dielectric-liquid crystal-plasmonic metadevice
  • Yiming Wang, Fei Fan, Huijun Zhao, Yunyun Ji, Jing Liu, Shengjiang Chang
  • Opto-Electronic Advances
  • 2025-03-06
  • Spin-dependent amplitude and phase modulation with multifold interferences via single-layer diatomic all-silicon metasurfaces
  • Hui Li, Chenhui Zhao, Jie Li, Hang Xu, Wenhui Xu, Qi Tan, Chunyu Song, Yun Shen, Jianquan Yao
  • Opto-Electronic Science
  • 2025-02-19
  • Highly sensitive laser spectroscopy sensing based on a novel four-prong quartz tuning fork
  • Runqiu Wang, Shunda Qiao, Ying He, Yufei Ma
  • Opto-Electronic Advances
  • 2025-01-22
  • A novel approach towards robust construction of physical colors on lithium niobate crystal
  • Quanxin Yang, Menghan Yu, Zhixiang Chen, Siwen Ai, Ulrich Kentsch, Shengqiang Zhou, Yuechen Jia, Feng Chen, Hongliang Liu
  • Opto-Electronic Advances
  • 2025-01-22
  • Multi-photon neuron embedded bionic skin for high-precision complex texture and object reconstruction perception research
  • Hongyu Zhou, Chao Zhang, Hengchang Nong, Junjie Weng, Dongying Wang, Yang Yu, Jianfa Zhang, Chaofan Zhang, Jinran Yu, Zhaojian Zhang, Huan Chen, Zhenrong Zhang, Junbo Yang
  • Opto-Electronic Advances
  • 2025-01-22
  • Single-beam optical trap-based surface-enhanced raman scattering optofluidic molecular fingerprint spectroscopy detection system
  • Ning Sun, Yuan Gan, Yujie Wu, Xing Wang, Shen Shen, Yong Zhu, Jie Zhang
  • Opto-Electronic Advances
  • 2025-01-22



  • Grassland: A Rapid Algebraic Modeling System for Million-variable Optimization        China's Technology Cooperation with Russia: Geopolitics, Economics, and Regime Security
    About
    |
    Contact
    |
    Copyright © PubCard