Year
Month
(Preprint) CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Xin Wang ¹, Yasheng Wang ², Pingyi Zhou ², Meng Xiao ², Yadao Wang ², Li Li ³, Xiao Liu ⁴, Hao Wu 武浩 ⁵, Jin Liu 刘进 ¹, Xin Jiang ²
¹ School of Computer Science, Wuhan University 武汉大学 计算机学院
² Noah's Ark Lab, Huawei 华为 诺亚方舟实验室
³ Faculty of Information Technology, Monash University
⁴ School of Information Technology, Deakin University
⁵ School of Information Science and Engineering, Yunnan University 云南大学 信息学院
arXiv, 2021-08-10
Abstract

Pre-trained models for programming languages have proven their significant values in various code-related tasks, such as code search, code clone detection, and code translation. Currently, most pre-trained models treat a code snippet as a sequence of tokens or only focus on the data flow between code identifiers.

However, rich code syntax and hierarchy are ignored which can provide important structure information and semantic rules of codes to help enhance code representations. In addition, although the BERT-based code pre-trained models achieve high performance on many downstream tasks, the native derived sequence representations of BERT are proven to be of low-quality, it performs poorly on code matching and similarity tasks.

To address these problems, we propose CLSEBERT, a Constrastive Learning Framework for Syntax Enhanced Code Pre-Trained Model, to deal with various code intelligence tasks. In the pre-training stage, we consider the code syntax and hierarchy contained in the Abstract Syntax Tree (AST) and leverage the constrastive learning to learn noise-invariant code representations. Besides the masked language modeling (MLM), we also introduce two novel pre-training objectives. One is to predict the edges between nodes in the abstract syntax tree, and the other is to predict the types of code tokens. Through extensive experiments on four code intelligence tasks, we successfully show the effectiveness of our proposed model.
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_1
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_2
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_3
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_4
  • Spectro-polarimetric detection enabled by multidimensional metasurface with quasi-bound states in the continuum
  • Haoyang He, Fangxing Lai, Yan Zhang, Xue Zhang, Chenyi Tian, Xin Li, Yongtian Wang, Shumin Xiao, Lingling Huang
  • Opto-Electronic Advances
  • 2025-06-30
  • Emerging low-dimensional perovskite resistive switching memristors: from fundamentals to devices
  • Shuanglong Wang, Hong Lian, Haifeng Ling, Hao Wu, Tianxiao Xiao, Yijia Huang, Peter Müller-Buschbaum
  • Opto-Electronic Advances
  • 2025-06-27
  • CW laser damage of ceramics induced by air filament
  • Chuan Guo, Kai Li, Zelin Liu, Yuyang Chen, Junyang Xu, Zhou Li, Wenda Cui, Changqing Song, Cong Wang, Xianshi Jia, Ji'an Duan, Kai Han
  • Opto-Electronic Advances
  • 2025-06-27
  • Eco-friendly quantum-dot light-emitting diode display technologies: prospects and challenges
  • Gao Peili, Li Chan, Zhou Hao, He Songhua, Yin Zhen, Ng Kar Wei, Wang Shuangpeng
  • Opto-Electronic Science
  • 2025-06-25
  • Operando monitoring of state of health for lithium battery via fiber optic ultrasound imaging system
  • Chen Geng, Wang Anqi, Zhang Yi, Zhang Fujun, Xu Dongchen, Liu Yueqi, Zhang Zhi, Yan Zhijun, Li Zhen, Li Hao, Sun Qizhen
  • Opto-Electronic Science
  • 2025-06-25
  • Observation of polaronic state assisted sub-bandgap saturable absorption
  • Li Zhou, Yiduo Wang, Jianlong Kang, Xin Li, Quan Long, Xianming Zhong, Zhihui Chen, Chuanjia Tong, Keqiang Chen, Zi-Lan Deng, Zhengwei Zhang, Chuan-Cun Shu, Yongbo Yuan, Xiang Ni, Si Xiao, Xiangping Li, Yingwei Wang, Jun He
  • Opto-Electronic Advances
  • 2025-06-19
  • Three-dimensional measurement enabled by single-layer all-in-one transmitting-receipting optical metasystem
  • Xiaoli Jing, Qiming Liao, Misheng Liang, Bo Wang, Junjie Li, Yongtian Wang, Rui You, Lingling Huang
  • Opto-Electronic Advances
  • 2025-06-19
  • Fast-zoom and high-resolution sparse compound-eye camera based on dual-end collaborative optimization
  • Yi Zheng, Hao-Ran Zhang, Xiao-Wei Li, You-Ran Zhao, Zhao-Song Li, Ye-Hao Hou, Chao Liu, Qiong-Hua Wang
  • Opto-Electronic Advances
  • 2025-06-19
  • Cascaded metasurfaces for adaptive aberration correction
  • Lei Zhang, Tie Jun Cui
  • Opto-Electronic Advances
  • 2025-05-27
  • Embedded solar adaptive optics telescope: achieving compact integration for high-efficiency solar observations
  • Naiting Gu, Hao Chen, Ao Tang, Xinlong Fan, Carlos Quintero Noda, Yawei Xiao, Libo Zhong, Xiaosong Wu, Zhenyu Zhang, Yanrong Yang, Zao Yi, Xiaohu Wu, Linhai Huang, Changhui Rao
  • Opto-Electronic Advances
  • 2025-05-27
  • Spectrally extended line field optical coherence tomography angiography
  • Si Chen, Kan Lin, Xi Chen, Yukun Wang, Chen Hsin Sun, Jia Qu, Xin Ge, Xiaokun Wang, Linbo Liu
  • Opto-Electronic Advances
  • 2025-05-27
  • Wearable photonic smart wristband for cardiorespiratory function assessment and biometric identification
  • Wenbo Li, Yukun Long, Yingyin Yan, Kun Xiao, Zhuo Wang, Di Zheng, Arnaldo Leal-Junior, Santosh Kumar, Beatriz Ortega, Carlos Marques, Xiaoli Li, Rui Min
  • Opto-Electronic Advances
  • 2025-05-27



  • Grassland: A Rapid Algebraic Modeling System for Million-variable Optimization                                China's Technology Cooperation with Russia: Geopolitics, Economics, and Regime Security
    About
    |
    Contact
    |
    Copyright © PubCard