Year
Month
(Preprint) CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Xin Wang ¹, Yasheng Wang ², Pingyi Zhou ², Meng Xiao ², Yadao Wang ², Li Li ³, Xiao Liu ⁴, Hao Wu 武浩 ⁵, Jin Liu 刘进 ¹, Xin Jiang ²
¹ School of Computer Science, Wuhan University 武汉大学 计算机学院
² Noah's Ark Lab, Huawei 华为 诺亚方舟实验室
³ Faculty of Information Technology, Monash University
⁴ School of Information Technology, Deakin University
⁵ School of Information Science and Engineering, Yunnan University 云南大学 信息学院
arXiv, 2021-08-10
Abstract

Pre-trained models for programming languages have proven their significant values in various code-related tasks, such as code search, code clone detection, and code translation. Currently, most pre-trained models treat a code snippet as a sequence of tokens or only focus on the data flow between code identifiers.

However, rich code syntax and hierarchy are ignored which can provide important structure information and semantic rules of codes to help enhance code representations. In addition, although the BERT-based code pre-trained models achieve high performance on many downstream tasks, the native derived sequence representations of BERT are proven to be of low-quality, it performs poorly on code matching and similarity tasks.

To address these problems, we propose CLSEBERT, a Constrastive Learning Framework for Syntax Enhanced Code Pre-Trained Model, to deal with various code intelligence tasks. In the pre-training stage, we consider the code syntax and hierarchy contained in the Abstract Syntax Tree (AST) and leverage the constrastive learning to learn noise-invariant code representations. Besides the masked language modeling (MLM), we also introduce two novel pre-training objectives. One is to predict the edges between nodes in the abstract syntax tree, and the other is to predict the types of code tokens. Through extensive experiments on four code intelligence tasks, we successfully show the effectiveness of our proposed model.
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_1
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_2
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_3
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_4
  • OptoGPT: A foundation model for inverse design in optical multilayer thin film structures
  • Taigao Ma, Haozhu Wang, L. Jay Guo
  • Opto-Electronic Advances
  • 2024-07-10
  • Paving continuous heat dissipation pathways for quantum dots in polymer with orange-inspired radially aligned UHMWPE fibers
  • Xuan Yang, Xinfeng Zhang, Tianxu Zhang, Linyi Xiang, Bin Xie, Xiaobing Luo
  • Opto-Electronic Advances
  • 2024-07-05
  • Multiplexed stimulated emission depletion nanoscopy (mSTED) for 5-color live-cell long-term imaging of organelle interactome
  • Yuran Huang, Zhimin Zhang, Wenli Tao, Yunfei Wei, Liang Xu, Wenwen Gong, Jiaqiang Zhou, Liangcai Cao, Yong Liu, Yubing Han, Cuifang Kuang, Xu Liu
  • Opto-Electronic Advances
  • 2024-07-05
  • Photonics-assisted THz wireless communication enabled by wide-bandwidth packaged back-illuminated modified uni-traveling-carrier photodiode
  • Yuxin Tian, Boyu Dong, Yaxuan Li, Bing Xiong, Junwen Zhang, Changzheng Sun, Zhibiao Hao, Jian Wang, Lai Wang, Yanjun Han, Hongtao Li, Lin Gan, Nan Chi, Yi Luo
  • Opto-Electronic Science
  • 2024-07-01
  • Control of light–matter interactions in two-dimensional materials with nanoparticle-on-mirror structures
  • Shasha Li, Yini Fang, Jianfang Wang
  • Opto-Electronic Science
  • 2024-06-28
  • Highly enhanced UV absorption and light emission of monolayer WS2 through hybridization with Ti2N MXene quantum dots and g-C3N4 quantum dots
  • Anir S. Sharbirin, Rebekah E. Kong, Wendy B. Mato, Trang Thu Tran, Eunji Lee, Jolene W. P. Khor, Afrizal L. Fadli, Jeongyong Kim
  • Opto-Electronic Advances
  • 2024-06-28
  • High performance micromachining of sapphire by laser induced plasma assisted ablation (LIPAA) using GHz burst mode femtosecond pulses
  • Kotaro Obata, Shota Kawabata, Yasutaka Hanada, Godai Miyaji, Koji Sugioka
  • Opto-Electronic Science
  • 2024-06-24
  • Large-field objective lens for multi-wavelength microscopy at mesoscale and submicron resolution
  • Xin Xu, Qin Luo, Jixiang Wang, Yahui Song, Hong Ye, Xin Zhang, Yi He, Minxuan Sun, Ruobing Zhang, Guohua Shi
  • Opto-Electronic Advances
  • 2024-06-11
  • Seeing at a distance with multicore fibers
  • Haogong Feng, Xi Chen, Runze Zhu, Yifeng Xiong, Ye Chen, Yanqing Lu, Fei Xu
  • Opto-Electronic Advances
  • 2024-06-05
  • NIR-triggered on-site NO/ROS/RNS nanoreactor: Cascade-amplified photodynamic/photothermal therapy with local and systemic immune responses activation
  • Ziqing Xu, Yakun Kang, Jie Zhang, Jiajia Tang, Hanyao Sun, Yang Li, Doudou He, Xuan Sha, Yuxia Tang, Ziyi Fu, Feiyun Wu, Shouju Wang
  • Opto-Electronic Advances
  • 2024-06-05
  • Reconfigurable optical neural networks with Plug-and-Play metasurfaces
  • Yongmin Liu, Yuxiao Li
  • Opto-Electronic Advances
  • 2024-06-04



  • Grassland: A Rapid Algebraic Modeling System for Million-variable Optimization                                China's Technology Cooperation with Russia: Geopolitics, Economics, and Regime Security
    About
    |
    Contact
    |
    Copyright © PubCard