×
ヒント: 日本語の検索結果のみ表示します。検索言語は [表示設定] で指定できます
A unified pre-trained model, HexT5, that is trained on vast amounts of natural language comments, source identifiers, and pseudo-code.
We fine-tune HexT5 on various downstream tasks, including code summarization, variable name recovery, function name recovery, and similarity detection.
We fine-tune HexT5 on various downstream tasks, including code summarization, variable name recovery, function name recovery, and similarity detection.
clinical coding. Conference Paper. HexT5: Unified Pre-Training for Stripped Binary Code Information Inference. September 2023. DOI:10.1109/ASE56229.2023.00099.
We evaluate HexT5 on four downstream inference tasks including three conditional sequence generation tasks: summa- rization, variable name recovery, and ...
Affiliation:University of Science and Technology of China. Contributions. 2023. ASE. Author of HexT5: Unified Pre-training for Stripped Binary Code ...
: HexT5: Unified Pre-Training for Stripped. Binary Code Information Inference, in 38th IEEE/ACM In- ternational Conference on Automated Software Engineering ...
2024/04/16 · Furthermore, HexT5 [18] proposed a unified pre-training model also based on CodeT5, which allowed multi-task learning, supported function name.
IEEE Transactions on Multimedia (TMM), 2023. HexT5: Unified Pre-training for Stripped Binary Code Information Inference. Jiaqi Xiong, Guoqiang Chen, Kejiang ...