Bi-Directional Transformers vs. word2vec: Discovering Vulnerabilities in Lifted Compiled Code

التفاصيل البيبلوغرافية
العنوان: Bi-Directional Transformers vs. word2vec: Discovering Vulnerabilities in Lifted Compiled Code
المؤلفون: McCully, Gary A., Hastings, John D., Xu, Shengjie, Fortier, Adam
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Cryptography and Security, Computer Science - Computation and Language, Computer Science - Machine Learning, Computer Science - Software Engineering, D.4.6, I.2.6, I.5.1
الوصف: Detecting vulnerabilities within compiled binaries is challenging due to lost high-level code structures and other factors such as architectural dependencies, compilers, and optimization options. To address these obstacles, this research explores vulnerability detection by using natural language processing (NLP) embedding techniques with word2vec, BERT, and RoBERTa to learn semantics from intermediate representation (LLVM) code. Long short-term memory (LSTM) neural networks were trained on embeddings from encoders created using approximately 118k LLVM functions from the Juliet dataset. This study is pioneering in its comparison of word2vec models with multiple bidirectional transformer (BERT, RoBERTa) embeddings built using LLVM code to train neural networks to detect vulnerabilities in compiled binaries. word2vec Continuous Bag of Words (CBOW) models achieved 92.3% validation accuracy in detecting vulnerabilities, outperforming word2vec Skip-Gram, BERT, and RoBERTa. This suggests that complex contextual NLP embeddings may not provide advantages over simpler word2vec models for this task when a limited number (e.g. 118K) of data samples are used to train the bidirectional transformer-based models. The comparative results provide novel insights into selecting optimal embeddings for learning compiler-independent semantic code representations to advance machine learning detection of vulnerabilities in compiled binaries.
Comment: 8 pages, 0 figures, IEEE 4th Cyber Awareness and Research Symposium 2024 (CARS'24)
نوع الوثيقة: Working Paper
URL الوصول: http://arxiv.org/abs/2405.20611
رقم الأكسشن: edsarx.2405.20611
قاعدة البيانات: arXiv