Home » Publication » 28421

Dettaglio pubblicazione

2024, IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, Pages -

BinBert: Binary Code Understanding with a Fine-tunable and Execution-aware Transformer (01a Articolo in rivista)

Artuso Fiorella, Mormando Marco, DI LUNA GIUSEPPE ANTONIO, Querzoni Leonardo

A recent trend in binary code analysis promotes the use of neural solutions based on instruction embedding models. An instruction embedding model is a neural network that transforms assembly instructions into embedding vectors. If the embedding network is able to processes sequences of assembly instructions transforming them into a sequence of embedding vectors, then the network effectively represents an assembly code model . In this paper we present BinBert, a novel assembly code model. BinBert is built on a transformer pre-trained on a huge dataset of both assembly instruction sequences and symbolic execution information. BinBert can be applied to assembly instructions sequences and it is fine-tunable , i.e. it can be re-trained as part of a neural architecture on task-specific data. Through fine-tuning, BinBert learns how to apply the general knowledge acquired with pre-training to the specific task. We evaluated BinBert on a multi-task benchmark that we specifically designed to test the understanding of assembly code. The benchmark is composed of several tasks, some taken from the literature, and a few novel tasks that we designed, with a mix of intrinsic and downstream tasks. Our results show that BinBert outperforms state-of-the-art models for binary instruction embedding, raising the bar for binary code understanding.
keywords
© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma