
Understanding Program Semantics
Here's a reflection on the need to represent code before actually feeding it into neural network based encoders, such as code2vec, word2vec, and code2seq ... Read More

Can Code Be translated?
Here we talk about Code2seq, which differs in adapting neural machine translation techniques to the task of mapping a snippet of code to a sequence of words ... Read More

Further Down Code2vec
Here is a tutorial on the usage of code2vec to predict method names, determine the accuracy of the model, and exporting the corresponding vector embeddings ... Read More

Embedding Code Into Vectors
Here we discuss code2vec relation with word2vec and autoencoders to grasp better how feasible it is to represent code as vectors, which is our main interest ... Read More

The Vectors of Language
This post is an overview of word2vec, a method for obtaining vectors that represent natural language in a way that is suitable for machine learning algorithms ... Read More

Triage for Hackers
This post is a high-level review of our previous discussion concerning machine learning techniques applied to vulnerability discovery and exploitation ... Read More
Vulnerability Classifier
Here is a simple attempt to define a vulnerability classifier using categorical encoding and a basic neural network with a single hidden layer ... Read More

Digression to Regression
In this post, we begin to tackle why vectors are the most appropriate representation for data as input to machine learning algorithms ... Read More

Tainted Love
This blog post provides a brief description of static and dynamic taint analysis or taint checking ... Read More

Fool the Machine
You'll see how to create images that fool classifiers into thinking they see the wrong object while maintaining visual similarity to a rightly classified image ... Read More