Divide et impera - part I: coupling

This article is about coupling in IT; divide et impera - divide and conquer

Think like a vertex: using Go's concurrency for graph computation

A simple face detection utility from Python to Go

In this article, I explain how to build a tool to detect faces in a picture. This article is a sort of how-to design and implements a tool by using a neural network. For the design part, I describe how to: build the business model thanks to a neural network; adapt the network to the specific domain of face detection by changing its knowledge; use the resulting domain with a go-based infrastructure; code a little application in Go to communicate with the outside world.

From a project to a product: the state of onnx-go

My journey with ONNX and Go - Running the graph

In the previous post, I made an introduction and a POC to interact with ONNX models and Go. I have decoded the information to reconstruct a graph. Now I propose to expand the principle and to create a proper execution backend based on Gorgonia. This post is a bit more technical than the previous one because all the concepts needed to work should be present in the last article. Decoding the tensor In machine learning, the fundamental element of a computation graph is a Tensor.

My journey with ONNX and Go - The beginning

This year has started with a lot of deep thoughts about the software 2.0. My conclusion (which is slightly different from Andrej Karpathy’s consideration) is that a software 2.0 is a combination of a Neural network model and its associated weights. This is a concept; now the question is: how to materialize the idea? What artifact represents a software 2.0. I emitted several ideas and tried one of them: to serialize the mathematical model and the weights.

Recurrent Neural Network, Serverless with Webassembly and S3

During the past weeks, I’ve had the opportunity to play a bit with Wasm and Go. All those experiments led me to a write a proof of concepts that can illustrate everything I have said recently about: Thinking the deep-learning stack like an Ops (see my post about NNRE/NNDK). Capturing the real value of the training process (the knowledge) into a sequence of bits (the lightning talk I gave about it at the dotAI should be online soon).