Sciblog - A blog designed like a scientific paper

 

My most popular LinkedIn posts of 2021

Feb. 26, 2022

Miguel González-Fierro

 
 

In 2021, I started posting on LinkedIn more consistently. A lot of my posts are about simplifying AI concepts so people can discover the field, learn something new or even grow their career in Data Science. In 2021 I got around 2 million views in my content, here are some of the most popular posts.

 
 

A Gentle Introduction to Distributed Training with DeepSpeed

Jan. 30, 2022

Miguel González-Fierro

 
 

DeepSpeed is an open-source library that facilitates the training of large deep learning models based on PyTorch. With minimal code changes, a developer can train a model on a single GPU machine, a single machine with multiple GPUs, or on multiple machines in a distributed fashion. In this post, we review DeepSpeed and explain how to get started.

 
 
 
 

Matrix Factorization is one of the most widely used methods in Recommendation Systems, whose ultimate goal is to understand what is the user preference for a set of items. For example, Netflix tries to understand what movie you would like to watch next. In this post, we explain in simple terms how Matrix Factorization works.

 
 

Understanding the new project of Elon Musk, the Tesla Bot

Aug. 22, 2021

Miguel González-Fierro

 
 

Elon Must unveiled a new project to build a full-size humanoid robot, it will be lightweight, with only 57Kg and 40 actuators. The first prototype will be ready next year. After being a researcher for more than 8 years in humanoid robotics, before going full-time into the AI space, here are my thoughts about the Tesla Bot.

 
 

A Gentle Introduction to Fourier Transformers for NLP

May 23, 2021

Miguel González-Fierro

 
 

The attention mechanism is responsible for much of the recent success in NLP tasks such as text classification, named-entity recognition, Q&A, translation to name a few. However, computing attention is expensive. In this post, we summarize a new approach that replaces self-attention with a Fast Fourier Transform, which achieves 92% of the accuracy of BERT while being 7 times faster.