Every month, we deliver a community update with a list of the recent articles and technologies, in various topics, which worth knowing about.

Highlight: Self Supervised Learning

Yann LeCunn, VP & Chief AI Scientist at Facebook, had given an excellent keynote during NVIDIA GTC titled "The Energy-Based View of Self-Supervised Learning (SSL)".

In his talk, he summarized the recent efforts of using supervised learning in computer vision. Supervised learning originated in NLP Language Models, where the model is trained by guessing the next word or a missing word in a sentence. Training such models are relatively easy to do because training data is available. The trained model is later trained further in a supervised manner on a downstream task (a.k.a fine-tuning).

This idea was transferred to computer vision by predicting a 'covered' part of images. However, it was proven difficult due to the high dimensionality nature of images. Instead of guessing a single missing word, the model had to predict a squared amount of pixels. In his talk, LeCunn reviews the recent different approaches and attempts and compares them by energy means of consumption and saving.

One such method is Contrastive Joint Embedding, where a Siamese Network is used to judge if two images are of the same object by measuring their similarity (or distance), an idea that originated in 1993 for signature comparison. However, most of these contrastive prediction methods are considerably expensive for images in practice, especially for high-resolution ones. Instead, this contrastive method works better for speech recognition. Multilingual representations are created by masking part of the audio and training a Siamese network to generate vector embeddings for different audio samples and then detect if they belong to the same spoken sentence. It's easy to see the resemblance to adversarial generators (GANs), which also belong to the contrastive methods family.

Another such method is called Amortized Inference, for which a famous example is Variational Autoencoders (VAE). LeCunn reviews several types of such amortized inferences and dives deeper into conditional VAE using Kullback Leibler term and dropout to control the embedding distribution. This method works well for multimodality and uncertainty.

The slides, available here, are very informative and understandable, and worth reading to understand the SOTA of supervised deep learning.

Machine Learning

Perceiver: General Perception with Iterative Attention

Mike Erlihson had shared with us extensive details about the Perceiver, a new sort of Transformer which solves its squared complexity problem and can work on complex and long data.

Perceiver: General Perception with Iterative Attention
To overcome Transformers’ squared complexity (w.r.t input length), the Perceiver article here offers a novel method to learn the QKV matrices. Check it out!

Papers With Datasets

The popular website PapersWithCode added a special section for papers that introduce new datasets: Papers With Datasets.

Shapash: Model Reasoning

Data Scientists from MAIF, a french insurance company, released a framework that helps investigating the connections between features and predictions. Their package creates a local HTTP server that enables navigating between local and global explanations for the datasets.

It currently supports non-deep learning models and frameworks, such as Catboost, Xgboost, LightGBM, Sklearn Ensemble, Linear models, SVM, etc.

Shapash makes Machine Learning models transparent and understandable by everyone - MAIF/shapash

Superset: Data Visualization

Superset is an open source business intelligence and data visualization/exploration platform from apache software foundation. It is aimed as a solution for organizations, to install it internally.

It can connect to many different data sources, and supports querying and visualizing the data in 2D and 3D.

Apache Superset is a Data Visualization and Data Exploration Platform - apache/superset

Software Development

Learn Algorithms with Examples

From graph algorithms like Bellman Ford, BFS & DFS, ciphers such as RSA and AES, Machine Learning, dynamic programming, sorting and data structures.

All algorithms are implemented in 22 programming languages.

Faster Python

This isn't new, but Itamar Turner-Trauring has so many good tips about improving both Python code performance as well as Pandas, especially when handling big datasets, and for using Docker.

Speed up your code
Helping you deploy with confidence, ship higher quality code, and speed up your application.

EverSQL: Database Query Optimization

EverSQL uses ML to optimize SQL queries. It parses your DB structure and offers suggestions such as adding indexes to the table or changing the query structure to improve its speed. Even if your DB is currently working fast, the more queries and users you have, the performance may deteriorates. Try it out for free here.  

Natural Language Processing (NLP)

SummVis: Visualize and Evaluate automatic summaries

SummVis is a new open-source interactive visualization tool that also enables in-depth exploration of the summary quality. It marks when the output and the original text overlap and where the model was simply hallucinating.

Textacy V. 0.11

Textacy, a Natural Language Processing library built on top of spaCy, had a major version release to comply with spaCy version 3.0. It has better support in pre-processing raw text, stripping HTML tags, more structured information extraction methods, integrated word embeddings with new similarity/distance calculation (Tversky, Jaro, etc.), and much more.

Upcoming Events

For more events, visit the AI Event Calendar.

May 25th-28th - Applied AI Conference in Vienna

How real world businesses make use of AI

AAIC is about real-world applications of artificial intelligence in business. This is where developers of AI solutions meet with users and potential users from all industries, internationally and nationally. This year we are putting a special focus on the USA!

We organize the AAIC together with AI Austria.


Check out open positions in selected companies.

That's it for this month. If you have feedback, news or events you wish to share with us, please feel free to contact us.

see you next month!