Yandex Open Sources CatBoost to Continue Building the Polyglot Deep Learning World

Another month and another deep learning framework to get familiar with. A few days ago, Russia’s search giant Yandex open sourced CatBoost, a new deep learning library that specializes on a technique known as gradient boosting. CatBoost encapsulates Yandex’s latest machine learning research efforts replacing an older framework known as MatrixNet which has been widely adopted across many Yandex’s services.

From a theoretical standpoint, gradient boosting is a machine learning technique that specializes on discovering learning patterns in highly sparse datasets. Gradient boosting excels when applied against semi-structured or structured transactional and historical data instead of the traditional sensorial data(video, audio, image) that is so common in deep learning algorithms.

If you are reading this you might already be thinking: Jeez… do we really need another machine learning framework? After all, the market is already inundated with open source deep learning libraries such as TensorFlow, Torch, Theano, Caffe and many others. This level of fragmentation gets an order of magnitude more complex if we consider the fact that deep learning frameworks are not exactly easy to learn by mainstream developers.

The short answer to the previous question is that deep we are and will continue to live in a multi-framework deep learning world. In this polygot ecosystem, there are frameworks and runtimes that specialize on different areas of the broad machine learning theory. Undoubtedly, the proliferation of deep learning frameworks may result scary to people getting started in the space but there are some ideas that might help us to better navigate the ecosystem.

Many Sense of a Polyglot Deep Learning World

Below, I’ve listed a few ideas that could help you understand the current polygot deep learning ecosystem. If nothing else, some of this ideas should help you to better reason about this emerging market:

1 — No Framework is Good at Everything: The first thing we should understand in order to not feel overwhelmed by the large number of deep learning frameworks in the market is that no single framework can be generically applied across the entire spectrum of machine learning problems. Some frameworks specialized on different models of learning( supervised, unsupervised, reinforcement…) while other excel at operating against specific types of data and scenarios.

2 — Big Software Companies Have Their Favorite Frameworks: Part of the fragmentation in the deep learning space is due to the fact that software incumbents are regularly putting resources behind different frameworks. Google open sourced TensorFlow, Microsoft its Cognitive Toolkit, Baidu recently released PaddlePaddle, Facebook is actively contributing to Caffe2, Amazon backs up MxNet and now we have Yandex’s CatBoost.

3 — High & Low Level Frameworks: When inspecting the deep learning space, it might help to make a distinction between high and low level frameworks. In the low level category we can place frameworks like TensorFlow, Torch or Theano that directly enable the manipulation of computation graphs. There are also higher level frameworks such as Keras, Sonnet or Bonsai that provide simpler programming models to create structures such as neural networks while using “low-level” frameworks as the underlying runtime.

4 — Keeping Up with Machine Learning Research: The proliferation of deep learning frameworks is directly related to the explosion in machine learning research in recent years. Many times, developers want to take advantage of a new machine learning research technique (ex: gradient boosting) but they realize that the models are not easy to implement with the existing frameworks so, you guessed it, they decide to create and open source a new library.

5 — The Opportunity for a Common Deep Learning Runtime: The fragmentation of the deep learning space has created a massive opportunity for startups providing tools and runtime infrastructure that can efficiently interoperate across the existing frameworks. Platforms such as Bonsai, BitFusion or Algorithmia CODEX are already dabbling into that space.

Written by

CEO of IntoTheBlock, Chief Scientist at Invector Labs, Guest lecturer at Columbia University, Angel Investor, Author, Speaker.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store