Technology Fridays: Comet.ML Wants to be the Google Analytics of the Deep Learning World

Welcome to Technology Fridays! Today we would like to explore a brand new platform that just launched to address one of the biggest challenges in machine intelligence applications. If you’ve ever worked in a deep learning project in the real world, you probably found yourself tied in a virtually never ending cycle of testing, regularization, optimization and constant improvement of a model.

The continuously evolving nature of its lifecycle is one of the key characteristics that makes machine learning applications fundamentally different from other types of software systems. In machine learning scenarios, data scientists rarely stop conducting experiments targeted to optimize and improve the behavior of models. However, the toolsets for performing that level of experimentation haven’t evolved at the pace of the corresponding deep learning runtimes and platforms. Recently, a new startup called launched with the promise of allowing data scientists to monitor and optimize machine learning models across different technology stacks.

The experience of using resembles the model adopted by technologies like Google Analytics that made them the standard for monitoring and testing web applications. Similar to Google Analytics, data scientists can start using by embedding a specific tracking code as part of their machine learning model. The scrip will track the specific behavior of the model including important elements such as hyperparameters and other relevant metrics.

Using the portal rapidly perform experiments on a specific model by tuning hyperparameters. The UI allows data scientists to visualize the results of experiments and compare the results based on specific hyperparameters. I can imagine this might sounds like a trivial problem but its one of the biggest nightmares in machine learning applications in the real world.

The platform automatically integrates with several deep learning frameworks such as Keras, TensorFlow, PyTorch, Scikit-Learn and several others. Data scientists can download the SDK for their specific runtime and start tracking any model The code for achieving this is fundamentally simple. The following example illustrates a Keras model that is being monitored using

experiment = Experiment(api_key="YOUR_API_KEY",                         project_name="my project name",                         auto_param_logging=False) batch_size = 128 
num_classes = 10
epochs = 20
params={ "batch_size":batch_size, "epochs":epochs, "num_classes":num_classes}
experiment.log_multiple_params(params) complements its robust machine learning monitoring and optimization capabilities with simple collaboration features that enable data scientists to provide feedback and cooperate on the optimization of specific machine learning programs. The platform is also capable of providing intelligent recommendations for optimizing and regularizing models based on their runtime behavior.

Competition? solves a very challenging aspect of machine learning solutions. However, the platform is not entering the market without competition. Cloud platforms such as Azure ML, AWS SageMaker or Google Cloud ML include their own toolset for monitoring and optimizing machine learning models. Simiarly, startups such as Floyd or BitFusion can also be considered as competitors.

CEO of IntoTheBlock, Chief Scientist at Invector Labs, I write The Sequence Newsletter, Guest lecturer at Columbia University, Angel Investor, Author, Speaker.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store