Five key analytics that clearly illustrate the momentum in the Bitcoin market.

Image for post
Image for post
Source: https://news.bitcoin.com/bitcoin-to-rise-to-20k-this-year-spurred-by-government-money-printing-and-covid-19-bloomberg/

The crypto market is experiencing a tremendous momentum led by the rally in Bitcoin prices. A lot has been written about the macro factors that are leading the market trends in the crypto space so I don’t plan to bore you with more subjectivity. The signals of the bullish momentum are everywhere but, sometimes, they are not properly captured in analytics. Today, I would like to show you 5 fascinating analytics that explain the Bitcoin momentum using different perspectives like blockchain, social or derivatives.

1) Bitcoins that are Older than 1 Week and Younger than 1 Month are Moving More than the Rest of the Market

Look at the following chart comparing Bitcoin UTXO transactions from a month ago and recent days. You can clearly see that Bitcoins between one week and one month have increased over 6%. Also Bitcoins younger than one day have also increased while Bitcoins between one day and one week have decreased. …


GPipe and PipeDream are new frameworks for high scale training in deep learning solutions.

Image for post
Image for post
Source: https://journal.jp.fujitsu.com/en/2017/01/12/01/

Microsoft and Google have been actively working on new models for training deep neural networks. The result of that work has been the release of two new frameworks: Microsoft’s PipeDream and Google’s GPipe that follow similar principles to scale the training of deep learning models. Both projects have been detailed in respective research papers(PipeDream, GPipe) which I would try to summarize today.

Training is one of those areas of the lifecycle of deep learning programs that we don’t think of as challenging until the model’s hit certain scale. While training basic models during experimentation is relatively trivial, the complexity scales linearly with the quality and size of the model. For example, the winner of the 2014 ImageNet visual recognition challenge was GoogleNet, which achieved 74.8% top-1 accuracy with 4 million parameters, while just three years later, the winner of the 2017 ImageNet challenge went to Squeeze-and-Excitation Networks, which achieved 82.7% top-1 accuracy with 145.8 million (36x more) parameters. …


Abstract reasoning is one of the hallmarks of human cognition. OpenAI research shows how it can be recreate it in deep neural networks.

Image for post
Image for post
Source: https://medium.com/swlh/an-iq-test-proves-that-neural-networks-are-capable-of-abstract-reasoning-9b0da1b5e97a

I recently started a new newsletter focus on AI education. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

The evolution of artificial intelligence(AI) has highlighted some major differences between human and machine intelligence. One of those differences is related to the fact that the current generation of AI methods are really focused on mastering tasks while humans are exceptional at learning generic concepts that are relevant to many diverse tasks. …


Weekly newsletter that discusses impactful ML research papers, cool tech releases, the money in AI, and real-life implementations.

Image for post
Image for post

The Sequence Scope is a summary of the most important published research papers, released technology and startup news in the AI ecosystem in the last week. This compendium is part of TheSequence newsletter. Data scientists, scholars, and developers from Microsoft Research, Intel Corporation, Linux Foundation AI, Google, Lockheed Martin, Cardiff University, Mellon College of Science, Warsaw University of Technology, Universitat Politècnica de València and other companies and universities are already subscribed to TheSequence.

📝 Editorial: The First AI Startups IPOs

Initial public offerings (IPOs) are not only a major achievement in the lifetime of any company but also a strong sign of maturity of a given market. In recent technology trends, the first group of startups that make it to the IPO line have gone to become emblematic companies of those respective markets while also signaling that the market has been mature enough to produce standalone viable companies. The examples are everywhere: Salesforce.com in cloud, Okta in enterprise identity, Cloudera in big data, New Relic in application performance monitoring, Twilio in cloud communications and the list goes on and on. The first IPOs of any tech trend are not necessarily the most successful companies but they certainly pave the wave for the rest of the market. In artificial intelligence (AI), there are several public companies such as Nvidia, Microsoft, Alphabet that have capitalized on the trend but we still haven’t seen startups from the AI-era debut as public companies. …


The digital assistant incorporates a reformulation engine that can learn to correct responses in real time based on customer interactions .

Image for post
Image for post
Source: https://www.howtogeek.com/286076/how-to-make-the-amazon-echo-understand-you-better/

I recently started a new newsletter focus on AI education. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

Digital assistant such as Alexa, Siri, Cortana or the Google Assistant are some of the best examples of mainstream adoption of artificial intelligence(AI) technologies. These assistants are getting more prevalent and tackling new domain-specific tasks which makes the maintenance of their underlying AI particularly challenging. The traditional approach to build digital assistant has been based on natural language understanding(NLU) and automatic speech recognition(ASR) methods which relied on annotated datasets. …


An analysis of the core challenges of machine learning based quant strategies in crypto.

Image for post
Image for post

Yesterday, we presented a session about the reasons why most machine learning based quant strategies in the crypto space fail. The session was a continuation of a recent article we published at CoinDesk. In summary here are the 10 core reasons we see as main challenges to quant strategies in crypto markets.

1. Small datasets: Training datasets in crypto are relatively small by quant standards.

2. Regular “outlier” events: Rare events in crypto markets happen all the time and they trick prediction models.

3. Propensity to overfit: The small datasets, the pressure that quant teams operate under and the limited research conspire towards making many quant models overfit for the training datasets. …


The TensorFlow Constrained Optimization framework helps to incorporate fairness constraints in machine learning models.

Image for post
Image for post
Source: https://twitter.com/GoogleAI/status/1230922776849993729

I recently started a new newsletter focus on AI education. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

Fairness is a highly subjective concept and is not different when comes to machine learning. We typically feels that the referees are “unfair” to our favorite team when they lose a close match or that any outcome is extremely “fair” when it goes our way. Given that machine learning models cannot rely on subjectivity, we need an efficient way to quantify fairness. A lot of research has been done in this area mostly framing fairness as an outcome optimization problem. …


Polygames, PyTorch3D and HiPlot are the new additions to Facebook’s open source deep learning stack.

Image for post

I recently started a new newsletter focus on AI education. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

The research divisions of technology giants such as Microsoft, Google, Facebook, Amazon, Uber etc. have become some of the most active contributors to open source frameworks in the artificial intelligence(AI) space. Their contributions combine stacks that have been tested at scale in their internal solutions as well as some very advanced ideas from their research labs. While having the biggest tech firms in the world actively contributing to open source deep learning is certainly exciting, it makes it a bit hard for data scientists to keep up with the new developments in the space. Today, I would like to cover three new releases that Facebook AI Research(FAIR) open source in the last month. One month into 2020, three new open source releases from the FAIR team. …


Weekly newsletter that discusses impactful ML research papers, cool tech releases, the money in AI, and real-life implementations.

Image for post
Image for post

The Sequence Scope is a summary of the most important published research papers, released technology and startup news in the AI ecosystem in the last week. This compendium is part of TheSequence newsletter. Data scientists, scholars, and developers from Microsoft Research, Intel Corporation, Linux Foundation AI, Google, Lockheed Martin, Cardiff University, Mellon College of Science, Warsaw University of Technology, Universitat Politècnica de València and other companies and universities are already subscribed to TheSequence.

📝 Editorial: Deep Learning for Java and .NET Developers

The deep learning space is mostly a Python game. Python’s flexibility and mathematically intuitive syntax have made it a favorite language of AI researchers and technologists producing a large and diverse group of data science frameworks and technology stacks. So if you like Python, you can find many options to build deep learning solutions. But what about the rest of the programming languages? For decades, Java and .NET battled for a dominant position in the enterprise space, attracting millions of developers to their respective stacks. The developer market composition seems to indicate that deep learning platforms for Java and .NET …


The new algorithm takes a novel approach to neural architecture search.

Image for post
Image for post
Source: http://www.morrisriedel.de/neural-architecture-search

I recently started a new newsletter focus on AI education. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below:

Neural architecture search(NAS) is one of the hottest trends in modern deep learning technologies. Conceptually, NAS methods focus on finding a suitable neural network architecture for a given problem and dataset. Think about it as making machine learning architecture a machine learning problem by itself. In recent years, there have been an explosion in the number of NAS techniques that are making inroads into mainstream deep learning frameworks and platforms. However, the first generation of NAS models have encountered plenty of challenges adapting neural networks that were tested on one domain to another domain. As a result, the search for new NAS techniques is likely to continue driving new innovations in the space. …

About

Jesus Rodriguez

CEO of IntoTheBlock, Chief Scientist at Invector Labs, Guest lecturer at Columbia University, Angel Investor, Author, Speaker.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store