Financial markets are complex systems with countless variables that can impact asset prices.
At Predicto, we recognize the importance of utilizing unique datasets and cutting-edge research
techniques to analyze and predict market trends.
Our auto-trading pipeline leverages the latest advancements in data science to collect and
process a wide range of datasets from sources such as Nasdaq Data Link.
At Predicto, we often need to train and experiment with dedicated deep learning models for different stocks and cryptocurrencies,
using similar categories of features but different data for each symbol.
To streamline this process, we developed the concept of Data Model Templates (DMTs).
In this blog post, we’ll explain how DMTs can be used to quickly and easily train hundreds of models at scale,
and how our low code API makes it all possible.
We are excited to announce the launch of the Predicto AI Android app, now available for download on the Google Play Store.
Our app brings the power of artificial intelligence and options data to your fingertips,
allowing you to stay up-to-date on the latest market forecasts and trades from anywhere.
In an effort to make Predicto more user friendly, we are now surfacing human readable explainability information of our deep learning forecasts in the form of insights.
In order to detect which features "highly influence" a forecast, Predicto generates the same forecast several times by omitting specific features and observing how this affects the outcome.
Whenever PredictoAI detects a clear influence of one or more features on the outcome of a forecast, you will now see an insight box in the symbol's card.
Today, we'll go through an example that showcases the simplicity of datafloat.ai - the No-Code AI Forecasting Platform
and how easy it makes it for anyone to create Deep Learning explainable forecasting models with a
few clicks - no deep learning experience required. Datafloat is the platform that powers hundreds of models
used by Predicto and generates hundreds of forecasts daily.
Crypto is here to stay. As you might have noticed, Cryptocurrency market is extremely volatile
and easily affected by trending news and high profile tweets.
As a result, we decided to start tracking a small list of cryptocurrencies as a side project.
Today I want to talk about the importance of continuous experimentation and continuous integration.
And about why it is important to invest in building a solid infrastructure that supports those concepts early on.
Machine Learning and Deep Learning model predictions are used in applications running in your laptop, your desktop,
your phone and even in your car or in your home. The list keeps growing as more smart devices enter our homes and lives,
like smart vacuums, fridges and anything else you can imagine.
Performance monitoring is critical, as it allows to pinpoint issues early, debug and update as needed.
Currently, Predicto tracks around 150 stocks, including the entire Nasdaq-100 list.
For every single one of those stocks we maintain at least 3 different Deep Learning models,
sometimes more based on our experimentation.
This means we maintain and occasionally retrain more than 450 models.
When first signing up for Predicto, you might be overwhelmed with all the information you are
suddenly exposed to. This is normal. You don't have to be a Deep Learning expert to use Predicto
but it is important to know its building blocks.
Going through this guide will help you understand how to use our platform the right way from Day One!
After you are done reading this article you will be in a position to setup your own auto stock trader based on
Deep Learning forecasting. You are going to achieve this by retrieving the latest stock forecasts from Predicto API
and submitting them daily to Alpaca using your own test account. You can be up and running in a couple of hours!
In this post we'll talk about time travel in some way.
Forecasting is all about trying to predict what the future holds.
But to do that, we need to understand the past first.
Not just a still snapshot of the past, but a continuous period of time.
We'll present how we are able to visualize our Deep Learning models' "vision" over time, kind of like a movie.
We already covered in a previous post, how important it is to deal with uncertainty in financial Deep Learning forecasts.
In this post, we’ll attempt a first introduction on how we deal with explainability.
Neural networks have been applied to various tasks including stock price prediction.
Although highly successfully, these models are frequently treated as black boxes.
In most cases we know that the performance on the test data is satisfying,
but we do not know why the model came up with a specific output.
In this post, we want to focus on News momentum and narratives: How narratives form in preparation for a big move.
We will see with examples how those narratives are sustained until they fade out or stabilize to a new equilibrium.
This idea has been described in Narrative Economics book by Robert Shiller,
where he tries to set a foundation on narrative economics and show how stories go viral and can drive major economic events.
In this post, we’ll share some of our thoughts on how to study market volatility and news.
We’ll give you some pointers on how we at Predic.to deal with volatility and uncertainty in order to understand
risk involved at any given time. This is a continuation of our series of posts about Investing + Deep Learning.
In the last few years, Hedge Funds and Financial Institutions have been investing in building strong
Data Science and ML/DL teams. It’s no secret in those circles that there is value in using Big and Deep Data
to get valuable insights about investment decisions, short term or long term.