The Trump switch

Can predictive analytics help markets profit from the President’s tweets?

Tweet This

Even before he became president, some of Donald Trump’s social media posts were enough to give Wall Street the jitters. After he fired Twitter missiles at Boeing and Lockheed Martin, hitting their stock prices, traders started looking at developing a ‘Trump Switch’ algorithm that gives them an advantage over the rest of the market. This issue has not gone away because, since his inauguration, President Trump has continued to use his personal Twitter account and this has contributed to something the markets don’t like – uncertainty.

Is it possible to build a ‘Trump switch’?

Tweet This

From a technical perspective, there are two questions to be asked about such an algorithm. Firstly, Is it possible? That’s easy to answer - yes, of course it is. According to the Los Angeles Times many of the big traders already have one in place. Which begs the follow-up, how effective will they be? That is the 64-million-dollar question.

To get an answer to this, we talked to Fabrizio Fantini, CEO of Evo Pricing.

The 3 things you need for a successful algorithm

Tweet This

“To put it simply,” Fantini says, “A successful algorithm depends on 3 main things: having a clear idea of what you want it to do; asking the right questions; and collating as much relevant data as possible. For example, our main algorithm, developed over the last 5 years, measures more than 1 million variables.

“The most famous algorithm in the world, Google’s search engine, took over a decade to reach its present point and has hundreds of the world’s brightest minds working on it. So it might be that, by the time Wall Street develops a reliable working model, Trump could even be out of office.”

Evolution on steroids

So does that mean the ‘Trump Algorithms’ already in place don’t work properly?

“No. Asking that question is to misunderstand how machine learning works. As it gets more data and tests it in the real world, feedback enables an algorithm to incrementally improve. It’s an evolutionary process but, because of the incredible speeds that modern computers can achieve, this process is very fast. Evolution on steroids, if you will. Darwin would have loved Big Data. Tweet This

I like to think of an original algorithm as a very talented intern who works their way up to the position of CEO in a matter of months, weeks, even days. They improve that quickly. And even as they are learning they are still adding value to the company.”

Machine learning with a human touch

“Having said that, the big question is, ‘How much human oversight of algorithms should there be?’ McKinsey predicted in 2015 that automated trading would become a mainstay of Wall Street firms but we at Evo Pricing have always stressed ‘machine learning with a human touch’ Tweet This.

“In May 2010 there was a ‘flash crash’ caused by automated high-frequency trading algorithms and the Dow Jones fell 500 points in a matter of seconds. Fortunately, there was a failsafe which cut in and the market recovered, but this does highlight the possible dangers of a “hands-off” approach to automated systems. On the other hand, what we have to remember is that, back in 2010, those kind of algorithms were still in their infancy. We have come a long way since then.

Google’s nowcasting algorithm ‘catches flu’

To many people automated high-frequency trading is an abstract thing that doesn’t matter much in their lives. Are there other areas of this machine/human oversight issue that are more relevant to the man and woman on the street? Tweet This

“Definitely. The area of public health is a big one. It’s vital for health authorities and hospitals to have accurate models tracking epidemics so they can allocate the right resources to the right place at the right time. We’re talking here about saving people’s lives.

“Between 2008 and 2012 the CDC (Center for Disease Control) could call on Google’s Flu Trends service to help with this important function. Unfortunately, in the winter of 2012/13, the forecasts were so wrong as to be useless.”

How did this happen?

“Well, generally speaking, when you have enough data, you don’t need a model or theory of what causes what because correlations found in Big Data are useful to predict and explain causality.

“However, if you use the wrong metrics or methodology, your results can be really skewed. It’s like a ship leaving port on the wrong compass setting. Over the first few miles you are only a few yards off course but, the further you travel, the bigger the error.”

Google’s Flu Trends was a website service started in 2008 aimed at ‘nowcasting’ flu outbreaks. Google analyzed 50 million search queries and historical (pre-2007) data about flu diffusion, using the top 45 historical flu-related search keywords. Google had some early successes with this data-driven algorithm, being able to predict flu outbreaks up to 7-10 days before they were reported by the CDC.

“Unfortunately, in 2012/13, Google overestimated peak flu levels by a large margin. The CDC reported 6% of the population with influenza-like illness while Google had predicted over 10%. In a population of more than 300 million, that 4% difference is a lot of people.”

Google and big data hubris

How did Google get it so wrong?

“We identified 4 possible areas where we thought their study was flawed:

  • The news in 2012/2013 was full of stories about the flu, inducing many healthy people to search for ‘flu’ or related terms.
  • People making flu-related searches may have known very little about how to diagnose flu, so they may have thought they had flu but actually had something else.
  • Google’s algorithm assumed that ‘flu searches’ = ‘flu patients’.
  • Google did not ask the more fundamental question of why people search for ‘flu’. It had no theory of human behaviour underlying the algorithm.

    “There is a concept in analytics called ‘Big Data Hubris’ Tweet This – the idea that just having exabytes or petabytes of data is enough to produce meaningful results. As the song goes, ‘It ain’t what you do, it’s the way that you do it’. The bottom line is that there are always ‘small’ method issues in Big Data (or in any data) and these issues matter for the type of insight we can generate from the data.

    Again, the human touch. You need humans to decide the correct methodology and humans to evaluate the data.

    Does your business need the kind of data-driven insights that are already helping companies around the world increase profitability by hundreds of millions of dollars?

    To contact us or to find out more about ‘machine learning with a human touch’ go to https://evopricing.com/questions


  • About the writer

    Martin

    Martin Luxton is a writer and content strategist who specializes in explaining how technology affects business and everyday life.

    Big Data and Predictive Analytics are here to stay and we have only just begun tapping into their enormous potential.