Machine Learning in Economics and Finance
Machine Learning followed the fate of AI and experienced long periods of low interest and low funding, often referred to as “AI winters”, but the present period is quite different as the timing of recent technological advances and the inception of new ML structures coincides ideally.
Abstract
The term Machine Learning (ML) was introduced by Arthur Samuel while working for IBM in 1959, mainly to describe the pattern recognition tasks that delivered the "learning" component on the pioneering then Artificial Intelligence (AI) systems.The concept of Artificial Intelligence was theoretically and tentatively investigated from the 1930's though it was systematically studied after the famous Dartmouth Workshop of 1956 (Kline 2011).There, among other things, John McCarthy, research fellow at MIT at that time, proposed the term Artificial Intelligence over Cybernetics.In these early years, ML systems were only considered as part of a wider AI system.Since then, the range of practical applications of Machine Learning has been very wide, outreaching the narrow limits defined by the AI framework.Today, there are more autonomous ML systems than there are ML components in AI architectures.The terms AI and ML are often abusively interchanged for many reasons (trending, funding, or even ignorance), creating confusion to the non-expert.A general rule of thumb is that if the system acts without intervention, then it is probably AI.If the system classifies or forecasts through learning, then it is ML.The learning process was well established in 1997 by Professor Tom M. Mitchel from Carnegie Mellon University, in his famous quote from (1997) "A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E".Self-driving cars use AI systems, the automatic vision system that identifies an imminent accident is ML.During this scientific evolution, ML followed the fate of AI and experienced long periods of low interest and low funding, often referred to as "AI winters".Nonetheless, the present period is quite different as the timing of recent technological advances and the inception of new ML structures coincides ideally.Affordable parallel computing allowed the use of complex and demanding Deep Learning (DL) architectures like Recurrent Neural Networks and Convolutive Neural Networks to common applications.Moreover, algorithms such as Support Vector Machines and Random Forests and techniques such as kernelization, bagging and boosting allowed for the first time, the application of ML to relatively small datasets.In addition, the