Advice

Dr Eden provides expert advice on strategic decisions which require both of scientifically-informed academic research and hands-on experience. His clients included policy makers, hedge funds, venture capital investors, corporations, start-ups, and charities (details).

Advice is offered in three areas:

See also: Professional  experience

The difference between theory and practice is greater in practice than in theory

— Origin unknown

Experience gained with —

  • Tracking, measuring & forecasting machine intelligence
  • Training in Effective Forecasting (Good Judgement Project )

Expertise offered in —

  • Future of artificial intelligence
  • Future of algorithmic trading
  • Cryptocurrency & Smart Contract evolution

Questions

Which investment algorithms will hedge funds use next decade? How will AI affect computer trading, now that almost all stocks are traded by algorithms? How will driverless cars affect transportation? Will AI displace 40% of jobs within 20 years? When will solar energy be cheaper than fossil fuel, if at all?

Making reliable technological forecasts seems to require a crystal ball.

Organizations spend staggering amounts of time and money trying to predict the future but no time or money measuring their accuracy or improving on their ability to do it

Prophecy was given to the fools…

Who expected the World Wide Web and mobile phones to take the 1990s by storm? Or BigDog, cryptocurrencies, and computer trading overtaking stock exchanges less than a decade later? Disruptive technology emerges at a breakneck speed as paradigm shifts accelerate. As for experts, when they’re pushed to make accurate forecasts they perform worse on average than simply expecting the future to repeat the past, according to the Good Judgement Project.

Robert Seymour, Locomotion: Walking by Steam, Riding by Steam, Flying by Steam (ca. 1830)
Robert Seymour, Locomotion: Walking by Steam, Riding by Steam, Flying by Steam (ca. 1830)

... but technological forecasting is a science

Forecasting has been disrupted during the last decade or so by four discoveries:

  • Big Data
    Recording very large, statistically-significant #data points
  • Machine learning
    Artificial intelligence systems with human capabilities at superspeed
  • Universal Laws
    Stable trends across continents & centuries
  • Superforecasting
    Experiments with accurate forecasting (the Good Judgement Project)

Universal trends

Comprehensive analysis of millions of data points collected from research articles, government reports, and market research spanning decades and continents shows that technology progresses relatively uniformly. Wars, recessions, and natural calamities have short-term effects and the trends quickly stabilize, as the graphs below (and many others) reveal. That, among others, allows reliable inference.

The further backward you look, the further forward you can see

— Winston Churchill

Forecasting with AI

After beating the experts in predicting the oscars, college football, and the Stanley Cup, Unanimous AI predicted the results of the Kentucky Derby at odds of 540:1 using swarm intelligence which combines predictions from individual experts whose predictions were only 23% accurate using AI, thereby reaching impressive accuracy — 100% in this example. In many domains (such as energy and stock prices) predictions with high reliability (72% and above) can be made using ‘big data’, data mining, and predictive modelling techniques, ranging from simple statistical and regression methods to more sophisticated machine learning inference algorithms.

Hyped ≠ Worthless

Even productive technologies get overhyped

Hype Cycle: Emerging Technologies (Gartner 2016)
Space travel, submarines, smartphones, and artificial intelligence were foreseen centuries ahead. Once a solid proof of concept is demonstrated early expectations are inflated by fantasy literature and pop images years and decades before the technology matures.

Compare for example the state of Enterprise 3D Printing in 2016 according to Gartner’s Hype Cycle (above), which is approaching the Plateau of Productivity, with Machine Learning which is at the Peak of Inflated Expectations. Both technologies are revolutionary. Unrealistic expectations do not prove that either technology is worthless.

The future is already here — it’s just not evenly distributed

— Attributed to William Gibson

Experience gained with —

  • Learning ‘hard’ (imprecisely defined) classes
    E.g. image/speech recognition
  • Implementing machine learning systems
  • Recommender systems

Expertise offered in —

  • Computer trading (algotrading)
  • Choosing machine learning algorithms
    Selecting most suitable [un]supervised learning
  • Symbolic and nonsymbolic learning
    Symbolic: reasoning with 1st-order & temporal logic
    Non-symbolic: nearest-neighbour, neural nets, genetic algorithms…

Machine learning — a field of expertise since 1991 — is replacing humans in investment planning, fraud detection, personal assistants, recommender systems, speech and image recognition, and many other tasks. Early classifiers such as decision trees and bayesian nets were superseded by deep learning neural nets — multi-layered networks that gradually build an ‘ontological hierarchy’ them much more complex reasoning.

Genetic algorithms, instance-based learning, or neural nets? Programmers today can use machine learning libraries, each suitable for different tasks:

A comparison of supervised learning algorithms (scikit-learn.com)
A comparison of supervised learning algorithms (scikit-learn.com)

Our machine learning research started in 1990 in asking how humans classify everyday objects quickly and without any formal criteria and whether a learning algorithm can perform as well as a human. Since then the variety of available neural network types has exploded.

Neural networks 'cheat sheet' (Fjodor van Veen, Asimov Institute)

Experience gained with —

  • Object-oriented programming: planning, staff training, and migrating legacy code (re-engineering)
  • Design & architecture of software platforms, infrastructure & applications: planning, evolving, and implementing

See also:

Expertise offered in —

  • Software design & architecture
  • Re-engineering & evolution of legacy software
  • Software visualization & reverse engineering
  • Software quality
  • Staff training for technology migration

There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies. The other way is to make it so complicated that there are no obvious deficiencies.

— C.A.R. Hoare (Turing Award Lecture)

Failure is common

It is estimated that up to 65% of large software projects fail disastrously. In 2013 the BBC wrote off £100 on an overambitious IT project. In 2016 an incorrectly configured software update destroyed the $286m Japanese satellite Hitomi, and the Scottish Police Authority scrapped £60m on an IT project because of “insurmountable flaws”. Also in 2016 the NHS revealed that its IT call handling and IT project is four years late and £40m over budget.

  • Cost of write-off (£m)
There I fixed it
There I fixed it

Patching only goes so far...

Failures occur not common for lack of skill, but because technology changes at a breakneck speed; hardware improves exponentially; software libraries, environments, and even programming languages are replaced before reaching maturity; and because markets dictate unrealistic deadlines. In this climate design and reengineering seem a luxury and there is only time for ‘firefighting’. Before you know it, your software is increasingly a hopeless mess.

Research-informed solutions

Decades of studying government, health, open-source, and financial projects have taught us that successful projects have dedicated effort to reduce complexity. How? Unfortunately, there is no silver bullet (yet). In reality, each project has its own needs. A stable architecture can emerge from research-informed application of cutting-edge software tools and modelling techniques (not only UML). Bottlenecks can be diagnosed using software visualization and analysis techniques for reengineering and evolution. Flexibility can be built in using techniques such as design patterns. And the performance of critical functions such as security can be enforced using industrial formal methods. The knowledge required comes from experimenting with research-informed software engineering tools and practices and keeping up with the scientific literature. This is where we could help.

Codechart of the Check-Point Pattern
Codechart of the Check-Point Pattern
Codechart of the Enterprise JavaBeans framework
Codechart of the Enterprise JavaBeans framework

He who loves practice without theory is like the sailor who boards ship without a rudder and compass and never knows where he may cast.

— Leonardo Da Vinci

Reports are useless

There is little point in writing lengthy documents. Nobody reads them — usually they’re obsolete anyway.

But some documentation must record clearly and ambiguously the following: What exactly can the user expect from the application you’re developing? Are these expectations realistic? Which changes in the requirements can be expected within 6 months from release? Within two or five years? Which strategic design decisions will ensure smooth transition to the next version? How should programmers see and understand these decisions? How can we ensure that future changes will not violate these decisions accidentally?

Let's Communicate. Precisely.

Instead of lengthy descriptions we recommend using one or more experimental tools and methodologies that communicate the answers explicitly, visually, and precisely. They may not easy to adopt but small investments can yield very large returns. Alternative forms of specifications can describe software function and structure clearly and precisely, and automated verification can effectively prevent the majority of bugs.

A variation on the tree swing cartoon for the software lifecycle in practice (Pre-1970; origin unknown)
A variation on the tree swing cartoon for the software lifecycle in practice (Pre-1970; origin unknown)

The design of computing systems can only properly succeed if it is well grounded in theory, and the important concepts in a theory can only emerge through protracted exposure to application.

— Robin Milner (1986)