David Hunter Tow, Director of The Future Planet Research Centre, suggests that the major value of the proposed European flagship FuturICT project lay not just in its capacity as a global Simulator and Forecasting tool, but as a stepping stone to a new class of prediction models capable of linking with the future Intelligent Web.
FuturICT is the project the world has to eventually have and its proposed timing was impeccable.
The planet and its population is in dire straits, with the latest IPCC draft report predicting ongoing temperature rises, triggering increasing frequency of more extreme climate events- droughts, floods hurricanes etc, into the foreseeable future.
Such physical events will have numerous disruptive impacts on all ecosystems and societies and in order to survive we will have to get much smarter at forecasting the effects of such outcomes and implementing adaptive survival strategies.
Planning for our future has always been an intrinsic part of human nature at the personal, community, state and more recently global level. And we’ve had some outstanding successes, such as forecasting the weather, technological progress, energy futures and even sometimes stock markets- although these have tended to be more problematic.
But despite a range of mathematical improvements in our foresight and modelling methods developed in tandem with a broader understanding of scientific and social principles, our capacity to forecast has been sadly lacking when outcomes don’t follow obvious trends and predictable scenarios or when the signals of emerging change are faint.
These results are usually called Black Swan events- that seem to come out of nowhere. But they can also be immensely disruptive and we need to get a much better fix on them to ensure our survival. In recent times these elusive events have included the GFC, Occupy Wall Street movement, Arab Spring, Fukushima meltdown and Eurozone collapse – all predictable in hindsight but not so easy with foresight.
Now with the survival of our society on a knife edge, civilisation needs tools that are a lot more robust and models that can deliver much more dependable outcomes.
Enter the FuturICT model.
It represented the next phase in the evolution of models powerful enough to not just deliver probable predictions but accurately prioritise the resources needed to help us survive the onslaught of massive change.
FuturICT was a prototype of the next generation forecasting tool. It is massive – on a different scale to previous models and this is part of the problem.
But the main difference is that it was a project that had to deliver. No more luxury of extended research timelines; no more egos on parade at glamorous six star conference locations; no more addressing only a narrow elite of academic peers, while throwing a few crumbs to the social media. This was crunch time for FuturICT and for society.
Whatever the world’s current knowledge base – and it’s massive, covering advanced mathematical and AI techniques; new societal, physics, materials, engineering and computational sciences; a better understanding of emerging and emergent sciences such as network, complexity, evolutionary and social theory - all based on exploding data sets and increasingly complex algorithms- it now has to be marshalled, corralled and delivered, based on its initial promise.
And FuturICT was the vehicle chosen to do it- a major advance in the science of modelling the future at a cost of a billion euros over ten years; harnessing the best scientific minds from hundreds of research institutions across Europe; and that was just the beginning.
Society was now asking for the payoff from its massive investment- billions and even trillions of euros - not just in the FuturICT project, but from research funding of multiple arcane disciplines over the last twenty years.
And make no mistake – it’s not the universities and research councils that have provided this largesse– it’s the humble taxpayer. And now the average citizen wants to know how their money can be applied to save their children’s future. There’s no ifs and buts here – they expect a pay-off now, in their time of desperate need.
Numerous PR releases have flooded both the old and new media already about this futuristic endeavour. In essence it will involve developing The FutureICT Knowledge Accelerator and Crisis-Relief System including-
Crisis Observatories- scanning for emerging problems;
Planetary nervous System- aggregating data streams from sensor systems- monitoring the pulse of civilisation around the globe;
The Living Earth Simulator- the heart and soul of the system; modelling the planet’s social, physical, biological and environmental phenomena- searching for insights into its future.
These components will apply state of the art techniques to mine Big Data using numerous mathematical, statistical, AI and logical inference techniques to discover patterns of significant interest.
But a lot of this is uncharted territory. This a giant leap from models aimed at solving specific problems with hundreds or thousands of variables, such as next week’s weather, transport flows or even complex ecosystem interactions, to one that can be applied to a huge range of environmental and societal problems, encompassing tens of thousands if not millions of interweaving information channels, parameters and variables.
But modelling at this unprecedented level of real world complexity is just the beginning. Managing the thousands of data streams and the research outputs from hundreds of institutions is the really difficult part. It won’t be a neat jigsaw - but a constantly dynamic multi-dimensional network of knowledge links, feedback loops, algorithmic contortions and exponentially exploding potentials.
For a start, the constraints on the model’s variables will have to be severe if it is to be managed at all. At least 95% will have to be pruned or drastically culled and the techniques to do this at the correct prioritisation levels have barely begun to be explored let alone formalised for a model of this scale.
The Centre’s own research suggests that evolutionary, network, complexity and decision theory will play key roles, but the challenges are myriad including-
The need to refine and integrate into the model a rigorous theory of social psychology. This vital field is still in its infancy.
The challenge of developing a radical new economic model after the disaster of the GFC. This appears to have been largely swept under the carpet by most economists as too hard, substituting another set of regulatory controls instead.
The challenge of smoothly combining the multiple disparate models of the system; managing the interdependent interfaces of feedback loops and input-outputs of an extremely complex and non-deterministic nature.
The problem of updating data inputs and algorithms in real or future time- extracting and extrapolating good models from past data alone is not enough and in fact could be disastrously misleading.
The core problem of using old mathematics in today’s 21st century models. Entirely new approaches may be required, such as replacing partial differential equations with a cellular automata approach as Steven Wolfram has argued. Also using coarse-grain rather than fine grain forecasting to escape the problem of infinite regress.
And then there’s the human element. Each group of researchers will be lobbying to maximise the application of their special insights and expertise to gain maximum kudos for themselves and their institutions. This is human nature, but if allowed to proliferate outside a disciplined framework could rapidly spiral out of control.
The data management and reporting system will also have to use a variety of standard tools to rigorously link results from multiple sources, particularly as funding will rely on the quality and transparency of the overall program- not individual or institutional progress.
Then there’s the next phase- providing advice to policy makers based on the project’s outcomes. As a critical potential EU Flagship project there was a lot riding on the verification of results in the public domain- particularly in the Eurozone’s cash strapped times.
At the same time integrating FuturICT with the other myriad models which will be working full throttle over the next ten years from competing and cooperating projects, particularly in the US and Asia. No-one will expect FuturICT to be operating as an island isolated in a sea of scientific progress.
And finally, integrating it with the full power of the Intelligent Web-Mark 4.0- because this will be the inevitable outcome.
Evolution towards a full Web-Human partnership will be the major paradigm shift of the 21st century, as policy makers and scientists alike find the causal relationships far too complex to comprehend.
The process of scientific research is expected to change more fundamentally over the next thirty years than in the previous three hundred years, towards an alternate global commons approach- a decentralised open marketplace of ideas, driven by a combination of the Web’s and human computational intelligence.
The epoch during which individual humans are able to conceptualise or understand increasingly complex phenomena is coming to an end. There will be just too many interacting variables for the human brain to get an intuitive understanding of the possible permutations and interactions.
With Big Data now a fact of life in all disciplines, combined with evolutionary discovery programs such as Eureqa, 95% of the traditional science will be handled autonomously by the Web.
Eureqa is already being applied as an ideal tool for disentangling and optimising systems that are too complicated or time consuming to analyse by traditional methods; for example aircraft wing design, network topology, financial forecasting and particle physics simulations.
But its significance goes well beyond this. It is being applied to discover new theorems increasingly beyond the cognitive capability of its human counterparts- beyond the limits of human knowledge.
Eureqa type software therefore could and will be applied in the future within all complex scientific disciplines- economics, biology, social sciences and climate science and even perhaps to solving the universal Theory of Everything. The combination of descendants of the Web and Eureqa could perhaps achieve this within the next several decades.
So if the challenges as defined may be almost impossible to overcome, why is FuturICT so vital?
Because it’s the next step in an essential learning curve that humanity has to experience in order to improve and refine its predictive capability. Such a process will be essential if we are to survive the approaching Armageddon of climate change and many other critical challenges.
This is the next major step – a proof of concept of a new era of mega-modelling. That is why it is so important – not whether it can produce perfect results, but whether we can learn enough to continue to make progress in this vital area for our future wellbeing.
There are no shortcuts in this endeavour – it will be and always has been a step by step evolutionary journey.
FuturICT will play a key role in eventually linking with the full computational intelligence of the Web to create a new societal decision framework never before contemplated by human society.
Accepting the decision capability of the Web as an equal and in the future- senior decision partner, integrating up to 10 billion human minds, will be one of the defining paradigm shifts of our times. It will involve a very radical mind-shift. Large cooperative projects such as FutureICT are essential stepping stones towards this goal.
This is what FuturICT has to teach us.
FuturICT is the project the world has to eventually have and its proposed timing was impeccable.
The planet and its population is in dire straits, with the latest IPCC draft report predicting ongoing temperature rises, triggering increasing frequency of more extreme climate events- droughts, floods hurricanes etc, into the foreseeable future.
Such physical events will have numerous disruptive impacts on all ecosystems and societies and in order to survive we will have to get much smarter at forecasting the effects of such outcomes and implementing adaptive survival strategies.
Planning for our future has always been an intrinsic part of human nature at the personal, community, state and more recently global level. And we’ve had some outstanding successes, such as forecasting the weather, technological progress, energy futures and even sometimes stock markets- although these have tended to be more problematic.
But despite a range of mathematical improvements in our foresight and modelling methods developed in tandem with a broader understanding of scientific and social principles, our capacity to forecast has been sadly lacking when outcomes don’t follow obvious trends and predictable scenarios or when the signals of emerging change are faint.
These results are usually called Black Swan events- that seem to come out of nowhere. But they can also be immensely disruptive and we need to get a much better fix on them to ensure our survival. In recent times these elusive events have included the GFC, Occupy Wall Street movement, Arab Spring, Fukushima meltdown and Eurozone collapse – all predictable in hindsight but not so easy with foresight.
Now with the survival of our society on a knife edge, civilisation needs tools that are a lot more robust and models that can deliver much more dependable outcomes.
Enter the FuturICT model.
It represented the next phase in the evolution of models powerful enough to not just deliver probable predictions but accurately prioritise the resources needed to help us survive the onslaught of massive change.
FuturICT was a prototype of the next generation forecasting tool. It is massive – on a different scale to previous models and this is part of the problem.
But the main difference is that it was a project that had to deliver. No more luxury of extended research timelines; no more egos on parade at glamorous six star conference locations; no more addressing only a narrow elite of academic peers, while throwing a few crumbs to the social media. This was crunch time for FuturICT and for society.
Whatever the world’s current knowledge base – and it’s massive, covering advanced mathematical and AI techniques; new societal, physics, materials, engineering and computational sciences; a better understanding of emerging and emergent sciences such as network, complexity, evolutionary and social theory - all based on exploding data sets and increasingly complex algorithms- it now has to be marshalled, corralled and delivered, based on its initial promise.
And FuturICT was the vehicle chosen to do it- a major advance in the science of modelling the future at a cost of a billion euros over ten years; harnessing the best scientific minds from hundreds of research institutions across Europe; and that was just the beginning.
Society was now asking for the payoff from its massive investment- billions and even trillions of euros - not just in the FuturICT project, but from research funding of multiple arcane disciplines over the last twenty years.
And make no mistake – it’s not the universities and research councils that have provided this largesse– it’s the humble taxpayer. And now the average citizen wants to know how their money can be applied to save their children’s future. There’s no ifs and buts here – they expect a pay-off now, in their time of desperate need.
Numerous PR releases have flooded both the old and new media already about this futuristic endeavour. In essence it will involve developing The FutureICT Knowledge Accelerator and Crisis-Relief System including-
Crisis Observatories- scanning for emerging problems;
Planetary nervous System- aggregating data streams from sensor systems- monitoring the pulse of civilisation around the globe;
The Living Earth Simulator- the heart and soul of the system; modelling the planet’s social, physical, biological and environmental phenomena- searching for insights into its future.
These components will apply state of the art techniques to mine Big Data using numerous mathematical, statistical, AI and logical inference techniques to discover patterns of significant interest.
But a lot of this is uncharted territory. This a giant leap from models aimed at solving specific problems with hundreds or thousands of variables, such as next week’s weather, transport flows or even complex ecosystem interactions, to one that can be applied to a huge range of environmental and societal problems, encompassing tens of thousands if not millions of interweaving information channels, parameters and variables.
But modelling at this unprecedented level of real world complexity is just the beginning. Managing the thousands of data streams and the research outputs from hundreds of institutions is the really difficult part. It won’t be a neat jigsaw - but a constantly dynamic multi-dimensional network of knowledge links, feedback loops, algorithmic contortions and exponentially exploding potentials.
For a start, the constraints on the model’s variables will have to be severe if it is to be managed at all. At least 95% will have to be pruned or drastically culled and the techniques to do this at the correct prioritisation levels have barely begun to be explored let alone formalised for a model of this scale.
The Centre’s own research suggests that evolutionary, network, complexity and decision theory will play key roles, but the challenges are myriad including-
The need to refine and integrate into the model a rigorous theory of social psychology. This vital field is still in its infancy.
The challenge of developing a radical new economic model after the disaster of the GFC. This appears to have been largely swept under the carpet by most economists as too hard, substituting another set of regulatory controls instead.
The challenge of smoothly combining the multiple disparate models of the system; managing the interdependent interfaces of feedback loops and input-outputs of an extremely complex and non-deterministic nature.
The problem of updating data inputs and algorithms in real or future time- extracting and extrapolating good models from past data alone is not enough and in fact could be disastrously misleading.
The core problem of using old mathematics in today’s 21st century models. Entirely new approaches may be required, such as replacing partial differential equations with a cellular automata approach as Steven Wolfram has argued. Also using coarse-grain rather than fine grain forecasting to escape the problem of infinite regress.
And then there’s the human element. Each group of researchers will be lobbying to maximise the application of their special insights and expertise to gain maximum kudos for themselves and their institutions. This is human nature, but if allowed to proliferate outside a disciplined framework could rapidly spiral out of control.
The data management and reporting system will also have to use a variety of standard tools to rigorously link results from multiple sources, particularly as funding will rely on the quality and transparency of the overall program- not individual or institutional progress.
Then there’s the next phase- providing advice to policy makers based on the project’s outcomes. As a critical potential EU Flagship project there was a lot riding on the verification of results in the public domain- particularly in the Eurozone’s cash strapped times.
At the same time integrating FuturICT with the other myriad models which will be working full throttle over the next ten years from competing and cooperating projects, particularly in the US and Asia. No-one will expect FuturICT to be operating as an island isolated in a sea of scientific progress.
And finally, integrating it with the full power of the Intelligent Web-Mark 4.0- because this will be the inevitable outcome.
Evolution towards a full Web-Human partnership will be the major paradigm shift of the 21st century, as policy makers and scientists alike find the causal relationships far too complex to comprehend.
The process of scientific research is expected to change more fundamentally over the next thirty years than in the previous three hundred years, towards an alternate global commons approach- a decentralised open marketplace of ideas, driven by a combination of the Web’s and human computational intelligence.
The epoch during which individual humans are able to conceptualise or understand increasingly complex phenomena is coming to an end. There will be just too many interacting variables for the human brain to get an intuitive understanding of the possible permutations and interactions.
With Big Data now a fact of life in all disciplines, combined with evolutionary discovery programs such as Eureqa, 95% of the traditional science will be handled autonomously by the Web.
Eureqa is already being applied as an ideal tool for disentangling and optimising systems that are too complicated or time consuming to analyse by traditional methods; for example aircraft wing design, network topology, financial forecasting and particle physics simulations.
But its significance goes well beyond this. It is being applied to discover new theorems increasingly beyond the cognitive capability of its human counterparts- beyond the limits of human knowledge.
Eureqa type software therefore could and will be applied in the future within all complex scientific disciplines- economics, biology, social sciences and climate science and even perhaps to solving the universal Theory of Everything. The combination of descendants of the Web and Eureqa could perhaps achieve this within the next several decades.
So if the challenges as defined may be almost impossible to overcome, why is FuturICT so vital?
Because it’s the next step in an essential learning curve that humanity has to experience in order to improve and refine its predictive capability. Such a process will be essential if we are to survive the approaching Armageddon of climate change and many other critical challenges.
This is the next major step – a proof of concept of a new era of mega-modelling. That is why it is so important – not whether it can produce perfect results, but whether we can learn enough to continue to make progress in this vital area for our future wellbeing.
There are no shortcuts in this endeavour – it will be and always has been a step by step evolutionary journey.
FuturICT will play a key role in eventually linking with the full computational intelligence of the Web to create a new societal decision framework never before contemplated by human society.
Accepting the decision capability of the Web as an equal and in the future- senior decision partner, integrating up to 10 billion human minds, will be one of the defining paradigm shifts of our times. It will involve a very radical mind-shift. Large cooperative projects such as FutureICT are essential stepping stones towards this goal.
This is what FuturICT has to teach us.
No comments:
Post a Comment