Tuesday, October 26, 2010

The Future of Cars

By 2015 most cars will be powered by electricity from next generation hydrogen cells and longer lasting higher power batteries, using solid polymer electrolytes instead of liquid. These will be capable of being charged rapidly at the new electric vehicle infrastructure power outlets such as traditional fuel stations and garages, enabling simple recharging and replacement of batteries and liquid hydrogen storage.
Computer systems will increasingly control all vehicle functions as standard- including those already in use for navigation, entertainment, collision avoidance, adaptive cruise control, anti-collision radar, safety crash protection, stability and automatic parking.

By 2020 in most larger cities, small efficient electric cars including single and dual passenger variations will be available for flexible and inexpensive hire for local transport needs via smart phone managed pickup pools, servicing urban neighbourhoods (Ref Future of Cities).

The major advance however will be in the form of fully automated cars, already being trialled- capable of navigating autonomously and guided by sensor/ processor embedded smart roads and transit corridors; obeying traffic laws and avoiding collisions with other objects and vehicles. They will also be capable of interpreting traffic forecasts and communicating via local networks with other vehicles and public transit corridors to reduce road congestion. In addition they will be responsive to passenger requirements, linked via the wireless Web to their passenger activity profiles- appointment schedules, regular destinations such as schools, child minding, leisure centres etc.

The car of 2020 will also be capable of providing and monitoring in-vehicle entertainment and communication, emergency assistance, scheduling and payment services for power charging, parking and security. Automated transit control will facilitate traffic streaming and congestion management, with specialised car, bus and cycle transit lanes in operation throughout most urban areas.

By 2030 most individual cars will have transformed into autonomous transport pods or capsules for individual and multiple passenger urban use. Pods will link seamlessly to other minimum carbon-emission forms of transport for local neighbourhood and inter-urban movement- light rail, electric cycles, scooters and bicycles. Pod streaming infrastructure will link to smart transport hubs, providing automated fast urban and intercity metro rail/bus transport services.

Pod infrastructure will be particularly valuable in high density areas such as the East Coast of the US which are already feeling the impact of climate change through major blizzards and ice events making it impossible for standard transport vehicles and infrastructure to function. Underground pod systems in such areas as well as those experiencing regular heat waves will be the only practical alternative solution.

By 2040 the car as we know it today will cease to exist in the developed world’s urban areas. In its place will be multi-purpose intelligent transit pods- systems seamlessly linked and customised to individual and community needs. Most ground-based vehicles except for bicycles/scooters will be totally autonomous and humans will become passengers only. All instructions managing human and urban infrastructure interaction such as pick-up/destination location and schedule requirements will be relayed by mobile links and automatically accessed by the pod system via the Intelligent Web 4.0 (Ref Future Web).

Autonomous pod/vehicle networks will then allow the primary role of passenger transportation to transform- merging with information, entertainment and education functions during transit times; providing major leisure and work productivity gains, both in urban and country population centres.

By 2050 3-dimensional multi-level transport systems will be in common use. Such systems will be suspended above the networked transit routes of cities with lower levels restricted to bicycles, scooters and walking. All levels will link with major transport hubs and mass electric rail for super-fast autonomous intercity and new low energy system air travel. The complex navigation, service and logistical decisions involved will be managed by adaptive intelligent software agents, operating via the dedicated and secure virtual networks of the Intelligent Web.

Humans and their transport infrastructure will be seamlessly and permanently inter-woven.

Tuesday, October 19, 2010

The Future of Space Exploration

By 2020- the Constellation moon-landing project will be back on track, allowing humans to return to the moon following the Apollo missions of the 60s and 70s, to begin creating a permanent space colony and base for future galactic exploration. The International Space Station will continue to play a vital test launch, scientific research, communications and training role, supporting future space missions.

India, China and Japan will also have proceeded with their own exploratory missions to the moon and planets, but will increasingly work cooperatively with the US and EU under International Space Treaty protocols administered by the UN. Other middle rank G20 countries such as Russia, Brazil, Turkey, Canada, Australia, UK, Germany, France and South Africa will also be major individual contributors to future space programs. Space exploration will have become a global cooperative enterprise.

The Constellation Orion Space Shuttle replacement will be launched in 2015, supporting the space station and future lunar missions, by providing a means of repair and escape for astronauts if the shuttles are damaged by space junk or solar radiation. Power sources for space vehicles and interstellar probes will routinely combine plutonium nuclear power, solar sail energy, gravity slingshot and ion drive technologies.

The construction and maintenance of future space stations, including the lunar colony, together with its instrumentation maintenance, will be carried out largely autonomously by robots, involving eventually the mining and transportation of local materials.

By 2030- most of the solar system's major objects- its planets, moons and larger asteroids will have been visited by probes and tested for signs of life. In addition, extensive modelling of the sun’s convection dynamics and heliosphere, extending the ICE missions, will be critical to gaining a better understanding of its cyclic impact on global warming. A significant sample of Mars terrain will also have been mapped by the next generation robot explorers, which will finally determine the existence of past and present microbial life on the red planet.

The potential for life to exist on many of the extrasolar planets similar to earth and within a proximity of 30 light years, will also have been determined by the SIM- Space Interferometry Mission; rejuvenated by NASA because of growing public awareness and involvement in extraterrestial life search and contact programs such as SETI. In addition, the prevalence and nature of complex pre-life organic molecules within the solar system and near space will have been extensively mapped to determine its likely origins and nature.

An asteroid and comet defence system will also have been established as a high priority, capable of tracking and eliminating most major threats to Earth. The threat to space missions from space junk and subatomic particle and electromagnetic impact will also be largely eliminated through extensive mapping and sweep technology as well as the use of new graphene-based protective materials.

The private sector’s commercial involvement in space missions will be increasingly significant, eventually surpassing Government investment and NASA’s role as primary project manager. Space tourism will become feasible but remain strictly limited because of the prohibitive energy costs and the ability to realistically replicate such experiences more safely in virtual reality.

By 2040- all navigation and exploration tasks will be automated and managed by the powerful capability of the Intelligent Web 4.0, extended to encompass communication with all spacecraft, exploratory vehicles, telescope observatories, satellites and robots involved in projects and missions across the solar system. This will include the use of intelligent robotic probes, relying on their own decision capability to analyse relevant data and determine items of interest for further exploration.

The nature of dark energy will also have been resolved supported by the $2 billion WFIRST- Wide Field Infrared Space Telescope project, centrepiece of NASA’s next decade development program.

The entire space enterprise will be linked and coordinated via massive e-infrastructure such as the European Grid Environment- EGEE, which integrates networks, grids, middleware, computational resources, data repositories, instruments, and operational support for global virtual science collaborations. A vast amount of data will need to be downloaded, stored and processed by global space programs. EGEE currently has access to more than 20,000 petabytes of storage and 80,000 CPUs. Projects by 2045 will increase this level of data processing by a factor of 100.

Globalisation and cooperation will have reached an advanced stage on earth in the face of the extreme risks to society from global warming. Therefore the risk of conflict between the major powers over sovereignty rights resulting from space exploration will be minimal. As the space program gathers momentum, humans will increasingly see themselves as belonging to one world in this domain- not separate nations.

By 2050- colonisation programs, including Mars and possibly the moons Europa and Titan will be launched, as well as the first interstellar robotic probes. These will be capable of self-replicating and evolving as agents in their own right. This will herald the beginning of second phase of the exploration and colonisation of the galaxy, as Transhumans move beyond their own home solar base and accelerate the search for new knowledge and experiences; including finally linking with other intelligent life forms.

Starships will follow later in the century, transporting the first interstellar robotic explorers; initially powered by nuclear pulse propulsion systems but later by more advanced technologies based on new physics. These will allow the nearest stars to be reached within several decades, with transhuman explorers following, primarily as observers and communicators in non-navigational support roles.

The primary task of exploring galactic space will be carried out instead by autonomous, self-learning, computationally-advanced probes, managed by a vast communications and knowledge network, extending across the galaxy.
This process will proceed exponentially as the ecosystem of smart AI probes replicates throughout the cosmos, with all life eventually becoming co-existent with the universe.

Wednesday, September 22, 2010

The Future of Economics

The recent failure of classical economics to predict and manage the catastrophic failure of the world’s financial system has triggered a re-evaluation of the whole basis of current economic theory, which has been applied to sustain capitalism for the last 100 years.

By the end of the 20th century traditional economics was dominated by the classical paradigm based on notions of rational consumers making rational choices in a simple supply/demand world of finite resources, with prices constrained by decreasing returns; all driving the economy to an optimal equilibrium point.

Twentieth century economists had finally realised their dream of creating a rational, rigorous and well-defined mathematical model for describing the workings of the global economy. This standard model has been applied by business leaders, finance ministers, central bankers and presidential advisers ever since.

Up until recently classical economic theory has appeared to work adequately by a process of trial and error. In times of growth people are generally optimistic and the theory describes reality reasonably well. But in extreme circumstances panic quickly spreads and the theory fails spectacularly, amplified by the performance of the quantitative risk algorithms beloved by hi-tech stock market traders.

Unfortunately such a clockwork model has proved over the last four decades to be seriously out of synch with reality, as global markets have been roiled by a series of disastrous credit, market, liquidity and commodity crises. The predictions of the standard model have failed to match real world outcomes, generated in succession by the Savings and Loan, Asian, Mexican, Dotcom and now GFC bubble disasters.

In this latest incarnation of excess greed debacles, high risk mortgage loans were repackaged many times over into opaque risk financial instruments, such as Collateralised Debt Obligations or CDOs, which ended up through an unregulated banking system in the portfolios of nearly every bank and financial institution around the world. Because of lack of controls, members of the shadow system such as hedge funds and merchant banks borrowed scores of times their own worth in cash. When the CDOs finally failed, the losses rippled through the world economy. The banks stopped lending, leading to further business failures and investors were then forced to sell previously sound stocks causing a stock market crash.

But this crash was far more serious- perhaps even more so than the Great Depression, as it could not be contained within borders as easily or so simply solved by pump priming mass lending and job creation programs. Now we’ve seen the biggest banks, car manufacturers, miners, energy suppliers and national economies toppling like dominoes around the world, under trillions of dollars of debt.

The current global interventions have now staunched the haemorrhaging but not cured the disease.

The stronger economies of China and south east Asia, Brazil and Germany, less affected by the carnage, have bounced back. But the European economy is still fragile, with Greece, Spain and Portugal and other smaller nations struggling to contain debt; while the recent G20 summit in Toronto failed to enforce the rigorous regulation and improved economic governance previously mandated. The US recovery is also weak, with the latest OECD report predicting that the US employment rate will not fall to pre-recession levels before 2013.

In fact a number of interdisciplinary thinkers, starting in the seventies, began to question the credibility of the entire basis of the classical economic model, likening it to a gigantic academic think tank experiment rather than a serious science. And it gradually began to dawn on this group that at a number of the key premises or axioms underpinning the existing model were seriously flawed.

As mentioned, the first is the assumption that humans are rational players in the great game of market roulette. They are not. Behavioural scientists have shown that while people are very good at recognising useful patterns and interpreting ambiguous or incomplete information in their decision-making, they are very poor when it comes to performing complex logical analysis, preferring to follow market leaders or flock according to the latest fashion. This can further amplify distorting trends.

The new theories of behavioural finance argue that during a bubble the rate of buying and selling can become manic, resulting in irrational decisions. Making money actually stimulates investor’s brain reward circuitry, causing them to ignore risk- increasing the difficulty of valuing stocks accurately.

But perhaps the most critically flawed assumption is that an economic system always reaches an ideal equilibrium of its own accord. In other words, the market is capable of benign self-regulation- automatically allocating resources and controlling excesses in an optimum way, best effected with minimum outside interference.

Since the nineteenth century the fundamental principle underpinning economics has therefore been based on the mythology that the economy is a system that moves from one equilibrium point to another, driven by shocks from external disruptions – whether technological, political, financial or cultural- but always eventually coming to rest in a natural equilibrium state.

The new emerging evolutionary paradigm however postulates that economies and markets, as well as the web, enterprises and the human brain, are all forms of complex systems in which agents dynamically interact, process information and adapt their behaviour to a constantly changing environment; never reaching a final stable equilibrium or goal.

In biological evolution, the natural environment selects those systems that are best able to adapt to its infinite variation. In economic evolution, the market is a combination of financial, logistical, cultural, organisational and government regulatory elements, which adapt to and in turn influence a constantly changing ecological, social and business environment.

In essence, economic and financial systems have been fundamentally misclassified. They are not perfect self-regulating systems. They are enormously complex adaptive networks, with topologies that include decision hubs, relationship connections and feedback loops linking multi-agent groups which interact dynamically in response to changes in their environment; not merely through simplified price setting mechanisms, tax and interest rate cuts, liquidity injections or job creation programs. They must be understood and managed at a far deeper level.

Modern evolutionary theorists believe that evolution is a universal phenomenon and that both economic and biological systems are subclasses of a more general and universal class of evolutionary systems. And if economics is an evolutionary system, then it follows there are also general evolutionary laws of economics, which must be understood and harnessed if it is to be effectively managed.

This contradicts much of the standard theory in economics developed over the past one hundred years.

The economic evolutionary ecosystem is now fed by trillions of transactions, interactions and non-linear feedback loops daily. It may in fact have become too complex and interdependent for economists and governments to control or even understand. Therefore, as several eminent complexity theorists have recently stated, it might be on the verge of chaos. Too much or not enough regulation can distort the outcomes further- creating ongoing speculative pricing bubbles or supply and demand distortions.

There is now an urgent need to understand at a much deeper level the genie that modern capitalism has engineered and released. This can only be done by admitting the current crumbling edifice is beyond repair and building a radical new model from the ground up; a system that incorporates the hard sciences of network, evolutionary, behavioural and complexity theory.

Tinkering around the edges with the old reactive tools is not an option anymore.
To have any real chance of harnessing the economic machine of the 21st century for the benefit of all human society, not just the wealthy, it must be modelled at the network level and managed autonomously according to adaptive evolutionary principles.

If a business as usual economic philosophy prevails, it is likely that the resulting ultra-massive waste of resources and social turmoil of a second GFC would be catastrophic for our civilisation.

Saturday, September 11, 2010

Future of Cyber-Infrastructure for World 2.0

Our future World 2.0 will face enormous challenges from now into the foreseeable future, including global warming, globalisation and social and business hyper-change.

Global Warming will create shortages of food and water and loss of critical ecosystems and species. It will require massive prioritisation and re-allocation of resources on a global scale.

Globalisation will require humans to live and work together cooperatively as one species on one planet- essential for our survival and finally eliminating the enormous destruction and loss of life that wars and conflict inevitably bring.

Social and Business Change will present myriad challenges relating to building and maintaining a cohesive social fabric to provide - democracy and justice, adequate levels of health and education, solutions to urban expansion, crime prevention, transport congestion and food and water security, in a fast changing global environment. This will require adaptation on a vast scale.

It is apparent that in order to meet these challenges, humans must harness the enormous advances in computing and communications technologies to achieve a complete makeover of the world’s Cyber-Infrastructure.

The infrastructure of the new cyber reality now affects every aspect of our civilisation. In tomorrow’s globalised world a dense mesh of super-networks will be required to service society’s needs- the ability to conduct government, business, education, health, research and development at the highest quality standard.

This infrastructure will be co-joined with the intelligent Internet/web, but will require additional innovation to facilitate its operation- a transparent and adaptable heterogeneous network of networks, interoperable at all levels of society.

In the last two decades tremendous progress has been made in the application of high-performance and distributed computer systems including complex software to manage and apply super-clusters, large scale grids, computational clouds and sensor-driven mobile systems. This will continue unabated, making the goal of providing ubiquitous and efficient computing on a worldwide scale possible.

But there’s a long road ahead. It is still difficult to combine multiple disparate systems to perform a single distributed application. Each cluster, grid and cloud provides its own set of access protocols, programming interfaces, security mechanisms and middleware to facilitate access to its resources. Attempting to combine multiple homogeneous software and hardware configurations in a seamless heterogeneous distributed system is still largely beyond our capability.

At the same time tomorrow’s World 2.0 enabling infrastructure, must also be designed to cope with sustainability and security issues.
It is estimated that The ICT industry contributes 2-3% of total Greenhouse Gas emissions, growing 6% per year compounded. If this trend continues, total emissions could triple by 2020. The next generation cyber-architecture therefore needs to be more power-adaptive. Coupled with machine learning this could achieve savings of up to 70 % of total ICT Greenhouse emissions by 2020.

But the world is also grappling with the possibility of cyber-warfare as well as increasingly sophisticated criminal hacking, with an estimated 100 foreign intelligence organisations trying to break into US networks. A global protocol safeguarding cyber privacy rights between nations, combined with greater predictive warning of rogue attacks, is critically needed. The next generation of cyber-infrastructure will therefore have to incorporate autonomous intelligence and resilience in the face of both these challenges.

To meet these targets a lot will ride on future advances in the field of Self-Aware Networks- SANs. Previous blogs have emphasised the emergence of the networked enterprise as the next stage in advanced decision-making. SANs are a key evolutionary step on the path to this goal. Self-aware networks can be wired, wireless or peer-to-peer, allowing individual nodes to discover the presence of other nodes and links as required- largely autonomously. Packets of information can be forwarded to any node without traditional network routing tables, based on reinforcement learning and smart routing algorithms, resulting in reduced response times, traffic densities, noise and energy consumption.
Another major shift towards a networked world has been the rise of Social Networks. These have attracted billions of users for networking applications such as Facebook, LinkedIn, Twitter etc. These are providing the early social glue for World 2.0, offering pervasive connectivity by processing and sharing multi-media content. Together with smart portable devices, they cater to the user’s every desire, through hundreds of thousands of web applications covering all aspects of social experience– entertainment, lifestyle, finance, health, news, reference and utility management etc.
With increased user mobility, location sharing and a desire to always be connected, there is a growing trend towards personalized networks where body, home, urban and vehicle sensory inputs will be linked in densely connected meshes to intermediate specialised networks supporting healthcare, shopping, banking etc.
The explosion of social networked communities is triggering new interest in collaborative systems in general. Recent research in network science has made a significant contribution to a more profound understanding of collaborative behaviour in business ecosystems. As discussed in previous posts, networked ‘swarm’ behaviour can demonstrate an increase in collective intelligence. Such collective synergy in complex self-organising systems allows ‘smarter’ problem solving as well as greater decision agility. By linking together in strategic and operational networks, enterprises can therefore achieve superior performance than was previously possible.
The key characteristics of the smart business network of the future will be its ability to react rapidly to emerging opportunities or threats, by selecting and linking appropriate business processes. Such networks will be capable of quickly and opportunistically connecting and disconnecting relationship nodes, establishing business rules for participating members on the basis of risk and reward.
This ‘on the fly’ capacity to reconfigure operational rules, will be a crucial dynamic governing the success of tomorrow’s enterprise. CIOs must also learn to span the architectural boundaries between their own networked organisation and the increasingly complex social and economic networked ecosystems in which their organisations are embedded.
In fact the business community is now struggling to keep up with the continuous rate of innovation demanded by its users. Social network solutions have the potential to help meet this demand by shaping the design of future architectures to provide better ways to secure distributed systems.
So what is the future of this new collaborative, densely configured networked world? What we are witnessing is the inter-weaving of a vast number of evolving and increasingly autonomous networks, binding our civilisation in a web of computational nodes and relational connections, spanning personal to global interactions.

By 2050 the new World 2.0 cyber-infrastructure will link most individuals, enterprises and communities on the planet. Each will have a role to play in our networked future, as the cells of our brain do- but it will be a future in which the sum of the connected whole will also be an active player.

Friday, July 9, 2010

Evolutionary Thrashing and Social Chaos

Society may be on the cusp of social chaos triggered by ‘Evolutionary Thrashing’, which could result in major social breakdown for many decades.

The ‘evolutionary thrashing’ phenomenon occurs when the rate of change in a system’s environment exceeds its capacity to effectively adapt or evolve, before again being overwhelmed by the next wave of change.

At the biological level this can result in an organism’s inability to reach its optimal potential, making it less fit and more susceptible to extinction. This is currently occurring on the planet at an unprecedented rate. Many species are finding it increasingly difficult to adapt to the continuous changes in their habitat resulting from global warming and human destruction, with a quarter of vertebrate species predicted to become endangered or extinct by 2050.

However the phenomenon of ‘evolutionary thrashing’ is not restricted to biological systems. According to David Tow’s recently published generic evolutionary theory, outlined in his book– The Future of Life: A Unified Theory of Evolution, it can apply equally to social systems, including human society.

In this generic scenario, the same laws and principles of evolution apply to all systems at the quantum information level. Support for this thesis has recently been provided by physicist Wojciech Zurek’s ground-breaking work on Quantum Evolution and Decoherence, analogous to Charles Darwin’s theory of natural selection.

Such ‘thrashing’ at the human level can therefore lead to ineffective decision-making, social breakdown and eventually chaos, before long-term optimal evolution reasserts itself.

Global warming is a significant primary driver of this process because it has the potential to adversely impact all the planet’s ecosystems, which in turn will affect most aspects of human civilization including its social and democratic institutions.

A high level of ongoing adaptation is therefore required, but if critical social needs cannot be met in response to rapidly changing constraints, dysfunctional outcomes on a global level such as increased conflict, work and lifestyle stress, loss of community cohesion etc, will inevitably result.

But global warming is not the only contributor to social evolutionary thrashing. The second major driver is globalization, which is also occurring at hyper-speed, resulting in the blurring and mixing of cultures, religions and social norms as populations spread across the planet.

This is most apparent for example in the emergence of the major geopolitical blocs such as the EU linking nations in Europe, Asia, America and Africa, together with an increasing number of regional groupings and cross-over trading and political alliances such as APEC. In addition, each of these networks is increasingly coordinating its influence through global decision-making bodies such as the United Nations and more recently the G20.

In order to manage global issues such as climate change, crime and terrorism, disease, natural and man-made disasters, refugee flows and the allocation of key resources such as food, water and energy, global cooperation will be essential. But at the same time, traditional cultural and commercial practices that have evolved sometimes over thousands of years are being swamped in less than a generation- the blink of an eye in evolutionary terms; resulting in racial blowback, which can trigger reactions such as paranoia and conflict.

The third major driver of hyper-change is the information and communication revolution, facilitated by the Internet and Web Mark 2.0 incorporating the new cyber-world of virtual reality, mobile communication, social media and instant information access.

This is beginning to accelerate exponentially, threatening to outpace the capacity of populations of both developed and developing countries to adapt their social and cultural practices relating to democratic, educational, legal, financial and governance processes. With a third of the world’s population, including developing nations, now connected via inexpensive mobile phones and laptops to this infinite resource, the rate of change will become hyper-exponential within the next few decades.

No-one disputes the benefits of this massive egalitarian knowledge gain, providing the potential to deliver quality of life improvements to both poor and rich nations- combating the adverse effects of poverty and climate change. But there is the real risk that such hyper-change will outstrip the capacity of humanity to absorb and utilize it to the best advantage, succumbing to the centrifugal forces that threaten rip the fragile fabric of society apart.

In the space of a generation, the rate of social evolution driven by these three mega-forces- global warming, globalization and knowledge acceleration, each catalyzing the others in a frenzy of complex feedback systems, now threatens to destabilize the foundations of human civilization.
Non-adaptive evolutionary thrashing is likely to reach a critical threshold within the next decade, mirroring the likely point of no return for global warming.
This effectively means that coordination and synchronization of the major practices and protocols for managing the planet will be essential. It will involve the intermeshing of not just trade, but decision-making on all critical social issues.

It will require the rapid creation and strengthening of common frameworks for managing commerce, finance, economics, education, science and technology- including the management of energy, food, water and air quality on a world-wide scale. This has already begun on a regional basis with the strengthening of the European Union and on a global basis since the recent financial melt-down with the creation of the G20.

In other words, it will demand achieving an excruciatingly fine balance between continuing to encourage the creativity, innovation and development that drives our civilization and the risk of social overreach, with the potential to implode it. Only global commitment and good will by all populations on the planet can achieve this resolution.

Saturday, May 22, 2010

Life Creates Life

The first artificial life form has been created by human biological life. Humans have crossed the Rubicon of creation by bypassing natural evolution and by designing the first artificial life form, have opened the floodgates of life’s evolutionary future.
Craig Venter and his team were the first scientists to sequence the human genome and have now created the first artificial life-form; a tiny new bacterium or synthetic cell, controlled by human engineered DNA, with its genetic instructions determined by human life.
The scientists have made a synthetic copy of the genome of a bacterium- Mycoplasma mycoides. This man-made genome was then transplanted into a related bacterium- Mycoplasma capricolum. This process “rebooted” the cell so that it was controlled by the synthetic genome, transforming it into another species. The cell has since divided more than a billion times.
The creation of this living organism is the culmination of 15 years of research, costing more than $47 million. But the cost is miniscule in comparison with its glittering potential benefits. It promises a new industry, generating synthetic bacteria capable of cleaning up pollution, producing new forms of green chemicals and fuels, capturing CO2 in designed algae and providing vaccines against disease.

The creation of life has been an ongoing human endeavour for at least 50 years, since Stanley Miller successfully synthesised amino acids, essential for the formation of proteins and life, using simple molecules such as water, ammonia and methane, exposed to an energy source such as ultraviolet radiation.

Since that time a number of paths have been taken by researchers to recreate the genesis of life including-
Resurrecting extinct species- such as the marsupial Tasmanian tiger and Woolly mammoth- extracting still viable DNA and implanting it in related species such as the Tasmanian devil and African elephant. But the notion of resurrecting Einstein or Shakespeare as present-day geniuses is highly doubtful, because evolution is not just a product of genes, but is a dance between genetics and the environment.

Re-engineering current species- reversing evolutionary changes and genetic switches to recreate the previous ancestor; for example producing teeth in chickens as birds related to ancient dinosaurs. The importance of this technique is that it demonstrates life as a continuum, with many of the genes from yeast and fruit flies still existing in modern humans.

Cloning new species- this can be achieved using the technique of hybrid speciation, which involves first mating two closely related species, such as single-cell yeasts. A small percentage of the offspring spontaneously clone themselves and some also change gender, thereby creating a new species of yeast.

The current artificial life-form has been created by manipulating of the code of life– the chemical bases needed to develop artificial chromosomes and therefore novel amino acids, proteins and life.

Producing new life-forms to order by designing novel DNA, is a comparatively recent process. It is a direct consequence of recent successes in sequencing DNA as well as the creation of component genome databases. This facilitates the assembly of genetic buuilding blocks into living systems in the same way that electronic components are combined to manufacture circuits and chips or software modules to create business services.

Flexible and reliable fabrication technology, together with standardised methods and design libraries have enabled a new generation of biological engineers to already create new organisms from biological components from the ground up, providing the basis for the new science of synthetic biology.

Molecular biology has previously largely been applied as a reductive science, but now synthetic biologists are building organic machines from interchangeable DNA parts that work inside living cells- deriving energy, processing information and reproducing.

Concurrently with developments in synthetic biology, another new form of life- Intelligent Software Agents, have been developed by computer scientists, representing artificial life in the form of adaptable evolutionary software programs. These are designed to provide autonomous and cooperative problem-solving support to humans through the application of artificial intelligence- primarily evolutionary, swarm and knowledge-based algorithms.


But the Holy Grail of life’s creation – evolving a living cell from scratch- has yet to be achieved. This is because many separate initial evolutionary processes have to take place first, including the evolution of- cell containment vesicles, an optimal genetic code such as DNA or RNA with the machinery to translate it into amino acids and proteins; the incredibly complex epigenetic processes providing signaling pathways from the cell’s environment and methods to fine tune its basic DNA; plus the machinery of cell replication, development, apoptosis and metabolism etc

In a sense Venter’s achievement has relied heavily on hijacking the machinery of existing cellular operation– much as Einstein did by borrowing Riemann’s mathematical framework for his theory of relativity. In other words he piggy-backed a free ride to life.

But this doesn’t detract in any way from the monumental human achievement in understanding better the enigma of life and creating it afresh in its full glory.

Because of this breakthrough it will now be possible to create not only new bacteria, but eventually the complete spectrum of new life-forms – plants and animals, including perhaps a new species of humans. In other words bringing artificial life from the super-natural to the human-natural realm of creation.

This glittering potential is balanced by unforeseeable risks; a synthetic bacterium With the capacity to mutate and proliferate outside the lab, doing untold damage to the environment by accelerating new disease pathogens and affecting the genetic blueprint of crops and animals including humans. It also will have the capability to be used as a biochemical weapon.

But science’s Pandora’s Box has been opened yet again. Now there are three players in the great game of life- biological, artificial and virtual.
All three will have to learn to co-exist and accommodate with each other; as over time the biological, technological and social barriers dissolve and they eventually merge into a new form- Meta-life.

Wednesday, May 12, 2010

Managing the Planet

The Director of the Future Planet Research Centre, David Hunter Tow, forecasts
an urgent need to harness the full resources and intelligence of the Web to coordinate and manage major programs relating to global warming and survival of the planet- including its life and human civilization.

The cards are now on the table- the climate skeptics bluff has been called. The latest science suggests that of the critical indicators of the health of the planet, at least three have already passed the critical stage and the remainder are perilously close to the abyss.

These include- biodiversity loss, ozone depletion, ocean warming and acidification, land and freshwater over-use and chemical pollution including nitrogen and phosphorous runoff. Most importantly, at current levels of CO2 accumulation, the maximum 2 degree centigrade threshold increase will be breached within twenty years.

In addition, over the past 50 years the world’s population has almost doubled to 8 billion, global consumption of food and fresh water has more than tripled, fossil fuel use has quadrupled and vertebrates have declined by over thirty percent.

It is clear that managing the planet’s outcomes to provide life with a future is the paramount goal that must focus all humanity’s skills, creativity and knowledge, from now into the far future

Up until comparatively recently, managing resources, infrastructure and catastrophes has been largely an ad hoc affair run on a country rather than regional or global basis. This is not surprising considering the evolution of our civilization, which has been based on a largely competitive, winner-take-all model between individuals, organizations, cities and nation-states.

Over the last few decades however a realization has dawned that this is an extremely inefficient and counter-productive approach and totally unsustainable in the modern carbon-induced warming era. This is particularly the case when it comes to managing critical global issues such as climate change, spread of disease, ecosystem protection and major catastrophes- including mega-droughts, oil-spills and earthquakes.

Although still operating in largely fragmented mode, humans are beginning to mobilise cooperatively, creating global research consortiums, trade and business alliances and knowledge exchange networks. But a lot more is needed to ensure our survival- primarily by becoming a lot cleverer in focusing our scientific, technology and social resources.

One of the most significant advances recently announced, w the European FuturIcT project.

This ambitious European Commission funded billion euro enterprise, was designed to simulate the knowledge resources of the entire planet- not just physical but social and economic, mobilising partners from most of the top university research centres in Europe.

The 'Living Earth Simulator' is a major part of this project originally scheduled to be completed by 2022. It will mine economic, technological, environmental and health data to create a model of the entire planet’s dynamics in real time; applying it to solve major problems relating to these areas.

There is now a vital need to better understand the global interrelationships enmeshing the society in which we live and the effect that these have on the planet as a whole. We also need to know how to leverage the benefits of global social systems, while at the same time limiting any downsides they may generate.

Labelled- 'Reality Mining', the plan was to gather information about every aspect of the living planet including its life-forms and use it to simulate the behaviour and evolution of entire ecosystems and economies; helping predict and prevent future potential crises. The Living Earth Simulator was expected to predict for example, potential economic bubbles, impacts of global warming, pandemics and conflicts and how to best mitigate them.

The FuturicT project since cancelled had the potential to nucleate and accelerate this process operating as an essential catalyst and mobiliser for managing our future. But there are many other advanced projects with the potential to complement this grand design and working in parallel to help complete the big picture.

The focus is on preparing for a smarter future for planet earth- creating solutions for managing more efficiently and reliably the world’s infrastructure, energy, food, water and health. This will be achieved through harnessing the immense power of advanced artificial intelligence, mathematical, computing, communication, control and modelling techniques.

Examples of current myriad hi-tech initiatives include-
self-healing software capable of automatically detecting, identifying, and fixing errors in the programs used in complex systems; a ‘central nervous system’ of ‘smart dust’ for the Earth, in which a trillion sensors will be deployed worldwide to monitor ecosystems, detect earthquakes, predict traffic patterns, and study energy use; a system of computerized agents that can manage energy use in the home, designed to optimize individual electricity usage to improve efficiency of the electricity grid; and leveraging the vast cornucopia of freely available services on the web to build mashups to support humanitarian and disaster relief.

As mentioned, game changing projects such as FuturicT are critical, but managing the planet requires much more- in essence coordinating and focusing the entire knowledge base and mind-power of our civilisation.

This should implemented as a world wide public project, in the same manner as the Internet and Web: with each component of the planet’s intellectual mosaic- individuals, research groups, corporations and governments, contributing and mining their knowledge resources- each according to their creative capacity and expertise.

Such a global vision is too fundamentally vital and complex to be funnelled through individual private organisations, politicians or states. It must instead function as a self-organising supra-national entity- evolving eventually as a largely autonomous system.
Managing the planet therefore will involve the massive task of coordinating thousands of techniques, technologies, systems and initiatives to gain the maximum leverage within the timescale available.

But time now is precious. Most current ‘green’ applications are in the early stages- designed to improve energy efficiency by deploying breakthroughs in sustainable technologies such as solar, wind, biofuels, carbon capture etc. But this is just the beginning of the journey. Copenhagen demonstrated that gaining consensus even for the essential task of implementing a global carbon trading system - so vital in generating the momentum to transition from polluting fossil fuels to green power- is difficult to achieve.

Is this a feasible proposition? Yes, but only by applying adaptive, autonomic system technology, capable of responding dynamically and autonomously to changes in the physical and social environment. Such a system will need to include the ability to self-organise and self-optimise its planning and operations – to discover, innovate, simulate, create, predict, apply, learn and continuously gain intelligence- to ensure optimal outcomes.

As mentioned, although projects such as FuturicT project have the potential to kick-start this process, there is only one practical mechanism to ensure the ultimate success of such a gargantuan endeavour- harnessing the intelligence of the Web itself. It must be nurtured and engineered to become self-organising and self-adaptive, in order to reach the goal of managing a sustainable future- essential for us and our planet.