You are currently browsing the tag archive for the ‘Self-Assembly’ tag.
Figure – A classic example of emergence: The exact shape of a termite mound is not reducible to the actions of individual termites. Even if, there are already computer models who could achieve it (Check for more on “Stigmergic construction” or the full current blog Stigmergy tag)
“The world can no longer be understood like a chessboard… It’s a Jackson Pollack painting” ~ Carne Ross, 2012.
[…] As pointed by Langton, there is more to life than mechanics – there is also dynamics. Life depends critically on principles of dynamical self-organization that have remained largely untouched by traditional analytic methods. There is a simple explanation for this – these self-organized dynamics are fundamentally non-linear phenomena, and non-linear phenomena in general depend critically on the interactions between parts: they necessarily disappear when parts are treated in isolation from one another, which is the basis for any analytic method. Rather, non-linear phenomena are most appropriately treated by a synthetic approach, where synthesis means “the combining of separate elements or substances to form a coherent whole”. In non-linear systems, the parts must be treated in each other’s presence, rather than independently from one another, because they behave very differently in each other’s presence than we would expect from a study of the parts in isolation. […] in Vitorino Ramos, 2002, http://arxiv.org/abs/cs /0412077.
What follows are passages from an important article on the consequences for Science at the moment of the recent discovery of the Higgs boson. Written by Ashutosh Jogalekar, “The Higgs boson and the future of science” (link) the article appeared at the Scientific American blog section (July 2012). And it starts discussing reductionism or how the Higgs boson points us to the culmination of reductionist thinking:
[…] And I say this with a suspicion that the Higgs boson may be the most fitting tribute to the limitations of what has been the most potent philosophical instrument of scientific discovery – reductionism. […]
[…] Yet as we enter the second decade of the twenty-first century, it is clear that reductionism as a principal weapon in our arsenal of discovery tools is no longer sufficient. Consider some of the most important questions facing modern science, almost all of which deal with complex, multi factorial systems. How did life on earth begin? How does biological matter evolve consciousness? What are dark matter and dark energy? How do societies cooperate to solve their most pressing problems? What are the properties of the global climate system? It is interesting to note at least one common feature among many of these problems; they result from the build-up rather than the breakdown of their operational entities. Their signature is collective emergence, the creation of attributes which are greater than the sum of their constituent parts. Whatever consciousness is for instance, it is definitely a result of neurons acting together in ways that are not obvious from their individual structures. Similarly, the origin of life can be traced back to molecular entities undergoing self-assembly and then replication and metabolism, a process that supersedes the chemical behaviour of the isolated components. The puzzle of dark matter and dark energy also have as their salient feature the behaviour of matter at large length and time scales. Studying cooperation in societies essentially involves studying group dynamics and evolutionary conflict. The key processes that operate in the existence of all these problems seem to almost intuitively involve the opposite of reduction; they all result from the agglomeration of molecules, matter, cells, bodies and human beings across a hierarchy of unique levels. In addition, and this is key, they involve the manifestation of unique principles emerging at every level that cannot be merely reduced to those at the underlying level. […]
[…] While emergence had been implicitly appreciated by scientists for a long time, its modern salvo was undoubtedly a 1972 paper in Science by the Nobel Prize winning physicist Philip Anderson (link) titled “More is Different” (PDF), a title that has turned into a kind of clarion call for emergence enthusiasts. In his paper Anderson (who incidentally first came up with the so-called Higgs mechanism) argued that emergence was nothing exotic; for instance, a lump of salt has properties very different from those of its highly reactive components sodium and chlorine. A lump of gold evidences properties like color that don’t exist at the level of individual atoms. Anderson also appealed to the process of broken symmetry, invoked in all kinds of fundamental events – including the existence of the Higgs boson – as being instrumental for emergence. Since then, emergent phenomena have been invoked in hundreds of diverse cases, ranging from the construction of termite hills to the flight of birds. The development of chaos theory beginning in the 60s further illustrated how very simple systems could give rise to very complicated and counter-intuitive patterns and behaviour that are not obvious from the identities of the individual components. […]
[…] Many scientists and philosophers have contributed to considered critiques of reductionism and an appreciation of emergence since Anderson wrote his paper. (…) These thinkers make the point that not only does reductionism fail in practice (because of the sheer complexity of the systems it purports to explain), but it also fails in principle on a deeper level. […]
[…] An even more forceful proponent of this contingency-based critique of reductionism is the complexity theorist Stuart Kauffman who has laid out his thoughts in two books. Just like Anderson, Kauffman does not deny the great value of reductionism in illuminating our world, but he also points out the factors that greatly limit its application. One of his favourite examples is the role of contingency in evolution and the object of his attention is the mammalian heart. Kauffman makes the case that no amount of reductionist analysis could explain tell you that the main function of the heart is to pump blood. Even in the unlikely case that you could predict the structure of hearts and the bodies that house them starting from the Higgs boson, such a deductive process could never tell you that of all the possible functions of the heart, the most important one is to pump blood. This is because the blood-pumping action of the heart is as much a result of historical contingency and the countless chance events that led to the evolution of the biosphere as it is of its bottom-up construction from atoms, molecules, cells and tissues. […]
[…] Reductionism then falls woefully short when trying to explain two things; origins and purpose. And one can see that if it has problems even when dealing with left-handed amino acids and human hearts, it would be in much more dire straits when attempting to account for say kin selection or geopolitical conflict. The fact is that each of these phenomena are better explained by fundamental principles operating at their own levels. […]
[…] Every time the end of science has been announced, science itself proved that claims of its demise were vastly exaggerated. Firstly, reductionism will always be alive and kicking since the general approach of studying anything by breaking it down into its constituents will continue to be enormously fruitful. But more importantly, it’s not so much the end of reductionism as the beginning of a more general paradigm that combines reductionism with new ways of thinking. The limitations of reductionism should be seen as a cause not for despair but for celebration since it means that we are now entering new, uncharted territory. […]
Abraham, Ajith; Grosan, Crina; Ramos, Vitorino (Eds.), Stigmergic Optimization, Studies in Computational Intelligence (series), Vol. 31, Springer-Verlag, ISBN: 3-540-34689-9, 295 p., Hardcover, 2006.
TABLE OF CONTENTS (short /full) / CHAPTERS:
 Stigmergic Optimization: Foundations, Perspectives and Applications.
 Stigmergic Autonomous Navigation in Collective Robotics.
 A general Approach to Swarm Coordination using Circle Formation.
 Cooperative Particle Swarm Optimizers: a powerful and promising approach.
 Parallel Particle Swarm Optimization Algorithms with Adaptive
 Termite: a Swarm Intelligent Routing algorithm for Mobile
Wireless ad-hoc Networks.
 Linear Multiobjective Particle Swarm Optimization.
 Physically realistic Self-Assembly Simulation system.
 Gliders and Riders: A Particle Swarm selects for coherent Space-time Structures in Evolving Cellular Automata.
 Stigmergic Navigation for Multi-agent Teams in Complex Environments.
 Swarm Intelligence: Theoretical proof that Empirical techniques are Optimal.
 Stochastic Diffusion search: Partial function evaluation in Swarm Intelligence Dynamic Optimization.
Swarm robotics is a new approach to the coordination of multirobot systems which consist of large numbers of mostly simple physical robots. It is supposed that a desired collective behavior emerges from the interactions between the robots and interactions of robots with the environment. This approach emerged on the field of artificial swarm intelligence, as well as the biological studies of insects, ants and other fields in nature, where swarm behavior occurs (check for more).
Possible laboratories around the world that follow this line of research include (Europe) Marco Dorigo’s Swarm-Bots Project in Brussels plus Swarm-Intelligent Systems Group, EPFL, in Lausanne, and CORO over Caltech, USA, among many others.