The scientific and cultural problems our society faces today are being addressed systematically not only by teams of scientists, but by volunteer distributed computing farms worldwide. Users volunteer their own processing power to research-level university departments, who send information in various packet sizes to computers at home in order to solve a wide range of computational and algorithmic problems.
At my office of Instructional Technology at the university I attend, we installed the Stanford-based project "Folding@Home" onto all of the lab machines and workstations this summer. Each computer on our campus is working in its spare time to solve various problems that could lead to a greater understanding of how diseases and disorders develop. Since then our campus has workhorsed its way to one of the top scores for folding teams. We rank 518 out of nearly 90,000 teams globally thanks to our combined processing power. This is the genius of outsourcing scientific knowledge, and the genius of encouraging productive competition.
The structure of scientific knowledge is effectively decentralized due to distributed computing projects. Probably the most notable DC project is the Berkeley-based "SETI@Home" project, which outsources SETI's computational problems to volunteer workstations with the goal of discovering extra-terrestrial life through the examination of data from radio telescopes. The SETI project has 5.2 million participates worldwide, but alas, has not been successful in finding one positive candidate in space.
The digital revolution as a whole is also moving faster than the sustainability revolution. The SETI project, while admirable in its mission, also comes at a cost to the environment. Since global power grids are not sustainable, the use of battery power and electricity relies foremost on power from coal and other energy plants. Most the energy produced in the United States still comes from coal and coke sources, in fact. And one estimate says the SETI@Home project alone has cost $500,000,000 in total equivalent energy costs.
Among a list of distributed computing projects, Folding@Home is one of the most practical. The project simulates problems that occur in lab settings (and real-life settings, of course) where proteins fold, or assemble themselves, in ways that are little understood by the scientific community. When proteins fail to fold properly, this gives rise to all sorts of disorders, such as Alzheimer's disease, Mad Cow disease, Creutzfeldt-Jakob disease, amyotrophic lateral sclerosis, Huntington's disease, Parkinson's disease, and many forms of cancer. The idea is that understanding the folding process will lead to all sorts of solutions to these diseases. And Folding@Home has in fact received numerous awards since 2000, when it began, for completing various tasks that could not have been completed in the laboratories at Stanford alone. One of those awards is for developing efficient algorithms and methods of distributing computational problems, thus reducing the environmental costs.
In some sense, the environmental critique is no different than an environmental critique of anything else. We could say that the critique is non-unique in that it applies equally to all sorts of other energy-intensive uses. Distributing computing is in fact not too energy-intensive on a local level, since it does not take priority over other system processes. Yet distributed computing can be compared to less admirable uses of computational power. Such as the aggregate amount of energy spent by local computers to daily view vast amounts of pornography. The cost of pornography in terms of environmental costs, to my knowledge, has not been calculated. But that activity does not funnel its energy into a single source, like distributed computing does. So it is difficult to hold anyone accountable. The only option would be to shame the entire pornographic community.
While distributed computing projects have not solved the sustainability problem itself, they should not be singled out as a greenhouse gas proliferators which are more 'unique' than other activities. That problem should be addressed separately, or perhaps addressed directly by distributed computing projects themselves. The most useful distributed computing project at this time would be to solve the problem of creating sustainable energy alternatives. This could be done by distributing algorithmic problems to determine the most efficient way to produce energy from, say, photovoltaic energy cells.
A similar distributed computing program at DARPA uses scalability technology to determine how and when the next terrorist attack will occur, and under what circumstances. Perhaps the same model could be used to determine under what circumstances our culture will become sympathetic to the damage our activities cause to the environment. And under what circumstances will that culture act on its behalf. But by the time the algorithms are solved and computational problems reach a stand-still, our society's traditional energy sources will have expired and all our research would have been in vain.
We understand the problems our society faces today already, and the potential benefits through the distributed computing revolution are greatly hindered by governmental policies and subsidies to harmful, archaic energy industries.
Friday, November 16, 2007
The Distributed Computing Revolution
Submitted by Acumensch at 16.11.07
Tag Cloud: Transhumanism
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment