Wrestling with Convergence, Part 1: What Convergence Hath Wrought
Wrestling with Convergence, Part 1: What Convergence Hath Wrought
  • Emanuel Yi Pastreich
  • 승인 2011.08.19 17:21
  • 댓글 0
이 기사를 공유합니다

The Asia Institute recently held a round-table discussion on the topic of technology convergence. The discussion was led by the director of the Asia Institute, Emanuel Pastreich, who serves as a professor at Humanitas College of Kyung Hee University. Also in attendance were Charlie Wolf,  director at the Social Impact Assessment Center; Paul Callomon, collections manager at the Academy of Natural Sciences; Stephanie Wan, the YGNSS Project Co-Lead and the North, Central America & Caribbean regional Coordinator of the Space Generation Advisory Council; Daniel Lafontaine, business consultant at AMA Korea; Alan Engel, president of Paterra, Inc. of Japan; Matthew Weigand, founder of Responsiv.Asia; Tahir Hameed, research fellow at the Korea Advanced Institute of Science and Technology (KAIST); and Vince Rubino, Global Team Leader of Business Development at the Korea Institute of Toxicology. In this first part of a five-part series, the experts discussed the social implications of the rapid technological change that convergence represents.

Emanuel Pastreich: Moore's Law holds that "the number of transistors that can be placed inexpensively on an integrated circuit has doubled every two years for the last half century." This phenomenon could be said to drive all the central trends in society, in other related technologies, in communications, and increasingly the merging of fields of activity and inquiry.

The Asia Institute recently held a series of round-table discussions on the topic of technology convergence.

Frances Cairncross's seminal book The Death of Distance postulated a radical drop in the cost of transporting information that will (is) changing our world. That is drop steeper than expected due to the constant increase in computational power.

But the ability to combine nanotechnology and biotechnology is also driven by Moore's Law because the tasks of manipulating carbon or silicon at the nano-level has become so much easier and cheaper. Repeated nano-fabrication and sequencing is on its way to being as free as sending e-mail.

For that matter, in the study of literature we find now that scholars are using computers to look for common words and the evolution of expressions over centuries by using the vastly cheaper supercomputers available today.

So when we talk about radical social change, perhaps in almost every case we need to start with Moore's Law and its implication for specific technologies. Let me give a concrete example. At the University of Illinois, where I taught for six years, there is a program called Blue Waters in the NCSA Center for next-generation supercomputing. The quote given is: "Scientists will create breakthroughs in nearly all fields of science using Blue Waters. They will predict the behavior of complex biological systems, understand how the cosmos evolved after the Big Bang, design new materials at the atomic level, predict the behavior of hurricanes and tornadoes, and simulate complex engineered systems like the power distribution system and airplanes and automobiles."

Thus the computational power to make full use of any physical structure, any system becomes infinitely easier with such computing power.

Charlie Wolf: Speaking of the long term social implications of technological change, the trends you note under Moore's Law would seem to pose some deep and as yet ill-defined questions for the social management of technology. It is unclear what is driving these trends and who is (or ought to be) in the driver's seat.

For example, they seem to suggest a counter trend in the intensification of social complexity that no amount of computational power is likely to solve or resolve. The current debate over net neutrality might be a case in point, not to mention the WikiLeaks fracas. Clearly, a major part of transformational technology assessment should be addressed to the downside of such development, and to anticipating the unanticipated consequences attendant to it.

Emanuel Pastreich: Well the problem is to some degree one of overcapacity. The next generation of supercomputers will have such capacity that there will be two significant threats: first, that they will create complexity and problems just to take advantage of their full capacity. This is already happening. And second, they will be able to create and sustain entire virtual universes over long periods of time that will evolve along their own trajectories and eventually create immense confusion as to what is true because the false world has a convincing mimetic texture.

Charlie Wolf: Thanks for your response. I was thinking outside the box about the general relationship between system complexity and vulnerability. I come by this honestly, by way of air traffic control and nuclear safety systems, and most recently through a slight acquaintance with knowledge management.

There is another field out there, complexity science, which we might consider contacting. At any rate, I consider complexity management a professional responsibility deserving of further reflection and action. To frame actionable research questions in this area would be a useful exercise.

Paul Callomon: Complexity gradually works to seal systems against entry except through purpose-designed portals. You might be able to "hop over the wall" directly into a complex science, but without passing first through the halls of learning and experience, what you see will be incomprehensible.

As Moore's Law enables computers to create ever-greater system complexity, so the distance between the gate and the inner sanctum where the action is will become so great as to require either a very early start (precluding mid-career shifts, for example) or the packaging of whole chunks of evolutionary ontogeny into modules in whose use one can be trained but of whose composition one has no time to learn.

To those of us who use computers as tools in the way earlier generations used chisels or typewriters, not knowing how they work is no impediment whatsoever. Learning how to use them as a child is the knowledge equivalent of the "no-phones-to-cell-phones" phenomenon in China and India. As the previous posts indicate, however, there are wider implications for security in requiring even elementary school kids to use $500 machines in order to get pass grades, when half the world can barely write.

Emanuel Pastreich: One question is whether the technology will eventually make the evolution of computers as alien to human agency as the genetic evolution of animals. If there was disaster and all research institutes were destroyed, unlike previous technologies, the computer chip could no longer be reproduced because all the previous generations of computers were required to design it. So you could end up with a technology which can be reproduced, but that humans can no longer design from that start.

Paul Callomon: The far greater peril, to my mind, is that we will continue (and with the best and most innocuous of intentions) to broaden the information gap between the heavily computerized and systems-dependent societies of the first world, and the poor but zealous ones elsewhere. When people feel they have no way of winning even basic respect from those in the mansions, they tend to band together and conspire to burn those mansions down. And as we have seen, it doesn't take much more than determination and an utter disregard for one's own fate to strike serious blows.

Emanuel Pastreich: That point may be true.

I was not so much imagining a dystopia, as proposing an interesting situation. It is the difference between a candle maker who makes candles and a farmer who milks a cow. The candle maker can actually make the candle. The farmer can get milk from the cow, but he cannot make a cow. Eventually we may get to the point that we can guide the evolution of technology, but we cannot actually create it again because we have lost the path by which it evolved.

Paul Callomon: I think that is less of a danger with technologies than with languages, and the point to remember is that even when a language is lost, people do not become unable to speak. How much of a loss to humanity the extinction of a language represents is a matter of values. One day, there may be no-one on Earth who remembers how to render tallow from sheep fat, and that will be the end of tallow candles. I doubt whether anyone will miss them, though. Their ascendancy was the result of a specific set of historical circumstances - there were sheep, sheep fat burns, and it can be made cleaner and hotter by rendering it - not their superiority to, say, whale oil, town gas or electricity as a lighting medium.

How many other technologies developed not because they were the best way of doing something, but because they were feasible with the knowledge and materials to hand at the time Think for example of the simple task of recording and replaying sound.

Emanuel Pastreich: If you are interested in the question of computer integration and convergence, and the implications for society, the classic film "Colossus: The Forbin Project" presents a very interesting scenario in which a computer encircles the world and takes control.


댓글삭제
삭제한 댓글은 다시 복구할 수 없습니다.
그래도 삭제하시겠습니까?
댓글 0
댓글쓰기
계정을 선택하시면 로그인·계정인증을 통해
댓글을 남기실 수 있습니다.

  • ABOUT
  • CONTACT US
  • SIGN UP MEMBERSHIP
  • RSS
  • 2-D 678, National Assembly-daero, 36-gil, Yeongdeungpo-gu, Seoul, Korea (Postal code: 07257)
  • URL: www.koreaittimes.com | Editorial Div: 82-2-578- 0434 / 82-10-2442-9446 | North America Dept: 070-7008-0005 | Email: info@koreaittimes.com
  • Publisher and Editor in Chief: Monica Younsoo Chung | Chief Editorial Writer: Hyoung Joong Kim | Editor: Yeon Jin Jung
  • Juvenile Protection Manager: Choul Woong Yeon
  • Masthead: Korea IT Times. Copyright(C) Korea IT Times, All rights reserved.
ND소프트