How to Design a Safe Nano Robot or Virus
How to Design a Safe Nano Robot or Virus
  • Korea IT Times
  • 승인 2010.07.02 11:30
  • 댓글 0
이 기사를 공유합니다

Ethics issues of a different type have begun to spring up for science and technology due to a combination of factors.  There are credibility issues that may hatch trust issues resultant from intentional and unintentional expected [in a democracy at least] questioning of findings in environmental studies and science becoming reality as scientists in various fields from varying epistemological backgrounds grapple with timeframes, conceptualisations and facts.  But as consumers and scientists come to understand the capacity of science and technology to innovate and that this capacity is increasing rapidly, there needs great thinkers on the side of wisdom to steer us from wanton disaster.

Benjamin Franklin Working at his desk

 

 

It is interesting and very noteworthy that modern society looks back for guidance to thinkers such as Aristotle and Asimov for ethics.  It is likely that human ethics in terms of acceptable standards have improved in some cases but there are only laws similar to Moore's Law calculations for innovation of technologies available, not ethics, a most worthy calculation.

Issac Asimov ascribed three laws in the design of robots that movies, and consumers worry that some scientists too, have reversed to create plots intended to entertain for 120 minutes.  Issac Asimov's 3 Laws of Robotics:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with either the First or Second Law.

 

As scientists get closer to making items such as the Minority Report's screen a reality, the voice of caution seems to ask too, what is a robot  Nanotechnology has taken flight into the realms of fantasy too and so have consumers' fears on what nano is motivated by.  As companies post pictures of ingestible RFID the Center for Responsible Nanotechnology [CRN] lists the major pitfalls, as they correspond to each of Asimov's laws, as follows:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

  • Nanotech weapons would be extremely powerful and could lead to a dangerously unstable arms race.
  • Criminals and terrorists could make effective use of the technology.
  • Extreme solutions and abusive regulations may be attempted [this is the surveillance concern].
  • Too little or too much regulation can result in unrestricted availability [Why worry about the human spies ... Beware the atomic spies].
  • Competing nanotech programs increase the danger [the new arms race...acknowledged].

 

2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.

  • Grey goo was an early concern of nanotechnology.
  • Too little or too much regulation can result in unrestricted availability [Why worry about the human spies ... Beware the atomic spies and remote activation of ingestibles].

 

3. A robot must protect its own existence as long as such protection does not conflict with either the First or Second Law.

  • Criminals and terrorists could make effective use of the technology.
  • Extreme solutions and abusive regulations may be attempted [this is the surveillance concern].
  • Collective environmental damage is a natural consequence of cheap manufacturing, as are health risks.
  • Grey goo was an early concern of nanotechnology.
  • Too little or too much regulation can result in unrestricted availability.

 

New robotic ethical points raised by the CRN, not falling clearly under Asimov's laws:

  • Disruption of the basis of economy is a strong possibility.
  • Major investment firms are conscious of potential economic impact.
  • Nano-built products may be vastly overpriced relative to their cost, perpetuating unnecessary poverty.
  • Society could be disrupted by the availability of new "immoral" products.
  • Collective environmental damage is a natural consequence of cheap manufacturing, as are health risks.
  • Grey goo [the ultra modern 'oil spill'] won't happen by accident, but eventually could be developed on purpose.
  • Competing nanotech programs increase the danger.
  • Relinquishment is counterproductive [weapons facilities just got harder to spot].
  • Solving these problems won't be easy [We are talking ARMS people, I, the nonchalant consumer, is not aware of a United Nations work group on nano non proliferation, thus far].

Above all, the CRN says, 'a single grey goo release, or unstable nanotech arms race, is intolerable. Threading a path between all these risks will require careful advanced planning.' Yet manufacture has begun ...

A presentation on the Korean Roboethics charter: www.roboethics.org/icra2007/contributions/.../Shim_icra%2007_ppt.pdf

The CRN encourages ALL to have their say and more information is available at: http://www.crnano.org/index.html

 


댓글삭제
삭제한 댓글은 다시 복구할 수 없습니다.
그래도 삭제하시겠습니까?
댓글 0
댓글쓰기
계정을 선택하시면 로그인·계정인증을 통해
댓글을 남기실 수 있습니다.

  • Korea IT Times: Copyright(C) 2004, Korea IT Times. .Allrights reserved.
  • #1206, 36-4 Yeouido-dong, Yeongdeungpo-gu, Seoul, Korea(Postal Code 07331)
  • 서울특별시 영등포구 여의도동 36-4 (국제금융로8길 34) / 오륜빌딩 1206호
  • * Mobile News: m.koreaittimes.com
  • * Internet news: www.koreaittimes.com
  • * Editorial Div. 02-578-0434 / 010-2442-9446 * PR Global/AD: 82-2-578-0678.
  • * IT Times Canada: Willow St. Vancouver BC
  • 070-7008-0005
  • * Email: info@koreaittimes.com
ND소프트