Robots Almost Conquering Walking, Reading, Dancing
Robots Almost Conquering Walking, Reading, Dancing
  • Matthew Weigand
  • 승인 2009.08.18 11:03
  • 댓글 0
이 기사를 공유합니다

The FIRA RoboWorld Congress 2009 was full of interesting sessions of presentations on a variety of robotics-related topics, from Terrain Mapping to Emotion, Behavior, and Interaction. One of the more interesting sessions was called Biped / Humanoid Robotics. It specifically dealt with developing the robots that are mostly found in science fiction - the ones that look like us. There were three presentations in this session. One was about a robot that can read words, a second was about a robot that can dance, and the third was about a robot that has almost all the degrees of movement that a person has.

Yu-Te Su, from the National Cheng Kung University, Taiwan

Yu-Te Su, a student from the National Cheng Kung University of Taiwan, made his presentation first. His robot, the aiRobot-2, was learning how to look at words printed in black on a white background and stuck on the wall. The robot can read the word and repeat it back. The way it does this by extensive filtering of the image that it receives from the sensor in its head. First it discards all the parts of the image that are not a word. Then it divides the word area into letter areas, and then divides each letter area into a 5x5 grid. Using mathematics it compares the light and dark grid squares to known letters, and then chooses the closest match. After doing all this computation it pronounces the word with its on-board speaker. The whole process takes about a minute.

The second presentation was by David Grunberg, a student from Drexel University in the US. His presentation was about developing a robot that can listen to a song, get the beat, and start dancing to it. Robots have been doing dances to music already, but they only follow a pre-arranged set of moves and pay no attention to the music. Syncing the robots up with the music is the problem of the handlers.

Grunberg's team's project was to get robots to actually pay attention to the music. He used a RoboNova model robot to develop this ability. They developed a way to identify the probable beat and do moves based on the beat, but there was some processing lag so the robot was unable to identify the beat and move in time to it at the same time. They ended up hooking the robot up to a faster processor which was able to get the robot to move in time with the beat.

Jacky Baltes, professor of computer science at the University of Manitoba

The third presentation was by Jacky Baltes, a professor of computer science at the University of Manitoba. Originally from Germany, Dr. Baltes has done work both in New Zealand and Canada. He gave a presentation on the new robot which his department received from Peter Kopacek, a retired professor of the Vienna Technical University. Dr. Baltes has inherited the robot and one Ph.D. student, Ahmad Byagowi. His goal is to make the robot dance.

This particular robot, named ARCHIE, was built with the explicit goal of mimicking human interaction as much as possible. It was built with all the degrees of freedom that a person's body has, and with custom-built joints that are unique in the robotics world. It does not use servomotors like most robots do, and therefore does not suffer from a servomotor breaking once a month as most other robots do. Another advantage that it has over other robots is that it has an articulated foot. The front of each foot can bend up and down, allowing the foot to bow into a shape that they anticipate will be good for running. However, the robot is still working on the basics right now. The robot is about the size of a child, 3 feet tall or so. It has extremely long arms and legs attached to a tiny frame, which makes it look a little disconcerting. Not helping is the mannequin's head that is attached to the top with a puppet's mouth, which is a placeholder for a more functional head.

All together, the presentation was surprising in unexpected ways. One does not really appreciate how difficult it is to recreate even small functions that a person can do such as recognizing beats or reading words until one tries to do it with a robot. It shows that robotics has a long way to go before passable robots can walk around doing small tasks for people. And yet, with constant advancement, the future is bright.


댓글삭제
삭제한 댓글은 다시 복구할 수 없습니다.
그래도 삭제하시겠습니까?
댓글 0
댓글쓰기
계정을 선택하시면 로그인·계정인증을 통해
댓글을 남기실 수 있습니다.

  • ABOUT
  • CONTACT US
  • SIGN UP MEMBERSHIP
  • RSS
  • 2-D 678, National Assembly-daero, 36-gil, Yeongdeungpo-gu, Seoul, Korea (Postal code: 07257)
  • URL: www.koreaittimes.com | Editorial Div: 82-2-578- 0434 / 82-10-2442-9446 | North America Dept: 070-7008-0005 | Email: info@koreaittimes.com
  • Publisher and Editor in Chief: Monica Younsoo Chung | Chief Editorial Writer: Hyoung Joong Kim | Editor: Yeon Jin Jung
  • Juvenile Protection Manager: Choul Woong Yeon
  • Masthead: Korea IT Times. Copyright(C) Korea IT Times, All rights reserved.
ND소프트