Signal Processing for Ubiquitous Healthcare Applications
Signal Processing for Ubiquitous Healthcare Applications
  • Cha Joo-hak
  • 승인 2009.05.08 19:14
  • 댓글 0
이 기사를 공유합니다

Cha Joo-hak is the CEO and Representative Director of Mobicomm Inc. and KW U-Globe Corp.
Ubiquitous Healthcare shows its potential by facilitating the exchange of information between clinicians or between institutions, reducing costs, extending the scope and reach of medical facilities, enhancing the quality of services offered on-site, and providing for new means of medical supervision and preemptive medicine. Currently, the integration of medical networking and medical information systems is treated as an obvious and irrefutable rule. Thus, standalone medical networking environments are no longer a reality and telemedicine may be used interchangeably with Ubiquitous Healthcare.

Let us review the field of Ubiquitous Healthcare and its supporting technologies, from what is typically referred to as the lower end. These are signals which are basic concepts from the field of signal processing. Two good examples are the electroencephalogram (EEG) and the electrocardiogram (ECG or EKG).

Medical signals refer to the observable facts or stimuli of biological systems. In order to extract and document the cause of a signal, a medical practitioner may utilize simple examination procedures such as measuring the temperature of a human body. However, the medical practitioner may sometimes have to resort to highly specialized and intrusive equipment such as an endoscope. After signal acquisition, practitioners go on to a second step, that of interpreting its meaning. This is usually done after some kind of signal enhancement or preprocessing that separates the captured information from noise and prepares it for specialized processing, classification, and recognition algorithms. It is only then that the result of the acquisition process reveals the true meaning of the physical phenomenon that produced the signal under investigation.

As a general rule, the particular techniques depend on the actual nature of the signal and the information it may convey. Signals in medicine-related applications are found in many forms, such as bioelectric signals that are usually generated by nerve and muscle cells. Besides these, tissue bioimpedance signals may contain important information such as tissue composition, blood volume and distribution, or endocrine activity, while bioacoustic and biomechanical signals may be produced by movement or flow within the human body. This could be blood flow in the heart or veins, blood pressure, joint rotation or displacement, or other movements.

While medical signal processing techniques can sometimes be performed on raw analog signals, advanced frequency-domain methods typically require a digitization step to convert the signal into a digital form. Besides enabling the deployment of techniques that would be otherwise impossible, digital signals are much more efficient when it comes to storing and transmitting them over networks or utilizing automated feature extraction and recognition techniques. This process begins with acquiring the raw signal in its analog form, which is then fed into an analog-to-digital (A/D) converter. Since computers cannot handle or store continuous data, the first step of the conversion procedure is to produce a discrete time series from the analog form of the raw signal. This step is known as sampling and is meant to create a sequence of values sampled from the original analog signals at predefined intervals which can faithfully reconstruct the initial signal waveform. In order for this to happen, the sampling frequency must be at least double the signal bandwidth. This requirement is known as the Shannon theorem and, in theory, it is the deciding factor required for the sampling process to produce a faithful representation of the captured signal. In practice, however, picking the sampling frequency with only Shannon's theorem in mind may lead to other equally important problems such as aliasing or noise replication.

The second step of the digitization process is quantization, which works on the temporally sampled values of the initial signal and produces a signal that is both temporally and quantitatively discrete. This means that the initial values are converted and encoded according to properties such as bit allocation and value range. Essentially, quantization maps the sampled signal into a range of values that is both compact and efficient for algorithms to work with.

Although there are a number of algorithms that operate on the sampled and quantized digital signals directly, most of the techniques that extract temporal information from and recognize patterns in a signal are performed in the frequency domain. The foundation of these algorithms is the fact that any analog or digital signal can be represented as an integral or sum of fundamental sine functions with varying amplitudes and phases. The reversible transformation between the initial representation and that of the frequency domain is given by the Fourier Transform (FT), which is defined in both continuous and discrete forms. An interesting alternative to the frequency representation is to apply the FT to the correlation function, resulting in the power spectral density function (PSD), which finds prominent use in recognizing the pathological states of EEGs.

An ECG signal represents the electrical activity produced by the heart and acquired at the skin's surface. It has been available from nearly the beginning of the 20th century. Actually some of the techniques used to capture the signal -usually at the arms and legs - and to label salient features originate from the pioneering work of Einthoven. ECGs were also among the earliest acquired signals to be transferred to remote locations over phone lines. In modern capture stations dedicated hardware is used to perform the A/D conversion of the differential signals captured at up to 12 sites on the human body. These 12 leads are then combined to produce conclusive measurements which are then fed to classification algorithms. ECGs are typically used to determine the rate and rhythm for the atria and ventricles, and pinpoint problems between or within the heart chambers. In addition to this, this signal can illustrate evidence of myocardial blood perfusion, ischemia, or chronic alteration of the mechanical structure of the heart.

Emergency ECG processing caters to early detection of possibly severe pathologies, since the permanent or temporary effects of even minor cardiac dysfunctions of the waveform can be detected and classified. Regarding acquisition, bulky Holter devices built in the 1960s are currently substituted with compact devices that cater to easy deployment at mobile sites. Initially, Holter probes were combined with 24 hour recording and playback equipment in order to help identify periods of abnormal heart rate, possibly attributed to a developing heart block. These devices possessed minimal heart beat segmentation capabilities and could also illustrate results visually and audibly. In addition to this, identification of premature ventricular complexes (PVCs) and relevant treatment is also possible and is recently enhanced by high-speed, hardware- assisted playback. Despite the decline in its use, emergency ECGs are still a very useful early detection tool. Based on devices with dedicated memory and storing capabilities, they are capable of providing automated signal processing and classification without the need for a host computer.

High-resolution ECGs are the most recent signal acquisition and processing improvement tools. They are capable of recording signals of relatively low magnitudes that occur after the QRS complex - the deflection in the ECG that represents ventricular activity of the heart- but are not evident on the standard ECG. They can be related to an abnormally rapid heart rate, which is called ventricular tachycardia. ECG signal processing lends itself well to knowledge-based techniques for classification. As a result, the best known and most widely used methods have been employed in this framework, including Bayesian algorithms and Markov models. Neural networks have also been put to use here with often better results, since they can be designed to operate with and handle incomplete data, which is quite often the case with ECG data sets.


댓글삭제
삭제한 댓글은 다시 복구할 수 없습니다.
그래도 삭제하시겠습니까?
댓글 0
댓글쓰기
계정을 선택하시면 로그인·계정인증을 통해
댓글을 남기실 수 있습니다.

  • ABOUT
  • CONTACT US
  • SIGN UP MEMBERSHIP
  • RSS
  • 2-D 678, National Assembly-daero, 36-gil, Yeongdeungpo-gu, Seoul, Korea (Postal code: 07257)
  • URL: www.koreaittimes.com | Editorial Div: 82-2-578- 0434 / 82-10-2442-9446 | North America Dept: 070-7008-0005 | Email: info@koreaittimes.com
  • Publisher and Editor in Chief: Monica Younsoo Chung | Chief Editorial Writer: Hyoung Joong Kim | Editor: Yeon Jin Jung
  • Juvenile Protection Manager: Choul Woong Yeon
  • Masthead: Korea IT Times. Copyright(C) Korea IT Times, All rights reserved.
ND소프트