Current BSU Projects
- Fluoroscopic analysis of knee joint kinematics feedback fluoroscopy image matching system
- Segmentation of ultrasound images
- Automated counting of stained cells
- Segmentation of Visual Human muscles
- Developing an graphical user interface for analysis of electromyography (EMG) signals
Viewing medical images such as
- MRI images
- ultrasound images
- optical microscope images
allows doctors to better diagnose illnesses, and allows researchers to better understand the human physiology to develop better therapy or design better rehabilitation or surgical devices. The raw images may not provide information in a format useful for interpretation, or may take too much human labor to interpret. Addition of automated image processing procedures can make the desired information available for down-stream applications.
Fluoroscopic Analysis of Knee Joint Kinematics Feedback Fluoroscopy Image Matching System
Application areas: Biomedical image and signal processing, mechanics
Female athletes sustain anterior cruciate ligament (ACL) injuries at rates from three to seven times higher than male athletes in the same sports. Several studies have recently suggested that differences between the genders in the mechanics of landing from jumps may result in increased ACL loads in female athletes. To date, no studies have quantified the internal kinematics of the knee joint during landings in athletes of either gender. Because the ACL connects the femur and tibia at the knee joint, relative motion between these two bones during landing may predispose the ligament to injury. Accurate bony motion data cannot be collected using standard non-invasive motion capture techniques. However, Dr. Michelle Sabick (Mechanical Engineering) and Dr. Elisa Barney Smith (Electrical Engineering) are combining computer vision and image processing algorithms to develop minimally invasive techniques using new medical imaging technologies to quantify joint motion in live human subjects. This technique matches 3-D joint images of human joints with 2-D video fluoroscopy (video X-ray) image sequences to track the motions of bones at a joint very accurately. This will enable researchers to collect accurate, three-dimensional kinematic data of bones and joints in vivo and to accurately quantify how bones in a joint move relative to one another during dynamic activities. Knowledge of the exact spatial position of the two joint bones will allow biomedical researchers to develop techniques to diagnose the extent of joint injuries.
This project involves the design of computer vision and image processing algorithms to match 3-D joint images of human joints with 2-D fluoroscopy (video X-ray) image sequences. The data for this project consists of sets of CT images representing a 3-D volume, and 2-D fluoroscopy image sequences of the same joint. The CT image has already been processed to extract a 3-D solid model of the bones. The procedure will be to:
- Segment the 3-D CT volume to separate the two bones into 2 different 3-D CT volumes
- Match the 3-D solid models to the bones in the segmented CT ‘images’
- Use projection software developed for this research to produce 2-D simulated fluoroscopy images through a ray-tracing process.Ray-Tracing Process to form simulated X-Ray
|Simulated X-Ray||Real X-Ray from Fluoro sequence|
- Develop edge detection algorithms to locate the bone edges in the real and projected fluoroscopy images
- Use simulated annealing or another optimization procedure to iteratively adjust the pose of each bone model in 6 degrees of freedom (3 position and 3 orientation) until edges detected in #4 from a projection of the implant model matches edges detected in #4 from the fluoroscopy image. From this the exact spatial position of the two joint bones is known.
Steps 1-4 have been completed. Step five is being developed currently.
This will be used to study the differences in knee joint motions during landing between genders, to quantify joint motions in people with movement or skeletal abnormalities, and to study both normal and pathologic motion in a wide range of skeletal joints. Our goal is to extend the fluoroscopy technique for analysis of very dynamic activities, such as running, jumping, and cutting, which are of particular interest in the study of ACL injury mechanisms in athletes.
Segmentation of ultrasound images
Ultrasound imaging is a non-invasive, and safe imaging modality. It is also portable and inexpensive. However it produces images that are very noisy. The images are also susceptible to occlusion from dense objects such as bones.
Dr. Uwe Reischl in the Department of Community and Environmental Health is interested in using ultrasound images for spinal health monitoring. We are working to extract the information he needs from the ultrasound images.
Automated counting of stained cells
Biologists produce cell cultures often. They then analyze the contents of these cell cultures, to determine feed back the results into their research.
The process of counting the cells on a slide is still often done manually. Automating, or semi-automating this counting can increase their productivity.
We are working on techniques to automate this process.
Segmentation of Visual Human Muscles
Application areas: Biomedical image processing, mechanics
In an attempt to see why female athletes sustain anterior cruciate ligament (ACL) injuries at rates from three to seven times higher than male athletes in the same sports, a model of the skeletal and muscular systems in a human female is being developed. To populate this model, the non-linear strength and location of muscles must be determined. It is desired to know the mass and placement of the muscles over their extent. The visible slices from the female Visual Human are being analyzed to determine the center of mass and the cross-sectional area of each slice. To aid in this endevour, we are helping to automate the segmentation process by implementing the intelligent scissors algorithm (developed by Barrett at BYU) and combining this with some interpolation to automate the acquisition of data from intermediate slices.
These slices will be ‘stacked’ to form a 3-D model of the muscles.
Research is progressing to develop methods to determine intermediate slice boundaries without human intervention. This will reduce the number of required slices, therefore automating the process further. The algorithms will be evaluated for smoothness and accuracy.
Developing an Graphical User Interface (GUI) for analysis of electromyography (EMG) signals
The need for this project arose due to questionable results encountered by researchers in using their system for processing electromyography (EMG) generated data. The purpose of this project is to design a Graphical User Interface (GUI) driven signal/EMG processing tool to be used in processing EMG signals collected from human subjects engaged in various activities, such as jumping and running.
A GUI was developed by BSU students as part of a Senior Design Project under the supervision of Dr. Barney Smith. It runs under the MATLAB environment. It can load DICOM files, do amplification, rectification, multiple types of smoothing (RMS, mean, median) and filtering (Butterworth, Chebyschev I & II, Elliptical, Bessel, or user defined by transfer function). Frequency response of filter transfer functions can be seen graphically in linear or Bode plot forms. The original and processed EMG signals can be simultaneously shown in the display window. Muscle onset times can be identified with rules with variable parameters. These times are shown graphically on the EMG signal plots.
This GUI is the primary EMG analysis tool used by researchers in the Center for Ortheopaedic Biomechanical Research (COBR). Additional features (such as Power Spectral Density) are being added and the code is being fully documented. Researchers in the Biomechanics Lab indicate that it is much more user friendly than the commercial they purchased with their EMG acquisition system. Plans are to make the source code available without cost to other researchers outside of Boise State once final testing is complete. People interested in using this tool should contact Dr. Barney Smith with inquiries.