Interpreting ultrasound using an app
A new app will offer health-care personnel training and experience in interpreting ultrasound images.
“The threshold for the use of ultrasound is steadily being lowered, and more and more people need to learn to interpret what they see on the images. For this reason, it is important to expose them to a wide range of ultrasound images,” says Frank Lindseth, a SINTEF research scientist at Scandinavia’s largest independent research group.
“One major advantage of ultrasound is that it makes a number of examinations and operations simpler and less time-consuming, as well as being easier on the patient. For example, in many cases, ultrasound can mean that we don’t need to put the patient under general anaesthesia,” says anaesthetist Kaj Johansen, who supplied data for the project. He believes that the app is both useful and instructive.
Lindseth has many years of experience of using ultrasound. He conceived the idea of an app several years ago, but it turned out to be difficult to get funding for the concept. Now he has developed a beta version with the help of MSc students in computer science at Trondheim Norwegian University of Science and Technology (NTNU).
“The feedback we have received from the health personnel who have tested the app has been very positive. They were surprised that it was so much fun in use,” say Hanna Holler Kamperud and Solveig Hellan, MSc students in computer science at NTNU.
The game has three levels of difficulty, so that new challenges appear as ‘players’ become better at using it. On the first level, they are given all the help they need, but at the highest and most difficult, they work against time.
“This means that there is always room for improvement,” say the students.
The app presents users with an ultrasound image. At the simplest level, the task is to identify the artery shown in red. On the most difficult level, the user is given the same task, but no help is given and it must be done as fast as possible, so there is a constant challenge to improve.
The app trains the users in the art of identifying the nerve, as well as surrounding landmarks, in ultrasound images. The aim is to place a needle close to the nerve and inject anaesthetics around it so that the leg can be operated on.
So far, only a limited amount of illustrative material has been incorporated in the app.
“Our aim is to add more data to the app that will cover more of the anatomical variability seen in patients. When we have done that, the natural next step will be to put the app online for downloading,” says Lindseth.
“It is important that [the app] should be used a lot, because the only way to become good at interpreting ultrasonic images is to completely internalise the process. The idea is that you should be able to bring it out in the bus, in a waiting-room or at home on the sofa,” explains Lindseth.
The plan is to put the app on Google Play (Android) and App Store (iOS) and make it available to everybody, as soon as the development team has incorporated a little more data, says the SINTEF scientist.