This post was originally posted on the Adaptive Path blog.
Touch screen installations are by no means new. We have been using them in airports and ATMs for years now. With the advances in computing and gestural touch interfaces, we are starting to see them even be considered at the local Ann Taylor. This trend has often times made processes more streamlined and allowed people to interact with information and services in ways that were impossible a few years back. There is a downside to this however, germs. New studies have shown that our touchscreen devices, most notably our iPads are germ magnets. So while we should not be running back to our caves in fear, we need to understand the implications of touch-based interfaces — especially in the context of public environments. One place in particular where touch-based interactions pose a serious hazard are hospitals. The CDC estimates that 1.7 million hospital-associated infections, or Nosocomial infections occur each year with 99,000 resulting in death. When germs are a deadly issue, the last thing you want is to have thousands of people touching the same thing.
On the other hand, the systems currently in place are horribly outdated and will be increasingly out of step with people expecting to have smarter, simpler and more streamlined points of interaction. There are plenty of areas in hospitals and other healthcare-related clinics where interactive, digital systems could drastically improve the patient and their support network’s experience. These sorts of solutions could also help expedite the digitization of medical records. Interactions such as patient check-in, directory information, patient location, and describing symptoms at an urgent care need a more modern approach. If touch screens pose a health risk, what are some other ways to solve this problem?
I believe spatial gesture KUIs (kinetic user interfaces) from technology being used in products like Microsoft Kinect, with mobile devices as an optional supplement could have a lot of promise. The advances in this space have been astounding and have shown the significant potential that they have. The XBox 360’s Kinect hub shows just how much can be accomplished with simple gestures and an appropriate interface.
I could see interfaces such as the example above being useful for patients to not only input their information to the clinic, but also to look through their own medical records, research information on their health issue and be able to better understand procedures and medication before they opt in to something they may be uncomfortable with. These systems could also support families in waiting rooms who desperately want to know the status of their loved one or understand the details of what they are afflicted with. This information could be available, it just currently poses a significant health risk to provide it with more traditional physical I/O devices.
There are potential challenges of course. Inputting large amounts of text for things such as patient records could be laborious and frustrating. However, creative solutions such as allowing peripheral interactions from mobile devices or voice input could aid these processes. The overwhelming norm of using pen and paper to continually input your information or needing to call a front desk for simply patient information is just no longer going to cut it. Our health care systems need a serious overhaul from start to finish. Perhaps a good place to get started is here.
Minor update: Dave Johnson brought up the appropriate concern of how something like this could especially challenge people with disabilities. There probably are few places more likely to have people with motor/motion ailments than a hospital. A solution such as the one I write about would either need to find ways to work within their limitations or there would need to be well-designed alternatives to support them. It just goes to show how tricky this subject is.