Tablet usage has increased substantially over the past several years in health care facilities. According to a survey by Manhattan Research, 72% physicians now use tablets in their everyday routines. This has fueled elevated demand for health care reference apps, like athenahealth's (ATHN) Epocrates and WebMD's (WBMD) Medscape, and mobile electronic health record, or EHR, apps from Allscripts (MDRX -2.78%), Cerner, and other major companies.
Thanks to its built-in camera, motion sensors, touch screen interface, cloud-based connections, and portability, the tablet is a versatile medical tool for which new health care applications are being developed every day.
Augmented reality -- just like reality, but smarter
One of the most innovative areas of tablet health care is augmented reality, or AR -- an extra layer of information that enhances a real-time image with cloud-based data to create a virtual viewfinder. By simply holding up a tablet's camera to a patient or an open body cavity, an image can be scanned and pertinent data can be directly downloaded onto a digital overlay -- creating an "enhanced reality" straight out of the realm of sci-fi.
Last year, Sheffield Hallam University became the first higher-education institution to use AR technology in health care education. Nursing students were instructed to use an app to view a dummy on a bed, and a living patient, played by an actor, would appear on the AR overlay to give students a taste of a real-life situation and to test their bedside manner.
German research institute Fraunhofer MEVIS went a big step further and created an app that allows surgeons to use the iPad as a real-time viewfinder during surgery. Holding the iPad's camera up to a patient's body lets the doctor see a digital overlay of key blood vessels. In the past, doctors needed to memorize the exact locations of these key vessels to avoid accidentally cutting them.
Last month, this technology was showcased for the first time in a liver surgery, and the details were documented by Reuters.
A new battleground for Apple and Google
At the center of this evolution in AR tech is Apple (AAPL -0.44%). The iPad's unified operating system and hardware, which are identical among units in the same generation, make it the easiest platform to test and develop software for. Google (GOOGL -0.72%) Android, by comparison, is installed on a fragmented universe of tablets with a wide array of hardware configurations -- making it more difficult to develop reliable apps for.
However, the arrival of AR apps could shift the health care hardware market in Google's favor. Although AR apps on the iPad have proven useful as teaching and surgical tools, it has one major flaw -- it isn't hands-free. While that might not be a major problem for a nurse tending to a patient, it is a major handicap for surgeons relying on Fraunhofer MEVIS' surgical app. An assistant must hold up the iPad for the surgeon -- a clumsy arrangement that could obstruct the surgical procedure instead.
Will Glass be the preferred platform for AR app developers?
That's where Google Glass comes in. If these AR apps can be installed on Glass' heads-up display, it could become as standard a tool in ORs as scrubs or scalpels. Unlike Android tablets, Glass comes directly from Google, so it uses a unified software and hardware system that will be easier to develop apps for.
Startups like Augmedix and established companies like drchrono are already exploring the possibilities of AR Glass apps for bedside care and surgeries. With that kind of medical presence, Google could eventually replace iPads in hospitals just as iPads replaced traditional PCs in homes.
The challenges facing AR app developers
Making a functional AR app, however, is not as simple as creating regular mobile apps for smartphones and tablets. On the hardware side, AR apps rely on sensors, cameras, and network connections to be working at optimum levels. On the software side, they require strong image recognition algorithms that can identify objects from a variety of angles.
Most consumers have encountered this kind of technology in Facebook and Google's facial recognition engines. Creating software that can identify body parts and organs from multiple angles, on the other hand, will be much harder to achieve than matching facial features.
The foundations have already been built
Yet the foundation for these AR apps has already been built by software developers.
Allscripts' native iPad EHR app, Wand 2.0, allows doctors to directly take pictures of patients' bodies during an examination. The photos can then directly be documented and stored in a cloud-based EHR. AR could enhance this technology by scanning for problems immediately when the photo is taken. For example, a photo of an infected eye could immediately be analyzed via an algorithm, with the data logged directly into the EHR.
Athenahealth's Epocrates and WebMD's Medscape, two major medical reference apps, could also be enhanced with AR technology. Skin diseases, for example, could be scanned via a camera like a barcode for immediate possible diagnoses. The camera could also be used to capture and log images as study notes to be used alongside the apps, which are two of the most popular apps among medical students.
The Foolish bottom line
The health care tech industry is evolving every day. Augmented reality might still be in the testing phases, but it will still play a huge part of the mobile revolution in health care, which I discussed in a previous article. Investors interested in the evolution of EHRs and mobile health care tech should keep a close eye on the possible applications of these AR apps.