Apple’s latest hardware release has re-ignited virtual reality discourse. We have been waiting with bated breath for the VR revolution to come ever since Facebook bought Oculus. Facebook has backed VR to such a degree that they renamed their corporation Meta, as in the metaverse. The metaverse is where – we are told – we will soon be interacting. It will be a shared, immersive, 3D virtual space. Interacting here will probably involve some form of VR enabled headgear.
Some argue that the revolution has stalled. Heavy, cumbersome hardware makes many users feel nauseated. Even the gaming industry, which has arguably gone furthest in development of metaverse-like environments, has not yet embraced VR. However, Apple have now thrown their hat in the ring – with their Vision Pro – and so things may change.
Apple has a track record of changing behaviour. We will see whether the Apple Vision Pro changes the world like the iPhone or iPod. If VR really does take off, and we end up part-living in the metaverse, it is worth considering how surgery might benefit.
Interestingly, the Vision Pro allows the user to be immersed in the virtual world to varying degrees. Apple say that one can ‘…seamlessly blend digital content with your physical space’. Consider a situation of minimal immersion, where the real-world operating environment dominates, but in which a surgeon might pull up the patient’s MRI or CT partway through an operation (using the in-built eye tracking technology). Taking this further, real-time electrophysiology could be superimposed on screen, or settings for the electrocautery or other surgical instrumentation might be controlled via the headset to avoid asking scrub teams to change settings.
In the world of endoscopic or exoscopic surgery, where the surgeon visualises the surgical field via a screen, there is the immediate potential to stream the view within the headset itself. This could be a usefully immersive, high fidelity environment with potentially improved visualisation. As well as superimposed information as above, it would be possible to overlay patient-specific anatomical detail in real-time (as is already possible using intra-operative surgical navigation hardware).
These developments could certainly prove useful and seem entirely feasible. However, might the extra information be too distracting? Settling down behind the microscope can be isolating in a good way; the rest of the room falls away and the surgical task at hand is magnified, quite literally. This focused immersion is arguably crucial to phases of operating.