Page 10 - CUAJ Dec. 2022
P. 10

Leveridge




         Consider whether the feel of the input need be congruent with the output. As it
       stands, we might think, “I want the July CT report,” or “I want to move this grasper to
       hoist up the prostate.” Currently, we search for the “imaging” tab or menu item and
       click to get a step closer or make arm movements that glide in an arc that the robotic
       arm mimics. In each case, we perform a process towards an outcome, but there is no
       reason the process has to be so literal. The computer was always capable of these out-
       puts, it was just programmed in such a way that the user is granted a sense of agency
       by the input process. Programmed another way, one might speak into the air, “I’d like
       the July CT report,” and the same output comes up as before (yes, I’m describing Siri).
       Perhaps a “surgeon” at a screen uses the telestrator function (or any other interface
       — a sensor-studded glove, a grid of IR beams, arms waving with a VR headset, etc.)
       to simply point where they’d like the robot to go and which move to apply on arrival.
         Taken further, if the computer can coordinate the moves we conceive of, perhaps
       goal states (ligated DVC; stone fragments ≤2 mm; adenoma enucleated) instead of
       processes will come to be standard. A mentor of mine envisioned a type of CNC
       (computer numerical control) machine for partial nephrectomy, using high-resolution
       imaging input to bloodlessly remove a renal mass like an engine block might be hewn
       from a cube of aluminum. We see artificial intelligence programs like DALL-E and
       GPT-3 already rendering cogent art and prose from basic text instructions. Of course,
       surgery and medicine are complex and not just complicated; the substrate changes
       during the act of care and biological variability is staggering, so it’s hard to compute
       (pun intended) a machine having the adaptiveness to modify a surgical plan on the
       fly and think clinically, as well as procedurally. Understand though that surgeons may
       hone their skills and intuition through perhaps 1000 of a given case over a career; a
       computer may have access to recordings of any robotic prostatectomy ever recorded
       in building its skillset.
         Well, we’ve managed to go from tapping A & B buttons on ‘80s consoles to a bleak
       future of autonomous robo-surgeons in 1200 words, but our AI replacement is not
       imminent. There are still only 30-odd robots in Canada, and the state-of-the-art prostate
       imaging takes a $6 million machine the size of a car 45 minutes in a 2000 square
       foot electromagnetic cage to render an image that looks like a jacked-up cookie your
       toddler made. It’s just an angle on innovation that reminds us that our ability to take
       care of patients is tariffed by interface decisions outside our ken and control. We reap
       both the efficiencies and the fragility.

       Correspondence: Dr. Michael Leveridge, Department of Urology, Queen’s University, Kingston, ON, Canada; Michael.Leveridge@kingstonhsc.ca




























       384                                      CUAJ • December 2022 • Volume 16, Issue 12
   5   6   7   8   9   10   11   12   13   14   15