ãheaderõ
ãdate/timeõ
Click to edit Master text styles
Second level
Third level
Fourth level
Fifth level
ãfooterõ
ã#õ
Letís face it, a minimal condition for some sort of new, ìpostî human condition would certainly be a fundamental shift in our notions of material reality. By exploring the recent rapid developments in imaging technologies, robotics, and simulation in surgery. I hope to suggest some of the pathways through which a so-called post-human future might come about.Ý Experience of materiality and our notions of the real are deeply tied to technologies that affect how we experience space and time and how we use our bodies. Changes in these technologies have a profound impact on our sense of materiality and of the real.
The story I want to tell is deeply wrapped up with the history of new media. To paraphrase Friedrich Kittler: Media inscribe our situation.
But we do not just switch on the computer and awake to a new ontology. Media are institutions. At stake here is more than the difficult task of constructing computer chips, parallel processors, massively distributed networks, etc. but also of configuring users and especially of configuring the senses of users. For me media are institutions embedded in hardware, and the history of their construction is a history also of resistance. I hope to make that apparent as we go on.
One of the first systems to incorporate all these features in a surgical simulator was developed for eye surgery by MIT robotics scientist Ian Hunter. Hunterís microsurgical robot (MSR) system incorporated features described above such as data acquisition by CT and MRI scanning, use of finite element modeling of the planned surgical procedure, a force-reflecting haptic feedback system which enables the perception of tissue cutting forces, including those that would normally be imperceptible to the surgeon if they were transmitted directly to his hands.
A distinctive feature of Hunterís MSR is its immersive virtual environment which fuses video, touch, and sound into a virtual reality experience. The haptic environment in Hunterís system is fused with 3D stereo camera images fed to a head-mounted display. As if in a flight simulator the surgeon can rehearse his procedure on the model of the individual patient he has constructed. In addition, the model can be used as a training site for student surgeons, co-present during a practice surgery, sharing the same video screen and feeling the same surgical moves as the master surgeon. But such systems can also be deployed in a collaborative telesurgery system, allowing different specialists to be faded in to ìtake the controlsî during different parts of the procedure. Indeed, a ìcollaborative clinicî incorporating these features was demonstrated at NASA-Ames on May 5, 1999 with participants at five different sites around the US.
Philip Green led a team at SRI that assembled the first working model of a telepresence surgery system in 1991, and with funding from the NIH Green went on to design and build a demonstration system. The proposal contained the diagram shown in Fig. 1, showing the concept of workstation, viewing arrangement, and manipulation configuration used in the surgical telepresence systems today. In1992 SRI obtained funding for a second-generation telepresence system for emergency surgeries in battlefield situations. For this second-generation system the SRI team developed the precise servo-mechanics, force-feedback, 3-D visualization and surgical instruments needed to build a computer-driven system that could accurately reproduce a surgeon's hand motions with remote surgical instruments having 5-degrees of freedom and extremely sensitive tactile response.
Philip Green led a team at SRI that assembled the first working model of a telepresence surgery system in 1991, and with funding from the NIH Green went on to design and build a demonstration system. The proposal contained the diagram shown in Fig. 1, showing the concept of workstation, viewing arrangement, and manipulation configuration used in the surgical telepresence systems today. In1992 SRI obtained funding for a second-generation telepresence system for emergency surgeries in battlefield situations. For this second-generation system the SRI team developed the precise servo-mechanics, force-feedback, 3-D visualization and surgical instruments needed to build a computer-driven system that could accurately reproduce a surgeon's hand motions with remote surgical instruments having 5-degrees of freedom and extremely sensitive tactile response.
In late 1995 SRI licensed this technology to Intuitive Surgical, Inc. of Mountain View, CA. Intuitive Surgical furthered the work begun at SRI by improving on the precise control of the surgical instruments, adding a new invention, EndoWristô, patented by company cofounder Frederic Moll, which added two degrees of freedom to the SRI deviceóinner pitch and inner yaw(Inner pitch is the motion a wrist performs to knock on a door; inner yaw is the side-to-side movement used in wiping a table)óallowing the system to better mimic a surgeon's actions; it gives the robot ability to reach around, beyond and behind delicate body structures, delivering these angles right at the surgical site. Through licenses of IBM patents, Intuitive also improved the 3-D video imaging, navigation and registration of the video image to the spatial frame in which the robot operates. The system employs 250 megaflops of parallel processing power.
Such examples demonstrate that computational modeling has added an entirely new dimension to surgery. For the first time the surgeon is able to plan and simulate a surgery based on a mathematical model that reflects the actual anatomy and physiology of the individual patient. Moreover, the model need not stay outside the operating room. Several groups of researchers have used these models to develop ìaugmented realityî systems that produce a precise, scaleable registration of the model on the patient so that a fusion of the model and the 3D stereo camera images is made. This procedure has been carried out successfully in removing brain tumors and in a number of prostatectomies in the Mayo Clinicís Virtual Reality Assisted Surgery Program (VRASP) headed by Richard Robb.
In addition to improving the performance of surgeons by putting predictive modeling and mathematically precise planning at their disposal, computers are playing a major role in improving surgical outcomes by providing surgeons opportunities to train and rehearse important procedures before they go into the operating theater. By 1995 modeling and planning systems began to be implemented in both surgical training simulators and in real time surgeries.