VIPRR: A VIRTUALLY IN PERSON REHABILITATION ROBOT

Larry Leifer*, Shawn Stepper*, Matt Schaefer*, Machiel Van der Loos+
*
Stanford Center for Design Research
+Rehabilitation R&D Center, Palo Alto VA

Abstract

A mandate in Rehabilitation Robotics is to disseminate R&D to the public and to scientists around the world. The World Wide Web (WWW) has become the medium of choice for this activity, especially for text and graphics . As the WWW has matured, sharing of other forms of data, like sound and video, has become possible. In this paper, we describe an interactive WWW video-camera-tele-robot that is unique in its ability to be remotely controlled by any web browser. Controls include camera position, orientation, zoom and image parameters. VIPRR allows people with severe physical limitations to visit the R&D lab anytime, from anywhere, to see what is going on and, by arrangement, to test-drive rehabilitation robots under development. Through this technology, the developer gains an extra-ordinary opportunity to obtain user, clinician and developer feedback, just-in-time.

Motivation

Training, clinical supervision and technical support for rehabilitation robots has proven difficult and expensive in most situations [1,2]. To overcome this barrier, we developed a digital video camera interface to the World-Wide-Web (WWW) that can be interactively controlled remotely, a kind of telepresence. The technology is broadly part of "tele-medicine" or "tele-rehabilitation". A second generation web-robot-camera is featured in ProVAR, a 4th generation rehabilitation robot [3], to support remote training, diagnostics and field service. The prototype can be test driven at (http://firstvip.stanford.edu) (see Fig 1).

Interactive Internet Audio-Video

Internet-mediated audio conversations, video conferencing and teleoperation of equipment have been available for several years (e.g., CUseeMe), but they require dedicated, savvy, technical expertise on both ends of the dialog. In contrast, the WWW is different in that the two parties need not be technical equals. One side, the "server", carries the burden of information access, translation, storage and on-demand forwarding. The server requires special hardware, software and technical expertise. The "user" or "client" side requires only a modest personal computer, InterNet access and a generic (usually free) web "browser". In this context, we can now provide unlimited access to rehabilitation robots [4].

Hot web-camera locations

Transmitting video images through the WWW has been demonstrated at numerous sites (see: http://www.rt66.com/~ozone/cam.html for an index). The site of the Stanford VIP project lets the browser control the camera orientation in real time: (http://vip -at- me210.stanford.edu). "FirstVIP" has a fast (6-8 frame/sec update rate), small (120x120 pixels), 256-level gray image. Adding zoom capability is a current priority.

Why FirstVIP

It was created to give remote members of a mechatronic systems design community informal browsing ability, the chance to "drop in" and see what is going on in the laboratory, any time, from anywhere without prior arrangements, e.g., asychronously. The user controls 3 degrees of freedom to pan, tilt and zoom the camera. This allows one to look at activity or objects of special interest when one wants to look. VIPRR has been added to ProVAR design requirements after experience with the FirstVIP system.

Figure 1: VIP browser page, showing camera image of programmer holding the camera and pan-tilt head. At bottom is the VIP control panel.

Remote Clinical Supervision

Remote clinical supervision of ProVAR, including training, diagnosis, therapy and servicing are key system features for the future. We also see these capabilities as central to life-cycle cost-effectiveness.

Our experience with the use of video in rehabilitation started with DeVAR evaluation and continues with the Functional Performance Assessment and Training (F-PAT) Pilot Project. F-PAT makes innovative use of computer-controlled video to assist rehabilitation professionals in tracking recovery of function (http://ability.stanford.edu/Projects/FPAT/intro.html) [5][6]. Internet-mediated video and equipment control capability is part of the ProVAR development plan.

VIPRR System Specifications

The system consists of an Intel Pentium(TM) computer running Linux(TM) (freeware Unix(TM) work-a-like); a Quickcam(TM) digital gray scale camera; a Sunpak(TM) pan/tilt mechanism; and a custom imbedded controller to interface between Linux(TM) and the motion platform. Drivers for the controller and Java(TM)-scripts for Linux(TM) are also required. The Pentium-Linux server, Quickcam(TM) and Sunpak(TM) are non-proprietary. The remaining components of the system were developed at Stanford and are licensable from the University.

The controller is addressed via a parallel port to control pan, tilt and zoom motion parameters. Software has been written in "C" to implement Linux(TM) specific system calls. The program also interacts with a CGI script that accepts user input from the web browser. Cursor and keyboard data are processed by the CGI script and mapped to specific motion parameters.

VIPRR Pre-trials in the Classroom

A Stanford graduate engineering course [http://me210.stanford.edu/], "Team Based Design-Development with Corporate Partners [7]", was the first to develop and use this technology as a tool for information exchange amongst students, faculty and teaching assistants. Of the 45 students presently enrolled in this project-based-learning course, one-quarter are remote students who now interact with the larger community via the Internet-WWW rather than traditional video-tape distribution and video-conferencing. VIP technology was developed to ameliorate the strained relations that often arise when team members are not co-located. The situation in this course resembles that encountered by physically impaired individuals who require an interdisciplinary, often distributed "team" of assistants for care and development. Our team performance data indicate that InterNet-supported teams can, on average, match or out perform co-located teams [7]. The implications for distributed rehabilitation is profound.

VIPRR Clinical Study

In corporate and federal R&D Labs such as the VA Palo Alto Rehabilitation R&D Center (RRDC), a different problem requires virtually identical instrumentation. We propose that the process of debugging and applying complex prototype systems (like ProVAR) undergoing clinical beta testing can be performed faster and at lower cost, compared to current procedures, by WWW-based camera control and remote equipment operation. Lab-based developers and experts can operate and observe the remote clinical system, diagnosis problems, and offer instruction to field-based operators. The need for such capabilities is especially acute in clinical settings where non-technical professional staff must safely and cost effectively use sophisticated diagnostic and therapeutic systems for which they can never have receive truly adequate, up-to-date, training. The same line of reasoning supports distributed R&D teams during the development of advanced medical technologies.

Plans for Development

We propose to implement three test bed systems. The first VIP-camera has been installed and debugged in the ME210 design-loft. This is where the VIP design-team is working. A second system has been installed at the VA RRDC on the DeVAR assistive robot. The third implementation will be in an instructional television studio-classroom on the Stanford campus.

The ME210 VIP team will interview staff and remote students, as well as review class videotapes, to assess the level of interaction with and without VIP. Metrics include the number of questions asked in class, email activity, and web-site browsing activity.

Considerable work remains to be done on VIP's human-robot-interface. Following guidelines from telepresence research, this will be a particularly active R&D theme over the next two years and is addressed, in part, in the ProVAR development plan.

References

[1] Van der Loos HFM. VA/Stanford Rehabilitation Robotics Research and Development Program: Lessons Learned in the Application of Robotics Technology to the Field of Rehabilitation. IEEE Trans. Rehabil Eng, 3:1, March, 1995, 46-55.

[2] Hammel J, Van der Loos HFM, Perkash I. Evaluation of a Vocational Robot with a Quadriplegic Employee. Archives of Physical Medicine and Rehabil, 73, July, 1992, 683-693.

[3] Van der Loos HFM, Burgar CG. Development of an Assistive Robot for Effective Health Care Delivery. Device Development Merit Review Project, Letter of Intent B95-1017RA, Oct 1995.

[4] Leifer L, Toye G, Van der Loos HFM. Tele-Service-Robot: integrating the socio-technical framework of human service through the InterNet-WWW. Proc Intl Workshop Biorobotics: Human-Robot Symbiosis, Tuskuba, Japan, May 1995, Elsevier, 1996

[5] Shafer D, Van der Loos, HFM. Integrated video and computerized functional assessment. Proc RESNA'95, Vancouver, Canada, June 1995, 146-148.

[6] Shafer D, Van der Loos HFM. Individualized video-based stroke rehabilitation home program. Proc RESNA'96, Salt Lake City, UT, June, 1996, 89-91.

[7] Leifer L. Evaluating Product-Based-Learning Education. Proc US-Japan Conf Evaluation of Engineering Educational Reform, in press, Osaka, Japan, 1995, 9 pages.

Acknowledgments

The authors would like to thank technology consultants George Toye, Bill Wood, Brian Luehrs, and Joe Wagner. We would like to recognize that the proof of concept was done in ME210, "Team Based Design-Development with Corporate Partners", a Stanford graduate course supported in part by funding from the Stanford Design Industry Affiliates Program. Additional support has been received from the U.S. Department of Veterans Affairs Rehabilitation R&D Service and the Palo Alto Institute for Research and Education (PAIRE).

Address

Larry Leifer, PhD
Professor, Mechanical Engineering
Director, Center for Design Research
Stanford, California 94305-4026, USA
Fax: 650/725-8475 fax
650/415-725-0158