·1. Environmental Media Project
Graduate School of Media and Governance, Keio University, Tokyo, Japan.
1999-current
·2. Virtual Explorer
University of California, San Diego, CA.
1998
·3. VRML Projects
Telepresence Research, Inc., San Francisco, CA.
1996
·4. Virtual Brewery Adventure
Telepresence Research, Inc., San Francisco, CA.
1994
·5. Menagerie
Telepresence Research, Inc., San Francisco, CA.
1993
·6. Telepresence Mobile Robot
Telepresence Research, Inc., San Francisco, CA.
1991
·7. NASA VIEWlab
NASA Ames Research Center, Mountain View CA.
1985-90
·8. Viewpoint Dependent Imaging
Architecture Machine Group, MIT, Cambridge, MA.
1981
·9. Stereoscopic Workstation
Architecture Machine Group, MIT, Cambridge, MA.
1981
·10. Dancing Images
Architecture Machine Group, MIT, Cambridge, MA.
1981
·11. Stereoscopic Design Theater
Fiat/Lancia Design, Turin, Italy.
1979
·12. Stereoscopic Art Works
Center for Advanced Visual Studies, MIT, Cambridge, MA.
1974-76
1. Environmental Media Project
Graduate School of Media and Governance, Keio University, Tokyo, Japan
1999-current

Linking Virtual Environments to the Physical World

The overall intent of this research program is to enable a user to easily access embedded location-specific information in any site in order to make it a more "context-rich" experience. The interdependent objectives of the program are to develop innovative interface techniques and authoring tools for the development, display, and access of location-linked virtual environments, and to develop design guidelines on how to make explicit, and display for a mobile user, the layers of information and digital data that are attached to objects, people, places, as well as information about the relationships between them.

WEM: Wearable Environmental Media

Our preliminary approach has been to explore potential applications of location-based information services over wireless networks and, based on the development of a prototype "wearable environmental media" system (WEM) to link virtual environments to the physical world, it enables a mobile user to browse a spatially correspondent multimedia information database about a specific location as it changes over time. As a test bed to evaluate these concepts and configurations, an initial technology platform has been developed that consists of a very lightweight stereoscopic camera and display system that is mounted on a remote users head and body. Additional subsystems are added for: presenting visual and audio information; tracking the user's location and head orientation; interacting with virtual 3d icons; accessing and caching data about the interacting with virtual 3d icons; accessing and caching data about the environment (both archived and local sensor data); and configuring or generating data to be displayed.

By tracking location and attention of the user as they move through the actual site, a wide variety of information virtually encoded in the site can be displayed in various formats. In effect, the user is able to browse a spatially correspondent digital information database about the site as it changes over time. Alternatively, the real-time video stream of the site as a user walks around can be transmitted to a remote location where observers can also experience the digitally augmented location or even request the on-site user to move to a different area.

More recent efforts extend this goal to develop comprehensive 3D representations of specific locations derived from both mobile and static real-time wireless sensing devices, and methods for capture, organization, and visualization of real-time, site-specific environmental information for users who are both remote and local to the site. And new developments described in this paper enable a user to post location-specific information in an open system architecture using a variety of technology platforms and to access interpretive annotations posted by domain experts.

   Environmental Media: Linking Virtual Environments to the Physical World", Second International Symposium on Mixed Reality, Yokohama, Japan (March, 2001)

MEG: Mobile Environmental Data Gathering

The goal of this research is to develop comprehensive 3D representations of specific locations derived from real-time wireless sensing devices. The specific objectives are to develop methods for capture, organization, and visualization of real-time, site-specific environmental information to users who are both remote and local to the site. Until recently, environmental data collection has been dependent on non-real-time input from installation of expensive and fragile data logging equipment and/or intermittent onsite field note collection by researchers. Now, with the aid of high-bandwidth wireless Internet connections, inexpensive sensing devices, and high-resolution tracking technologies, site-specific environmental data can be captured and analyzed in real-time. In addition, this data also can be made available in real-time to users while they are exploring a specific site, as well as to a wider audience of both professional and non-professional users.

The project developed an integrated "environmental data gathering system" with three primary subsystems:

  • Data Capture Systems
  • Database Server System
  • Visualization Authoring System

A network of mobile, static, and remote controlled camera wireless real-time sensor stations were designed and installed to collect and transmit information to a site-specific database of information:

  • The mobile data capture system includes a video camera, GPS unit, custom sensor array, and java keitai handset. The system is operated by using the keitai handset to select which sensor array to activate. Voice, audio and text information can also be recorded and transmitted by the handset. All of the collected data is stamped with time and location information and sent by wireless Ethernet to a remote database.
  • A static data capture system was designed and implemented to continuously add to the database a range of basic environmental data such as temperature and humidity level.
  • A wireless video camera system was developed and installed to capture imagery of the test location to the database. A java keitai handset was also developed to control the camera remotely.

A database system was developed to receive and archive the multimedia information gathered by the Data Capture System. Currently the database is running on a Linux system using PostgreSQL. Eventually it will also contain existing imagery and data about the test site such as satellite imagery, aerial photographs, IR imagery, topographic maps, and personal photographs collected in order to compile a comprehensive, site-specific database of information.

The visualization authoring system is a visualization toolkit for linking information to specific physical locations. Currently the system is based on a 2D map of the testbed site. By clicking on any point of the map, the user can insert an icon, attach a data file to the icon, and then publish the new data configuration for display to a WEM system user in the physical testbed. The system is also an important tool for previsualization of the site-specific environmental data.

  Authoring_Toolkit_for_Mixed_Reality-IWEC2002.pdf

  Mobile Environmental Data Gathering Project demo
Quicktime movie (81MB)
  Wearable Environmental Media Project demo
Quicktime movie (102MB)

Projects ·  Publications ·  Biography

sfisher@telepresence.com