Projection augmented model

A projection augmented model (PA model) is an element sometimes employed in virtual reality systems. It consists of a physical three-dimensional model onto which a computer image is projected to create a realistic looking object. Importantly, the physical model is the same geometric shape as the object that the PA model depicts.

Uniting physical and virtual objects


Spatially augmented reality (SAR) renders virtual objects directly within or on the user's physical space.[1] A key benefit of SAR is that the user does not need to wear a head-mounted display. Instead, with the use of spatial displays, wide field of view and possibly high-resolution images of virtual objects can be integrated directly into the environment. For example, the virtual objects can be realized by using digital light projectors to paint 2D/3D imagery onto real surfaces, or by using built-in flat panel displays.

Real objects can be physically handled and naturally manipulated to be viewed from any direction, which is essential for ergonomic evaluation and provides a strong sense of palpability.[2] Although simulated haptic feedback devices enable some aspects of computer-generated objects to be touched, they can not match this level of functionality.[3] It is, therefore, unsurprising that physical objects are still used for many applications, such as product design.[4] However, computer-generated objects have a key advantage; they provide a level of flexibility that cannot be matched by physical objects. Therefore, a display is needed that somehow joins the real physical world and computer-generated objects together, thus enabling them to be experienced simultaneously.[5]

Tangible user interfaces (TUI) and augmented reality both aim to address this issue. TUI systems use real physical objects to both represent and also interact with computer-generated information (Figure 1). However, while TUIs create a physical link between real and computer-generated objects, they do not create the illusion that the computer-generated objects are actually in a user's real environment. That is the aim of augmented reality.


Figure 1 Continuum of advanced computer interfaces, based on Milgram and Kishino (1994).

Unlike virtual reality (VR), which immerses a user in a computer-generated environment, augmented reality (AR) joins together physical and virtual spaces by creating the illusion that computer-generated objects are actually real objects in a user's environment[6] (Figure 1). Furthermore, head-mounted-display based AR and VR systems can directly incorporate physical objects. Thus, as a user reaches out to a computer-generated object that they can see, they touch an equivalent physical model that is placed at the same spatial location.[7] Such systems enable the computer-generated visual appearance of the object to be dynamically altered, while the physical model provides haptic feedback for the object's underlying form. However, head-mounted-display based systems require users to wear equipment, which limits the number of people who can simultaneously use the display.

A variant of the AR paradigm that does not suffer from these limitations is spatially augmented reality (Figure 1).[8] Spatially augmented reality displays project computer-generated information directly into the user's environment.[9] Although there are several possible display configurations, the most natural type is the projection augmented model.

Projection augmented models



Figure 2 The Projection Augmented model concept

A projection augmented model (PA model) consists of a physical three-dimensional model, onto which a computer image is projected to create a realistic looking object (Figure 2). Importantly, the physical model is the same geometric shape as the object that the PA model depicts. For example, the image projected onto the objects shown in Figure 3 provides colour and visual texture, which makes them appear to be made from different materials.


Figure 3 An example of a Projection Augmented model (inset - with the projection off).

PA models use a unique combination of physical objects and computer-generated information, and hence they inherit advantages from both. “The human interface to a physical model is the essence of ‘intuitive’. There are no widgets to manipulate, no sliders to move, and no displays to look through (or wear). Instead, we walk around objects, moving in and out to zoom, gazing and focusing on interesting components, all at very high visual, spatial, and temporal fidelity”.[10] PA models combine the high level of intuitiveness of physical models with the flexibility and functionality of computer graphics, such as the ability to be quickly altered, animated, saved and updated (Jacucci, Oulasvirta, Psik, Salovaara & Wagner, 2005). Thus, a PA model essentially gives a physical form to a computer-generated object, which a user can touch and grasp with their bare hands. It is therefore unsurprising that user studies, which compared PA models to other Virtual and Augmented Reality displays, found PA models to be a natural and intuitive type of display (Nam & Lee, 2003; Stevens et al., 2002).

However, the PA model concept is not new. In fact, one of the first PA model type displays was created over twenty years ago when Naimark built the ‘Displacements’ art installation (Naimark, 1984) and more recently in the “Haunted Mansion” attraction in Disney World (Liljegren & Foster, 1990). At the time technology did not exist for a PA model to be much more than an artistic statement. However, given the technology available today and a little “unfettered imagination”, exploring novel projection displays is now “potentially boundless”.[11]

The growth in PA model technology has been marked by the recent recreation of Naimark's ‘Displacements’ installation at SIGGRAPH (Displacements, 2005). Specifically, new technology has been developed that semi-automates the process of both creating and aligning the physical model and projected image. This supports multiple projectors, which enables a PA model to be illuminated from every direction. Furthermore, powerful projectors (2000-3000 lumens) can be used to allow a PA model to be located in a well-lit room (Nam, 2005; Umemoro, Keller & Stappers, 2003). However, whilst this technology enables a PA model to be a viable and useful type of display, it does not address its main aim.

A PA model aims to create the illusion of actually being the object that it depicts. For example, when used for a product design application, it is important that a PA model provides a convincing perceptual impression of actually being the final product (Nam, 2006; Saakes, 2006; Verlinden, Horváth & Edelenbos, 2006; Keller & Stappers, 2001). Similarly, when used for a museum display application to create a replica of an artefact, a PA model aims to create the illusion of being the real artefact (Hirooka & Satio, 2006; Senckenberg Museum, 2006; Bimber, Gatesy, Witmer, Raskar & Encarnacao, 2002; Museum of London, 1999).

However, no previous research has specifically considered this illusion. Therefore, this thesis defines the ‘Projection Augmented model illusion’ as the situation in which a PA model is perceived to actually be the object that it depicts. For example, this illusion occurs when a user perceives the PA model in Figure 3 to be real bricks, flower pots, and pieces of wood, as opposed to white models with an image projected onto them. However, the essence of this illusion does not involve deceiving the user. A user can perceive a PA model to be the object that it depicts, whilst knowing that it is actually a white model and a projected image.

Technology has been developed to enhance this illusion by increasing the physical similarity between the PA model and the object that it depicts, or in other words, increasing the fidelity of the PA model. For example, the way in which the specular highlights on an object move as the viewer changes position can be dynamically simulated. This enables a PA model to appear to be made from a wide range of materials. For example, a dull clay vase can appear to be made from a shiny plastic material.

However, whether or not the PA model illusion occurs is entirely dependent on a user's subjective perceptual impression. Therefore, increasing the fidelity of different aspects of a PA model may each have a different effect on the strength of the illusion. This is essentially the same as the way in which increasing the fidelity of different aspects of a computer-generated photorealistic image, may each have a different effect on the degree to which the image is perceived to be a real photograph (Longhurst, Ledda & Chalmers, 2003; Rademacher, Lengyel, Cutrell, & Whitted, 2001). For example, increasing the fidelity of the textures in the image may typically be more important than increasing the fidelity of the shadows. It cannot therefore be assumed that increasing the fidelity of any aspect of a PA model will automatically strengthen the PA model illusion, and similarly it cannot be assumed that decreasing the fidelity of any aspect will automatically weaken it. Therefore, given that no previous research has investigated this illusion, it is difficult to determine the success of the technology that aims to enhance it, and difficult to make informed decisions when developing new technology. The capabilities of the human perceptual system should guide the development of any advanced interface (Stanney et al., 2004), hence this issue needs to be addressed.

Note: Projection Augmented models are sometimes referred to as 'Shader Lamps' (Raskar, Welch, Low & Bandyopadhyay, 2001, p. 89).

See also



  1. ^ "Scientific Commons: Spatially Augmented Reality (1998), 1998 [Ramesh Raskar, Greg Welch, Henry Fuchs]". Archived from the original on 2012-10-23. Retrieved 2010-12-27.
  2. ^ Ishii & Ullmer, 1997.[page needed]
  3. ^ Evans, Wallace, Cheshire & Sener, 2005; Baradaran & Stuerzlinger, 2005; Khoudja, Hafez & Kheddar, 2004[page needed]
  4. ^ Dutson & Wood, 2005.
  5. ^ Gibson, Gao & Campbell, 2004; Ishii & Ullmer, 1997.
  6. ^ Azuma et al., 2001
  7. ^ Whitton, Lok, Insko & Brooks, 2005; Billingshurst, Grasset & Looser, 2005; Borst & Volz, 2005; Lee, Chen, Kim, Han & Pan, 2004; Hoffman, Garcia-Palacios, Carlin, Furness & Botella-Arbona, 2003.[page needed]
  8. ^ Raskar, Welch, Fuchs, 1998.
  9. ^ Bimber & Raskar, 2005.
  10. ^ Raskar, Welch, Low & Bandyopadhyay, 2001, p.89
  11. ^ Naimark, 2005, p.605

Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., & MacIntyre, B. (2001). Recent Advances in Augmented Reality. IEEE Computer Graphics and Applications, 21(6), 34-47.

Baradaran, H., & Stuerzlinger, W. (2005). A Comparison of Real and Virtual 3D Construction Tools with Novice Users. In Proceedings of International Conference on Computer Graphics & Virtual Reality – CGVR’06 – part of 2006 World Congress in Computer Science, Computer Engineering, and Applied Computing - WORLDCOMP'06. World Academy of Science.

Billingshurst, M., Grasset, R., & Looser, J. (2005). Designing Augmented Reality Interfaces[dead link]. In Proceedings of Annual Conference on Computer Graphics and Interactive Techniques – SIGGRAPH’05 (pp. 17–22). New York: ACM Press.

Bimber, O., Gatesy, S., Witmer, L., Raskar, R., & Encarnacao, L. (2002). Merging Fossil Specimens with Computer-Generated Information. IEEE Computer, 35(9), 25-30.

Bimber, O., & Raskar, R. (2005). Spatial Augmented Reality: A Modern Approach to Augmented Reality. In Proceedings of Annual Conference on Computer Graphics and Interactive Techniques - SIGGRAPH’05. New York: ACM Press.

Borst, C., & Volz, R. (2005). Evaluation of a Haptic Mixed Reality System for Interactions with a Virtual Control Panel. Presence: Teleoperators and Virtual Environments, 14(6), 677-696.

Brooks, F. (1999). What's real about virtual reality? IEEE Computer Graphics and Applications, 19(6), 16-27.

Burdea, G., & Coffet, P. (2003). Virtual Reality Technology, 2nd Edition. Washington: Wiley-IEEE Press.

Cruz-Neira, C., Sandin, D., & DeFanti, T. (1993). Surround-screen projection-based virtual reality: the design and implementation of the CAVE. In Proceedings of Annual Conference on Computer Graphics and Interactive Techniques - SIGGRAPH’93 (pp. 135–142). New York: ACM Press.

"Michael Naimark: Interactive and Immersive Film Environments, 1977–1997. An Exhibition at Annual Conference on Computer Graphics and Interactive Techniques – SIGGRAPH'05". 2005. Archived from the original on 2015-11-05. Retrieved 2024-03-05.

Drettakis, G., Roussou, M., Tsingos, N., Reche, A., & Gallo, E. (2004). Image-based Techniques for the Creation and Display of Photorealistic Interactive Virtual Environments. In Proceedings of the 10th Eurographics Symposium on Virtual Environments – EGVE’04 (pp. 157–166).

Dutson, A., & Wood, K. (2005). Using rapid prototypes for functional evaluation of evolutionary product designs. Rapid Prototyping Journal, 11 (3), 125-11.

Evans, M., Wallace, D., Cheshire, D., & Sener, B. (2005). An evaluation of haptic feedback modelling during industrial design practice. Design Studies, 26,487-508.

"CAVE: The Most Widely Installed Fully Immersive Visualization System in the World". FakeSpace. 2006. Archived from the original on 2008-01-08. Retrieved 2006-09-20.

Fischer, J., Bartz, D., & Straßer, W. (2006). Enhanced Visual Realism by Incorporating Camera Image Effects. In Proceedings of International Symposium on Mixed and Augmented Reality - ISMAR’06. Washington: IEEE Computer Society Press.

Gibson, I., Gao, Z., & Campbell, I. (2004). A Comparative Study of Virtual prototyping and Physical Prototyping. International Journal of Manufacturing Technology and Management, 6(6), 503-522.

Hirooka, S., & Saito, H. (2006). Calibration Free Virtual Display System Using Video Projector onto Real Object Surface. IEICE-Transactions on Info and Systems - Special Section on Artificial Reality and Telexistence, E89-D(1), 88-97.

Hoffman, H., Garcia-Palacios, A., Carlin, C., Furness, T., Botella-Arbona, C. (2003). Interfaces that heal: Coupling real and virtual objects to cure spider phobia. International Journal of Human-Computer Interaction, 16, 283-300.

Ichida, H., Itoh, Y., Kitamura, Y., & Kishino, F. (2004). ActiveCube and its 3D Applications. In Proceedings of IEEE Virtual Reality Conference – VR’04. Washington: IEEE Computer Society Press.

Ishii, H., & Ullmer, B. (1997). Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms. In Proceedings Conference on Human Factors in Computing Systems – CHI-97 (pp. 234–241). New York: ACM Press.

Ishii, H., & Ullmer, B. (2001). Emerging Framework for Tangible User Interfaces. In J. Carroll (Eds.), Human-Computer Interaction in the New Millennium (pp. 579–601). Addison-Wesley.

Jacucci, G., Oulasvirta, A., Psik, T., Salovaara, A., & Wagner, I. (2005). Augmented reality painting and collage: Evaluating tangible interaction in a field study. In Proceedings of Tenth IFIP-TC13 International Conference on Human-Computer Interaction INTERACT'05 (pp. 43–56).

Keller, I., & Stappers, P. (2001). TRI: Inspiration Support for a design studio environment. International Journal of Design Computing, 3, 1-17.

Khoudja M., Hafez M., & Kheddar A. (2004). Tactile Interfaces. A State of the Art Survey. In Proceedings of 35th International Symposium on Robotics (pp. 721–726).

Kölsch, M., Bane, R., Höllerer, T., & Turk, M. (2006). Multimodal interaction with a wearable augmented reality system. IEEE Computer Graphics and Applications, 26(3), 62 -71.

Lee, S., Chen, T., Kim, J., Han, S., & Pan, Z. (2004). Affective Property Evaluation of Virtual Product Designs. In Proceedings of IEEE Virtual Reality Conference – VR’04 (pp. 207–216). Washington: IEEE Computer Society Press.

Lee, W., & Park, J. (2006) Augmented Foam: Touchable and Graspable Augmented Reality for Product Design Simulation. Bulletin of Japanese Society for the Design Science, 52(6), 17-26.

Liljegren, G., & Foster, E. (1990). Figure with Back Projected Image Using Fibre Optics. US Patent # 4,978.216, Walt Disney Company, Burbank California, USA, December 18, 1990.

Longhurst, P., Ledda, P., & Chalmers, A. (2003). Psychophysically based artistic techniques for increased perceived realism of virtual environments, In Proceedings of Proceedings of the 4th International Conference on Computer Graphics, Virtual Reality, Visualisation and Interaction in Africa - AFRIGRAPH '03 (pp. 123–132). New York: ACM Press.

Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems Special Issue on Networked Reality (E77D), 12, 1321-1329.

Naimark, M. (2005). Two Unusual Projection Spaces. Presence: Teleoperators and Virtual Environments, Special Issue on Projection, 14(5), 597-506.

Naimark, M. (1984). "'Displacements'. An exhibit at San Francisco Museum of Modern Art". Retrieved 2006-09-20.

Nam, T. (2005). Sketch-Based Rapid Prototyping Platform for Hardware-Software Integrated Interactive Products. In Proceedings of the Third Symposium on Applied Perception in Graphics and Visualization at SIGGRAPH – APGV’05 (pp. 1689–1692). New York: ACM Press.

Nam. T. (2006). Sketching for Hardware Software Integrated Interactive Product Design. In Proceedings Conference on Human Factors in Computer Systems - CHI’06, Workshop on “Sketching" Nurturing Creativity: Commonalities in Art, Design, Engineering and Research. New York: ACM Press.

Nam, T., & Lee, W. (2003). Integrating hardware and software: augmented reality based on prototyping method for digital products. In Proceedings of Conference on Human Factors in Computing Systems CHI’03 (pp. 956–957). New York: ACM Press.

Ni, T., Schmidt, G., Staadt, O., Livingston, M., Ball, R., & May, R. (2006). A Survey of Large High-Resolution Display Technologies, Techniques, and Applications. In Proceedings of IEEE Virtual Reality Conference – VR’06 (pp. 223–236). Washington: IEEE Computer Society Press.

Rademacher, P., Lengyel, J., Cutrell, E., & Whitted, T. (2001). Measuring the perception of visual realism in images. In Proceedings of the 12th Eurographics Workshop on Rendering Techniques (pp. 235–248). Springer.

Raskar, R., Welch, G., Low K., & Bandyopadhyay, D. (2001). Shader Lamps: Animating Real Objects With Image-Based Illuminations. In Proceedings of the 12th Eurographics Workshop on Rendering Techniques (pp. 89–102). Springer.

Saakes, D. (2006). Material light: exploring expressive materials. Personal Ubiquitous Computing, 10(2), 144-147.

"Dinosaur Fossil Exhibit" (PDF). Senckenberg Museum. 2006. Archived from the original (PDF) on 2006-05-17. Retrieved 2006-09-20.

Stanney, K., Samman, S., Reeves, L., Hale, K., Buff, W., Bowers, C., Goldiez, B., Nicholson, D., & Lackey, S. (2004). A paradigm shift in interactive computing: Deriving multimodal design principles from behavioural and neurological foundations. International Journal of Human-Computer Interaction, 17(2), 229-257.

Stevens, B., Jerrams-Smith, J., Heathcote, D., & Callear, D. (2002). Putting the Virtual into Reality: Assessing Object-Presence with Projection-Augmented Models. Presence: Teleoperators and Virtual Environments, 11(1), 79-92.

Umemoro, H., Keller, I., & Stappers, P. (2003). More light on your table: Table-sized Sketchy VR in support of fluid collaboration. In Proceedings of the 6th Asian Design International Conference.

Verlinden, J., Horváth, I., & Edelenbos, E. (2006). Treatise of technologies for interactive augmented prototyping. Proceedings of the 7th International Symposium on Tools and Methods of Competitive Engineering – TMCE’06. Rotterdam: Millpress.

Whitton, M., Lok, B., Insko, B., & Brooks, F. (2005). Integrating Real and Virtual Objects in Virtual Environments – Invited Paper. In Proceedings of HCI International Conference.

Other relevant publications


Bennett, E., & Stevens, B. (2006). The effect that the visual and haptic problems associated with touching a Projection Augmented model have on object-presence. Journal of Presence: Teleoperators and Virtual Environments, special edition of the best papers from the International Presence Conference, 15(4), 419-437, MIT Press.

Bennett, E., & Stevens, B. (2006). The ‘Detection, Perception and Object-Presence framework’: A unified structure for investigating illusory representations of reality. In Proceedings of SIGGRAPH's Computer Graphics and Applied Perception Symposium.