The emergence of mixed reality and innovative display technologies promises to dismantle boundaries separating the digital and physical into seamless ubiquitous computing. The conceptual foundation was laid in the early 1990s when pioneers like Paul Milgram and Fumio Kishino framed a reality-virtuality continuum spanning completely real to completely virtual environments. In between emerged the idea of mixed reality (MR) blending both in fluid ratios of augmented virtuality and augmented reality. This framework set the stage for dynamic, contextually-aware interfaces situated across reality and cyberspace. VR (virtual reality) and AR (augmented reality) offered the first stepping stones toward commonplace mixed reality experiences. The rise of interaction modalities like touchscreens, natural gestures and spatial mapping unlocked more intuitive navigation of blended spaces. As technologies mature, MR aims to make interfaces disappear into the fabric of reality. Displays overlay graphics onto everyday physical items and environments, touchable interfaces allow direct manipulation of virtual content, and spatial audio completes multi-sensory immersion. The next era nears of ubiquitously integrated reality-virtuality, where computing melts invisibly into the foreground of life itself.

Jun Rekimoto · 01/04/2002
The heart of this novel HCI research by Rekimoto lies in the concept of SmartSkin, a unique system that introduces new ways to interact with computing machines beyond traditional inputs. It's a paramount work in the area of gesture-based interaction, steering modern touchscreen technology.
Impact and Limitations: SmartSkin cultivated the path for contemporary touchscreen technology and gestural interfaces seen in systems like smartphone touchscreens and interactive kiosks. Nevertheless, issues like gesture ambiguity and 'Gorilla Arm Syndrome' remain unaddressed. Future research could focus on mitigating these challenges and refining gesture recognition algorithms.

Paul Milgram, Fumio Kishino · 01/12/1994
Paul Milgram and Fumio Kishino's "A Taxonomy of Mixed Reality Visual Displays" is a foundational paper that has significantly influenced the HCI field, particularly in the understanding and classification of mixed reality (MR) systems. Published in 1994, it predates the current boom in virtual and augmented reality, setting the stage for future research and development.
Impact and Limitations: This paper's impact has been long-lasting, helping shape the way we understand, design, and implement mixed-reality systems today. However, it was formulated at a time when MR technologies were in their infancy. The taxonomy may need to be updated to address newer technologies and interaction paradigms, such as brain-computer interfaces or advanced haptics.