2 March 2010 | Bo Begole
What's the difference between Ubiquitous Computing ("ubicomp") and Augmented Reality ("AR")? I hear this question often, and you could replace "augmented reality" in that question with any of the following buzzy paradigms for people-interacting-with-computers: Virtual Reality, Pervasive Computing, Mobile Computing, Wearable Computing, Multi-Device Interaction, Cloud Computing, Intelligent Systems, Ambient Intelligence, Context-Aware Computing, Adaptive Systems, Machine Perception, Social Computing, Smart Environments, Everyware, and so on. For the most part, I don’t find formal definitions useful; you can call it whatever suits your fancy. All that matters is that I understand what you mean when you use a term and that you understand what I mean when I use it. The attributes of a definition that carry lasting meaning are not technological properties (performance, cost, size, distribution, latency), but the core capabilities that the paradigm enables for usage.
26 October 2009 | Maurice Chu
This face tracker outputs a cloud of points that map onto image coordinates, which can be used to determine the 3D orientation of a face relative to the video camera. The algorithm also overcomes the challenge of "deformable movements" such as glasses, eye blinks, and lips.
23 July 2009 | Lawrence Lee
Search engines like Google have trained us to believe we can find the answer to any question. Now activity streams from Twitter, Facebook, and others are changing our expectations around information yet again. We now demand information in real-time that’s socially and contextually relevant. Contextual information transforms our interactions within our physical environment... This area of research is called Augmented Reality, and it spans a wide spectrum of applications...
1 July 2009 | Bo Begole
We've found that there are certain types of information that shoppers need but still cannot get online. Certain kinds of tactile and physical information cannot easily be communicated electronically: texture, fit, drape, flow, movement, light refraction, heft, etc. So, people still visit stores to find out how things feel. But we can still help shoppers by supplementing their decision-making processes with electronic information.
30 June 2009 | Bo Begole
"Responsive Media" applications are one of the most exciting areas of current research in human-computer interaction. Based on technologies that can detect human response using cameras and other sensors to glean demographic data (gender, race, age) and physiological states (eye gaze, orientation, pupil dilation, skin temp, expression), these applications can be used for human-robot interaction, marketing, gaming, digital concierge avatars, and more.
augmented reality big data business models cloud computing contextual intelligence DARPA disruptive innovation electric vehicles ethnography future of maufacturing government ideation and beyond information overload intellectual property intelligent automation location-based long tail malware manufacturing MVP (minimum viable product) open innovation PARC Forum portfolio management printed electronics reading list real options recommendation systems social search startups Steve Jobs twitter Wikipedia Xerox
enter email to choose newsletters: