CN103858073B - Augmented reality device, method of operating augmented reality device, computer-readable medium - Google Patents

Augmented reality device, method of operating augmented reality device, computer-readable medium Download PDF

Info

Publication number
CN103858073B
CN103858073B CN201280048836.8A CN201280048836A CN103858073B CN 103858073 B CN103858073 B CN 103858073B CN 201280048836 A CN201280048836 A CN 201280048836A CN 103858073 B CN103858073 B CN 103858073B
Authority
CN
China
Prior art keywords
real
image information
user
gesture
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201280048836.8A
Other languages
Chinese (zh)
Other versions
CN103858073A (en
Inventor
I·卡茨
A·申弗尔德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eyesight Mobile Technologies Ltd
Original Assignee
Eyesight Mobile Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyesight Mobile Technologies Ltd filed Critical Eyesight Mobile Technologies Ltd
Priority to CN202210808606.2A priority Critical patent/CN115167675A/en
Publication of CN103858073A publication Critical patent/CN103858073A/en
Application granted granted Critical
Publication of CN103858073B publication Critical patent/CN103858073B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 -��G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present disclosure provides an augmented reality device, a method of operating an augmented reality device, a computer readable medium. Images of the real-world scene are obtained from one or more image sensors. The orientation and/or position of the image sensor is obtained from one or more state sensors. A real world object in the image of the real world scene is identified and data associated with the identified object is displayed on a viewing device, a predetermined pointing object performing a predetermined gesture on the real world object. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.

Description

Augmented reality device, method of operating augmented reality device, computer-readable medium
Technical Field
The invention relates to a method and a system for augmented reality.
Prior Art
References relevant to the background of the presently disclosed subject matter are listed below:
U.S. patent nos. 7126558;
U.S. published patent application 20110221669;
U.S. published patent application 20110270522;
GB2465280(A);
U.S. published patent application 20120068913;
U.S. patent nos. 7,215,322;
WO2005/091125;
WO 2010/086866
crowley, j.l. et al, "Finger Tracking as an Input Device for Augmented Reality". It was published in the book of Zurich, Switzerland 6.1995, International Workshop on Face and Gesture Recognition.
The identification of the above references should not be inferred to mean that these are in any way relevant to the patent of the presently disclosed subject matter.
Background
Augmented reality is a term for a real-time, direct or indirect view of a physical, real-world environment, elements of which are augmented by computer-generated information, such as text, sound, video, graphics, or GPS data. Artificial information about the environment and its objects is thus overlaid on the real world view or image. Augmentation is typically done in real-time and in the semantic context of environmental factors, so that information about the user's surrounding real world becomes interactive and digitally actionable.
The main hardware components for augmented reality are processors, displays, sensors and input devices. These elements, in particular the CPU, display, camera and MEMS sensor (e.g. accelerometer, GPS or solid state compass) are present in portable devices such as smartphones, allowing them to act as augmented reality platforms.
Augmented reality systems have been widely used in entertainment, navigation, assembly processes, maintenance, medical procedures. Portable augmented reality systems have also been widely used in tourist sightseeing, where augmented reality is used to present information of real world objects and location objects being viewed.
An immersive augmented reality experience is provided using a head mounted display, typically in the form of goggles or a helmet. In the case of a head-mounted display, a virtual visual object is superimposed on a view of the user's real-world scene. The head mounted display is tracked with sensors that allow the system to align virtual information with the physical world. For example, tracking may be performed using any one or more of the technologies such as digital cameras or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID, and wireless sensors. The head mounted display is either light see-through or video see-through. Optical fluoroscopy uses solutions such as half-silvered mirrors to pass the image through the lens and cover the information to be reflected to the user's eyes, and transparent LCD projectors that display digital information and images directly or indirectly to the user's retina.
Disclosure of Invention
The invention provides an interactive system for augmented reality. The interactive system of the present invention comprises a wearable data display device that may, for example, incorporate a pair of glasses or goggles. The wearable display has a device (e.g., GPS) and compass that provide location extraction functionality. The system also includes a user interface that allows a user to select computer-generated data to augment the real-world scene viewed by the user. The camera obtains an image of the real-world scene being viewed. The processor detects a predetermined object, such as a user's finger, in an image of a real-world scene captured by the camera. When a user points at an element in a scene, data relating to the element is displayed on a data display device and superimposed into the user's view of the scene.
Accordingly, in one aspect, the invention provides a method for augmented reality, comprising:
(a) obtaining an image of a real-world scene from one or more image sensors;
(b) obtaining one or both of orientation and position data of the image sensor from one or more state sensors;
(c) identifying a real world object in the image of the real world scene obtained by the one or more image sensors, a predetermined pointing object performing a predetermined gesture on the real world object, the gesture detection module utilizing data provided by the one or more status sensors; and
(d) Presenting data associated with the identified object on a display of a viewing device.
The image sensor may be selected from: a camera, a light sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a Short Wave Infrared (SWIR) image sensor or reflection sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, and a reflection sensor. One or more of the condition sensors may be selected from: an optical sensor, an accelerometer, a GPS, a gyroscope, a compass, a magnetic sensor, a sensor indicating the direction of the device relative to the Earth's magnetic field, a gravity sensor, and an RFID detector.
The data associated with the identified object may be obtained by searching a memory for data associated with the real-world object.
The predetermined object may be, for example, a hand, a part of a hand, two hands, parts of two hands, a finger, a part of a finger, or a fingertip.
The viewing device may be configured to be worn by a user, such as glasses or goggles. The viewing device may be incorporated into a mobile communication device.
The step of identifying in the image of the real-world scene obtained by the one or more image sensors may comprise: determining a position (X, Y) of the predetermined object in an image obtained by the image sensor; and determining one or both of a position and an orientation of the display device provided by the sensor.
The method of the present invention may further comprise: communicating with an external device or website. The communication may include: sending a message to an application running on the external device, a service running on the external device, an operating system running on the external device, a program running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or one or more services running on the external device. The method may further comprise: sending a message to an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, a program running on the mobile communication device, one or more applications running on a processor of the mobile communication device, a software program running in the context of the mobile communication device, or one or more services running on the mobile communication device.
The method may further comprise: sending a message from an application running on the external device, a service running on the external device, an operating system running on the external device, a program running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, the message requesting data related to real world objects identified in an image, or sending the message to one or more services running on the external device. The method may further comprise: sending a message from an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, a program running on the mobile communication device, one or more applications running on a processor of the mobile communication device, a software program running in the context of the mobile communication device, the message requesting data relating to real world objects identified in an image, or sending the message to one or more services running on the mobile communication device.
The message to the external device or website may be a command. The command may be selected from: a command to run an application on the external device or website, a command to stop an application running on the external device or website, a command to activate a service running on the external device or website, a command to stop a service running on the external device or website, or a command to send data related to a real-world object identified in an image.
The message to the mobile communication device may be a command. The command may be selected from: a command to run an application on the mobile communication device, a command to stop an application running on the mobile communication device or a website, a command to activate a service running on the mobile communication device, a command to stop a service running on the mobile communication device, or a command to send data related to a real world object identified in an image.
The method may further comprise: receiving data relating to real world objects identified in an image from the external device or website; and presenting the received data to a user.
Communication with the external device or website may be via a communication network.
The command to the external device may be selected from: pressing a virtual key displayed on a display device of the external device; rotating the selection dial; switching a desktop, and running a predetermined software application on the external device; closing an application on the external device; turning on or off the sound box; turning up or turning down the volume; locking the external device, unlocking the external device, jumping to another track of a media player or switching between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, and displaying the notification; browsing a gallery of photographs or music albums, scrolling a web page, presenting an email, presenting one or more documents or maps, controlling an action in a game, pointing at a map, zooming in or out of a map or image, coloring on an image, grabbing and pulling an activation icon from the display device, rotating an activation icon, simulating a touch command on the external device, executing one or more multi-touch commands, touch gesture commands, typing, clicking on a display video to pause or play, marking a frame or capturing a frame from a video, presenting an incoming message; answering the incoming call, muting or refusing to answer the incoming call, and opening an incoming call prompt; presenting a notification received from a web community service; presenting a notification generated by the external device, opening a predetermined application, changing a lock mode of the external device and opening a last call application, changing a lock mode of the external device and opening an online service application or browser, changing a lock mode of the external device and opening an email application, changing a lock mode of the external device and opening an online service application or browser, changing a lock mode of the device and opening a calendar application, changing a lock mode of the device and opening a reminder application, changing a lock mode of the device and opening a user-set, manufacturer-set or service operator-set predetermined application of the external device, activating an activation icon, selecting a menu item, moving a pointer on a display, manipulating a touch-free mouse on a display, a method of operating a touch-free mouse, a computer program, and a computer program, Activating an icon, changing information on the display.
In the method of the present invention, the predetermined gesture may be selected from: a page-flip gesture, a pinch motion of two fingers, a pointing direction, a left-to-right gesture, a right-to-left gesture, an up gesture, a down gesture, a press gesture, an open closed fist and move towards the image sensor, a tap gesture, a wave gesture, a clap gesture, a reverse clap gesture, an open finger gesture, a reverse open finger gesture, a finger activation icon, a hold activation object for a predetermined amount of time, a click activation icon, a double click activation icon, a right click activation icon, a left click activation icon, a lower click activation icon, an upper click activation icon, a grab activation icon, the object, a right click activation icon, a left activation icon, a push object, a drum, a wave gesture above an activation icon, Performing an explosion gesture, performing a tap gesture, performing a clockwise or counterclockwise gesture on the activation icon, sliding the icon, grabbing the activation icon with two fingers, and performing a click drag-and-drop motion.
The data associated with the identified object may be any one or more of visual data, audio data, or textual data. The data associated with the identified object may be an activation icon. The activation icon may be a 2D or 3D activation icon. The activation icon may be perceived by a user in 3D space in front of the user.
The method of the invention may have two or more modes of operation. The method may change the operating mode of the system after recognizing a predetermined gesture. The mode of operation may be specified by any one or more of the following: an algorithm to be validated on the gesture detection module for the gesture to be recognized; a resolution of an image captured by the image sensor, and a rate of image capture captured by the image sensor, the level of detail of the data to be presented, the activation icon to be presented to the user, a source of data to be presented, the level of detail of the data to be presented, an activation icon to be displayed on the display device, an active online service.
The operating mode may be a mode selected from: a mode in which the image sensor video records images after recognizing a predetermined gesture; a mode in which the microphone records sound after a predetermined gesture is recognized and stops recording after another predetermined gesture is recognized; a mode of continuously monitoring video or sound and recording the video or sound after detecting a predetermined gesture starting a predetermined amount of time before the gesture is recognized and stopping the recording after recognizing another predetermined gesture; a mode of adding a tag to the captured and real-time recorded video after recognizing the predetermined gesture; a mode of selecting and resizing a region in the field of view captured by the camera and copying the region to another location in the field of view; using a tracker on a selected area in an image and presenting a pattern of the selected area in real-time in the resized and repositioned area on the display device; a pattern of images is captured after the predetermined gesture is recognized.
The method of the present invention may further comprise: running a tracking algorithm that tracks the identified real-world object and maintains the displayed relevant visual data in a fixed position relative to the identified real-world object.
The object identification module may be operative to detect the predetermined object only if the display device has a motion level below a predetermined threshold.
The method may further comprise: feedback is provided when a predetermined gesture has been recognized. The feedback may be, for example, visual feedback, auditory feedback, tactile feedback, directional vibration, aero-tactile feedback, or ultrasonic feedback. The feedback may be a visual indication in a form selected from: an active icon displayed on the display device, a change in color of an active icon displayed on the display device, a change in size of an active icon displayed on the display device, an animation of an active icon displayed on the display device, an indicator light, an indicator moving on a display device, an indicator moving on the display device appearing on top of all other images or videos appearing on the display device, and the appearance of a glow around the predetermined object. The feedback may be a vibration, a directional vibration indication, or an aero-tactile indication.
In the method of the invention, the part of the activation icon displayed on the display device is not presented at the location where the predetermined object is located, so that the predetermined object appears to be on top of the activation icon.
The active icon may be deleted from the display device when the display device has an activity level above a predetermined threshold. For example, the deleted icon on the display device may be deleted when the display device has a motion level below the predetermined threshold.
The method may be brought into the active mode when a predetermined action is performed. The predetermined action may be selected from: bringing the predetermined object from below into the field of view, e.g. pointing at the lower right corner of the camera field of view or opening a hand in the camera field of view, when the user puts the predetermined object in a certain position or a bunch; when an active icon is displayed and the user performs a predetermined gesture associated with the active icon, such as pointing at the active icon, a predetermined gesture is performed, such as moving a hand from right to left through the field of view, or performing a waving gesture at the location where the active icon is presented, or sliding the floating active icon from one location to another location by performing a gesture in the 3D space at a location where the active icon is perceived to be, by touching the device, or tapping on the device if the device has an accelerometer. As another example, if the device has a proximity sensor or an ultrasonic sensor, the system may enter the active mode when the user's hand is near the device. The system may also be activated by voice command or when the user places the predetermined object in a particular position in the field of view. As another example, the system may enter the active mode only if there is relevant data associated with the real world in the field of view of the user. At this point, the system may indicate to the user when there is relevant data to be presented, or when it is ready for interaction.
The method of the present invention may further comprise: a visual indication is attached to the real world object to indicate that there is memory of data related to the real world object. The visual indication may be overlaid on an image of the real-world object. The visual may be selected from the group consisting of an activation icon, a photograph, and an image of an envelope.
The method of the present invention may further comprise: a calibration process recording one or more physical parameters of the predetermined object. The calibration process may include any one or more steps selected from: presenting activation icons on the display at different locations in 3D space; extracting physical characteristics of the predetermined object; and determining a correlation between the size of the predetermined object and its distance from the camera. The calibration process may include the steps of: constructing a triangle with its vertex on one of the image sensors and at the front of the predetermined object and its sides formed by the user's line of sight. The distance of the real-world object from the camera may be estimated based on information extracted in the calibration.
The method may further comprise: a keyboard capable of text entry is displayed. The keyboard may be displayed after detecting a predetermined gesture, such as a right-to-left gesture, presenting an open hand, presenting two open hands in a predetermined area of the field of view of the image sensor. The keyboard may be displayed after a tap gesture is performed in a 3D typing area or location where a predetermined activation icon is perceived to be.
The present invention also provides a system comprising an apparatus configured to perform the method of the present invention.
The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer. The computer program may be embodied on a computer readable medium.
The user may interact with the visual images, which are typically displayed through glasses. Thus, the user's real view is augmented by the information presented on the display. One problem with augmented reality devices is the way in which a user interacts with and controls the device. Conventional control devices such as a mouse, trackball, or touch screen are difficult to use with augmented reality devices. Using gesture recognition in an augmented reality system is not simple because the user and therefore the augmented reality device are moving in real time on a continuous basis.
The invention thus provides a computer program product comprising instructions for causing a processor to perform a method comprising the steps of:
receiving image information associated with an environment from an image sensor associated with an augmented reality device;
displaying, on a display associated with the device, enhanced information related to the environment;
Recognizing a gesture of a device user in the image information;
associating the gesture with the augmented information; and
the displayed augmentation information is changed based on the association.
The enhancement information may include at least one of: information associated with an object in the environment; an image associated with an environment; and a distance associated with the environment.
The associating may include: determining a reference position in three-dimensional space of at least a portion of a user's hand; and determining at least one of enhancement information and image information data associated with the reference location.
The changing may include: the augmentation information is changed according to data associated with the reference location.
Drawings
In order to understand the invention and to see how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
fig. 1 schematically shows a system for augmented reality according to an embodiment of the invention;
fig. 2 shows a system for augmented reality according to an embodiment of the invention, the system comprising a set of goggles;
FIG. 3 shows the system of FIG. 2 in use;
FIG. 4a shows a view of a real world scene displayed on a display device of the system of FIG. 2; FIG. 4b shows the view of FIG. 4a with the user's finger pointing at an object in the view; FIG. 4c shows visual text relating to the object pointed to by the user's finger overlaid on the view of FIG. 4 b;
Fig. 5 shows a system for augmented reality integrated with a communication device according to another embodiment of the invention; and
FIG. 6a illustrates the designation of a region in the field of view of the image sensor by the user performing a gesture that "draws" a region outline; FIG. 6b shows resizing the selection area by performing a second gesture; FIG. 6c shows the region after resizing; and figure 6d shows the area after dragging to a new position in the field of view.
Detailed Description
Fig. 1 schematically shows a system 30 for augmented reality according to an embodiment of the invention. The system 30 includes one or more image sensors 32, the image sensors 32 configured to obtain images of a real-world scene. Any type of image sensor may be used in the system of the present invention, such as a camera, light sensor, IR sensor, ultrasonic sensor, proximity sensor, CMOS image sensor, Short Wave Infrared (SWIR) image sensor, or reflective sensor.
The system 30 further includes a viewing device 34 having one or more display devices 35, the display devices 35 enabling a user to see the real world scene and external information, such as images, video or audio signals, superimposed on the real world scene. Any type of display device that allows a user to see the real world scene and the displayed data may be used in the system of the present invention.
The display device 35 may, for example, include a surface on which visual material is presented to the user or one or more projectors that display images directly to the user's retina. Processor 36 obtains direction and/or location data for system 30 from one or more state sensors 38, for example, state sensors 38 may be any one or more of optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, magnetic sensors, gravity sensors, and RFID detectors. The processor 36 may be, for example, a special purpose processor, a general purpose processor, a DSP (digital signal processor) processor, a GPU (visual processing unit) processor, dedicated hardware, or a processor operable on an external device. The system 30 may run as software on the viewing device 34 or another device 37 (e.g., a smartphone) that incorporates other components of the system 30.
The processor 36 is configured to run a gesture detection module 40, the gesture detection module 40 identifying one or more real world objects to which the predetermined object is pointing in the image of the real world scene obtained by the image sensor 32. The real world object may be, for example, a building or a billboard. The determination of real world objects uses data provided by the status sensor 38. The predetermined object may be a user's finger or other object such as a stylus or wand.
When the processor 36 has identified a real world object to which the predetermined object is pointing, the processor searches the memory 42 for data associated with the identified object. The data may be, for example, visual data, audio data, or textual data. The visual data may be textual information relating to the identified object. The processor then displays the relevant visual data associated with the identified object on a display of the viewing device. The memory 42 may be integral to the system 30 or may be remotely located and accessed via a communications network, such as the internet. The system 30 may thus include a communication module 39, the communication module 39 allowing the system 30 to communicate with a network, a wireless network, a cellular network, an external device (e.g., another device 30, a cell phone, a tablet), or an internet website, etc.
The data may be an activation icon. As used herein, the term "activation icon" represents an area in an image or video associated with one or more messages or commands activated by user interaction. The activation icon may be, for example, a 2D or 3D visual element, such as a virtual button, a virtual keyboard, or an icon. The activation icon is activated by means of one or more predetermined objects which are recognizable by the system and which may for example be one or more of a stylus, a hand or a part of a hand of a user, one or more fingers or a part of a finger, e.g. a fingertip. Activation of one or more of the activation icons by the predetermined object generates a message or command that is directed to the operating system, one or more services, one or more applications, one or more devices, one or more remote applications, one or more remote services, or one or more remote devices.
The processor 36 may be configured to send messages or commands to the device 37 or a remote device, an application running on the device, a service running on the device 37, and an operating system running on the device, a program running on the device, a software program running in the background, and one or more services running on the device, or processes running in the device. The message or command may be sent over a communication network such as the internet or a cellular telephone network. The command may be, for example, a command to run an application on the device, a command to stop an application running on the device, a command to activate a service running on the device, a command to stop a service running on the device, or a command to send data to the processor 36 relating to a real-world object recognized in the image by the processor 36.
The command may be a command to the device 37, such as pressing a virtual key displayed on a display device of the device; rotating the selection dial; switching desktops and running predetermined software applications on the equipment; closing an application on the device; turning on or off the sound box; turning up or turning down the volume; locking the device, unlocking the device, jumping to another track of the media player or switching between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, and displaying the notification; browsing a gallery of photographs or music albums, scrolling a web page, presenting an email, presenting one or more documents or maps, controlling actions in a game, controlling interactive video or animation content, editing a video or image, pointing to a map, zooming in or out a map or image, coloring on an image, pulling an activation icon away from a display device, grabbing and pulling an activation icon from a display device, rotating an activation icon, simulating a touch command on a device, executing one or more multi-touch commands, touch gesture commands, typing, clicking on a display video to pause or play, editing a video or music command, marking a frame or capturing a frame from a video, cutting a subset of a video from a video, presenting an incoming message; answering the incoming call, muting or refusing to answer the incoming call, and opening an incoming call prompt; presenting a notification received from a web community service; presenting a notification generated by a device, changing a lock mode of the device and activating a last call application, changing a lock mode of the device and activating an online service application or browser, changing a lock mode of the device and activating an email application, changing a lock mode of the device and activating an online service application or browser, changing a lock mode of the device and activating a calendar application, changing a lock mode of the device and activating a reminder application, changing a lock mode of the device and activating a predetermined application set by a user, set by a manufacturer of the device or set by a service operator, activating an activation icon, selecting a menu item, moving a pointer on a display, manipulating a touch-free mouse, activating an activation icon on a display, and changing information on the display.
The communication module may be used to send messages that may be located to a remote device, for example. The message may be, for example, a command to a remote device. The command may be, for example, a command to run an application on the remote device, a command to stop an application running on the remote device, a command to activate a service running on the remote device, a command to stop a service running on the remote device. The message may be a command to the remote device selected from the group consisting of: pressing a virtual key displayed on a display device of a remote device; rotating the selection dial; switching a desktop, and running a predetermined software application on the remote device; closing an application on the remote device; turning on or off the sound box; turning up or turning down the volume; locking the remote device, unlocking the remote device, jumping to another track of the media player, or switching between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, and displaying the notification; browsing a gallery of photographs or music albums, scrolling a web page, presenting an email, presenting one or more documents or maps, controlling actions in a game, pointing at a map, zooming in or out of a map or image, coloring on an image, grabbing and pulling an activation icon from a display device, rotating an activation icon, simulating a touch command on a remote device, executing one or more multi-touch commands, touch gesture commands, typing, clicking on a display video to pause or play, marking a frame or capturing a frame from a video, presenting an incoming message; answering the incoming call, muting or refusing to answer the incoming call, and opening an incoming call prompt; presenting a notification received from a web community service; presenting a notification generated by the remote device, opening a predetermined application, changing a lock mode of the remote device and opening a last call application, changing a lock mode of the remote device and opening an online service application or browser, changing a lock mode of the remote device and opening an email application, changing a lock mode of the remote device and opening an online service application or browser, changing a lock mode of the device and opening a calendar application, changing a lock mode of the device and opening a reminder application, changing a locked mode of the device and opening a predetermined application set by a user, set by a manufacturer of the remote device, or set by a service operator, activating an activation icon, selecting a menu item, moving a pointer on a display, manipulating a touch-free mouse, activating an icon on a display, changing information on a display.
The message may be a request for data associated with the identified object. The data request message may be located to or from an application, service, process, thread running on the device, or an online service.
In order to reduce CPU resources, an object recognition module that detects a predetermined object may be used only when the headset does not move significantly as determined by information obtained from the state sensor.
Fig. 2 shows a system 2 for augmented reality according to an embodiment of the invention. The system 2 includes a portable viewing device, which may be, for example, an interactive head-mounted eyepiece, such as a pair of glasses or goggles 4. The goggles 4 have an image sensor 6 that obtains images of a real-world scene 8. Scene 8 may include, for example, one or more buildings 12, or one or more billboards 14. The eyewear may have one or more display devices 10, the display devices 10 being located in the eyewear 4 so as to be in front of the user's eyes when the user wears the eyewear 4. The display device 10 may be, for example, a see-through device, such as a transparent LCD screen, through which the real world scene is viewed and external data is presented. The system 2 further includes a processor 16, the processor 16 being configured to identify a predetermined object in the image captured by the image sensor 6 that performs a gesture or points to a real world object in the real world scene 8 or an activation icon displayed to the user. The system 2 also includes one or more position and/or orientation sensors 23, such as a GPS, accelerometer, gyroscope, solid state compass, magnetic sensor, or gravity sensor.
Fig. 5 shows a system 40 for augmented reality according to another embodiment of the invention. The system 40 is integrated into a mobile communication device 42, such as a cell phone, tablet or camera. Fig. 5a shows a front view of the communication device 42, while fig. 5b shows a rear view of the communication device 42. On the back side of the communication device 42 there is an image sensor 46 for obtaining an image of the real world scene, the image sensor 46 being opposite the display device. The front of the communication device 42 also has a display device 48 thereon, the display device 48 being positioned in front of the user when the camera 46 is facing a real world scene. The display device 48 may be, for example, an LCD screen that presents to the user images of the real world scene obtained by the camera 6, as well as visual data as explained below. The system 40 uses a camera 46, a display device 48, and a processor of the communication device 42, and also includes one or more status sensors contained in a housing of the communication device 42 that is not shown in fig. 5. The processor is configured to identify a predetermined object in the image captured by the image sensor 46 that is directed to a real-world object in the real-world scene.
Fig. 3a shows the system 2 in use. The goggles 4 are placed over the eyes of the user 18. The user faces the real world scene 8 and thus views the scene 8. Fig. 3b shows the system 40 in use. The user 18 holds a communication device 42. the communication device 42 has an image sensor 46 facing the real world scene 8 and a display device 48 facing the user.
The system 2 or 40 now performs the following procedure. A view of the scene 8 that will be viewed by the user when using the system 2 or 40 is displayed on a display device. Fig. 4a shows a view of the scene 8 that a user would see when viewing the real world scene 8 using the system 2 or 40. The processor 36 analyzes the images obtained by the image sensor to determine when a predetermined object in the images captured by the image sensor performs a predetermined gesture in relation to a real world object in the real world scene 8.
The viewing device 34, such as the visor 4 or the communication device 42, is not fixed in use due to movement that occurs as the user walks or movement of the user's head or hands. In such a case, the signal generated by the sensor 38 may be noisy and inaccurate. In this case, the machine vision module 37 runs a tracking algorithm that tracks the identified real-world object and maintains the displayed relevant visual data in a fixed position relative to the identified real-world object.
The predetermined gesture related to the real world object or the activation icon may, for example, point to or perform a page turn gesture on the real world object or the activation icon. The activation icon may or may not be related to the real world object.
Other possible predetermined gestures include a page-flip gesture, a pinching motion of two fingers (e.g., index finger and thumb or middle finger and thumb), a pointing, a left-to-right gesture, a right-to-left gesture, an upward gesture, a downward gesture, a pressing gesture, opening a closed fist and moving towards an image sensor, a tapping gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, a hand-making fist, a pinching gesture, a reverse opposite pinching gesture, an opening of fingers gesture, a reverse opening of fingers gesture, pointing to activate an icon or a real-world object for a predetermined amount of time, clicking to activate an icon or a real-world object, double-clicking to activate an icon or a real-world object, clicking to activate an icon or a real-world object with an index finger, clicking to activate an icon or a real-world object with a middle finger, clicking to activate an icon or a real-world object from below, a mouse button, a mouse, a display, a mouse, Clicking on an active icon, grabbing an active icon or real world object from above, pointing from the right, pointing from the left, activating an icon or real world object, passing through an active icon or real world object from the left, pushing an active icon or real world object, clapping or waving a hand over an active icon or real world object, performing an explosion gesture, performing a tap gesture, performing a clockwise or counterclockwise gesture on an active icon or real world object, sliding an active icon or real world object, grabbing an active icon or real world object with two fingers, or performing a click-drag-release motion.
The predetermined object may be, for example, a user's hand, a portion of a user's hand, such as a user's finger 20, or portions of two different hands. Alternatively, the predetermined object may be a stylus or wand.
When the processor 16 determines that the predetermined gesture has been performed, this may be indicated to the user by any type of feedback, such as visual feedback, auditory feedback, tactile feedback, directional vibration, aero-tactile feedback, or ultrasonic feedback. The feedback may be a visual indication in a form selected from: an activation icon displayed on the display device, a change in a color of an activation icon displayed on the display device, a change in a size of an activation icon displayed on the display device, an animation of an activation icon displayed on the display device, an indicator light, an indicator moving on the display device, a vibration, a directional vibration indication, an aero-tactile indication. The indication may be provided by a pointer moving on the display device appearing on top of all other images or videos appearing on the display device. The visual feedback may be the appearance of glow around the predetermined object when the system recognizes the predetermined object.
The gesture detection module 40 may use any method for detecting a predetermined object in an image obtained by the image sensor 32. For example, the gesture detection module may detect a predetermined object as disclosed in WO2005/091125 or WO 2010/086866.
The processor 16 is also configured to determine real world objects in the scene 8 on which the predetermined gesture is performed. Thus, for example, in the image shown in FIG. 4b, the processor 16 will determine that the user's finger 20 is pointing at the billboard 14 by determining the location of the finger tip (X, Y) in the image and combining this information with the user's location from the status sensor 21 and the direction of the goggles 4. The real world object is thus recognized by the processor without presenting a cursor or other marker to the user to indicate the real world object that the user wishes to select, thereby enabling direct pointing at the real world object to initiate interaction. The processor 16 searches for data relating to the real world object pointed to by the user's finger 20 in a memory, which may be integral to the processor 16 or may be located remotely. For example, the memory may have stored data relating to the billboard 14. When the user points to an object in the scene 8 where the data is stored in memory or retrieved from a remote server, such as a web site, the data is displayed on the display device 10 superimposed on the user's view of the scene. Thus, when the user points at the billboard 14 (FIG. 3), visual data 21 relating to the billboard 14 is displayed on the display device 10, as shown in FIG. 4 c.
The visual data 21 may be static or dynamic. The visual data 21 may include one or more activation icons such that when a predetermined gesture is performed with respect to one of the activation icons, a command associated with the activation icon is executed. The command may be, for example, to display specific visual material related to the selected real-world object. The activation icon may be a 2D or 3D activation icon and may be presented to the user such that the user perceives the icon in 3D space in front of him. As used herein, an activation icon is an area in a 2D or 3D image or video associated with one or more messages that are interactively activated by a user. The activation icon may be, for example, a 2D or 3D visual element. The activation icon may be a virtual button, a virtual keyboard, a 2D or 3D activation icon, an area in an image or video. The activation icons may include two or more activation icons.
The processor may not present the portion of the activation icon where the predetermined object is located such that the predetermined object appears to be on top of the activation icon. The active icon may be deleted when the user moves his head quickly and then returns when the head movement is below a predetermined movement speed.
The system 2 may have two or more modes of operation, and the processor 16 may be configured to recognize one or more predetermined gestures to change between modes of operation. Thus, the gesture may be used to turn the system on or off, select a source of visual material to be presented, select a level of detail of visual material to be presented, select a button or activate an icon to be presented to the user, or activate an online service, such as an online service with respect to a selected real-world object. Another mode of operation may be to start video recording of images with the image sensor and/or recording of sounds with the microphone after a predetermined gesture is recognized, and stop recording after another predetermined gesture is recognized. Another mode of operation is to continuously monitor video and/or sound, but after a predetermined gesture is detected, record video/sound beginning a predetermined amount of time before the gesture is recognized, and stop recording after another predetermined gesture is recognized. The predetermined time may be user defined. Another mode of operation is to add tags to the captured and real-time recorded video after the predetermined gesture is recognized.
Fig. 6a to 6d show another mode of operation. In fig. 6a, a region 62 in the field of view 60 captured by the image sensor is designated by the user performing a gesture that "draws" the region outline, as shown by the dashed line in fig. 6. The selection area is then resized by the user performing a second gesture, e.g. separating or bringing the two fingers closer together as indicated by arrow 66 in fig. 6b, until the selection area reaches the desired size (67 in fig. 6 c). The area 67 is then dragged to a new position in the field of view (fig. 6d) and copied to the new position in the field of view. The system then uses the tracker on the selected area, and the selected area is presented in real-time in a user-set resizing and repositioning area on the display device.
To minimize CPU resources, for each displayed activation icon, the image area containing the displayed activation icon bounding box around the displayed activation icon may be defined to remain unchanged. The system tracks this bounding box using a machine vision tracker. The distance between the positions of the bounding boxes in two frames of the video sequence is less than a predetermined distance determined using a video tracker and the correlation value of the trackers of the bounding boxes is below a predetermined value.
When the system is in an operating mode in which only the activation icon is active and the real world object is not active, the CPU may minimize by searching only for predetermined objects in the vicinity of each displayed activation icon. To further reduce the CPU, the object recognition module is activated only if the headset is determined not to have moved significantly as determined from the information obtained from the status sensor.
The user may select different filters to filter data related to real world objects, such as filters that "show data generated only by friends," or that show data from registered sources or data generated in the past three months.
System 2 may have a standby mode in which power consumption of system 2 is minimal. For example, the active mode may differ from the standby mode in the following respects: the number of video frames per second analyzed by the system, the image resolution analyzed, the portion of the image frame analyzed, and/or the detection module activated. The system 2 may be brought into the active mode by any technique. For example, system 2 may be brought into an active mode by: bringing the predetermined object from below into the field of view when the user places the predetermined object in a certain position or a bunch, e.g. pointing to the lower right corner of the camera field of view or opening a hand in the camera field of view; when the activation icon is displayed and the user performs a predetermined gesture associated with the activation icon, such as pointing at the activation icon, the predetermined gesture is performed, such as moving a hand from right to left through the field of view, or performing a waving gesture at the location where the activation icon is presented, or sliding the floating activation icon from one location to another by performing a gesture in 3D space at a location where the activation icon is perceived to be, by touching the device, or tapping on the device if the device has an accelerometer. As another example, if the device has a proximity sensor or an ultrasonic sensor, the system may enter an active mode when the user's hand is near the device. The system may also be activated by voice command or when the user places a predetermined object in a specific position in the field of view. As another example, the system may enter the active mode only if there is relevant data associated with the real world in the user's field of view. At this point, the system may indicate to the user when there is relevant data to be presented, or when it is ready to interact.
A visual indication may be attached to the real-world object to let the user know that there is data related to the real-world object.
The indication of the relevant data may be overlaid on the location of the real world object, as a small visual indication, such as activating icon "i", may indicate information, while the indicia of "photo" may indicate an image related to the real world object, or the indicia of "envelope" indicates a message left by a friend or other user related to the real world object. The data may be presented when the user performs a predetermined gesture associated with activating the icon.
The system 2 may be configured to go through a calibration process to record various physical parameters of the predetermined object to facilitate the processor 2 in identifying the predetermined object in the image obtained by the camera. This can be done, for example, by: presenting an activation icon to a user at different locations in 3D space on a display; extracting a physical characteristic of the predetermined object, such as a size or a direction of the predetermined object; and determining a correlation between the size of the predetermined object and its distance from the camera. Calibration may include calculating the triangle of the camera, the line of sight of the user, and the front end of the predetermined object to determine the pointing direction of the user. Accuracy is improved by estimating the distance of the real world object from the camera based on the information extracted in the calibration.
The processor may be configured to identify images of the real world scene obtained at the camera by a user of the system of the invention. The identification of another user in the real world scene may be performed, for example, by notifying a remote server of the location of the device in a particular geographic area. The locations of the other devices may be transmitted to all devices in the geographic area.
Two systems of the present invention can be used to play a game when a communication link exists between the two systems. The other user may be represented as a computer avatar with which the user may interact through gestures, such as sending a message to the other user, such as "like".
The processor may be configured to display a keyboard capable of text entry using one or more fingers or hands. The display of the keyboard may begin after a predetermined gesture is detected, such as a right-to-left gesture, or presenting a splayed hand, or presenting two splayed hands in a predetermined area of the field of view of the camera, such as the bottom of the field of view. Another way to start displaying the keyboard is when the user performs a tap gesture in the typing area or 3D space where the activation icon is perceived to be located. The keyboard may be used, for example, to write a note of a person, conduct a search, or communicate with an online service (e.g., Skype or twitter) by typing on a virtual keyboard. The system may not present the portion of the keyboard where the predetermined object is located, such that the predetermined object appears to be on top of the keyboard to create the illusion that the predetermined object, e.g., the user's hand, appears to be "above" the keyboard.
When the system is in the input mode, an animated hand may be presented on the keyboard in a position relative to the user's hand and fingers. The fingertip of the animated hand may be positioned over the virtual keys to see the key characters. The keyboard and animated hands are preferably opaque so that the user cannot see the background behind the keyboard. This tends to make the keyboard more clear to the user.

Claims (18)

1.一种增强现实设备,所述增强现实设备包括:1. An augmented reality device, the augmented reality device comprising: 至少一个处理器,所述至少一个处理器被配置成:at least one processor configured to: 从图像传感器接收与现实世界场景相关联的图像信息的视频帧;receiving video frames of image information associated with a real-world scene from an image sensor; 提供所述图像信息和增强信息以显示;providing said image information and enhancement information for display; 在所述图像信息中检测由用户执行的预定手势;detecting a predetermined gesture performed by the user in the image information; 在所述图像信息中识别不同于用户的手和所述增强信息的一个或更多个现实世界对象,其中,识别所述一个或更多个现实世界对象与所检测到的预定手势相关联;identifying one or more real-world objects in the image information that are distinct from the user's hand and the augmented information, wherein identifying the one or more real-world objects is associated with the detected predetermined gesture; 指定与所述视频帧中的所识别的一个或更多个现实世界对象相关联的所选择的图像信息,其中,所选择的图像信息与所述现实世界场景相关联,其中,与所述现实世界场景相关联的所述信息不包括所述用户的手和所显示的信息;specifying selected image information associated with the identified one or more real-world objects in the video frame, wherein the selected image information is associated with the real-world scene, wherein the the information associated with the world scene does not include the user's hand and displayed information; 通过所述手势对与所识别的不同于所述用户的手的一个或更多个现实世界对象相关联的所选择的图像信息进行标记以指定与所选择的图像信息相关联的区域;以及Marking, by the gesture, selected image information associated with the identified one or more real-world objects other than the user's hand to designate an area associated with the selected image information; and 录制与所选择的图像信息的所指定的区域相关联的视频信息和从所检测到的预定手势之前的预定时间量开始的视频信息中的至少一者。At least one of video information associated with the designated area of the selected image information and video information beginning a predetermined amount of time before the detected predetermined gesture is recorded. 2.根据权利要求1所述的增强现实设备,其中,所述预定手势包括以下至少一项:绘制与现实世界对象相关联的轮廓,或指向现实世界对象。2. The augmented reality device of claim 1, wherein the predetermined gesture includes at least one of: drawing an outline associated with a real-world object, or pointing at a real-world object. 3.根据权利要求1所述的增强现实设备,其中,所述至少一个处理器还被配置成对所指定的区域调整大小。3. The augmented reality device of claim 1, wherein the at least one processor is further configured to resize the designated area. 4.根据权利要求1所述的增强现实设备,其中,所述录制开始于所检测到的手势之前的预定时间。4. The augmented reality device of claim 1, wherein the recording begins a predetermined time before the detected gesture. 5.根据权利要求1所述的增强现实设备,其中,所述录制结束于所检测到的手势之后的预定时间。5. The augmented reality device of claim 1, wherein the recording ends a predetermined time after the detected gesture. 6.根据权利要求1所述的增强现实设备,其中,所述至少一个处理器还被配置成基于第二手势的检测来停止所述录制。6. The augmented reality device of claim 1, wherein the at least one processor is further configured to stop the recording based on detection of a second gesture. 7.根据权利要求1所述的增强现实设备,其中,所述至少一个处理器还被配置成与在所述图像信息中检测到的第二预定手势相关联地从图像信息的所述视频帧中捕获帧。7. The augmented reality device of claim 1, wherein the at least one processor is further configured to extract data from the video frame of image information in association with a second predetermined gesture detected in the image information capture frame. 8.根据权利要求1所述的增强现实设备,其中,所述时间是由所述用户定义的预定时间量。8. The augmented reality device of claim 1, wherein the time is a predetermined amount of time defined by the user. 9.根据权利要求1所述的增强现实设备,其中,所述至少一个处理器还被配置成响应于检测到的第二预定手势而运行跟踪算法,所述跟踪算法跟踪所述图像信息中所识别的现实世界对象。9. The augmented reality device of claim 1, wherein the at least one processor is further configured to run a tracking algorithm in response to the detected second predetermined gesture, the tracking algorithm tracking all of the image information Recognized real-world objects. 10.根据权利要求1所述的增强现实设备,其中,所述至少一个处理器还被配置成在所选择的图像信息上采用跟踪器,以在所述用户的视场中的经调整大小的区域中呈现所选择的信息中的现实世界对象的实时视图。10. The augmented reality device of claim 1, wherein the at least one processor is further configured to employ a tracker on the selected image information for a resized view in the user's field of view A real-time view of the real-world objects in the area presents the selected information. 11.一种操作增强现实设备的方法,该方法包括以下步骤:11. A method of operating an augmented reality device, the method comprising the steps of: 由至少一个处理器从图像传感器接收与现实世界场景相关联的图像信息的视频帧;receiving, by at least one processor, video frames of image information associated with a real-world scene from an image sensor; 提供所述图像信息和增强信息以显示;providing said image information and enhancement information for display; 由所述至少一个处理器在所述图像信息中检测用户执行的预定手势;detecting, by the at least one processor, a predetermined gesture performed by a user in the image information; 在所述视频帧中���由所述至少一个处理器指定与所���用户的手和所述增强信息不同的现实世界对象相关联的所选择的图像信息的区域,其中,所选择的图像信息与不包括所述用户的手和所显示的信息的所述现实世界场景相关联,所指定的区域与所检测到的预定手势相关联;In the video frame, a region of selected image information associated with the user's hand and a real-world object different from the augmentation information is specified by the at least one processor, wherein the selected image information is associated with the real world scene that does not include the user's hand and the displayed information is associated with the designated area being associated with the detected predetermined gesture; 基于所指定的区域,跟踪与不包括所述用户的手的所述现实世界场景相关联的所选择的图像信息;tracking selected image information associated with the real-world scene that does not include the user's hand based on the designated area; 基于所述跟踪,固定所选择的图像信息的区域;fixing the selected region of image information based on the tracking; 录制与所选择的图像信息的所指定的区域相关联的视频信息和从所检测到的预定手势之前的预定时间量开始的视频信息中的至少一者。At least one of video information associated with the designated area of the selected image information and video information beginning a predetermined amount of time before the detected predetermined gesture is recorded. 12.根据权利要求11所述的方法,其中,所述预定手势包括以下至少一项:绘制与现实世界对象相关联的轮廓,或指向现实世界对象。12. The method of claim 11, wherein the predetermined gesture includes at least one of: drawing an outline associated with a real-world object, or pointing at a real-world object. 13.根据权利要求11所述的方法,其中,所述至少一个处理器还被配置成对所指定的区域调整大小。13. The method of claim 11, wherein the at least one processor is further configured to resize the designated area. 14.根据权利要求13所述的方法,其中,对所指定的区域调整大小通过第二预定手势来进行,并且其中,所述第二预定手势包括分开两个手指,或者使两个手指靠近在一起。14. The method of claim 13, wherein resizing the designated area is performed by a second predetermined gesture, and wherein the second predetermined gesture comprises separating two fingers, or bringing two fingers close to the Together. 15.根据权利要求11所述的方法,所述方法还包括与在所述图像信息中检测到的第二预定义手势相关联地放大或缩小。15. The method of claim 11, further comprising zooming in or out in association with a second predefined gesture detected in the image information. 16.根据权利要求11所述的方法,所述方法还包括在所选择的区域上采用跟踪器,以在所述用户的视场中的经调整大小的区域中呈现所选择的信息中的现实世界对象的实时视图。16. The method of claim 11, further comprising employing a tracker on the selected area to render reality in the selected information in a resized area in the user's field of view Live view of world objects. 17.一种非暂时性计算机可读介质,所述非暂时性计算机可读介质存储指令,所述指令在被执行时使至少一个处理器执行操作设备的方法,所述方法包括以下步骤:17. A non-transitory computer-readable medium storing instructions that, when executed, cause at least one processor to perform a method of operating a device, the method comprising the steps of: 从图像传感器接收与现实世界场景相关联的图像信息的视频帧;receiving video frames of image information associated with a real-world scene from an image sensor; 提供所述图像信息和增强信息以显示;providing said image information and enhancement information for display; 在所述图像信息中检测用户执行的预定手势;detecting a predetermined gesture performed by the user in the image information; 在所述视频帧中指定与所检测到的手势相关联的所选择的图像信息的区域,其中,所选择的图像信息与不包括所述用户的手和所显示的信息的现实世界场景相关联;以及specifying a region of selected image information associated with the detected gesture in the video frame, wherein the selected image information is associated with a real world scene that does not include the user's hands and the displayed information ;as well as 跟踪与不包括所述用户的手的现实世界场景相关联的所选择的图像信息,以录制与所选择的信息中的所述现实世界场景相关联的所选择的图像信息的实时视频。Selected image information associated with a real-world scene that does not include the user's hand is tracked to record a real-time video of the selected image information associated with the real-world scene of the selected information. 18.根据权利要求17所述的���暂时性计算机可读介质,其中,所述现实世界场景至少部分地包括显示在显示设备上的信息。18. The non-transitory computer-readable medium of claim 17, wherein the real world scene includes, at least in part, information displayed on a display device.
CN201280048836.8A 2011-09-19 2012-09-19 Augmented reality device, method of operating augmented reality device, computer-readable medium Expired - Fee Related CN103858073B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210808606.2A CN115167675A (en) 2011-09-19 2012-09-19 Augmented reality device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161536144P 2011-09-19 2011-09-19
US61/536,144 2011-09-19
PCT/IL2012/050376 WO2013093906A1 (en) 2011-09-19 2012-09-19 Touch free interface for augmented reality systems

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210808606.2A Division CN115167675A (en) 2011-09-19 2012-09-19 Augmented reality device

Publications (2)

Publication Number Publication Date
CN103858073A CN103858073A (en) 2014-06-11
CN103858073B true CN103858073B (en) 2022-07-29

Family

ID=47189999

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210808606.2A Pending CN115167675A (en) 2011-09-19 2012-09-19 Augmented reality device
CN201280048836.8A Expired - Fee Related CN103858073B (en) 2011-09-19 2012-09-19 Augmented reality device, method of operating augmented reality device, computer-readable medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210808606.2A Pending CN115167675A (en) 2011-09-19 2012-09-19 Augmented reality device

Country Status (5)

Country Link
US (8) US20140361988A1 (en)
JP (3) JP2014531662A (en)
KR (3) KR20140069124A (en)
CN (2) CN115167675A (en)
WO (1) WO2013093906A1 (en)

Families Citing this family (267)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9865125B2 (en) 2010-11-15 2018-01-09 Bally Gaming, Inc. System and method for augmented reality gaming
CN115167675A (en) 2011-09-19 2022-10-11 视力移动技术有限公司 Augmented reality device
WO2013093837A1 (en) * 2011-12-23 2013-06-27 Koninklijke Philips Electronics N.V. Method and apparatus for interactive display of three dimensional ultrasound images
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US11169611B2 (en) * 2012-03-26 2021-11-09 Apple Inc. Enhanced virtual touchpad
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
TWI475474B (en) * 2012-07-30 2015-03-01 Mitac Int Corp Gesture combined with the implementation of the icon control method
KR102001218B1 (en) * 2012-11-02 2019-07-17 삼성전자주식회사 Method and device for providing information regarding the object
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US9770203B1 (en) 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
US10856796B1 (en) 2013-01-19 2020-12-08 Bertec Corporation Force measurement system
US10646153B1 (en) 2013-01-19 2020-05-12 Bertec Corporation Force measurement system
US10413230B1 (en) 2013-01-19 2019-09-17 Bertec Corporation Force measurement system
US10231662B1 (en) 2013-01-19 2019-03-19 Bertec Corporation Force measurement system
US10010286B1 (en) 2013-01-19 2018-07-03 Bertec Corporation Force measurement system
US11052288B1 (en) 2013-01-19 2021-07-06 Bertec Corporation Force measurement system
US11311209B1 (en) 2013-01-19 2022-04-26 Bertec Corporation Force measurement system and a motion base used therein
US11540744B1 (en) 2013-01-19 2023-01-03 Bertec Corporation Force measurement system
US12161477B1 (en) 2013-01-19 2024-12-10 Bertec Corporation Force measurement system
US9526443B1 (en) * 2013-01-19 2016-12-27 Bertec Corporation Force and/or motion measurement system and a method of testing a subject
US11857331B1 (en) 2013-01-19 2024-01-02 Bertec Corporation Force measurement system
US10133342B2 (en) * 2013-02-14 2018-11-20 Qualcomm Incorporated Human-body-gesture-based region and volume selection for HMD
EP2960867A4 (en) * 2013-02-21 2016-08-03 Fujitsu Ltd DISPLAY DEVICE, METHOD, PROGRAM, AND POSITION ADJUSTMENT SYSTEM
US20140240226A1 (en) * 2013-02-27 2014-08-28 Robert Bosch Gmbh User Interface Apparatus
US9122916B2 (en) * 2013-03-14 2015-09-01 Honda Motor Co., Ltd. Three dimensional fingertip tracking
US20140285520A1 (en) * 2013-03-22 2014-09-25 Industry-University Cooperation Foundation Hanyang University Wearable display device using augmented reality
US9507426B2 (en) * 2013-03-27 2016-11-29 Google Inc. Using the Z-axis in user interfaces for head mountable displays
US9213403B1 (en) 2013-03-27 2015-12-15 Google Inc. Methods to pan, zoom, crop, and proportionally move on a head mountable display
JP6108926B2 (en) * 2013-04-15 2017-04-05 オリンパス株式会社 Wearable device, program, and display control method for wearable device
US20140094148A1 (en) 2013-05-08 2014-04-03 Vringo Infrastructure Inc. Cognitive Radio System And Cognitive Radio Carrier Device
GB2513884B (en) 2013-05-08 2015-06-17 Univ Bristol Method and apparatus for producing an acoustic field
US9672627B1 (en) * 2013-05-09 2017-06-06 Amazon Technologies, Inc. Multiple camera based motion tracking
EP2818948B1 (en) * 2013-06-27 2016-11-16 ABB Schweiz AG Method and data presenting device for assisting a remote user to provide instructions
US10295338B2 (en) 2013-07-12 2019-05-21 Magic Leap, Inc. Method and system for generating map data from an image
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US10042422B2 (en) 2013-11-12 2018-08-07 Thalmic Labs Inc. Systems, articles, and methods for capacitive electromyography sensors
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
US12504816B2 (en) 2013-08-16 2025-12-23 Meta Platforms Technologies, Llc Wearable devices and associated band structures for sensing neuromuscular signals using sensor pairs in respective pods with communicative pathways to a common processor
KR102157313B1 (en) * 2013-09-03 2020-10-23 삼성전자주식회사 Method and computer readable recording medium for recognizing an object using a captured image
KR102165818B1 (en) * 2013-09-10 2020-10-14 삼성전자주식회사 Method, apparatus and recovering medium for controlling user interface using a input image
JP5877824B2 (en) * 2013-09-20 2016-03-08 ヤフー株式会社 Information processing system, information processing method, and information processing program
KR102119659B1 (en) 2013-09-23 2020-06-08 엘지전자 주식회사 Display device and control method thereof
CN103501473B (en) * 2013-09-30 2016-03-09 陈创举 Based on multifunctional headphone and the control method thereof of MEMS sensor
KR101499044B1 (en) * 2013-10-07 2015-03-11 홍익대학교 산학협력단 Wearable computer obtaining text based on gesture and voice of user and method of obtaining the text
US9740935B2 (en) * 2013-11-26 2017-08-22 Honeywell International Inc. Maintenance assistant system
US9671826B2 (en) * 2013-11-27 2017-06-06 Immersion Corporation Method and apparatus of body-mediated digital content transfer and haptic feedback
US10586395B2 (en) 2013-12-30 2020-03-10 Daqri, Llc Remote object detection and local tracking using visual odometry
US9264479B2 (en) 2013-12-30 2016-02-16 Daqri, Llc Offloading augmented reality processing
EP2899609B1 (en) * 2014-01-24 2019-04-17 Sony Corporation System and method for name recollection
DE102014201578A1 (en) * 2014-01-29 2015-07-30 Volkswagen Ag Device and method for signaling an input area for gesture recognition of a human-machine interface
US20150227231A1 (en) * 2014-02-12 2015-08-13 Microsoft Corporation Virtual Transparent Display
KR20150110032A (en) * 2014-03-24 2015-10-02 삼성전자주식회사 Electronic Apparatus and Method for Image Data Processing
WO2015161062A1 (en) * 2014-04-18 2015-10-22 Bally Gaming, Inc. System and method for augmented reality gaming
US9501871B2 (en) 2014-04-30 2016-11-22 At&T Mobility Ii Llc Explorable augmented reality displays
TWI518603B (en) 2014-05-22 2016-01-21 宏達國際電子股份有限公司 Image editing method and electronic device
US10600245B1 (en) 2014-05-28 2020-03-24 Lucasfilm Entertainment Company Ltd. Navigating a virtual environment of a media content item
KR102303115B1 (en) * 2014-06-05 2021-09-16 삼성전자 주식회사 Method For Providing Augmented Reality Information And Wearable Device Using The Same
KR101595957B1 (en) * 2014-06-12 2016-02-18 엘지전자 주식회사 Mobile terminal and controlling system
EP3180676A4 (en) * 2014-06-17 2018-01-10 Osterhout Group, Inc. External user interface for head worn computing
JP6500355B2 (en) * 2014-06-20 2019-04-17 富士通株式会社 Display device, display program, and display method
US20150379770A1 (en) * 2014-06-27 2015-12-31 David C. Haley, JR. Digital action in response to object interaction
US9959591B2 (en) * 2014-07-31 2018-05-01 Seiko Epson Corporation Display apparatus, method for controlling display apparatus, and program
JP6638195B2 (en) * 2015-03-02 2020-01-29 セイコーエプソン株式会社 DISPLAY DEVICE, DISPLAY DEVICE CONTROL METHOD, AND PROGRAM
CN104133593A (en) * 2014-08-06 2014-11-05 北京行云时空科技有限公司 Character input system and method based on motion sensing
CN104156082A (en) * 2014-08-06 2014-11-19 北京行云时空科技有限公司 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes
US9696551B2 (en) * 2014-08-13 2017-07-04 Beijing Lenovo Software Ltd. Information processing method and electronic device
US9690375B2 (en) 2014-08-18 2017-06-27 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
CN104197950B (en) * 2014-08-19 2018-02-16 奇������������份有限公司 The method and system that geography information is shown
US9910504B2 (en) * 2014-08-21 2018-03-06 Samsung Electronics Co., Ltd. Sensor based UI in HMD incorporating light turning element
JP5989725B2 (en) * 2014-08-29 2016-09-07 京セラドキュメントソリューションズ株式会社 Electronic device and information display program
DE102014217843A1 (en) * 2014-09-05 2016-03-10 Martin Cudzilo Apparatus for facilitating the cleaning of surfaces and methods for detecting cleaning work done
GB2530036A (en) 2014-09-09 2016-03-16 Ultrahaptics Ltd Method and apparatus for modulating haptic feedback
TWI613615B (en) * 2014-10-15 2018-02-01 在地實驗文化事業有限公司 Navigation system and method
US20160109701A1 (en) * 2014-10-15 2016-04-21 GM Global Technology Operations LLC Systems and methods for adjusting features within a head-up display
US10108256B2 (en) * 2014-10-30 2018-10-23 Mediatek Inc. Systems and methods for processing incoming events while performing a virtual reality session
WO2016071244A2 (en) * 2014-11-06 2016-05-12 Koninklijke Philips N.V. Method and system of communication for use in hospitals
KR102038965B1 (en) * 2014-11-26 2019-10-31 삼성전자주식회사 Untrasound sensor and object detecting method thereof
EP3236335A4 (en) 2014-12-17 2018-07-25 Konica Minolta, Inc. Electronic instrument, method of controlling electronic instrument, and control program for same
CN104537401B (en) * 2014-12-19 2017-05-17 南京大学 Reality augmentation system and working method based on technologies of radio frequency identification and depth of field sensor
US9658693B2 (en) * 2014-12-19 2017-05-23 Immersion Corporation Systems and methods for haptically-enabled interactions with objects
US9600076B2 (en) * 2014-12-19 2017-03-21 Immersion Corporation Systems and methods for object manipulation with haptic feedback
US9685005B2 (en) * 2015-01-02 2017-06-20 Eon Reality, Inc. Virtual lasers for interacting with augmented reality environments
US20160196693A1 (en) * 2015-01-06 2016-07-07 Seiko Epson Corporation Display system, control method for display device, and computer program
US10317215B2 (en) 2015-01-09 2019-06-11 Boe Technology Group Co., Ltd. Interactive glasses and navigation system
CN104570354A (en) * 2015-01-09 2015-04-29 京东方科技集团股份有限公司 Interactive glasses and visitor guide system
TWI619041B (en) * 2015-01-09 2018-03-21 Chunghwa Telecom Co Ltd Augmented reality unlocking system and method
JP2016133541A (en) * 2015-01-16 2016-07-25 株式会社ブリリアントサービス Electronic spectacle and method for controlling the same
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
JP6771473B2 (en) 2015-02-20 2020-10-21 ウルトラハプティクス アイピー リミテッドUltrahaptics Ip Ltd Improved algorithm in the tactile system
US9886633B2 (en) * 2015-02-23 2018-02-06 Vivint, Inc. Techniques for identifying and indexing distinguishing features in a video feed
EP3267295B1 (en) * 2015-03-05 2021-12-29 Sony Group Corporation Information processing device, control method, and program
JP6596883B2 (en) 2015-03-31 2019-10-30 ソニー株式会社 Head mounted display, head mounted display control method, and computer program
US20160292920A1 (en) * 2015-04-01 2016-10-06 Caterpillar Inc. Time-Shift Controlled Visualization of Worksite Operations
US10156908B2 (en) * 2015-04-15 2018-12-18 Sony Interactive Entertainment Inc. Pinch and hold gesture navigation on a head-mounted display
JP6534292B2 (en) * 2015-04-24 2019-06-26 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリ���Panasonic Intellectual Property Corporation of America Head mounted display and control method of head mounted display
US10055888B2 (en) 2015-04-28 2018-08-21 Microsoft Technology Licensing, Llc Producing and consuming metadata within multi-dimensional data
DE102015211515A1 (en) * 2015-06-23 2016-12-29 Siemens Aktiengesellschaft Interaction system
US10409443B2 (en) * 2015-06-24 2019-09-10 Microsoft Technology Licensing, Llc Contextual cursor display based on hand tracking
US10156726B2 (en) * 2015-06-29 2018-12-18 Microsoft Technology Licensing, Llc Graphene in optical systems
US10818162B2 (en) 2015-07-16 2020-10-27 Ultrahaptics Ip Ltd Calibration techniques in haptic systems
CN105138763A (en) * 2015-08-19 2015-12-09 中山大学 Method for real scene and reality information superposition in augmented reality
CN112557676B (en) * 2015-08-25 2025-04-29 株式会社日立高新技术 Marking method
CN105205454A (en) * 2015-08-27 2015-12-30 深圳市国华识别科技开发有限公司 System and method for capturing target object automatically
KR102456597B1 (en) * 2015-09-01 2022-10-20 삼성전자주식회사 Electronic apparatus and operating method thereof
KR101708455B1 (en) * 2015-09-08 2017-02-21 엠더블유엔테크 주식회사 Hand Float Menu System
CN105183173B (en) * 2015-10-12 2018-08-28 重庆中电大宇卫星应用技术研究所 It is a kind of by tactics and Morse code gesture input and the device for being converted to voice
CN113220116A (en) 2015-10-20 2021-08-06 奇跃公司 System and method for changing user input mode of wearable device and wearable system
DE102015221860A1 (en) * 2015-11-06 2017-05-11 BSH Hausgeräte GmbH System and method for facilitating operation of a household appliance
CN105872815A (en) * 2015-11-25 2016-08-17 乐视���信息技术(北京)股份有限公司 Video playing method and device
EP3182328A1 (en) * 2015-12-17 2017-06-21 Nokia Technologies Oy A method, apparatus or computer program for controlling image processing of a captured image of a scene to adapt the captured image
US9697648B1 (en) 2015-12-23 2017-07-04 Intel Corporation Text functions in augmented reality
JP2017129406A (en) * 2016-01-19 2017-07-27 日本電気通信システム株式会社 Information processing device, smart glass and control method thereof, and computer program
CN105843390B (en) * 2016-02-24 2019-03-19 上海理湃光晶技术有限公司 A method of image scaling and AR glasses based on the method
US10168768B1 (en) 2016-03-02 2019-01-01 Meta Company Systems and methods to facilitate interactions in an interactive space
US10133345B2 (en) 2016-03-22 2018-11-20 Microsoft Technology Licensing, Llc Virtual-reality navigation
US9933855B2 (en) * 2016-03-31 2018-04-03 Intel Corporation Augmented reality in a field of view including a reflection
AU2017244109B2 (en) 2016-03-31 2022-06-23 Magic Leap, Inc. Interactions with 3D virtual objects using poses and multiple-DOF controllers
SE541141C2 (en) * 2016-04-18 2019-04-16 Moonlightning Ind Ab Focus pulling with a stereo vision camera system
US10186088B2 (en) 2016-05-13 2019-01-22 Meta Company System and method for managing interactive virtual frames for virtual objects in a virtual environment
US9990779B2 (en) 2016-05-13 2018-06-05 Meta Company System and method for modifying virtual objects in a virtual environment in response to user interactions
ES2643863B1 (en) * 2016-05-24 2018-10-26 Sonovisión Ingenieros España, S.A.U. METHOD FOR PROVIDING BY GUIDED INCREASED REALITY, INSPECTION AND SUPPORT IN INSTALLATION OR MAINTENANCE OF PROCESSES FOR COMPLEX ASSEMBLIES COMPATIBLE WITH S1000D AND DEVICE THAT MAKES SAME USE
CN105915715A (en) * 2016-05-25 2016-08-31 努比亚技术有限公司 Incoming call reminding method and device thereof, wearable audio device and mobile terminal
WO2017217752A1 (en) * 2016-06-17 2017-12-21 이철윤 System and method for generating three dimensional composite image of product and packing box
CN106157363A (en) * 2016-06-28 2016-11-23 广东欧珀移动通信有限公司 A camera method, device and mobile terminal based on augmented reality
CN106125932A (en) * 2016-06-28 2016-11-16 广东欧珀移动通信有限公司 A method, device, and mobile terminal for identifying target objects in augmented reality
CN106155315A (en) * 2016-06-28 2016-11-23 广东欧珀移动通信有限公司 Method, device and mobile terminal for adding augmented reality effect in shooting
CN106066701B (en) * 2016-07-05 2019-07-26 上海智旭商务咨询有限公司 A kind of AR and VR data processing equipment and method
KR20180009170A (en) * 2016-07-18 2018-01-26 엘지전자 주식회사 Mobile terminal and operating method thereof
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
EP3487595A4 (en) 2016-07-25 2019-12-25 CTRL-Labs Corporation SYSTEM AND METHOD FOR MEASURING MOVEMENTS OF ARTICULATED RIGID BODIES
EP3487402B1 (en) 2016-07-25 2021-05-05 Facebook Technologies, LLC Methods and apparatus for inferring user intent based on neuromuscular signals
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
WO2020112986A1 (en) 2018-11-27 2020-06-04 Facebook Technologies, Inc. Methods and apparatus for autocalibration of a wearable electrode sensor system
US10268275B2 (en) 2016-08-03 2019-04-23 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
CN106354257A (en) * 2016-08-30 2017-01-25 湖北睛彩视讯科技有限公司 Mobile scene fusion system and method based on augmented reality technology
CN106980362A (en) 2016-10-09 2017-07-25 阿里巴巴集团控股有限公司 Input method and device based on virtual reality scenario
US11119585B2 (en) 2016-10-13 2021-09-14 Ford Motor Company Dual-mode augmented reality interfaces for mobile devices
US10257558B2 (en) * 2016-10-26 2019-04-09 Orcam Technologies Ltd. Systems and methods for constructing and indexing a database of joint profiles for persons viewed by multiple wearable apparatuses
JP2018082363A (en) * 2016-11-18 2018-05-24 セイコーエプソン株式会社 Head-mounted display device and method for controlling the same, and computer program
WO2018100575A1 (en) 2016-11-29 2018-06-07 Real View Imaging Ltd. Tactile feedback in a display system
WO2018113740A1 (en) * 2016-12-21 2018-06-28 Zyetric Technologies Limited Combining virtual reality and augmented reality
US11507216B2 (en) * 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications
US10620910B2 (en) 2016-12-23 2020-04-14 Realwear, Inc. Hands-free navigation of touch-based operating systems
US11099716B2 (en) 2016-12-23 2021-08-24 Realwear, Inc. Context based content navigation for wearable display
CN106682468A (en) * 2016-12-30 2017-05-17 百度在线网络技术(北京)有限公司 Method of unlocking electronic device and electronic device
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
US11572653B2 (en) * 2017-03-10 2023-02-07 Zyetric Augmented Reality Limited Interactive augmented reality
EP4250066A3 (en) 2017-03-21 2023-11-29 InterDigital VC Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
US10489651B2 (en) * 2017-04-14 2019-11-26 Microsoft Technology Licensing, Llc Identifying a position of a marker in an environment
US10620779B2 (en) * 2017-04-24 2020-04-14 Microsoft Technology Licensing, Llc Navigating a holographic image
US10481755B1 (en) * 2017-04-28 2019-11-19 Meta View, Inc. Systems and methods to present virtual content in an interactive space
US11054894B2 (en) 2017-05-05 2021-07-06 Microsoft Technology Licensing, Llc Integrated mixed-input system
CN111033444B (en) 2017-05-10 2024-03-05 优玛尼股份有限公司 Wearable multimedia devices and cloud computing platform with application ecosystem
US12230029B2 (en) * 2017-05-10 2025-02-18 Humane, Inc. Wearable multimedia device and cloud computing platform with laser projection system
US11023109B2 (en) 2017-06-30 2021-06-01 Microsoft Techniogy Licensing, LLC Annotation using a multi-device mixed interactivity system
US10895966B2 (en) 2017-06-30 2021-01-19 Microsoft Technology Licensing, Llc Selection using a multi-device mixed interactivity system
CN107340871A (en) * 2017-07-25 2017-11-10 深识全球创新科技(北京)有限公司 The devices and methods therefor and purposes of integrated gesture identification and ultrasonic wave touch feedback
WO2019021447A1 (en) * 2017-07-28 2019-01-31 株式会社オプティム Wearable terminal display system, wearable terminal display method and program
WO2019021446A1 (en) * 2017-07-28 2019-01-31 株式会社オプティム Wearable terminal display system, wearable terminal display method and program
CN107635057A (en) * 2017-07-31 2018-01-26 努比亚技术有限公司 A kind of virtual reality terminal control method, terminal and computer-readable recording medium
US10591730B2 (en) * 2017-08-25 2020-03-17 II Jonathan M. Rodriguez Wristwatch based interface for augmented reality eyewear
US10068403B1 (en) 2017-09-21 2018-09-04 Universal City Studios Llc Locker management techniques
US10506217B2 (en) * 2017-10-09 2019-12-10 Facebook Technologies, Llc Head-mounted display tracking system
US20190129607A1 (en) * 2017-11-02 2019-05-02 Samsung Electronics Co., Ltd. Method and device for performing remote control
JP2019086916A (en) * 2017-11-02 2019-06-06 オリンパス株式会社 Work support device, work support method, and work support program
US11531395B2 (en) 2017-11-26 2022-12-20 Ultrahaptics Ip Ltd Haptic effects from focused acoustic fields
WO2019123762A1 (en) * 2017-12-22 2019-06-27 ソニー株式会社 Information processing device, information processing method, and program
EP3729418B1 (en) 2017-12-22 2024-11-20 Ultrahaptics Ip Ltd Minimizing unwanted responses in haptic systems
WO2019122912A1 (en) 2017-12-22 2019-06-27 Ultrahaptics Limited Tracking in haptic systems
US10739861B2 (en) * 2018-01-10 2020-08-11 Facebook Technologies, Llc Long distance interaction with artificial reality objects using a near eye display interface
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
WO2019147956A1 (en) 2018-01-25 2019-08-01 Ctrl-Labs Corporation Visualization of reconstructed handstate information
US11907423B2 (en) * 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US10706628B2 (en) * 2018-02-28 2020-07-07 Lenovo (Singapore) Pte. Ltd. Content transfer
US20190324549A1 (en) * 2018-04-20 2019-10-24 Immersion Corporation Systems, devices, and methods for providing immersive reality interface modes
MX2020011492A (en) 2018-05-02 2021-03-25 Ultrahaptics Ip Ltd Blocking plate structure for improved acoustic transmission efficiency.
US20190339837A1 (en) * 2018-05-04 2019-11-07 Oculus Vr, Llc Copy and Paste in a Virtual Reality Environment
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10768426B2 (en) 2018-05-21 2020-09-08 Microsoft Technology Licensing, Llc Head mounted display system receiving three-dimensional push notification
EP3801216A1 (en) 2018-05-29 2021-04-14 Facebook Technologies, LLC. Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
CN112585600A (en) 2018-06-14 2021-03-30 脸谱科技有限责任公司 User identification and authentication using neuromuscular signatures
JP7056423B2 (en) * 2018-07-10 2022-04-19 オムロン株式会社 Input device
US11360558B2 (en) * 2018-07-17 2022-06-14 Apple Inc. Computer systems with finger devices
WO2020018892A1 (en) 2018-07-19 2020-01-23 Ctrl-Labs Corporation Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US10890653B2 (en) 2018-08-22 2021-01-12 Google Llc Radar-based gesture enhancement for voice interfaces
US10770035B2 (en) 2018-08-22 2020-09-08 Google Llc Smartphone-based radar system for facilitating awareness of user presence and orientation
US10698603B2 (en) * 2018-08-24 2020-06-30 Google Llc Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
US10909762B2 (en) 2018-08-24 2021-02-02 Microsoft Technology Licensing, Llc Gestures for facilitating interaction with pages in a mixed reality environment
EP4241661A1 (en) 2018-08-31 2023-09-13 Facebook Technologies, LLC Camera-guided interpretation of neuromuscular signals
CN109348003A (en) * 2018-09-17 2019-02-15 深圳市泰衡诺科技有限公司 Application control method and device
WO2020061451A1 (en) * 2018-09-20 2020-03-26 Ctrl-Labs Corporation Neuromuscular text entry, writing and drawing in augmented reality systems
CN110942518B (en) * 2018-09-24 2024-03-29 苹果公司 Contextual Computer Generated Reality (CGR) digital assistant
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
CN119454302A (en) * 2018-10-05 2025-02-18 元平台技术有限公司 Using neuromuscular signals to provide enhanced interaction with physical objects in augmented reality environments
KR102620702B1 (en) * 2018-10-12 2024-01-04 삼성전자주식회사 A mobile apparatus and a method for controlling the mobile apparatus
US10788880B2 (en) 2018-10-22 2020-09-29 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
US10929099B2 (en) * 2018-11-02 2021-02-23 Bose Corporation Spatialized virtual personal assistant
CN111273766B (en) 2018-12-04 2022-05-13 苹果公司 Method, apparatus and system for generating an affordance linked to a simulated reality representation of an item
US10789952B2 (en) * 2018-12-20 2020-09-29 Microsoft Technology Licensing, Llc Voice command execution from auxiliary input
CN109782639A (en) * 2018-12-29 2019-05-21 深圳市中孚能电气设备有限公司 The control method and control device of a kind of electronic equipment operating mode
US12373033B2 (en) 2019-01-04 2025-07-29 Ultrahaptics Ip Ltd Mid-air haptic textures
WO2020152828A1 (en) * 2019-01-24 2020-07-30 マクセル株式会社 Display terminal, application control system and application control method
US10885322B2 (en) * 2019-01-31 2021-01-05 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
JP6720385B1 (en) * 2019-02-07 2020-07-08 株式会社メルカリ Program, information processing method, and information processing terminal
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US11842517B2 (en) 2019-04-12 2023-12-12 Ultrahaptics Ip Ltd Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network
CN110109547A (en) * 2019-05-05 2019-08-09 芋头科技(杭州)有限公司 Order Activiation method and system based on gesture identification
US11302081B2 (en) * 2019-05-21 2022-04-12 Magic Leap, Inc. Caching and updating of dense 3D reconstruction data
JP7331462B2 (en) * 2019-05-24 2023-08-23 京セラドキュメントソリューションズ株式会社 ROBOT SYSTEM, ROBOT CONTROL METHOD AND ELECTRONIC DEVICE
US10747371B1 (en) * 2019-06-28 2020-08-18 Konica Minolta Business Solutions U.S.A., Inc. Detection of finger press from live video stream
USD1009884S1 (en) * 2019-07-26 2024-01-02 Sony Corporation Mixed reality eyeglasses or portion thereof with an animated graphical user interface
JP2021026260A (en) 2019-07-31 2021-02-22 セイコーエプソン株式会社 Display unit, display method, and computer program
US10909767B1 (en) * 2019-08-01 2021-02-02 International Business Machines Corporation Focal and interaction driven content replacement into augmented reality
US12229341B2 (en) 2019-09-23 2025-02-18 Apple Inc. Finger-mounted input devices
US11275453B1 (en) 2019-09-30 2022-03-15 Snap Inc. Smart ring for manipulating virtual objects displayed by a wearable device
US11374586B2 (en) 2019-10-13 2022-06-28 Ultraleap Limited Reducing harmonic distortion by dithering
US20210116249A1 (en) * 2019-10-16 2021-04-22 The Board Of Trustees Of The California State University Augmented reality marine navigation
US11288871B2 (en) * 2019-11-08 2022-03-29 Fujifilm Business Innovation Corp. Web-based remote assistance system with context and content-aware 3D hand gesture visualization
US12089953B1 (en) 2019-12-04 2024-09-17 Meta Platforms Technologies, Llc Systems and methods for utilizing intrinsic current noise to measure interface impedances
CN113012214A (en) * 2019-12-20 2021-06-22 北京外号信息技术���限公司 Method and electronic device for setting spatial position of virtual object
US11715453B2 (en) 2019-12-25 2023-08-01 Ultraleap Limited Acoustic transducer structures
CN115244263A (en) * 2020-02-28 2022-10-25 日本电气株式会社 Locker system, locker management method, and storage medium
US11277597B1 (en) 2020-03-31 2022-03-15 Snap Inc. Marker-based guided AR experience
US11798429B1 (en) 2020-05-04 2023-10-24 Snap Inc. Virtual tutorials for musical instruments with finger tracking in augmented reality
US11520399B2 (en) 2020-05-26 2022-12-06 Snap Inc. Interactive augmented reality experiences using positional tracking
US11816267B2 (en) 2020-06-23 2023-11-14 Ultraleap Limited Features of airborne ultrasonic fields
JP2022022568A (en) * 2020-06-26 2022-02-07 沖電気工業株式会社 Display operation unit and device
JP7515590B2 (en) * 2020-07-08 2024-07-12 マクセル株式会社 Information processing terminal, remote control method and program
WO2022058738A1 (en) 2020-09-17 2022-03-24 Ultraleap Limited Ultrahapticons
US11925863B2 (en) 2020-09-18 2024-03-12 Snap Inc. Tracking hand gestures for interactive game control in augmented reality
US11546505B2 (en) * 2020-09-28 2023-01-03 Snap Inc. Touchless photo capture in response to detected hand gestures
US11644902B2 (en) * 2020-11-30 2023-05-09 Google Llc Gesture-based content transfer
WO2022146678A1 (en) 2020-12-29 2022-07-07 Snap Inc. Micro hand gestures for controlling virtual and graphical elements
WO2022146673A1 (en) 2020-12-30 2022-07-07 Snap Inc. Augmented reality precision tracking and display
US11740313B2 (en) 2020-12-30 2023-08-29 Snap Inc. Augmented reality precision tracking and display
US11531402B1 (en) 2021-02-25 2022-12-20 Snap Inc. Bimanual gestures for controlling virtual and graphical elements
CN113190110A (en) * 2021-03-30 2021-07-30 青岛小鸟看看科技有限公司 Interface element control method and device of head-mounted display equipment
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
WO2022216784A1 (en) 2021-04-08 2022-10-13 Snap Inc. Bimanual interactions between mapped hand regions for controlling virtual and graphical elements
EP4280599A4 (en) * 2021-04-09 2024-07-17 Samsung Electronics Co., Ltd. PORTABLE ELECTRONIC DEVICE WITH MULTIPLE CAMERAS
WO2022225761A1 (en) 2021-04-19 2022-10-27 Snap Inc. Hand gestures for animating and controlling virtual and graphical elements
CN113141529B (en) * 2021-04-25 2022-02-25 聚好看科技股份有限公司 Display device and media resource playback method
US11435857B1 (en) * 2021-04-29 2022-09-06 Google Llc Content access and navigation using a head-mounted device
US11995899B2 (en) * 2021-04-29 2024-05-28 Google Llc Pointer-based content recognition using a head-mounted device
US12517585B2 (en) 2021-07-15 2026-01-06 Ultraleap Limited Control point manipulation techniques in haptic systems
WO2023283934A1 (en) * 2021-07-16 2023-01-19 Huawei Technologies Co.,Ltd. Devices and methods for gesture-based selection
KR102629771B1 (en) * 2021-09-30 2024-01-29 박두고 Wearable device for recognition object using hand or finger tracking
US11967147B2 (en) * 2021-10-01 2024-04-23 At&T Intellectual Proerty I, L.P. Augmented reality visualization of enclosed spaces
CN114089879B (en) * 2021-11-15 2022-08-05 北京灵犀微光科技有限公司 Cursor control method of augmented reality display equipment
US12405661B2 (en) * 2022-01-10 2025-09-02 Apple Inc. Devices and methods for controlling electronic devices or systems with physical objects
US12265663B2 (en) * 2022-04-04 2025-04-01 Snap Inc. Gesture-based application invocation
US12282607B2 (en) 2022-04-27 2025-04-22 Snap Inc. Fingerspelling text entry
CN115309271B (en) * 2022-09-29 2023-03-21 南方科技大学 Information display method, device and equipment based on mixed reality and storage medium
KR102703511B1 (en) * 2022-12-29 2024-09-06 서울과학기술대학교 산학협력단 System for gemnerating virtual space using level of detail of object
US20250321630A1 (en) * 2024-04-10 2025-10-16 Meta Platforms Technologies, Llc Single-Handed Mode for an Artificial Reality System

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545663B1 (en) * 1999-04-19 2003-04-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and input device for controlling the position of an object to be graphically displayed in virtual reality
CN101262830A (en) * 2005-07-20 2008-09-10 布拉科成像S.P.A.公司 Method and system for mapping a virtual model of an object to an object
CN101300621A (en) * 2005-09-13 2008-11-05 时空3D公司 System and method for providing a three-dimensional graphical user interface
CN101375599A (en) * 2004-06-01 2009-02-25 L-3通信公司 Method and system for performing video flashlight
CN101542584A (en) * 2006-10-16 2009-09-23 索尼株式会社 display device, display method

Family Cites Families (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0981309A (en) * 1995-09-13 1997-03-28 Toshiba Corp Input device
JP3365246B2 (en) 1997-03-14 2003-01-08 ミノルタ株式会社 Electronic still camera
JP3225882B2 (en) * 1997-03-27 2001-11-05 日本電信電話株式会社 Landscape labeling system
AU7651100A (en) 1999-09-15 2001-04-17 Roche Consumer Health Ag Pharmaceutical and/or cosmetical compositions
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
SE0000850D0 (en) * 2000-03-13 2000-03-13 Pink Solution Ab Recognition arrangement
CA2410427A1 (en) * 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
JP2002157606A (en) * 2000-11-17 2002-05-31 Canon Inc Image display control device, mixed reality presentation system, image display control method, and medium providing processing program
US7215322B2 (en) 2001-05-31 2007-05-08 Siemens Corporate Research, Inc. Input devices for augmented reality applications
US7126558B1 (en) 2001-10-19 2006-10-24 Accenture Global Services Gmbh Industrial augmented reality
AU2003217587A1 (en) * 2002-02-15 2003-09-09 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US6943754B2 (en) * 2002-09-27 2005-09-13 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
US7676079B2 (en) * 2003-09-30 2010-03-09 Canon Kabushiki Kaisha Index identification method and apparatus
IL161002A0 (en) 2004-03-22 2004-08-31 Itay Katz Virtual video keyboard system
CN1304931C (en) * 2005-01-27 2007-03-14 北京理工大学 Head carried stereo vision hand gesture identifying device
US20060200662A1 (en) * 2005-02-01 2006-09-07 Microsoft Corporation Referencing objects in a virtual environment
KR100687737B1 (en) * 2005-03-19 2007-02-27 한국전자통신연구원 Virtual Mouse Device and Method Based on Two-Hand Gesture
JP4851771B2 (en) * 2005-10-24 2012-01-11 京セラ株式会社 Information processing system and portable information terminal
US7509588B2 (en) * 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US7725547B2 (en) * 2006-09-06 2010-05-25 International Business Machines Corporation Informing a user of gestures made by others out of the user's line of sight
JP4961914B2 (en) * 2006-09-08 2012-06-27 ソニー株式会社 Imaging display device and imaging display method
WO2008153599A1 (en) * 2006-12-07 2008-12-18 Adapx, Inc. Systems and methods for data annotation, recordation, and communication
KR101285360B1 (en) * 2007-01-25 2013-07-11 삼성전자주식회사 Point of interest displaying apparatus and method for using augmented reality
US8942764B2 (en) * 2007-10-01 2015-01-27 Apple Inc. Personal media device controlled via user initiated movements utilizing movement based interfaces
JP4933406B2 (en) * 2007-11-15 2012-05-16 キヤノン株式会社 Image processing apparatus and image processing method
US8165345B2 (en) * 2007-12-07 2012-04-24 Tom Chau Method, system, and computer program for detecting and characterizing motion
EP2258587A4 (en) * 2008-03-19 2013-08-07 Denso Corp Operation input device for vehicle
JP5250834B2 (en) * 2008-04-03 2013-07-31 コニカミノルタ株式会社 Head-mounted image display device
WO2009128064A2 (en) * 2008-04-14 2009-10-22 Pointgrab Ltd. Vision based pointing device emulation
US8971565B2 (en) * 2008-05-29 2015-03-03 Hie-D Technologies, Llc Human interface electronic device
WO2010042880A2 (en) * 2008-10-10 2010-04-15 Neoflect, Inc. Mobile computing device with a virtual keyboard
US8397181B2 (en) 2008-11-17 2013-03-12 Honeywell International Inc. Method and apparatus for marking a position of a real world object in a see-through display
CN101739122A (en) * 2008-11-24 2010-06-16 玴荣科技股份有限公司 Gesture Recognition and Tracking Method
US9041660B2 (en) * 2008-12-09 2015-05-26 Microsoft Technology Licensing, Llc Soft keyboard control
US9405970B2 (en) 2009-02-02 2016-08-02 Eyesight Mobile Technologies Ltd. System and method for object recognition and tracking in a video stream
CN102326133B (en) * 2009-02-20 2015-08-26 皇家飞利浦电子股份有限公司 The equipment of being provided for enters system, the method and apparatus of activity pattern
JP5304329B2 (en) * 2009-03-05 2013-10-02 ブラザー工業株式会社 Head mounted display device, image control method, and image control program
US8009022B2 (en) 2009-05-29 2011-08-30 Microsoft Corporation Systems and methods for immersive interaction with virtual objects
WO2010144050A1 (en) * 2009-06-08 2010-12-16 Agency For Science, Technology And Research Method and system for gesture based manipulation of a 3-dimensional image of object
KR101622196B1 (en) * 2009-09-07 2016-05-18 삼성전자주식회사 Apparatus and method for providing poi information in portable terminal
US20110107216A1 (en) * 2009-11-03 2011-05-05 Qualcomm Incorporated Gesture-based user interface
JP4679661B1 (en) * 2009-12-15 2011-04-27 株式会社東芝 Information presenting apparatus, information presenting method, and program
KR20110075250A (en) 2009-12-28 2011-07-06 엘지전자 주식회사 Object tracking method and device using object tracking mode
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for controlling user gestures by using image capture device
US8490002B2 (en) * 2010-02-11 2013-07-16 Apple Inc. Projected display shared workspaces
US20110214082A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
KR20130000401A (en) * 2010-02-28 2013-01-02 오스터하우트 그룹 인코포레이티드 Local advertising content on an interactive head-mounted eyepiece
US9128281B2 (en) * 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US8788197B2 (en) 2010-04-30 2014-07-22 Ryan Fink Visual training devices, systems, and methods
US8593375B2 (en) 2010-07-23 2013-11-26 Gregory A Maltz Eye gaze user interface and method
JP5499985B2 (en) * 2010-08-09 2014-05-21 ソニー株式会社 Display assembly
US20120062602A1 (en) * 2010-09-13 2012-03-15 Nokia Corporation Method and apparatus for rendering a content display
US8941559B2 (en) 2010-09-21 2015-01-27 Microsoft Corporation Opacity filter for display device
US8768006B2 (en) * 2010-10-19 2014-07-01 Hewlett-Packard Development Company, L.P. Hand gesture recognition
CN102147926A (en) * 2011-01-17 2011-08-10 中兴通讯股份有限公司 Three-dimensional (3D) icon processing method and device and mobile terminal
US9336240B2 (en) * 2011-07-15 2016-05-10 Apple Inc. Geo-tagging digital images
US20130050069A1 (en) * 2011-08-23 2013-02-28 Sony Corporation, A Japanese Corporation Method and system for use in providing three dimensional user interface
CN115167675A (en) * 2011-09-19 2022-10-11 视力移动技术有限公司 Augmented reality device
WO2013136333A1 (en) 2012-03-13 2013-09-19 Eyesight Mobile Technologies Ltd. Touch free user interface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545663B1 (en) * 1999-04-19 2003-04-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and input device for controlling the position of an object to be graphically displayed in virtual reality
CN101375599A (en) * 2004-06-01 2009-02-25 L-3通信公司 Method and system for performing video flashlight
CN101262830A (en) * 2005-07-20 2008-09-10 布拉科成像S.P.A.公司 Method and system for mapping a virtual model of an object to an object
CN101300621A (en) * 2005-09-13 2008-11-05 时空3D公司 System and method for providing a three-dimensional graphical user interface
CN101542584A (en) * 2006-10-16 2009-09-23 索尼株式会社 display device, display method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Research on Eye-gaze Tracking Network Generated by Augmented Reality Application;Ming Chen等;《IEEE》;20090202;全文 *
增强现实技术的应用和研究;涂子琰等;《计算机工程与应用》;20030421;全文 *

Also Published As

Publication number Publication date
US10401967B2 (en) 2019-09-03
US20160291699A1 (en) 2016-10-06
US20140361988A1 (en) 2014-12-11
JP2014531662A (en) 2014-11-27
US20160306433A1 (en) 2016-10-20
US11093045B2 (en) 2021-08-17
KR20190133080A (en) 2019-11-29
JP2018028922A (en) 2018-02-22
WO2013093906A1 (en) 2013-06-27
KR20220032059A (en) 2022-03-15
US11494000B2 (en) 2022-11-08
US20170052599A1 (en) 2017-02-23
CN115167675A (en) 2022-10-11
CN103858073A (en) 2014-06-11
JP2021007022A (en) 2021-01-21
US20200097093A1 (en) 2020-03-26
US20220107687A1 (en) 2022-04-07
KR20140069124A (en) 2014-06-09
US20160259423A1 (en) 2016-09-08
US20160320855A1 (en) 2016-11-03
JP7297216B2 (en) 2023-06-26

Similar Documents

Publication Publication Date Title
US11494000B2 (en) Touch free interface for augmented reality systems
US11941764B2 (en) Systems, methods, and graphical user interfaces for adding effects in augmented reality environments
AU2022228121B2 (en) User interfaces for simulated depth effects
US12462498B2 (en) Systems, methods, and graphical user interfaces for adding effects in augmented reality environments
US10168792B2 (en) Method and wearable device for providing a virtual input interface
CN109739361B (en) Visibility improvement method based on eye tracking and electronic device
EP2956843B1 (en) Human-body-gesture-based region and volume selection for hmd
CN104662492B (en) For information processor, display control method and the program of the rolling for changing the content rolled automatically
EP4068115A1 (en) Mobile terminal and method for controlling the same
US20250110574A1 (en) User interfaces integrating hardware buttons
US9898183B1 (en) Motions for object rendering and selection
CN111240483A (en) Operation control method, head mounted device and medium
US20250217002A1 (en) Three-dimensional user interfaces
US20240291944A1 (en) Video application graphical effects
US20260019699A1 (en) Camera user interface

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220729

CF01 Termination of patent right due to non-payment of annual fee