CN103858073A - Touch-free interface for augmented reality systems - Google Patents

Touch-free interface for augmented reality systems Download PDF

Info

Publication number
CN103858073A
CN103858073A CN201280048836.8A CN201280048836A CN103858073A CN 103858073 A CN103858073 A CN 103858073A CN 201280048836 A CN201280048836 A CN 201280048836A CN 103858073 A CN103858073 A CN 103858073A
Authority
CN
China
Prior art keywords
gesture
icon
external unit
moving
methods
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201280048836.8A
Other languages
Chinese (zh)
Other versions
CN103858073B (en
Inventor
I·卡茨
A·申弗尔德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eyesight Mobile Technologies Ltd
Original Assignee
Eyesight Mobile Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyesight Mobile Technologies Ltd filed Critical Eyesight Mobile Technologies Ltd
Priority to CN202210808606.2A priority Critical patent/CN115167675A/en
Publication of CN103858073A publication Critical patent/CN103858073A/en
Application granted granted Critical
Publication of CN103858073B publication Critical patent/CN103858073B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Position Input By Displaying (AREA)

Abstract

A method and system for augmented reality. Images of the real-world scene are obtained from one or more image sensors. The orientation and/or position of the image sensor is obtained from one or more state sensors. A real world object in the image of the real world scene is identified and data associated with the identified object is displayed on a viewing device, a predetermined pointing object performing a predetermined gesture on the real world object. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.

Description

增强现实系统的免触摸界面Touch-free interface for augmented reality systems

技术领域technical field

本发明涉及用于增强现实的方法和系统。The present invention relates to methods and systems for augmented reality.

相关技术related technology

以下列出与本公开主题的背景相关的参考文献:References that are relevant to the background of the disclosed subject matter are listed below:

���国专利No.7126558;US Patent No.7126558;

美国公开专利申请20110221669;US Published Patent Application 20110221669;

美国公开专利申请20110270522;US Published Patent Application 20110270522;

GB2465280(A);GB2465280(A);

美国公开专利申请20120068913;US Published Patent Application 20120068913;

美国专利No.7,215,322;U.S. Patent No. 7,215,322;

WO2005/091125;WO2005/091125;

WO2010/086866WO2010/086866

Crowley,J.L.等人的“Finger Tracking as an Input Device for Augmented Reality”。其发表在1995年6月Switzerland的Zurich的《International Workshop on Face and GestureRecognition》的会刊上。"Finger Tracking as an Input Device for Augmented Reality" by Crowley, J.L. et al. It was published in the journal of "International Workshop on Face and Gesture Recognition" in Zurich, Switzerland in June 1995.

上述参考文献的确认不应被推断为意指,这些都以任何方式与本公开主题的专利相关。Identification of the above references should not be inferred as implying that these are in any way relevant to the patent of the presently disclosed subject matter.

背景技术Background technique

增强现实是物理、现实世界环境的实时、直接或间接视图的术语,所述物理、现实世界环境的元素由计算机生成的信息来增强,例如文本、声音、视频、图形或GPS数据。有关环境和其对象的人工信息因此覆盖在现实世界视图或图像上。增强通常实时进行且在环境因素的语义上下文中,使得关于用户的周围现实世界的信息变得互动和数字可操作。Augmented reality is the term for a real-time, direct or indirect view of a physical, real-world environment, elements of which are augmented by computer-generated information, such as text, sound, video, graphics or GPS data. Artificial information about the environment and its objects is thus overlaid on the real world view or image. Augmentation typically occurs in real-time and within the semantic context of environmental factors, making information about the user's surrounding real world interactive and digitally actionable.

用于增强现实的主要硬件组件是处理器、显示器、传感器和输入设备。特别是CPU、显示器、相机和MEMS传感器(例如加速度计、GPS或固态罗盘)的这些元件存在于例如智能手机的便携式设备中,从而允许它们充当增强现实平台。The main hardware components for augmented reality are processors, displays, sensors, and input devices. These elements, notably CPUs, displays, cameras and MEMS sensors such as accelerometers, GPS or solid-state compasses, are present in portable devices such as smartphones, allowing them to act as augmented reality platforms.

增强现实系统已在娱乐、导航、装配工艺、维修、医疗程序中广泛应用。便携式增强现实系统也已在旅游观光中广泛应用,其中增强现实用以呈现正在观看的现实世界对象和位置对象的信息。Augmented reality systems are already widely used in entertainment, navigation, assembly processes, maintenance, and medical procedures. Portable augmented reality systems have also been widely used in tourism, where augmented reality is used to present information about real world objects and location objects being viewed.

使用通常呈护目镜或头盔形式的头戴式显示器来提供身临其境的增强现实体验。在使用头戴式显示器的情况下,虚拟视觉对象被叠加到用户现实世界场景的视图上。头戴显示器用允许系统把虚拟信息与物理世界对齐的传感器来跟踪。例如,跟踪可使用例如数码相机或其它光学传感器、加速度计、GPS、陀螺仪、固态罗盘、RFID和无线传感器的技术中的任一个或多个来执行。头戴式显示器是光透视或视频透视。光透视使用一些解决方案,例如半镀银镜以使图像通过镜片并覆盖将要反映到用户眼睛的信息,和透明LCD投影机,所述透明LCD投影机把数字信息和图像直接或间接显示到用户视网膜。Use a head-mounted display, usually in the form of goggles or a helmet, to provide an immersive augmented reality experience. In the case of a head-mounted display, virtual visual objects are superimposed on the user's view of the real-world scene. Head-mounted displays are tracked with sensors that allow the system to align virtual information with the physical world. For example, tracking may be performed using any one or more of technologies such as digital cameras or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID, and wireless sensors. Head-mounted displays are light see-through or video see-through. Light see-through uses solutions such as half-silvered mirrors to pass the image through the lens and overlay the information to be reflected to the user's eyes, and transparent LCD projectors that display digital information and images directly or indirectly to the user retina.

发明内容Contents of the invention

本发明提供一种用于增强现实的互动系统。本发明的互动系统包括可例如并入一副眼镜或护目镜的可穿戴数据显示设备。可穿戴显示器具有提供位置提取功能的设备(例如GPS)和罗盘。系统也包括允许用户选择计算机生成的数据以增强用户查看的现实世界场景的用户界面。相机获得正被查看的现实世界场景的图像。处理器在相机捕获的现实世界场景的图像中检测预定对象,例如用户手指。当用户指向场景中的元素时,与所述元素有关的数据被显示在数据显示设备上并被叠加到用户的场景查看中。The present invention provides an interactive system for augmented reality. The interactive system of the present invention includes a wearable data display device that may, for example, be incorporated into a pair of glasses or goggles. Wearable displays have devices that provide location extraction (such as GPS) and compasses. The system also includes a user interface that allows a user to select computer-generated data to enhance a real-world scene viewed by the user. The camera obtains an image of the real world scene being viewed. A processor detects a predetermined object, such as a user's finger, in an image of a real-world scene captured by the camera. When the user points to an element in the scene, data related to that element is displayed on the data display device and superimposed into the user's view of the scene.

因此,在一个方面,本发明提供一种用于增强现实的方法,其包括:Accordingly, in one aspect, the present invention provides a method for augmented reality comprising:

(a)从一个或多个图像传感器获得现实世界场景的图像;(a) acquire images of real-world scenes from one or more image sensors;

(b)从一个或多个状态传感器获得所述图像传感器的方向和位置数据中的一个或两个;(b) obtaining one or both of orientation and position data for said image sensor from one or more state sensors;

(c)在所述一个或多个图像传感器获得的所述现实世界场景的所述图像中识别现实世界对象,预定指向对象在所述现实世界对象上执行预定手势,所述手势检测模块利用所述一个或多个状态传感器提供的数据;和(c) identifying, in said images of said real world scene obtained by said one or more image sensors, a real world object on which a predetermined pointing object performs a predetermined gesture, said gesture detection module utilizing said data provided by the one or more condition sensors; and

(d)在查看设备的显示��上���现与所述识别的对象关联的数据。(d) presenting data associated with said identified object on a display of a viewing device.

所述图像传感器可选自:相机、光传感器、IR传感器、超声波传感器、接近传感器、CMOS图像传感器、短波红外(SWIR)图像传感器或反射传感器、IR传感器、超声波传感器、接近传感器,和反射传感器。状态传感器中的一个或多个可选自:光学传感器、加速度计、GPS、陀螺仪、罗盘、磁传感器、指示所述设备相对于地球磁场的所述方向的传感器、重力传感器和RFID检测器。The image sensor may be selected from: a camera, a light sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a short-wave infrared (SWIR) image sensor or a reflective sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, and a reflective sensor. One or more of the status sensors may be selected from: optical sensors, accelerometers, GPS, gyroscopes, compasses, magnetic sensors, sensors indicating said orientation of said device relative to the earth's magnetic field, gravity sensors and RFID detectors.

与所述识别的对象关联的所述数据可通过在存储器中搜索与所述现实世界对象关联的数据来获得。The data associated with the identified object may be obtained by searching memory for data associated with the real world object.

所述预定对象例如可为手、手的一部分、两只手、两只手的部分、手指、手指的一部分或指尖。The predetermined object may be, for example, a hand, a part of a hand, both hands, parts of both hands, a finger, a part of a finger or a fingertip.

所述查看设备可被配置以由用户佩戴,例如,眼镜或护目镜。所述查看设备可被被并入移动通信设备。The viewing device may be configured to be worn by a user, eg, eyeglasses or goggles. The viewing device may be incorporated into a mobile communication device.

所述在所述一个或多个图像传感器获得的所述现实世界场景的所述图像中识别的步骤可包括:确定所述图像传感器获得的图像中所述预定对象的位置(X,Y);和确定所述传感器提供的所述显示设备的位置和方向中的一个或两个。The step of identifying in said images of said real world scene obtained by said one or more image sensors may comprise: determining a position (X, Y) of said predetermined object in said images obtained by said image sensors; and determining one or both of the position and orientation of the display device provided by the sensor.

本发明的方法还可包括:与外部设备或网站通信。所述通信可包括:把消息发送到在所述外部设备上运行的应用程序、在所述外部设备上运行的服务、在所述外部设备上运行的操作系统、在所述外部设备上运行的程序、在所述外部设备的处理器上运行的一个或多个应用程序、在所述外部设备的所述背景中运行的软件程序,或在所述外部设备上运行的一个或多个服务。所述方法还可包括:把消息发送到在所述移动通信设备上运行的应用程序、在所述移动通信设备上运行的服务、在所述移动通信设备上运行的操作系统、在所述移动通信设备上运行的程序、在所述移动通信设备的处理器上运行的一个或多个应用程序、在所述移动通信设备的所述背景中运行的软件程序,或在所述移动通信设备上运行的一个或多个服务。The method of the present invention may also include: communicating with external devices or websites. The communication may include sending a message to an application running on the external device, a service running on the external device, an operating system running on the external device, an program, one or more application programs running on a processor of said external device, a software program running in said background of said external device, or one or more services running on said external device. The method may further include sending a message to an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, an a program running on a communication device, one or more application programs running on a processor of said mobile communication device, a software program running in said background of said mobile communication device, or on said mobile communication device One or more services running.

所述方法还可包括:从在所述外部设备上运行的应用程序、在所述外部设备上运行的服务、在所述外部设备上运行的操作系统、在所述外部设备上运行的程序、在所述外部设备的处理器上运行的一个或多个应用程序、在所述外部设备的所述背景中运行的软件程序发送消息,所述消息请求与图像中识别的现实世界对象有关的数据,或者把所述消息发送到在所述外部设备上运行的一个或多个服务。所述方法还可包括:从在所述移动通信设备上运行的应用程序、在所述移动通信设备上运行的服务、在所述移动通信设备上运行的操作系统、在所述移动通信设备上运行的程序、在所述移动通信设备的处理器上运行的一个或多个应用程序、在所述移动通信设备的所述背景中运行的软件程序发送消息,所述消息请求与图像中识别的现实世界对象有关的数据,或者把所述消息发送到在所述移动通信设备上运行的一个或多个服务。The method may further include: from an application running on the external device, a service running on the external device, an operating system running on the external device, a program running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device sends a message requesting data related to a real world object identified in an image , or send the message to one or more services running on the external device. The method may further include: from an application program running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, A running program, one or more application programs running on a processor of said mobile communication device, a software program running in said background of said mobile communication device sends a message requesting a data about real-world objects, or send the message to one or more services running on the mobile communication device.

到所述外部设备或网站的所述消息可为命令。所述命令可选自:在所述外部设备或网站上运行应用程序的命令、停止在所述外部设备或网站上运行的应用程序的命令、激活在所述外部设备或网站上运行的服务的命令、停止在所述外部设备或网站上运行的服务的命令,或发送与图像中识别的现实世界对象有关的数据的命令。The message to the external device or website may be a command. The command may be selected from: a command to run an application on the external device or website, a command to stop an application running on the external device or website, a command to activate a service running on the external device or website commands, commands to stop services running on said external device or website, or commands to send data related to real-world objects identified in the images.

到所述移动通信设备的所述消息可为命令。所述命令可选自:在所述移动通信设备上运行应用程序的命令、停止在所述移动通信设备或网站上运行的应用程序的命令、激活在所述移动通信设备上运行的服务的命令、停止在所述移动通信设备上运行的服务的命令,或发送与图像中识别的现实世界对象有关的数据的命令。The message to the mobile communication device may be a command. The command may be selected from: a command to run an application on the mobile communication device, a command to stop an application running on the mobile communication device or a website, a command to activate a service running on the mobile communication device , a command to stop a service running on said mobile communication device, or a command to send data related to a real world object identified in an image.

所述方法还可包括:从所述外部设备或网站接收与图像中识别的现实世界对象有关的数据;和把所述接收到的数据呈现给用户。The method may further include: receiving data from the external device or website related to the real world object identified in the image; and presenting the received data to a user.

与所述外部设备或网站通信可通过通信网络进行。Communication with said external device or website may be performed through a communication network.

到所述外部设备的所述命令可选自:按压所述外部设备的显示设备上显示的虚拟键;旋转选择转盘;切换桌面、在所述外部设备上运行预定软件应用程序;关闭所述外部设备上的应用程序;打开或关闭音箱;调高或调低音量;锁定所述外部设备、解锁所述外部设备、跳到媒体播放器的另一个曲目或在IPTV频道间转换;控制导航应用程序;发起呼叫、结束通话、呈现出通知、显示通知;浏览照片或音乐专辑图库、滚动网页页面、呈现电子邮件、呈现一个或多个文档或地图、控制游戏中的动作、指着地图、放大或缩小地图或图像、在图像上着色、抓激活图标且从所述显示设备拉出所述激活图标、旋转激活图标、在所述外部设备上模拟触摸命令、执行一个或多个多点触摸命令、触摸手势命令、打字、点击显示视频以暂停或播放、标记帧或从视频捕获帧、呈现传入消息;接听来电、静音或拒绝接听来电、打开来电提醒;呈现从网络社区服务收到的通知;呈现由所述外部设备生成的通知、打开预定应用程序、改变所述外部设备的锁定模式并打开最近通话应用程序、改变所述外部设备的锁定模式并打开在线服务应用程序或浏览器、改变所述外部设备的锁定模式并打开电子邮件应用程序、改变所述外部设备的锁定模式并打开在线服务应用程序或浏览器、改变所述设备的锁定模式并打开日历应用程序、改变所述设备的锁定模式并打开提醒应用程序、改变所述设备的锁定模式并打开用户设置的、所述外部设备的制造商设置的或服务运营商设置的预定应用程序、激活激活图标、选择菜单项、在显示器上移动指针、操纵显示器上的免触摸鼠标、激活图标、改变显示器上的信息。The command to the external device may be selected from: pressing a virtual key displayed on a display device of the external device; rotating a selection dial; switching desktops, running a predetermined software application on the external device; closing the external device. Apps on the device; turn the speaker on or off; turn the volume up or down; lock the external device, unlock the external device, skip to another track in the media player or switch between IPTV channels; control the navigation application ; make a call, end a call, present a notification, display a notification; browse photo or music album galleries, scroll web pages, present email, present one or more documents or maps, control in-game actions, point at a map, zoom in or zooming out on a map or image, coloring on an image, grabbing an activation icon and pulling the activation icon from the display device, rotating the activation icon, simulating a touch command on the external device, executing one or more multi-touch commands, Touch gesture commands, typing, tapping a displayed video to pause or play, mark a frame or capture a frame from a video, present an incoming message; answer an incoming call, mute or reject an incoming call, turn on incoming call alerts; present notifications received from online community services; Presenting a notification generated by the external device, opening a predetermined application, changing the lock mode of the external device and opening a recents application, changing the lock mode of the external device and opening an online service application or browser, changing all lock mode of the external device and open the email application, change the lock mode of the external device and open an online service application or browser, change the lock mode of the device and open the calendar application, change the lock mode of the device mode and open the reminder application, change the lock mode of the device and open a predetermined application set by the user, the manufacturer of the external device or the service operator, activate the activation icon, select a menu item, on the display Move the pointer, manipulate the touch-free mouse on the display, activate icons, change information on the display.

在本发明的方法中,所述预定手势可选自:翻页手势、两个手指的对捏运动、指向、左到右手势、右到左手势、向上手势、向下手势、按压手势、打开握紧的拳头、打开握紧的拳头并移向所述图像传感器、轻敲手势、挥手手势、鼓掌手势、反向鼓掌手势、手���������、对捏手势、反向对捏手势、张开手指的手势、反向张开手指的手势、指着激活图标、保持激活对象预定时间量、点击激活图标、双击激活图标、从右侧点击激活图标、从左侧点击激活图标、从下点击激活图标、从上点击激活图标、抓激活图标即所述对象、从右指着激活图标即所述对象、从左指着激活图标、从左通过激活图标、推对象、鼓掌、在激活图标上方挥手、执行爆炸手势、执行轻敲手势、在激活图标上执行顺时针或反时针手势、滑动图标、用两个手指抓激活图标,和执行点击拖动释放运动。In the method of the present invention, the predetermined gesture may be selected from: page turning gesture, pinching movement of two fingers, pointing, left-to-right gesture, right-to-left gesture, upward gesture, downward gesture, pressing gesture, opening Clenched fist, open clenched fist and move towards said image sensor, tap gesture, wave hand gesture, clap gesture, reverse clap gesture, hand clenched, pinch gesture, reverse pinch gesture, open Finger gesture, Reverse spread finger gesture, Point to activate icon, Hold activate object for predetermined amount of time, Tap to activate icon, Double tap to activate icon, Tap to activate icon from right side, Tap to activate icon from left side, Tap to activate from bottom icon, click to activate icon from above, grab activate icon to say object, point to activate icon from right to say object, point to activate icon from left, pass activate icon from left, push object, clap, wave over activate icon , performing an explosion gesture, performing a tap gesture, performing a clockwise or counterclockwise gesture on an activation icon, sliding an icon, grabbing an activation icon with two fingers, and performing a tap-drag-to-release motion.

与所述识别的对象关联的所述数据可为视觉数据、音频数据或文本数据中的任一个或多个。与所述识别的对象关联的所述数据可为激活图标。所述激活图标可为2D或3D激活图标。所述激活图标可由用户在所述用户前面的3D空间中感知。The data associated with the identified object may be any one or more of visual data, audio data or text data. The data associated with the identified object may be an activation icon. The activation icon may be a 2D or 3D activation icon. The activation icon may be perceived by the user in 3D space in front of the user.

本发明的方法可具有两个或更多个操作模式。所述方法可在识别预定手势之后,改变所述系统的所述操作模式。操作模式可由以下中的任一个或多个指定:将要识别的所述手势、在所述手势检测模块上有效的算法;所述图像传感器捕获的图像分辨率,和所述图像传感器捕获的图像捕获率、将要呈现的所述数据的所述详细程度、将要呈现给所述用户的所述激活图标、将要呈现的数据源、将要呈现的所述数据的详细程度、将要在所述显示设备上显示的激活图标、活跃的在线服务。The method of the invention may have two or more modes of operation. The method may change the mode of operation of the system after recognizing a predetermined gesture. The mode of operation may be specified by any one or more of: the gesture to be recognized, the algorithm active on the gesture detection module; the image resolution captured by the image sensor, and the image capture captured by the image sensor rate, the level of detail of the data to be presented, the activation icon to be presented to the user, the source of the data to be presented, the level of detail of the data to be presented, the level of detail to be displayed on the display device activation icon, active online service.

所述操作模式可为选自以下的模式:在识别预定手势之后所述图像传感器视频录制图像的模式;在识别预定手势之后麦克风录制声音并在识别另一预定手势之后停止录制的模式;连续监测视频或声音并在检测到预定手势之后录制从识别所述手势之前预定时间量开始的所述视频或声音且在识别另一预定手势之后停止所述录制的模式;在识别预定手势之后向捕获和实时录制的视频添加标签的模式;在所述相机捕获的所述视场中选择区域并把所述区域复制到所述视场中另一位置且其调整大小的模式;对图像中选择区域使用跟踪器并在所述显示设备上所述调整大小和重新安置区域中实时呈现所述选择区域的模式;在识别预定手势之后捕获图像的模式。The mode of operation may be a mode selected from: a mode in which the image sensor video records an image after recognizing a predetermined gesture; a mode in which the microphone records sound after recognizing a predetermined gesture and stops recording after recognizing another predetermined gesture; continuous monitoring A mode of video or sound and recording said video or sound starting from a predetermined amount of time before recognition of said gesture after detecting a predetermined gesture and stopping said recording after recognition of another predetermined gesture; A mode of tagging video recorded in real time; a mode of selecting an area in the field of view captured by the camera and copying the area to another location in the field of view and resizing it; using the selected area in the image A tracker and a mode of presenting said selected area in real time in said resizing and repositioning area on said display device; a mode of capturing an image after recognition of a predetermined gesture.

本发明的方法还可包括:运行跟踪算法,所述跟踪算法跟踪所述识别的现实世界对象并维持所述显示的相关视觉数据相对于所述识别的现实世界对象处于固定位置。The method of the present invention may also include running a tracking algorithm that tracks the identified real world object and maintains the displayed associated visual data in a fixed position relative to the identified real world object.

对象识别模块可用以只有当所述显示设备具有低于预定阈值的运动水平时才检测所述预定对象。The object recognition module is operable to detect the predetermined object only if the display device has a level of motion below a predetermined threshold.

所述方法还可包括:当已识别到预定手势时提供反馈。所述反馈例如可为视觉反馈、听觉反馈、触觉反馈、定向振动、空气触觉反馈,或超声波反馈。所述反馈可为呈选自以下形式的视觉指示:所述显示设备上显示的激活图标、所述显示设备上显示的激活图标的变化、所述显示设备上显示的激活图标的颜色的变化、所述显示设备上显示的激活图标的大小的变化、所述显示设备上显示的激活图标的动画、指示灯、在显示设备上移动的指示器、在所述显示设备上出现的所有其它图像或视频顶部出现的在所述显示设备上移动的指示器,和所述预定对象周围的辉光的所述外观。所述反馈可为振动、定向振动指示,或空气触觉指示。The method may further include providing feedback when the predetermined gesture has been recognized. The feedback can be, for example, visual feedback, auditory feedback, tactile feedback, directional vibration, air tactile feedback, or ultrasonic feedback. The feedback may be a visual indication in a form selected from the group consisting of: an activation icon displayed on the display device, a change in the activation icon displayed on the display device, a change in the color of the activation icon displayed on the display device, changes in the size of activation icons displayed on the display device, animations of activation icons displayed on the display device, indicator lights, pointers moving on the display device, all other images appearing on the display device or The appearance of a pointer moving on said display device and a glow around said predetermined object appearing at the top of the video. The feedback may be a vibration, a directional vibration indication, or an air haptic indication.

在本发明的方法中,所述显示设备上显示的激活图标的部分不在所述预定对象所在的位置呈现,使得所述预定对象似乎在所述激活图标的顶部。In the method of the present invention, the part of the activation icon displayed on the display device is not presented where the predetermined object is located, so that the predetermined object appears to be on top of the activation icon.

当所述显示设备具有高于预定阈值的活动水平时,激活图标可被从所述显示设备删除。例如,当所述显示设备具有低于所述预定阈值的运动水平时,在所述显示设备上的所述删除的图标可被删除。The activation icon may be removed from the display device when the display device has an activity level above a predetermined threshold. For example, the deleted icon on the display device may be deleted when the display device has a motion level below the predetermined threshold.

当执行预定动作时,所述方法可被带入所述活动模式。所述预定动作可选自:当用户把所述预定对象放入某一位置或一团时,把所述预定对象从下带入所述视场,例如指着所述相机视场的所述右下角或在所述相机视场中打开手;当显示激活图标且所述用户执行关联到所述激活图标的预定手势时,例如指着所述激活图标时,执行预定手势,例如从右向左移动手穿过所述视场,或在呈现所述激活图标的所述位置执行挥手手势,或通过在所述3D空间中在感知所述激活图标处于的位置执行手势、通过触摸所述设备,或如果所述设备具有加速度计就在所述设备上轻敲,把所述浮动激活图标从一个位置滑动到另一位置。作为另一实例,如果所述设备具有接近传感器或超声波传感器,那么当所述用户的手靠近所述设备时,所述系统可进入所述活动模式。所述系统也可由语音命令激活,或当所述用户把所述预定对象放入所述视场中特定位置时激活。作为另一实例,只有当在所述用户的所述视场中有与所述现实世界关联的相关数据时,所述系统才可进入所述活动模式。此时,所述系统可向所述用户指示何时有将要呈现的相关数据,或何时准备好进行互动。The method may be brought into the active mode when a predetermined action is performed. The predetermined action may be selected from: when the user puts the predetermined object into a certain position or a ball, bringing the predetermined object into the field of view from below, such as pointing at the Open hand in the lower right corner or in the camera field of view; when an activation icon is displayed and the user performs a predetermined gesture associated with the activation icon, such as pointing at the activation icon, perform a predetermined gesture, such as from the right moving the hand left across the field of view, or performing a wave gesture at the location where the activation icon is presented, or by performing a gesture in the 3D space at a location where the activation icon is perceived to be, by touching the device , or tap on the device if the device has an accelerometer, to slide the floating activation icon from one position to another. As another example, if the device has a proximity sensor or an ultrasonic sensor, the system may enter the active mode when the user's hand is near the device. The system may also be activated by voice commands, or when the user places the predetermined object at a specific location in the field of view. As another example, the system may enter the active mode only when there is relevant data associated with the real world in the user's field of view. At this point, the system can indicate to the user when there is relevant data to be presented, or when it is ready for interaction.

本发明的方法还可包括:把视觉指示附加到现实世界对象,以指示存在与所述现实世界对象相关的数据的存储器。所述视觉指示可被覆盖在所述现实世界对象的图像上。所述视觉可选自激活图标、照片和信封的图像。The method of the present invention may also include attaching a visual indication to the real world object to indicate that there is storage for data related to the real world object. The visual indication may be overlaid on the image of the real world object. The visual may be selected from an activation icon, a photograph, and an image of an envelope.

本发明的方法还可包括:记录所述预定对象的一个或多个物理参数的校准过程。所述校准过程可包括选自以下的任一个或多个步骤:在所述显示器上在3D空间中不同的位置处呈现激活图标;提取所述预定对象的物理特性;和确定所述预定对象的尺寸和它与所述相机的距离之间的相关性。所述校准过程可包括以下步骤:构建三角形,所述三角形的顶点在所述图像传感器中的一个上且在所述预定对象的前端,且所述三角形的边由用户的视线形成。所述现实世界对象与所述相机的所述距离可基于所述校准中提取的信息来估计。The method of the present invention may further comprise recording a calibration procedure of one or more physical parameters of said predetermined object. The calibration process may include any one or more steps selected from the group consisting of presenting activation icons at different locations in 3D space on the display; extracting physical properties of the predetermined object; and determining Correlation between size and its distance from the camera. The calibration process may include the step of constructing a triangle whose apex is on one of the image sensors and in front of the predetermined object, and whose sides are formed by a user's line of sight. The distance of the real world object from the camera may be estimated based on information extracted in the calibration.

所述方法还可包括:显示能够进行文字输入的键盘。所述键盘可在检测到预定手势之后显示,所述预定手势例如从右到左的手势、呈现张开的手、在图像传感器的所述视场的预定区域中呈现两个张开的手。所述键盘可在3D打字区域或感知预定激活图标所处于的位置中执行点击手势之后显示。The method may further include: displaying a keyboard capable of text input. The keyboard may be displayed upon detection of a predetermined gesture, such as a right-to-left gesture, presentation of an open hand, presentation of two open hands in a predetermined area of the field of view of the image sensor. The keyboard may be displayed after a tap gesture is performed in a 3D typing area or a location where a predetermined activation icon is sensed.

本发明也提供一种系统,其包括被配置以执行本发明的方法的设备。The invention also provides a system comprising a device configured to perform the method of the invention.

本发明也提供一种计算机程序,其包括用于当所述程序在计算机上运行时执行本发明的方法的所有步骤的计算机程序代码构件。所述计算机程序可实施在计算机可读介质上。The invention also provides a computer program comprising computer program code means for carrying out all the steps of the method of the invention when said program is run on a computer. The computer program can be embodied on a computer readable medium.

用户可与通常通过眼镜显示的视觉图像互动。因此,用户的现实视图被显示器上呈现的信息增强。增强现实设备的一个问题是用户与设备互动并控制设备的方式。例如鼠标、跟踪球或触摸屏的传统控制设备难以与增强现实设备连用。在增强现实系统中使用手势识别并不简单,因为用户在不断的实时移动因此增强现实设备也在不断的实时移动。Users can interact with visual images typically displayed through glasses. Thus, the user's view of reality is augmented by the information presented on the display. One problem with augmented reality devices is the way users interact with and control the devices. Traditional control devices such as mice, trackballs or touch screens are difficult to use with augmented reality devices. Using gesture recognition in an augmented reality system is not straightforward because the user is constantly moving in real time and thus the augmented reality device is constantly moving in real time.

本发明因此提供一种计算机程序产品,其包含用于使处理器执行包括以下步骤的方法的指令:The present invention therefore provides a computer program product comprising instructions for causing a processor to perform a method comprising the steps of:

从与增强现实设备关联的图像传感器接收与环境关联的图像信息;receiving image information associated with the environment from an image sensor associated with the augmented reality device;

在与设备关联的显示器上显示与环境有关的增强信息;Display enhanced information related to the environment on a display associated with the device;

在图像信息中识别设备用户的手势;Recognize device user gestures in image information;

把手势与增强信息相关联;和associate gestures with augmented information; and

基于关联来改变��示的增强信息。The displayed enhanced information is changed based on the association.

所述增强信息可包括以下中至少一个:与环境中对象关联的信息;与环境关联的图像;和与环境关联的距离。The enhanced information may include at least one of: information associated with objects in the environment; images associated with the environment; and distances associated with the environment.

所述关联可包括:确定用户手的至少一部分的三维空间中的参考位置;和确定与参考位置关联的增强信息和图像信息数据中至少一个。The associating may include: determining a reference location in three-dimensional space of at least a portion of the user's hand; and determining at least one of enhancement information and image information data associated with the reference location.

所述改变可包括:根据与参考位置关联的数据来改变增强信息。The changing may include changing the augmentation information based on data associated with the reference location.

附图说明Description of drawings

为了理解本发明并了解其在实践中如何实施,现在将参照附图仅通过非限制性实例的方式来描述实施方案,在附图中:In order to understand the invention and see how it may be implemented in practice, an embodiment will now be described, by way of non-limiting example only, with reference to the accompanying drawings in which:

图1示意性地示出根据本发明的一个实施方案的用于增强现实的系统;Fig. 1 schematically shows a system for augmented reality according to one embodiment of the present invention;

图2示出根据本发明的一个实施方案的用于增强现实的系统,所述系统包括一组护目镜;Figure 2 illustrates a system for augmented reality comprising a set of goggles according to one embodiment of the present invention;

图3示出使用中的图2的系统;Figure 3 shows the system of Figure 2 in use;

图4a示出在图2的系统的显示设备上显示的现实世界场景的视图;图4b示出图4a的视图,其中用户手指指向视图中的对象;和图4c示出覆盖在图4b的视图上的与用户手指所指向的对象有关的视觉文本;Figure 4a shows a view of a real world scene displayed on a display device of the system of Figure 2; Figure 4b shows the view of Figure 4a with a user's finger pointing at an object in the view; and Figure 4c shows the view overlaid on Figure 4b Visual text related to the object pointed by the user's finger on the ;

图5示出根据本发明的另一实施方案的与通信设备成一体的用于增强现实的系统;和5 shows a system for augmented reality integrated with a communication device according to another embodiment of the present invention; and

图6a示出通过用户执行“绘制”区域轮廓的手势来指定图像传感器的视场中的区域;图6b示出通过执行第二手势来调整选择区域的大小;图6c示出调整大小之后的区域;和图6d示出拖到视场中新的位置之后的区域。Figure 6a shows specifying an area in the field of view of the image sensor by the user performing a gesture of "drawing" the outline of the area; Figure 6b shows resizing the selected area by performing a second gesture; Figure 6c shows the resized region; and Figure 6d shows the region after dragging to a new location in the field of view.

具体实施方式Detailed ways

图1示意性地示出根据本发明的一个实施方案的用于增强现实的系统30。系统30包括一个或多个图像传感器32,图像传感器32被配置以获得现实世界场景的图像。任何类型的图像传感器可用于本发明的系统,例如相机、光传感器、IR传感器、超声波传感器、接近传感器、CMOS图像传感器、短波红外(SWIR)图像传感器或反射传感器。Fig. 1 schematically shows a system 30 for augmented reality according to an embodiment of the present invention. System 30 includes one or more image sensors 32 configured to obtain images of real-world scenes. Any type of image sensor can be used in the system of the present invention, such as cameras, light sensors, IR sensors, ultrasonic sensors, proximity sensors, CMOS image sensors, short-wave infrared (SWIR) image sensors, or reflective sensors.

系统30还包括具有一个或多个显示设备35的查看设备34,显示设备35使得用户能够看到现实世界场景和叠加到现实世界场景上的外部信息,例如图像、视频或音频信号。允许用户看到现实世界场景和显示的数据的任何类型的显示设备可用于本发明的系统中。The system 30 also includes a viewing device 34 having one or more display devices 35 that enable a user to see the real world scene and external information superimposed on the real world scene, such as images, video or audio signals. Any type of display device that allows a user to see the real world scene and displayed data can be used in the system of the present invention.

显示设备35可例如包括视觉材料在其上呈现给用户的表面或直接把图像显示到用户视网膜的一个或多个投影机。处理器36从一个或多个状态传感器38获得系统30的方向和/或位置数据,状态传感器38例如可为光学传感器、加速度计、GPS、陀螺仪、固态罗盘、磁传感器、重力传感器和RFID检测器中的任一个或多个。处理器36例如可为专用处理器、通用处理器、DSP(数字信号处理器)处理器、GPU(可视处理单元)处理器、专用硬件,或者可在外部设备上运行的处理器。系统30可作为软件在查看设备34或并入系统30的其它组件的另一设备37(例如智能手机)上运行。Display device 35 may, for example, include a surface on which visual material is presented to the user or one or more projectors that display images directly to the user's retina. Processor 36 obtains orientation and/or position data for system 30 from one or more state sensors 38, such as optical sensors, accelerometers, GPS, gyroscopes, solid-state compasses, magnetic sensors, gravity sensors, and RFID detection any one or more of the devices. Processor 36 may be, for example, a special-purpose processor, a general-purpose processor, a DSP (Digital Signal Processor) processor, a GPU (Visual Processing Unit) processor, dedicated hardware, or a processor that can run on an external device. The system 30 may run as software on a viewing device 34 or another device 37 (eg, a smartphone) incorporating other components of the system 30 .

处理器36被配置以运行手势检测模块40,手势检测模块40在图像传感器32获得的现实世界场景的图像中识别预定对象正指向的一个或多个现实世界对象。现实世界对象例如可为建筑物或广告牌。现实世界对象的确定使用状态传感器38提供的数据。预定对象可为用户的手指或例如手写笔或棒的其它对象。Processor 36 is configured to execute gesture detection module 40 that identifies one or more real-world objects that a predetermined object is pointing at in the image of the real-world scene obtained by image sensor 32 . A real world object may be a building or a billboard, for example. The determination of real world objects uses data provided by state sensors 38 . The predetermined object may be a user's finger or other object such as a stylus or wand.

当处理器36已识别到预定对象正指向的现实世界对象时,处理器在存储器42中搜索与识别的对象关联的数据。数据例如可为视觉数据、音频数据,或文本数据。视觉数据可为与识别的对象有关的文本信息。处理器然后在查看设备的显示器上显示与识别的对象关联的相关视觉数据。存储器42可与系统30成一体,或可能位于远程并通过例如互联网的通信网络访问。系统30因此可包括通信模块39,通信模块39允许系统30与网络、无线网络、蜂窝网络、外部设备(例如,另一设备30、手机、平板),或互联网网站等通信。When processor 36 has identified a real world object to which the predetermined object is pointing, processor searches memory 42 for data associated with the identified object. Data may be, for example, visual data, audio data, or text data. The visual data may be textual information related to the recognized object. The processor then displays relevant visual data associated with the identified object on the display of the viewing device. Memory 42 may be integral to system 30, or may be located remotely and accessed through a communication network such as the Internet. System 30 may thus include a communication module 39 that allows system 30 to communicate with a network, a wireless network, a cellular network, an external device (eg, another device 30 , a cell phone, a tablet), or an Internet website, among others.

所述数据可为激活图标。如本文所使用,术语“激活图标”代表与用户互动所激活的一个或多个消息或命令关联的图像或视频中的区域。激活图标例如可为2D或3D视觉元素,例如虚拟按钮、虚拟键盘或图标。激活图标借由一个或多个预定对象来激活,所述预定对象可由系统来识别,且例如可为手写笔、用户的手或手的一部分、一个或多个手指或例如指尖的手指的一部分中的一个或多个。预定对象激活激活图标中的一个或多个生成了定位到操作系统、一个或多个服务、一个或多个应用程序、一个或多个设备、一个或多个远程应用程序、一个或多个远程服务,或一个或多个远程设备的消息或命令。The data may be an activation icon. As used herein, the term "activation icon" refers to an area in an image or video associated with one or more messages or commands activated by user interaction. Activation icons can be, for example, 2D or 3D visual elements, such as virtual buttons, virtual keyboards or icons. The activation icon is activated by one or more predetermined objects, which can be recognized by the system and can be, for example, a stylus, a user's hand or a part of a hand, one or more fingers or a part of a finger such as a fingertip one or more of the . One or more of the predefined object activation activation icons generate a target to the operating system, one or more services, one or more applications, one or more devices, one or more remote applications, one or more remote service, or a message or command to one or more remote devices.

处理器36可被配置以把消息或命令发送到设备37或远程设备、在设备上运行的应用程序、在设备37上运行的服务,和在设备上运行的操作系统、在设备上运行的程序、在背景中运行的软件程序和在设备上运行的一个或多个服务,或在设备中运行的过程。消息或命令可通过例如互联网或蜂窝电话网络的通信网络来发送。命令例如可为在设备上运行应用程序的命令、停止在设备上运行的应用程序的命令、激活在设备上运行的服务的命令、停止在设备上运行的服务的命令,或把与处理器36在图像中识别的现实世界对象有关的数据发送到处理器36的命令。Processor 36 may be configured to send messages or commands to device 37 or to a remote device, applications running on the device, services running on device 37, and operating systems running on the device, programs running on the device , a software program running in the background and one or more services running on the device, or a process running on the device. Messages or commands may be sent over a communication network such as the Internet or a cellular telephone network. The command may be, for example, a command to run an application on the device, a command to stop an application running on the device, a command to activate a service running on the device, a command to stop a service running on the device, or to communicate with the processor 36 Data pertaining to real world objects identified in the images are sent to processor 36 as commands.

所述命令可为到设备37的命令,例如按压设备的显示设备上显示的虚拟键;旋转选择转盘;切换桌面、在设备上运行预定软件应用程序;关闭设备上的应用程序;打开或关闭音箱;调高或调低音量;锁定设备、解锁设备、跳到媒体播放器的另一个曲目或在IPTV频道间转换;控制导航应用程序;发起呼叫、结束通话、呈现出通知、显示通知;浏览照片或音乐专辑图���、���动������页面、呈现电子邮件、呈现一个或多个文档或地图、控制游戏中的动作、控制互动视频或动画内容、编辑视频或图像、指着地图、放大或缩小地图或图像、在图像上着色、把激活图标从显示设备拉开、抓激活图标且从显示设备拉出激活图标、旋转激活图标、在设备上模拟触摸命令、执行一个或多个多点触摸命令、触摸手势命令、打字、点击显示视频以暂停或播放、编辑视频或音乐命令、标记帧或从视频捕获帧、从视频切割视频的子集、呈现传入消息;接听来电、静音或拒绝接听来电、打开来电提醒;呈现从网络社区服务收到的通知;呈现由设备生成的通知、改变设备的锁定模式并激活最近通话应用程序、改变设备的锁定模式并激活在线服务应用程序或浏览器、改变设备的锁定模式并激活电子邮件应用程序、改变设备的锁定模式并激活在线服务应用程序或浏览器、改变设备的锁定模式并激活日历应用程序、改变设备的锁定模式并激活提醒应用程序、改变设备的锁定模式并激活用户设置的、设备的制造商设置的或服务运营商设置的预定应用程序、激活激活图标、选择菜单项、在显示器上移动指针、操纵免触摸鼠标、激活显示器上的激活图标,和改变显示器上的信息。The command may be a command to the device 37, such as pressing a virtual key displayed on the display device of the device; rotating a selection dial; switching desktops, running predetermined software applications on the device; closing applications on the device; ; increase or decrease the volume; lock the device, unlock the device, skip to another track in the media player or switch between IPTV channels; control navigation applications; initiate calls, end calls, present notifications, show notifications; view photos or music album gallery, scrolling web pages, rendering email, rendering one or more documents or maps, controlling actions in games, controlling interactive video or animated content, editing video or images, pointing at maps, zooming in or out on maps or images , paint on an image, pull the activation icon away from the display device, grab the activation icon and pull the activation icon from the display device, rotate the activation icon, simulate a touch command on the device, perform one or more multi-touch commands, touch gestures Command, type, tap to display video to pause or play, edit video or music commands, mark frames or capture frames from video, cut a subset of video from video, present incoming messages; answer incoming calls, mute or reject incoming calls, open incoming calls Reminders; presentation of notifications received from online community services; presentation of notifications generated by the device, changing the device's lock mode and activating the recents app, changing the device's lock mode and activating the online services app or browser, changing the device's lock mode and activate email application, change device lock mode and activate online service application or browser, change device lock mode and activate calendar application, change device lock mode and activate reminder application, change device lock mode and activate predetermined applications set by the user, set by the manufacturer of the device, or set by the service operator, activate an activation icon, select a menu item, move a pointer on a display, manipulate a touch-free mouse, activate an activation icon on a display, and change information on the display.

通信模块可用以发送例如可被定位到远程设备的消息。消息例如可为到远程设备的命令。命令例如可为在远程设备上运行应用程序的命令、停止在远程设备上运行的应用程序的命令、激活在远程设备上运行的服务的命令、停止在远程设备上运行的服务的命令。消息可为到远程设备的选自以下的命令:按压远程设备的显示设备上显示的虚拟键;旋转选择转盘;切换桌面、在远程设备上运行预定软件应用程序;关闭远程设备上的应用程序;打开或关闭音箱;调高或调低音量;锁定远程设备、解锁远程设备、跳到媒体播放器的另一个曲目或在IPTV频道间转换;控制导航应用程序;发起呼叫、结束通话、呈现出通知、显示通知;浏览照片或音乐专辑图库、滚动网页页面、呈现电子邮件、呈现一个或多个文档或地图、控制游戏中的动作、指着地图、放大或缩小地图或图像、在图像上着色、抓激活图标且从显示设备拉出激活图标、旋转激活图标、在远程设备上模拟触摸命令、执行一个或多个多点触摸命令、触摸手势命令、打字、点击显示视频以暂停或播放、标记帧或从视频捕获帧、呈现传入消息;接听来电、静音或拒绝接听来电、打开来电提醒;呈现从网络社区服务收到的通知;呈现由远程设备生成的通知、打开预定应用程序、改变远程设备的锁定模式并打开最近通话应用程序、改变远程设备的锁定模式并打开在线服务应用程序或浏览器、改变远程设备的锁定模式并打开电子邮件应用程序、改变远程设备的锁定模式并打开在线服务应用程序或浏览器、改变设备的锁定模式并打开日历应用程序、改变设备的锁定模式并打开提醒应用程序、改变设备的锁定模式并打开用户设置的、远程设备的制造商设置的或服务运营商设置的预定应用程序、激活激活图标、选择菜单项、在显示器上移动指针、操纵免触摸鼠标、激活显示器上的图标、改变显示器上的信息。The communication module can be used to send messages that can be located to remote devices, for example. A message may, for example, be a command to a remote device. A command may be, for example, a command to run an application on the remote device, a command to stop an application running on the remote device, a command to activate a service running on the remote device, or a command to stop a service running on the remote device. The message may be a command to the remote device selected from: pressing a virtual key displayed on a display device of the remote device; rotating a selection dial; switching desktops, running a predetermined software application on the remote device; closing an application on the remote device; Turn speaker on or off; turn volume up or down; lock remote device, unlock remote device, skip to another track in media player or switch between IPTV channels; control navigation apps; initiate calls, end calls, present notifications , display notifications; browse photo or music album galleries, scroll web pages, render e-mail, render one or more documents or maps, control actions in games, point at maps, zoom in or out on maps or images, color on images, Grab and pull the activation icon from the display device, rotate the activation icon, simulate a touch command on a remote device, execute one or more multi-touch commands, touch gesture commands, type, tap a display video to pause or play, mark a frame Or capture frames from video, render incoming messages; answer incoming calls, mute or reject incoming calls, turn on incoming call alerts; render notifications received from online community services; render notifications generated by remote devices, open scheduled applications, change remote devices change the lock pattern of the remote device and open the recents app, change the lock pattern of the remote device and open the online services app or browser, change the lock pattern of the remote device and open the email app, change the lock pattern of the remote device and open the online services app program or browser, change the device's lock pattern and open the Calendar application, change the device's lock pattern and open the Reminders application, change the device's lock pattern and open user settings, remote device manufacturer settings, or service operator settings activating icons, selecting menu items, moving the pointer on the display, manipulating a touch-free mouse, activating icons on the display, changing information on the display.

消息可为对与识别的对象关联的数据的请求。数据请求消息可被定位到应用程序、服务、过程、在设备上运行的线程,或从应用程序、服务、过程或在外部设备上运行的线程,或在线服务定位。A message may be a request for data associated with an identified object. A data request message can be addressed to or from an application, service, process, or thread running on an external device, or an online service.

为了减少CPU资源,只有当耳机如从状态传感器获得的信息所确定地并未显著移动时,才可使用检测预定对象的对象识别模块。To reduce CPU resources, an object recognition module that detects a predetermined object may be used only when the headset is not significantly moving as determined from information obtained from the state sensor.

图2示出根据本发明的一个实施方案的用于增强现实的系统2。系统2包括便携式查看设备,所述查看设备例如可为互动头戴式目镜,例如一副眼镜或护目镜4。护目镜4具有获得现实世界场景8的图像的图像传感器6。场景8例如可包括一个或多个建筑物12,或一个或多个广告牌14。护目镜可具有一个或多个显示设备10,所述显示设备10位于护目镜4中以当用户配戴护目镜4时位于用户眼睛前。显示设备10例如可为通过其查看现实世界场景并呈现外部数据的透视设备,例如透明LCD屏幕。系统2还包括处理器16,处理器16被配置以在图像传感器6捕获的图像中识别预定对象,所述预定对象执行手势或指向现实世界场景8中的现实世界对象或显示给用户的激活图标。系统2也包括一个或多个位置和/或方向传感器23,例如GPS、加速度计、陀螺仪、固态罗盘、磁传感器,或重力传感器。Fig. 2 shows a system 2 for augmented reality according to one embodiment of the present invention. System 2 includes a portable viewing device, which may be, for example, an interactive head-mounted eyepiece, such as a pair of glasses or goggles 4 . The goggles 4 have an image sensor 6 that acquires an image of a real world scene 8 . Scene 8 may include, for example, one or more buildings 12 , or one or more billboards 14 . The goggles may have one or more display devices 10 positioned in the goggles 4 to be positioned in front of the user's eyes when the goggles 4 are worn by the user. The display device 10 may be, for example, a see-through device, such as a transparent LCD screen, through which a real world scene is viewed and external data presented. The system 2 also includes a processor 16 configured to identify a predetermined object in the image captured by the image sensor 6 performing a gesture or pointing to a real world object in the real world scene 8 or an activation icon displayed to the user . System 2 also includes one or more position and/or orientation sensors 23, such as GPS, accelerometers, gyroscopes, solid-state compasses, magnetic sensors, or gravity sensors.

图5示出根据本发明的另一实施方案的用于增强现实的系统40。系统40被集成到例如手机、平板或相机的移动通信设备42。图5a示出通信设备42的前视图,而图5b示出通信设备42的后视图。通信设备42的背面上具有获得现实世界场景的图像的图像传感器46,图像传感器46在显示设备对面。通信设备42的正面上也具有显示设备48,当相机46面对现实世界场景时,显示设备48位于用户前面。显示设备48例如可为向用户呈现相机6获得的现实世界场景的图像以及如下文所解释的视觉数据的LCD屏幕。系���40使用相机46、显示设备48,和通信设备42的处理器,且还包括一个或多个状态传感器,所述状态传感器被包含在图5中未示出的通信设备42的外壳中。处理器被配置以在图像传感器46捕获的图像中识别指向现实世界场景中现实世界对象的预定对象。Fig. 5 shows a system 40 for augmented reality according to another embodiment of the present invention. The system 40 is integrated into a mobile communication device 42 such as a cell phone, tablet or camera. FIG. 5 a shows a front view of the communication device 42 , while FIG. 5 b shows a rear view of the communication device 42 . On the back of the communication device 42 there is an image sensor 46 that acquires an image of the real world scene, opposite the display device. The communication device 42 also has a display device 48 on the front, which is positioned in front of the user when the camera 46 is facing the real world scene. The display device 48 may be, for example, an LCD screen that presents to the user images of the real world scene acquired by the camera 6 together with visual data as explained below. System 40 employs camera 46, display device 48, and processor of communication device 42, and also includes one or more state sensors contained within the housing of communication device 42, not shown in FIG. The processor is configured to identify predetermined objects pointing to real-world objects in the real-world scene in images captured by image sensor 46 .

图3a示出使用中的系统2。护目镜4被放置在用户18眼睛上方。用户面向现实世界场景8因此查看场景8。图3b示出使用中的系统40。用户18手持通信设备42,通信设备42具有面向现实世界场景8的图像传感器46和面向用户的显示设备48。Figure 3a shows the system 2 in use. The goggles 4 are placed over the user's 18 eyes. The user faces the real world scene 8 and thus views the scene 8 . Figure 3b shows the system 40 in use. The user 18 holds a communication device 42 with an image sensor 46 facing the real world scene 8 and a display device 48 facing the user.

系统2或40现在执行以下过程。当用户使用系统2或40时将查看到的场景8的视图被显示在显示设备上。图4a示出当用户使用系统2或40查看现实世界场景8时将看到的场景8的视图。处理器36分析图像传感器获得的图像,以确定图像传感器捕获的图像中的预定对象何时执行与现实世界场景8中的现实世界对象相关的预定手势。System 2 or 40 now performs the following process. A view of the scene 8 that the user would see when using the system 2 or 40 is displayed on the display device. FIG. 4 a shows a view of a real world scene 8 that a user would see when viewing the real world scene 8 using the system 2 or 40 . Processor 36 analyzes images obtained by the image sensor to determine when a predetermined object in the image captured by the image sensor performs a predetermined gesture associated with the real-world object in real-world scene 8 .

例如护目镜4或通信设备42的查看设备34在使用中并不固定,这是由于用户走路时发生的移动或用户的头或手的移动。在这种情况下,传感器38生成的信号可能是嘈杂且不准确的。在这种情况下,机器视觉模块37运行跟踪算法,所述跟踪算法跟踪识别的现实世界对象并维持显示的相关视觉数据相对于识别的现实世界对象处于固定位置。A viewing device 34 such as goggles 4 or a communication device 42 is not fixed in use due to movements that occur when the user walks or movements of the user's head or hands. In this case, the signal generated by sensor 38 may be noisy and inaccurate. In this case, the machine vision module 37 runs a tracking algorithm that tracks the identified real-world object and maintains the displayed relevant visual data in a fixed position relative to the identified real-world object.

与现实世界对象或激活图标有关的预定手势例如可指向现实世界对象或激活图标,或对现实世界对象或激活图标执行翻页手势。激活图标可与现实世界对象相关或不相关。A predetermined gesture related to a real world object or activation icon may, for example, point to a real world object or activation icon, or perform a page turning gesture on a real world object or activation icon. Activation icons may or may not be related to real world objects.

其它可能的预定手势包括翻页手势、两个手指的对捏运动(例如食指和拇指或中指和拇指)、指向、左到右手势、右到左手势、向上手势、向下手势、按压手势、打开握紧的拳头、打开握紧的拳头并移向图像传感器、轻敲手势、挥手手势、鼓掌手势、反向鼓掌手势、手握成拳、对捏手势、反向对捏手势、张开手指的手势、反向张开手指的手势、指着激活图标或现实世界对象、指着激活图标或现实世界对象预定时间量、点击激活图标或现实世界对象、双击激活图标或现实世界对象、食指点击激活图标或现实世界对象、中指点击激活图标或现实世界对象、从下点击激活图标或现实世界对象、从上点击激活图标、抓激活图标或现实世界对象、从右指着激活图标或现实世界对象、从左指着激活图标或现实世界对象、从左通过激活图标或现实世界对象、推激活图标或现实世界对象、鼓掌或在激活图标或现实世界对象上方挥手、执行爆炸手势、执行轻敲手势、在激活图标或现实世界对象上执行顺时针或反时针手势、滑动激活图标或现实世界对象、用两个手指抓激活图标或现实世界对象,或执行点击拖动释放运动。Other possible predefined gestures include page turning gestures, two-finger pinch movements (for example, index and thumb or middle and thumb), pointing, left-to-right gestures, right-to-left gestures, up gestures, down gestures, press gestures, Open clenched fist, open clenched fist and move towards image sensor, tap gesture, hand wave gesture, clapping gesture, reverse clapping gesture, fisting hands, pinch gesture, reverse pinch gesture, spread fingers gesture, reverse spread finger gesture, point to activate an icon or real world object, point to activate an icon or real world object for a predetermined amount of time, tap to activate an icon or real world object, double tap to activate an icon or real world object, index finger tap Activate icon or real world object, Middle finger tap to activate icon or real world object, Bottom tap to activate icon or real world object, Top tap to activate icon, Grab to activate icon or real world object, Right pointing to activate icon or real world object , pointing from left to activate icon or real world object, passing from left to activate icon or real world object, pushing to activate icon or real world object, clapping or waving over activate icon or real world object, performing an explosion gesture, performing a tap gesture , perform a clockwise or counterclockwise gesture on the activation icon or real-world object, swipe the activation icon or real-world object, grab the activation icon or real-world object with two fingers, or perform a tap-drag-release motion.

预定对象例如可为用户的手、用户的手的一部分,例如用户的手指20或两个不同手的部分。或者,预定对象可为手写笔或棒。The predetermined object may eg be the user's hand, a part of the user's hand, eg the user's finger 20 or parts of two different hands. Alternatively, the predetermined object may be a stylus or a stick.

当处理器16确定已执行预定手势时,这可能通过任何类型的反馈指示给用户,例如视觉反馈、听觉反馈、触觉反馈、定向振动、空气触觉反馈,或超声波反馈。反馈可为呈选自以下形式的视觉指示:显示设备上显示的激活图标、显示设备上显示的激活图标的变化、显示设备上显示的激活图标的颜色的变化、显示设备上显示的激活图标的大小的变化、显示设备上显示的激活图标的动画、指示灯、在显示设备上移动的指示器、振动、定向振动指示、空气触觉指示。指示可由在显示设备上出现的所有其它图像或视频顶部出现的在显示设备上移动的指示器提供。视觉反馈可为当系统识别预定对象时预定对象周围的辉光的外观。When processor 16 determines that a predetermined gesture has been performed, this may be indicated to the user by any type of feedback, such as visual feedback, auditory feedback, tactile feedback, directional vibration, air haptic feedback, or ultrasonic feedback. The feedback may be a visual indication in a form selected from: an activation icon displayed on the display device, a change in the activation icon displayed on the display device, a change in the color of the activation icon displayed on the display device, a change in the activation icon displayed on the display device Changes in size, animation of activation icons displayed on the display device, indicator lights, indicators moving on the display device, vibration, directional vibration indication, air haptic indication. The indication may be provided by a pointer that moves on the display device that appears on top of all other images or video appearing on the display device. The visual feedback may be the appearance of a glow around the predetermined object when the system recognizes the predetermined object.

手势检测模块40可使用任何用于检测图像传感器32获得的图像中预定对象的方法。例如,手势检测模块可检测如WO2005/091125或WO2010/086866中所公开的预定对象。Gesture detection module 40 may use any method for detecting a predetermined object in an image obtained by image sensor 32 . For example, the gesture detection module may detect predetermined objects as disclosed in WO2005/091125 or WO2010/086866.

处理器16还被配置以确定对其执行预定手势的场景8中的现实世界对象。因此,例如,在图4b示出的图像中,处理器16将通过确定图像中指尖位置(X,Y)并把这个信息与来自状态传感器21的用户的位置和护目镜4的方向相组合来确定用户的手指20指向广告牌14。现实世界对象因此被处理器识别,而无需向用户呈现光标或其它标志物来指示用户希望选择的现实世界对象,从而使得能够直接指着现实世界对象以开始互动。处理器16在可能与处理器16成一体或可能位于远程的存储器中搜索与用户的手指20指着的现实世界对象有关的数据。例如,存储器可能已存储了与广告牌14有关的数据。当用户指向数据存储在存储器中或从例如网址的远程服务器提取的场景8中的对象时,数据被显示在显示设备10上叠加到用户的场景的视图。因此,当用户指向广告牌14时(图3),与广告牌14有关的视觉数据21被显示在显示设备10上,如图4c示出。The processor 16 is also configured to determine a real world object in the scene 8 on which the predetermined gesture is performed. Thus, for example, in the image shown in FIG. It is determined that the user's finger 20 is pointed at the billboard 14 . The real world object is thus recognized by the processor without presenting the user with a cursor or other marker to indicate the real world object the user wishes to select, thereby enabling direct pointing at the real world object to initiate interaction. Processor 16 searches memory, which may be integral to processor 16 or may be remotely located, for data relating to the real world object at which user's finger 20 is pointing. For example, the memory may have stored data related to the billboard 14 . When the user points to an object in the scene 8 where the data is stored in memory or retrieved from a remote server, eg a web site, the data is displayed on the display device 10 superimposed onto the user's view of the scene. Thus, when the user points to the billboard 14 (Fig. 3), visual data 21 related to the billboard 14 is displayed on the display device 10, as shown in Fig. 4c.

视觉数据21可为静态或动态的。视觉数据21可包括一个或多个激活图标,使得当相对于激活图标中的一个激活图标执行预定手势时,执行与激活图标关联的命令。命令例如可为显示与选择的现实世界对象有关的具体视觉材料。激活图标可为2D或3D激活图标且可被呈现给用户,使得用户感知在他前面的3D空间中的图标。如本文所使用,激活图标是与用户互动激活的一个或多个消息关联的2D或3D图像或视频中的区域。激活图标例如可为2D或3D视觉元素。激活图标可为虚拟按钮、虚拟键盘、2D或3D激活图标、图像或视频中的区域。激活图标可包含两个或更多个激活图标。Visual data 21 may be static or dynamic. Visual data 21 may include one or more activation icons such that when a predetermined gesture is performed relative to one of the activation icons, a command associated with the activation icon is executed. The command may, for example, be to display specific visual material related to the selected real world object. The activation icon may be a 2D or 3D activation icon and may be presented to the user such that the user perceives the icon in 3D space in front of him. As used herein, an activation icon is an area in a 2D or 3D image or video associated with one or more messages activated by user interaction. Activation icons can be, for example, 2D or 3D visual elements. The activation icon can be a virtual button, a virtual keyboard, a 2D or 3D activation icon, an area in an image or video. An activation icon can contain two or more activation icons.

处理器可能不呈现预定对象所处的激活图标的部分,使得预定对象似乎在激活图标的顶部。当用户快速移动他的头时,激活图标可被删除,然后当头部运动低于预定运动速度时,激活图标返回。The processor may not render the portion of the active icon where the predetermined object is located such that the predetermined object appears to be on top of the active icon. When the user moves his head quickly, the activation icon can be removed, and then when the head movement falls below a predetermined movement speed, the activation icon returns.

系统2可具有两个或更多个操作模式,且处理器16可被配置以识别一个或多个预定手势来在操作模式之间变化。因此,手势可用以打开或关闭系统、选择将要呈现的视觉材料源、选择将要呈现的视觉材料的详细程度、选择将要呈现给用户的按钮或激活图标,或激活在线服务,例如关于选择的现实世界对象的在线服务。另一操作模式可为在识别预定手势之后用图像传感器开始视频录制图像且/或用麦克风录制声音,并在识别另一预定手势之后停止录制。另一操作模式是连续监测视频和/或声音,但在检测到预定手势之后,录制从识别手势之前预定时间量开始的视频/声音,且在识别另一预定手势之后停止录制。预定时间可由用户定义。另一操作模式是在识别预定手势之后向捕获和实时录制的视频添加标签。System 2 may have two or more modes of operation, and processor 16 may be configured to recognize one or more predetermined gestures to change between modes of operation. Thus, a gesture can be used to turn the system on or off, select the source of the visual material to be presented, select the level of detail of the visual material to be presented, select a button or activate an icon to be presented to the user, or activate an online service, such as a real-world Objects of online services. Another mode of operation may be to start video recording images with the image sensor and/or sound recording with the microphone after recognizing a predetermined gesture and stop recording after recognizing another predetermined gesture. Another mode of operation is to monitor video and/or sound continuously, but after a predetermined gesture is detected, video/sound is recorded starting a predetermined amount of time before the gesture is recognized, and recording is stopped after another predetermined gesture is recognized. The scheduled time can be defined by the user. Another mode of operation is to add tags to captured and real-time recorded video after recognition of predetermined gestures.

图6示出另一操作模式。在图6a中,通过用户执行“绘制”区域轮廓的手势来指定图像传感器捕获的视场60中的区域62,如图6中虚线示出。选择区域然后被通过用户执行第二手势来调整大小,例如如图6b中箭头66所指示地分开两个手指或把两个手指靠得更近,直到选择区域达到所希望的大小(图6c中的67)。区域67然后被拖到视场中新的位置(图6d)并被复制到视场中新的位置。系统然后在选择区域上使用跟踪器,且选择区域被实时呈现在显示设备上用户设置的调整大小且重新安置区域中。Figure 6 shows another mode of operation. In FIG. 6 a , an area 62 in the field of view 60 captured by the image sensor is designated by the user performing a gesture to "draw" the outline of the area, as shown in dashed lines in FIG. 6 . The selection area is then resized by the user performing a second gesture, such as spreading two fingers apart or bringing them closer together as indicated by arrow 66 in Figure 6b, until the selection area reaches the desired size (Figure 6c 67 of them). Region 67 is then dragged to a new location in the field of view (Fig. 6d) and copied to the new location in the field of view. The system then uses the tracker on the selection area, and the selection area is rendered in real-time in the resized and relocated area set by the user on the display device.

为了最小化CPU资源,对于每个显示的激活图标,包含显示的激活图标周围的显示的激活图标边界框的图像区域可被定义为保持不变。系统使用机器视觉跟踪器来跟踪这个边界框。在视频序列的两个帧中的边界框的位置之间的距离小于使用视频跟踪器确定的预定距离,且边界框的跟踪器的相关值低于预定值。In order to minimize CPU resources, for each displayed active icon, the image area containing the displayed active icon bounding box around the displayed active icon may be defined to remain constant. The system uses a machine vision tracker to track this bounding box. The distance between the positions of the bounding boxes in two frames of the video sequence is less than a predetermined distance determined using a video tracker, and the correlation value of the tracker for the bounding box is lower than a predetermined value.

当系统处于其中只可激活激活图标而不能激活现实世界对象的操作模式时,CPU可通过只在每个显示的激活图标附近搜索预定对象来最小化。为了进一步减小CPU,只有当如从状态传感器获得的信息确定耳机未显著移动时,才激活对象识别模块。When the system is in an operating mode in which only activation icons can be activated and no real world objects can be activated, the CPU can be minimized by only searching for predetermined objects near each displayed activation icon. To further reduce CPU, the object recognition module is activated only when the headset is not significantly moved as determined by information obtained from the state sensor.

用户可选择不同的过滤器来��选与现实世界对象相关的数据,例如“显示仅由朋友生成的数据”,或显示来自注册源的数据或过去三个月生成的数据的过滤器。Users can select different filters to filter data related to real-world objects, such as "show data generated by friends only," or a filter that shows data from registered sources or data generated in the past three months.

系统2可具有待机模式,其中系统2的耗电量最小。例如,活动模式可能在以下方面与待机模式不同:系统分析的每秒钟的视频帧的数目、分析的图像分辨率、分析的图像帧的部分,和/或激活的检测模块。系统2可被通过任何技术带入活动模式。例如,系统2可通过以下动作而被带入活动模式:当用户把预定对象放入某一位置或一团时把预定对象从下带入视场,例如指着相机视场的右下角或在相机视场中打开手;当显示激活图标且用户执行关联到激活图标的预定手势时,例如指着所述激活图标时,执行预定手势,例如从右向左移动手穿过视场,或在呈现激活图标的位置执行挥手手势,或通过在3D空间中在感知激活图标处于的位置执行手势、通过触摸设备,或如果设备具有加速度计就在设备上轻敲,把浮动激活图标从一个位置滑动到另一位置。作为另一实例,如果设备具有接近传感器或超声波传感器,那么当用户的手靠近设备时,系统可进入活动模式。系统也可由语音命令激活,或当所述用户把预定对象放入视场中特定位置时激活。作为另一实例,只有当在用户的视场中有与现实世界关联的相关数据时,系统才可进入活动模式。此时,系统可向用户指示何时有将要呈现的相关数据,或何时准备好进行互动。System 2 may have a standby mode in which system 2 consumes minimal power. For example, the active mode may differ from the standby mode in terms of the number of video frames per second analyzed by the system, image resolution analyzed, fraction of image frames analyzed, and/or detection modules activated. System 2 can be brought into active mode by any technique. For example, system 2 can be brought into active mode by an action that brings a predetermined object into the field of view from below when the user places the predetermined object into a position or clump, such as pointing at the lower right corner of the camera's field of view or at An open hand in the camera's field of view; when an activation icon is displayed and the user performs a predetermined gesture associated with the activation icon, such as pointing at the activation icon, performs a predetermined gesture, such as moving the hand across the field of view from right to left, or in Perform a wave gesture at the location where the activation icon is presented, or slide the floating activation icon from a location by performing a gesture in 3D space where the sensory activation icon is located, by touching the device, or by tapping on the device if the device has an accelerometer to another location. As another example, if the device has a proximity sensor or an ultrasonic sensor, the system may enter active mode when the user's hand approaches the device. The system may also be activated by voice commands, or when the user places a predetermined object in a specific location in the field of view. As another example, the system may enter active mode only when there is relevant data associated with the real world in the user's field of view. At this point, the system can indicate to the user when there is relevant data to present, or when it is ready for interaction.

视觉指示可被附加到现实世界对象,以让用户知道有与现实世界对象相关的数据。Visual indications can be attached to real-world objects to let users know that there is data associated with the real-world objects.

相关数据的指示可覆盖在现实世界对象的位置上,因为例如激活图标“i”的小的视觉指示可指示信息,而“照片”的标志可指示与现实世界对象相关的图像,或者“信封”的标志指示朋友或其它用户留下的与现实世界对象相关的消息。当用户执行与激活图标相关的预定手势时,可呈现数据。Indications of related data may be overlaid on the location of real world objects, as for example a small visual indication of an activated icon "i" may indicate information, while a sign of "photo" may indicate an image associated with a real world object, or an "envelope" The flags on indicate messages left by friends or other users related to real-world objects. Data may be presented when the user performs a predetermined gesture associated with activating the icon.

系统2可被配置以经过校准过程来记录预定对象的各种物理参数,从而促进处理器2在相机获得的图像中识别预定对象。这可例如通过以下动作进行:在显示器上在3D空间中不同的位置处向用户呈现激活图标;提取预定对象的物理特性,例如预定对象的大小或方向;和确定预定对象的尺寸和它与相机的距离之间的相关性。校准可包括计算相机的三角、用户的视线和预定对象的前端来确定用户的指向。通过基于校准中提取的信息而估计现实世界对象与相机的距离来改善准确性。The system 2 may be configured to undergo a calibration process to record various physical parameters of the intended object, thereby facilitating the processor 2 to identify the intended object in the image obtained by the camera. This can be done, for example, by presenting activation icons to the user at different locations in 3D space on the display; extracting physical properties of the predetermined object, such as the size or orientation of the predetermined object; and determining the dimensions of the predetermined object and its relationship to the camera The correlation between the distances. Calibration may include calculating the triangle of the camera, the user's line of sight, and the front of the intended object to determine the user's pointing. Improves accuracy by estimating the distance of real-world objects from the camera based on information extracted during calibration.

处理器可被配置以由本发明的系统的用户在相机获得的现实世界场景的图像进行识别。现实世界场景中另一用户的识别可例如通过向远程服务器通知特定地理区域中设备的位置来执行。其它设备的位置可被发送到地理区域中的所有设备。The processor may be configured to perform recognition on images of real world scenes captured by a camera by a user of the system of the present invention. The identification of another user in a real world scenario may be performed, for example, by notifying a remote server of the location of the device in a certain geographical area. The location of other devices can be sent to all devices in the geographic area.

当本发明的两个系统之间存在通信链路时,这两个系统可用以玩游戏。另一用户可被表示为计算机化身,用户可通过手势与其进行互动,例如向另一用户发送例如“喜欢”的消息。When there is a communication link between the two systems of the present invention, the two systems can be used to play games. The other user may be represented as an avatar with which the user may interact through gestures, such as sending a message such as a "like" to the other user.

处理器可被配置以显示能够使用一个或多个手指或手进行文字输入的键盘。键盘的显示可在检测到预定手势之后开始,所述预定手势例如从右到左的手势,或呈现张开的手,或在相机的视场的预定区域中呈现两个张开的手,所述预定区域例如视场的底部。另一种开始显示键盘的方式是当用户在打字区域或感知激活图标所处于的3D空间中执行点击手势时进行。键盘例如可用以写个便条、进行搜索或通过在虚拟键盘上打字来与在线服务(例如Skype或twitter)通信。系统可能不呈现预定对象所处的键盘的部分,使得预定对象似乎在键盘的顶部来创造例如用户的手的预定对象似乎在键盘“上方”的错觉。The processor can be configured to display a keyboard that enables text entry using one or more fingers or hands. The display of the keyboard may begin after detection of a predetermined gesture, such as a right-to-left gesture, or presentation of an open hand, or presentation of two open hands in a predetermined area of the camera's field of view, so The predetermined area is, for example, the bottom of the field of view. Another way to start showing the keyboard is when the user performs a tap gesture in the typing area or in the 3D space where the sense-activated icon resides. The keyboard can be used, for example, to write a note, conduct a search, or communicate with online services such as Skype or twitter by typing on the virtual keyboard. The system may not render the portion of the keyboard where the intended object is located such that the intended object appears to be on top of the keyboard to create the illusion that the intended object, such as the user's hand, appears to be "above" the keyboard.

当系统处于输入模式时,动画手可呈现在键盘上,其位置与用户的手和手指相关。动画手的指尖可位于虚拟按键上方看到按键字符的位置。键盘和动画手最好是不透明的,使得用户不能看到键盘背后的背��。这������使得键盘对于用户更加清晰。When the system is in input mode, animated hands can be presented on the keyboard with positions relative to the user's hands and fingers. The fingertips of the animated hand can be positioned where the key character is visible above the virtual key. The keyboard and animated hands are preferably opaque so that the user cannot see the background behind the keyboard. This tends to make the keyboard more legible to the user.

Claims (56)

1. for a method for augmented reality, it comprises:
(a) obtain the image of real world scene from one or more imageing sensors;
(b) obtain one or two direction and the position data of described imageing sensor from one or more state sensors;
(c) in the described image of the described real world scene obtaining at described one or more imageing sensors, identify real world objects, predetermined point at objects is carried out prearranged gesture in described real world objects, the data that described gestures detection module utilizes described one or more state sensor to provide; With
(d) on the display of checking equipment, present and the data of the object association of described identification.
2. the method for claim 1, one or more being selected from wherein said state sensor: optical sensor, accelerometer, GPS, gyroscope, compass, Magnetic Sensor, the described equipment of indication are with respect to sensor, gravity sensor and the RFID detecting device of the described direction in magnetic field of the earth.
3. method as claimed in claim 1 or 2, wherein obtains by the data associated with described real world objects of search in storer with the described data of the object association of described identification.
4. the method according to any one of the preceding claims, wherein said predetermine one is part, the finger of a part for hand, hand, two hands, two hands, a part or the finger tip of finger.
5. the method according to any one of the preceding claims, wherein saidly checks that equipment is configured to be worn by user.
6. method as claimed in claim 5, wherein saidly checks that equipment is glasses or safety goggles.
7. the method according to any one of the preceding claims, wherein saidly checks that equipment is merged in mobile communication equipment.
8. the method according to any one of the preceding claims, the step of identifying in the described image of the wherein said described real world scene obtaining at described one or more imageing sensors comprises: the position (X, Y) of determining predetermine one described in the image that described imageing sensor obtains; With one or two in position and the direction of the described display device that provides of described sensor is provided.
9. the method according to any one of the preceding claims, it also comprises: communicate by letter with external unit or website.
10. the method according to any one of the preceding claims, wherein said imageing sensor is selected from: camera, optical sensor, IR sensor, ultrasonic sensor, proximity transducer, cmos image sensor, short-wave infrared (SWIR) imageing sensor or reflective sensor, IR sensor, ultrasonic sensor, proximity transducer and reflective sensor.
11. methods as claimed in claim 9, it also comprises: the software program that message is sent to the application program of moving on described external unit, the service moving on described external unit, the operating system of moving on described external unit, the program of moving on described external unit, one or more application programs of moving on the processor of described external unit, moves in the described background of described external unit, or the one or more services that move on described external unit.
12. methods as claimed in claim 7, it also comprises: the software program that message is sent to the application program of moving on described mobile communication equipment, the service moving on described mobile communication equipment, the operating system of moving on described mobile communication equipment, the program of moving on described mobile communication equipment, one or more application programs of moving on the processor of described mobile communication equipment, moves in the described background of described mobile communication equipment, or the one or more services that move on described mobile communication equipment.
13. methods as claimed in claim 9, it also comprises: from the application program of moving at described external unit, the service moving on described external unit, the operating system of moving on described external unit, the program of moving on described external unit, one or more application programs of moving on the processor of described external unit, the software program moving in the described background of described external unit sends message, the data that described message request is relevant with the real world objects of identifying in image, or described message is sent to the one or more services that move on described external unit.
14. methods as claimed in claim 7, it also comprises: from the application program of moving at described mobile communication equipment, the service moving on described mobile communication equipment, the operating system of moving on described mobile communication equipment, the program of moving on described mobile communication equipment, one or more application programs of moving on the processor of described mobile communication equipment, the software program moving in the described background of described mobile communication equipment sends message, the data that described message request is relevant with the real world objects of identifying in image, or described message is sent to the one or more services that move on described mobile communication equipment.
15. methods as claimed in claim 11 are wherein orders to the described message of described external unit or website.
16. methods as claimed in claim 15, wherein said order is selected from: the order running application on described external unit or website, the order that stops at the application program of moving on described external unit or website, activate the service moving on described external unit or website order, stop at the order of the service moving on described external unit or website, or the order of transmission data relevant with the real world objects of identifying in image.
17. methods as claimed in claim 12 are wherein orders to the described message of described mobile communication equipment.
18. methods as claimed in claim 17, wherein said order is selected from: the order running application on described mobile communication equipment, the order that stops at the application program of moving on described mobile communication equipment or website, activate the service moving on described mobile communication equipment order, stop at the order of the service moving on described mobile communication equipment, or the order of transmission data relevant with the real world objects of identifying in image.
19. methods as claimed in claim 13, it also comprises: receive the data relevant with the real world objects of identifying image from described external unit or website; Present to user with the data that receive described in handle.
20. methods as described in any one in claim 9,11 or 13, wherein communicate by letter and are undertaken by communication network with described external unit or website.
21. methods as claimed in claim 11, wherein said message is the order following to being selected from of described external unit: press the virtual key showing on the display device of described external unit, rotating disk is selected in rotation, switch desktop, on described external unit, move predetermined software application program, close the application program on described external unit, open or close audio amplifier, heighten or turn down volume, lock external unit described in described external unit, release, jump to another song of media player or change between IPTV channel, control navigate application, make a call, finish call, present notice, display notification, browse photo or music album picture library, rolling Webpage, present Email, present one or more documents or map, control the action in game, point to map, zoom in or out map or image, painted on image, grab and activate icon and pull out described activation icon from described display device, rotation-activated icon, on described external unit, simulate touch order, carry out one or more multiple point touching orders, touch gestures order, typewriting, click display video to suspend or to play, marker frame or from Video Capture frame, present and import message into, incoming call answering, quiet or refuse incoming call answering, open call reminding, present the notice of receiving from Web Community's service, present the notice being generated by described external unit, open predetermined application, change the locking mode of described external unit and open nearest talk application program, change the locking mode of described external unit and open online service application program or browser, change the locking mode of described external unit and open email application, change the locking mode of described external unit and open online service application program or browser, change the locking mode of described equipment and open calendar applications, change the locking mode of described equipment and open reminder application, change the locking mode of described equipment and open user arrange, the predetermined application that the manufacturer of described external unit arranges or service provider arranges, activate icon, choice menus item, moving hand on display, handle and exempt to touch mouse on display, activate icon, change the information on display.
22. the method according to any one of the preceding claims, wherein said prearranged gesture is selected from: page turning gesture, the pinch motion of two fingers, point to, left-to-right gesture, right to left hand gesture, upwards gesture, gesture downwards, press gesture, open the fist of holding with a firm grip, open the fist of holding with a firm grip and shift to described imageing sensor, rap gesture, the gesture of waving, applause gesture, oppositely applause gesture, hand fists, to the gesture of handling knob, oppositely to the gesture of handling knob, the gesture spreading one's fingers, the gesture oppositely spreading one's fingers, point to activation icon, keep activating object schedule time amount, click and activate icon, double-click and activate icon, click activation graph mark from right side, click activation graph mark from left side, activate icon from lower click, activate icon from upper click, grabbing and activating icon is described object, pointing to from the right side and activating icon is described object, point to activation icon from a left side, pass through to activate icon from a left side, push away object, applaud, at activation graph, wave in the side of putting on, carry out blast gesture, gesture is rapped in execution, put on and carry out clockwise or counter-clockwise gesture at activation graph, slip icon, grab activation icon with two fingers, and carry out to click and drag released movement.
23. the method according to any one of the preceding claims, wherein with the described data of the object association of described identification be in vision data, voice data or text data any or multiple.
24. the method according to any one of the preceding claims are wherein to activate icon with the described data of the object association of described identification.
25. methods as claimed in claim 24, wherein said activation icon is that 2D or 3D activate icon.
26. methods as described in claim 24 or 25, the perception in the 3d space before described user by user of wherein said activation icon.
27. the method according to any one of the preceding claims have two or more operator schemes.
28. methods as claimed in claim 27, it also comprises: after identification prearranged gesture, change the described operator scheme of described system.
29. methods as described in claim 27 or 28, wherein operator scheme is by with lower any or multiple appointment: the described gesture that will identify, effective algorithm in described gestures detection module; The image resolution ratio that described imageing sensor is caught, and described the level of detail of the described imageing sensor image capture rate of catching, the described data that will present, will present to described user's described activation icon, the data source that will present, the level of detail of the described data that will present, the activation icon that will show on described display device, active online service.
30. methods as described in any one in claim 27 to 29, wherein said operator scheme is to be selected from following pattern: the pattern of described imageing sensor video record image after identification prearranged gesture; Microphone records sound the pattern that stops recording after another prearranged gesture in identification after identification prearranged gesture; Continuously monitoring video or sound and prearranged gesture being detected after, record from identify described video or the sound that schedule time amount starts before described gesture and after identifying another prearranged gesture, stop described in pattern that recording; After identification prearranged gesture, add tagged pattern to catching with the video of real-time recording; In the described visual field of catching at described camera, select region and described region duplication is adjusted to big or small pattern to another location in described visual field and its; To selecting region to use tracker in image and the above pattern of adjusting size and again presenting in real time described selection region in resettlement area at described display device; After identification prearranged gesture, catch the pattern of image.
31. the method according to any one of the preceding claims, it also comprises: operation track algorithm, the multi view data that described track algorithm is followed the tracks of the real world objects of described identification and maintained described demonstration are in a fixed position with respect to the real world objects of described identification.
32. the method according to any one of the preceding claims, wherein object identification module just detects described predetermine one in order to only have in the time that described display device has the sports level lower than predetermined threshold.
33. the method according to any one of the preceding claims, it also comprises: feedback is provided in the time recognizing prearranged gesture.
34. methods as claimed in claim 33, wherein said feedback is visual feedback, audio feedback, tactile feedback, directional vibration, air tactile feedback, or ultrasound wave feedback.
35. methods as described in claim 33 or 34, wherein said feedback is to be the vision indication that is selected from following form: the activation icon showing on described display device, the activation graph target showing on described display device changes, the variation of the activation graph target color showing on described display device, the variation of the activation graph target size showing on described display device, the activation graph target animation showing on described display device, pilot lamp, the indicator moving on display device, the indicator moving on described display device that all other images that occur on described display device or video top occur, described outward appearance with described predetermine one aura around.
36. methods as claimed in claim 33, wherein said feedback is vibration, directional vibration indication, air sense of touch indication.
37. the method according to any one of the preceding claims, the activation graph target part showing on wherein said display device does not present in the position at described predetermine one place, and described predetermine one is seemed at described activation graph target top.
38. the method according to any one of the preceding claims, wherein, in the time that described display device has the activity level higher than predetermined threshold, activate icon and are deleted from described display device.
39. methods as claimed in claim 38, it also comprises: in the time that described display device has the sports level lower than described predetermined threshold, show the icon of described deletion on described display device.
40. methods as described in any one in claim 27 to 30, wherein in the time carrying out predetermined action, described method is brought into described activity pattern, described predetermined action is selected from: in the time that user puts into described predetermine one a certain position or one, described predetermine one is brought into described visual field from down, for example, point to the described lower right corner of described viewing field of camera or open hand in described viewing field of camera; In the time showing that activation icon and described user execution are associated with described activation graph target prearranged gesture, for example point to described activation graph timestamp, carry out prearranged gesture, for example mobile hand passes described visual field from right to left, or carry out the gesture of waving presenting position described in described activation graph target, or by described 3d space described in perception, activate icon in position carry out gesture, by touch described equipment, if or described equipment has accelerometer and just rap, described unsteady activation icon is slided into another location from a position on described equipment.As another example, if described equipment has proximity transducer or ultrasonic sensor, when described user's hand is during near described equipment, described system can enter described activity pattern so.Described system also can be activated by voice command, or activates in the time that described user puts into described visual field ad-hoc location described predetermine one.As another example, to only have in the time having the related data associated with described real world in the described visual field described user, described system just can enter described activity pattern.Now, when described system can have the related data that will present to described user's indication, or when is ready for interaction.
41. the method according to any one of the preceding claims, it also comprises: vision indication is appended to real world objects, to indicate the storer that has the data relevant to described real world objects.
42. methods as claimed in claim 41, wherein said vision indication is coated on the image of described real world objects.
43. methods as claimed in claim 42, wherein said vision is selected from the image that activates icon, photo and envelope.
44. the method according to any one of the preceding claims, it also comprises: the calibration process that records one or more physical parameters of described predetermine one.
45. methods as claimed in claim 44, wherein said calibration process comprises and is selected from following any or multiple step: on described display, in 3d space, different positions presents activation icon; Extract the physical characteristics of described predetermine one; And determine the correlativity between the size of described predetermine one and it and the distance of described camera.
46. methods as described in claim 44 or 45, wherein said calibration process comprises the following steps: build triangle, on in described imageing sensor one of described vertex of a triangle and at the front end of described predetermine one, and described leg-of-mutton limit is formed by user's sight line.
47. methods as claimed in claim 46, it also comprises: the information based on extracting in described calibration is estimated the described distance of described real world objects and described camera.
48. the method according to any one of the preceding claims, it also comprises: demonstration can be carried out the keyboard of word input.
49. methods as claimed in claim 48, wherein said keyboard shows after prearranged gesture being detected.
50. methods as claimed in claim 49, wherein said prearranged gesture is selected from: gesture from right to left, present the hand that opens, in the presumptive area of the described visual field of imageing sensor, present two hands that open.
51. methods as claimed in claim 48 wherein after execution point hitter gesture, show described keyboard in 3D typewriting region or the predetermined activation of the perception present position of icon.
52. the method according to any one of the preceding claims, wherein activation graph target part does not present checking on equipment described in described predetermine one place.
53. the method according to any one of the preceding claims, wherein one or two in video or sound starts to record from schedule time amount before identification prearranged gesture.
54. 1 kinds of systems, it comprises the equipment that is configured to carry out the method as described in any one in claim 1 to 53.
55. 1 kinds of computer programs, it comprise in the time that described program is moved on computers, carry out as described in claim 1 to 53 computer program code member in steps.
56. 1 kinds of computer programs, described computer program is implemented on computer-readable medium as described in claim 55.
CN201280048836.8A 2011-09-19 2012-09-19 Augmented reality device, method of operating augmented reality device, computer-readable medium Expired - Fee Related CN103858073B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210808606.2A CN115167675A (en) 2011-09-19 2012-09-19 Augmented reality device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161536144P 2011-09-19 2011-09-19
US61/536,144 2011-09-19
PCT/IL2012/050376 WO2013093906A1 (en) 2011-09-19 2012-09-19 Touch free interface for augmented reality systems

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210808606.2A Division CN115167675A (en) 2011-09-19 2012-09-19 Augmented reality device

Publications (2)

Publication Number Publication Date
CN103858073A true CN103858073A (en) 2014-06-11
CN103858073B CN103858073B (en) 2022-07-29

Family

ID=47189999

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210808606.2A Pending CN115167675A (en) 2011-09-19 2012-09-19 Augmented reality device
CN201280048836.8A Expired - Fee Related CN103858073B (en) 2011-09-19 2012-09-19 Augmented reality device, method of operating augmented reality device, computer-readable medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210808606.2A Pending CN115167675A (en) 2011-09-19 2012-09-19 Augmented reality device

Country Status (5)

Country Link
US (8) US20140361988A1 (en)
JP (3) JP2014531662A (en)
KR (3) KR20140069124A (en)
CN (2) CN115167675A (en)
WO (1) WO2013093906A1 (en)

Cited By (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104133593A (en) * 2014-08-06 2014-11-05 北京行云时空科技有限公司 Character input system and method based on motion sensing
CN104156082A (en) * 2014-08-06 2014-11-19 北京行云时空科技有限公司 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes
CN104197950A (en) * 2014-08-19 2014-12-10 奇瑞汽车股份有限公司 Geographic information display method and system
CN104537401A (en) * 2014-12-19 2015-04-22 南京大学 Reality augmentation system and working method based on technologies of radio frequency identification and depth of field sensor
CN104570354A (en) * 2015-01-09 2015-04-29 京东方科技集团股份有限公司 Interactive glasses and visitor guide system
CN104915581A (en) * 2015-01-09 2015-09-16 中华电信股份有限公司 Augmented reality unlocking system and method
CN105183173A (en) * 2015-10-12 2015-12-23 重庆中电大宇卫星应用技术研究所 Device for typing and converting tactical gestures and Morse codes into voice
CN105319714A (en) * 2014-07-31 2016-02-10 精工爱普生株式会社 Display apparatus, method for controlling display apparatus, and program
WO2016059530A1 (en) * 2014-10-15 2016-04-21 在地实验文化事业有限公司 Guiding system and method
CN105527709A (en) * 2014-10-15 2016-04-27 通用汽车环球科技运作有限责任公司 Systems and methods for adjusting features within a head-up display
CN105759422A (en) * 2015-01-06 2016-07-13 精工爱普生株式会社 Display System And Control Method For Display Device
CN106066701A (en) * 2016-07-05 2016-11-02 成都福兰特电子技术股份有限公司 A kind of AR and VR data handling equipment and method
CN106066537A (en) * 2015-04-24 2016-11-02 松下电器(美国)知识产权公司 Head mounted display and the control method of head mounted display
CN106125932A (en) * 2016-06-28 2016-11-16 广东欧珀移动通信有限公司 A method, device, and mobile terminal for identifying target objects in augmented reality
CN106157363A (en) * 2016-06-28 2016-11-23 广东欧珀移动通信有限公司 A camera method, device and mobile terminal based on augmented reality
CN106155315A (en) * 2016-06-28 2016-11-23 广东欧珀移动通信有限公司 Method, device and mobile terminal for adding augmented reality effect in shooting
CN106200892A (en) * 2014-10-30 2016-12-07 联发科技股份有限公司 Virtual reality system, mobile device, wearable device and entry event processing method
CN106354257A (en) * 2016-08-30 2017-01-25 湖北睛彩视讯科技有限公司 Mobile scene fusion system and method based on augmented reality technology
CN106682468A (en) * 2016-12-30 2017-05-17 百度在线网络技术(北京)有限公司 Method of unlocking electronic device and electronic device
CN107003179A (en) * 2014-11-26 2017-08-01 三星电子株式会社 Ultrasonic sensor and object detection method thereof
CN107340871A (en) * 2017-07-25 2017-11-10 深识全球创新科技(北京)有限公司 The devices and methods therefor and purposes of integrated gesture identification and ultrasonic wave touch feedback
CN107407977A (en) * 2015-03-05 2017-11-28 索尼公司 Message processing device, control method and program
CN107408026A (en) * 2015-03-31 2017-11-28 索尼公司 Message processing device, information processing method and computer program
CN107635057A (en) * 2017-07-31 2018-01-26 努比亚技术有限公司 A kind of virtual reality terminal control method, terminal and computer-readable recording medium
CN108351695A (en) * 2015-11-06 2018-07-31 Bsh家用电器有限公司 System and method for the operation for simplifying household appliance
CN108885533A (en) * 2016-12-21 2018-11-23 杰创科科技有限公司 Combining virtual reality and augmented reality
CN109076164A (en) * 2016-04-18 2018-12-21 月光产业股份有限公司 Focus pulling is carried out by means of the range information from auxiliary camera system
CN109348003A (en) * 2018-09-17 2019-02-15 深圳市泰衡诺科技有限公司 Application control method and device
CN109782639A (en) * 2018-12-29 2019-05-21 深圳市中孚能电气设备有限公司 The control method and control device of a kind of electronic equipment operating mode
US10317215B2 (en) 2015-01-09 2019-06-11 Boe Technology Group Co., Ltd. Interactive glasses and navigation system
CN110109547A (en) * 2019-05-05 2019-08-09 芋头科技(杭州)有限公司 Order Activiation method and system based on gesture identification
CN110209263A (en) * 2018-02-28 2019-09-06 联想(新加坡)私人有限公司 Information processing method, information processing equipment and device-readable storage medium
CN110520899A (en) * 2017-04-14 2019-11-29 微软技术许可有限责任公司 The position of the label of mark in the environment
CN110582741A (en) * 2017-03-21 2019-12-17 Pcms控股公司 Method and system for detection and enhancement of haptic interactions in augmented reality
CN110622219A (en) * 2017-03-10 2019-12-27 杰创科增强现实有限公司 Interactive augmented reality
CN110942518A (en) * 2018-09-24 2020-03-31 苹果公司 Contextual computer-generated reality (CGR) digital assistant
CN111149079A (en) * 2018-08-24 2020-05-12 谷歌有限责任公司 Smart phone, system and method including radar system
CN111194423A (en) * 2017-10-09 2020-05-22 脸谱科技有限责任公司 Head Mounted Display Tracking System
CN111273766A (en) * 2018-12-04 2020-06-12 苹果公司 Method, apparatus and system for generating an affordance linked to a simulated reality representation of an item
CN111538405A (en) * 2019-02-07 2020-08-14 株式会社美凯利 Information processing method, terminal and non-transitory computer readable storage medium
CN111624770A (en) * 2015-04-15 2020-09-04 索尼互动娱乐股份有限公司 Pinch and hold gesture navigation on head mounted display
US10890653B2 (en) 2018-08-22 2021-01-12 Google Llc Radar-based gesture enhancement for voice interfaces
US10930251B2 (en) 2018-08-22 2021-02-23 Google Llc Smartphone-based radar system for facilitating awareness of user presence and orientation
CN112424730A (en) * 2018-07-17 2021-02-26 苹果公司 Computer system with finger device
CN112585566A (en) * 2019-01-31 2021-03-30 华为技术有限公司 Hand-covering face input sensing for interacting with device having built-in camera
CN112783700A (en) * 2019-11-08 2021-05-11 富士施乐株式会社 Computer readable medium for network-based remote assistance system
CN112822992A (en) * 2018-10-05 2021-05-18 脸谱科技有限责任公司 Providing enhanced interaction with physical objects using neuromuscular signals in augmented reality environments
CN113141529A (en) * 2021-04-25 2021-07-20 聚好看科技股份有限公司 Display device and media asset playing method
CN113190110A (en) * 2021-03-30 2021-07-30 青岛小鸟看看科技有限公司 Interface element control method and device of head-mounted display equipment
CN114245909A (en) * 2019-05-21 2022-03-25 奇跃公司 Caching and updating of dense 3D reconstruction data
US11314312B2 (en) 2018-10-22 2022-04-26 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
TWI777333B (en) * 2019-12-20 2022-09-11 大陸商北京外號信息技術有限公司 Method and electronic device for setting spatial positions of a virtual object
CN115309271A (en) * 2022-09-29 2022-11-08 南方科技大学 Mixed reality-based information display method, device, device and storage medium
US12554325B2 (en) 2022-05-10 2026-02-17 Meta Platforms Technologies, Llc Methods and apparatuses for low latency body state prediction based on neuromuscular data

Families Citing this family (214)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9865125B2 (en) 2010-11-15 2018-01-09 Bally Gaming, Inc. System and method for augmented reality gaming
CN115167675A (en) 2011-09-19 2022-10-11 视力移动技术有限公司 Augmented reality device
WO2013093837A1 (en) * 2011-12-23 2013-06-27 Koninklijke Philips Electronics N.V. Method and apparatus for interactive display of three dimensional ultrasound images
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US11169611B2 (en) * 2012-03-26 2021-11-09 Apple Inc. Enhanced virtual touchpad
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
TWI475474B (en) * 2012-07-30 2015-03-01 Mitac Int Corp Gesture combined with the implementation of the icon control method
KR102001218B1 (en) * 2012-11-02 2019-07-17 삼성전자주식회사 Method and device for providing information regarding the object
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US9770203B1 (en) 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
US10856796B1 (en) 2013-01-19 2020-12-08 Bertec Corporation Force measurement system
US10646153B1 (en) 2013-01-19 2020-05-12 Bertec Corporation Force measurement system
US10413230B1 (en) 2013-01-19 2019-09-17 Bertec Corporation Force measurement system
US10231662B1 (en) 2013-01-19 2019-03-19 Bertec Corporation Force measurement system
US10010286B1 (en) 2013-01-19 2018-07-03 Bertec Corporation Force measurement system
US11052288B1 (en) 2013-01-19 2021-07-06 Bertec Corporation Force measurement system
US11311209B1 (en) 2013-01-19 2022-04-26 Bertec Corporation Force measurement system and a motion base used therein
US11540744B1 (en) 2013-01-19 2023-01-03 Bertec Corporation Force measurement system
US12161477B1 (en) 2013-01-19 2024-12-10 Bertec Corporation Force measurement system
US9526443B1 (en) * 2013-01-19 2016-12-27 Bertec Corporation Force and/or motion measurement system and a method of testing a subject
US11857331B1 (en) 2013-01-19 2024-01-02 Bertec Corporation Force measurement system
US10133342B2 (en) * 2013-02-14 2018-11-20 Qualcomm Incorporated Human-body-gesture-based region and volume selection for HMD
EP2960867A4 (en) * 2013-02-21 2016-08-03 Fujitsu Ltd DISPLAY DEVICE, METHOD, PROGRAM, AND POSITION ADJUSTMENT SYSTEM
US20140240226A1 (en) * 2013-02-27 2014-08-28 Robert Bosch Gmbh User Interface Apparatus
US9122916B2 (en) * 2013-03-14 2015-09-01 Honda Motor Co., Ltd. Three dimensional fingertip tracking
US20140285520A1 (en) * 2013-03-22 2014-09-25 Industry-University Cooperation Foundation Hanyang University Wearable display device using augmented reality
US9507426B2 (en) * 2013-03-27 2016-11-29 Google Inc. Using the Z-axis in user interfaces for head mountable displays
US9213403B1 (en) 2013-03-27 2015-12-15 Google Inc. Methods to pan, zoom, crop, and proportionally move on a head mountable display
JP6108926B2 (en) * 2013-04-15 2017-04-05 オリンパス株式会社 Wearable device, program, and display control method for wearable device
US20140094148A1 (en) 2013-05-08 2014-04-03 Vringo Infrastructure Inc. Cognitive Radio System And Cognitive Radio Carrier Device
GB2513884B (en) 2013-05-08 2015-06-17 Univ Bristol Method and apparatus for producing an acoustic field
US9672627B1 (en) * 2013-05-09 2017-06-06 Amazon Technologies, Inc. Multiple camera based motion tracking
EP2818948B1 (en) * 2013-06-27 2016-11-16 ABB Schweiz AG Method and data presenting device for assisting a remote user to provide instructions
US10295338B2 (en) 2013-07-12 2019-05-21 Magic Leap, Inc. Method and system for generating map data from an image
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US10042422B2 (en) 2013-11-12 2018-08-07 Thalmic Labs Inc. Systems, articles, and methods for capacitive electromyography sensors
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
US12504816B2 (en) 2013-08-16 2025-12-23 Meta Platforms Technologies, Llc Wearable devices and associated band structures for sensing neuromuscular signals using sensor pairs in respective pods with communicative pathways to a common processor
KR102157313B1 (en) * 2013-09-03 2020-10-23 삼성전자주식회사 Method and computer readable recording medium for recognizing an object using a captured image
KR102165818B1 (en) * 2013-09-10 2020-10-14 삼성전자주식회사 Method, apparatus and recovering medium for controlling user interface using a input image
JP5877824B2 (en) * 2013-09-20 2016-03-08 ヤフー株式会社 Information processing system, information processing method, and information processing program
KR102119659B1 (en) 2013-09-23 2020-06-08 엘지전자 주식회사 Display device and control method thereof
CN103501473B (en) * 2013-09-30 2016-03-09 陈创举 Based on multifunctional headphone and the control method thereof of MEMS sensor
KR101499044B1 (en) * 2013-10-07 2015-03-11 홍익대학교 산학협력단 Wearable computer obtaining text based on gesture and voice of user and method of obtaining the text
US9740935B2 (en) * 2013-11-26 2017-08-22 Honeywell International Inc. Maintenance assistant system
US9671826B2 (en) * 2013-11-27 2017-06-06 Immersion Corporation Method and apparatus of body-mediated digital content transfer and haptic feedback
US10586395B2 (en) 2013-12-30 2020-03-10 Daqri, Llc Remote object detection and local tracking using visual odometry
US9264479B2 (en) 2013-12-30 2016-02-16 Daqri, Llc Offloading augmented reality processing
EP2899609B1 (en) * 2014-01-24 2019-04-17 Sony Corporation System and method for name recollection
DE102014201578A1 (en) * 2014-01-29 2015-07-30 Volkswagen Ag Device and method for signaling an input area for gesture recognition of a human-machine interface
US20150227231A1 (en) * 2014-02-12 2015-08-13 Microsoft Corporation Virtual Transparent Display
KR20150110032A (en) * 2014-03-24 2015-10-02 삼성전자주식회사 Electronic Apparatus and Method for Image Data Processing
WO2015161062A1 (en) * 2014-04-18 2015-10-22 Bally Gaming, Inc. System and method for augmented reality gaming
US9501871B2 (en) 2014-04-30 2016-11-22 At&T Mobility Ii Llc Explorable augmented reality displays
TWI518603B (en) 2014-05-22 2016-01-21 宏達國際電子股份有限公司 Image editing method and electronic device
US10600245B1 (en) 2014-05-28 2020-03-24 Lucasfilm Entertainment Company Ltd. Navigating a virtual environment of a media content item
KR102303115B1 (en) * 2014-06-05 2021-09-16 삼성전자 주식회사 Method For Providing Augmented Reality Information And Wearable Device Using The Same
KR101595957B1 (en) * 2014-06-12 2016-02-18 엘지전자 주식회사 Mobile terminal and controlling system
EP3180676A4 (en) * 2014-06-17 2018-01-10 Osterhout Group, Inc. External user interface for head worn computing
JP6500355B2 (en) * 2014-06-20 2019-04-17 富士通株式会社 Display device, display program, and display method
US20150379770A1 (en) * 2014-06-27 2015-12-31 David C. Haley, JR. Digital action in response to object interaction
JP6638195B2 (en) * 2015-03-02 2020-01-29 セイコーエプソン株式会社 DISPLAY DEVICE, DISPLAY DEVICE CONTROL METHOD, AND PROGRAM
US9696551B2 (en) * 2014-08-13 2017-07-04 Beijing Lenovo Software Ltd. Information processing method and electronic device
US9690375B2 (en) 2014-08-18 2017-06-27 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US9910504B2 (en) * 2014-08-21 2018-03-06 Samsung Electronics Co., Ltd. Sensor based UI in HMD incorporating light turning element
JP5989725B2 (en) * 2014-08-29 2016-09-07 京セラドキュメントソリューションズ株式会社 Electronic device and information display program
DE102014217843A1 (en) * 2014-09-05 2016-03-10 Martin Cudzilo Apparatus for facilitating the cleaning of surfaces and methods for detecting cleaning work done
GB2530036A (en) 2014-09-09 2016-03-16 Ultrahaptics Ltd Method and apparatus for modulating haptic feedback
WO2016071244A2 (en) * 2014-11-06 2016-05-12 Koninklijke Philips N.V. Method and system of communication for use in hospitals
EP3236335A4 (en) 2014-12-17 2018-07-25 Konica Minolta, Inc. Electronic instrument, method of controlling electronic instrument, and control program for same
US9658693B2 (en) * 2014-12-19 2017-05-23 Immersion Corporation Systems and methods for haptically-enabled interactions with objects
US9600076B2 (en) * 2014-12-19 2017-03-21 Immersion Corporation Systems and methods for object manipulation with haptic feedback
US9685005B2 (en) * 2015-01-02 2017-06-20 Eon Reality, Inc. Virtual lasers for interacting with augmented reality environments
JP2016133541A (en) * 2015-01-16 2016-07-25 株式会社ブリリアントサービス Electronic spectacle and method for controlling the same
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
JP6771473B2 (en) 2015-02-20 2020-10-21 ウルトラハプティクス アイピー リミテッドUltrahaptics Ip Ltd Improved algorithm in the tactile system
US9886633B2 (en) * 2015-02-23 2018-02-06 Vivint, Inc. Techniques for identifying and indexing distinguishing features in a video feed
US20160292920A1 (en) * 2015-04-01 2016-10-06 Caterpillar Inc. Time-Shift Controlled Visualization of Worksite Operations
US10055888B2 (en) 2015-04-28 2018-08-21 Microsoft Technology Licensing, Llc Producing and consuming metadata within multi-dimensional data
DE102015211515A1 (en) * 2015-06-23 2016-12-29 Siemens Aktiengesellschaft Interaction system
US10409443B2 (en) * 2015-06-24 2019-09-10 Microsoft Technology Licensing, Llc Contextual cursor display based on hand tracking
US10156726B2 (en) * 2015-06-29 2018-12-18 Microsoft Technology Licensing, Llc Graphene in optical systems
US10818162B2 (en) 2015-07-16 2020-10-27 Ultrahaptics Ip Ltd Calibration techniques in haptic systems
CN105138763A (en) * 2015-08-19 2015-12-09 中山大学 Method for real scene and reality information superposition in augmented reality
CN112557676B (en) * 2015-08-25 2025-04-29 株式会社日立高新技术 Marking method
CN105205454A (en) * 2015-08-27 2015-12-30 深圳市国华识别科技开发有限公司 System and method for capturing target object automatically
KR102456597B1 (en) * 2015-09-01 2022-10-20 삼성전자주식회사 Electronic apparatus and operating method thereof
KR101708455B1 (en) * 2015-09-08 2017-02-21 엠더블유엔테크 주식회사 Hand Float Menu System
CN113220116A (en) 2015-10-20 2021-08-06 奇跃公司 System and method for changing user input mode of wearable device and wearable system
CN105872815A (en) * 2015-11-25 2016-08-17 乐视网信息技术(北京)股份有限公司 Video playing method and device
EP3182328A1 (en) * 2015-12-17 2017-06-21 Nokia Technologies Oy A method, apparatus or computer program for controlling image processing of a captured image of a scene to adapt the captured image
US9697648B1 (en) 2015-12-23 2017-07-04 Intel Corporation Text functions in augmented reality
JP2017129406A (en) * 2016-01-19 2017-07-27 日本電気通信システム株式会社 Information processing device, smart glass and control method thereof, and computer program
CN105843390B (en) * 2016-02-24 2019-03-19 上海理湃光晶技术有限公司 A method of image scaling and AR glasses based on the method
US10168768B1 (en) 2016-03-02 2019-01-01 Meta Company Systems and methods to facilitate interactions in an interactive space
US10133345B2 (en) 2016-03-22 2018-11-20 Microsoft Technology Licensing, Llc Virtual-reality navigation
US9933855B2 (en) * 2016-03-31 2018-04-03 Intel Corporation Augmented reality in a field of view including a reflection
AU2017244109B2 (en) 2016-03-31 2022-06-23 Magic Leap, Inc. Interactions with 3D virtual objects using poses and multiple-DOF controllers
US10186088B2 (en) 2016-05-13 2019-01-22 Meta Company System and method for managing interactive virtual frames for virtual objects in a virtual environment
US9990779B2 (en) 2016-05-13 2018-06-05 Meta Company System and method for modifying virtual objects in a virtual environment in response to user interactions
ES2643863B1 (en) * 2016-05-24 2018-10-26 Sonovisión Ingenieros España, S.A.U. METHOD FOR PROVIDING BY GUIDED INCREASED REALITY, INSPECTION AND SUPPORT IN INSTALLATION OR MAINTENANCE OF PROCESSES FOR COMPLEX ASSEMBLIES COMPATIBLE WITH S1000D AND DEVICE THAT MAKES SAME USE
CN105915715A (en) * 2016-05-25 2016-08-31 努比亚技术有限公司 Incoming call reminding method and device thereof, wearable audio device and mobile terminal
WO2017217752A1 (en) * 2016-06-17 2017-12-21 이철윤 System and method for generating three dimensional composite image of product and packing box
KR20180009170A (en) * 2016-07-18 2018-01-26 엘지전자 주식회사 Mobile terminal and operating method thereof
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
EP3487595A4 (en) 2016-07-25 2019-12-25 CTRL-Labs Corporation SYSTEM AND METHOD FOR MEASURING MOVEMENTS OF ARTICULATED RIGID BODIES
EP3487402B1 (en) 2016-07-25 2021-05-05 Facebook Technologies, LLC Methods and apparatus for inferring user intent based on neuromuscular signals
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
WO2020112986A1 (en) 2018-11-27 2020-06-04 Facebook Technologies, Inc. Methods and apparatus for autocalibration of a wearable electrode sensor system
US10268275B2 (en) 2016-08-03 2019-04-23 Ultrahaptics Ip Ltd Three-dimensional perceptions in haptic systems
CN106980362A (en) 2016-10-09 2017-07-25 阿里巴巴集团控股有限公司 Input method and device based on virtual reality scenario
US11119585B2 (en) 2016-10-13 2021-09-14 Ford Motor Company Dual-mode augmented reality interfaces for mobile devices
US10257558B2 (en) * 2016-10-26 2019-04-09 Orcam Technologies Ltd. Systems and methods for constructing and indexing a database of joint profiles for persons viewed by multiple wearable apparatuses
JP2018082363A (en) * 2016-11-18 2018-05-24 セイコーエプソン株式会社 Head-mounted display device and method for controlling the same, and computer program
WO2018100575A1 (en) 2016-11-29 2018-06-07 Real View Imaging Ltd. Tactile feedback in a display system
US11507216B2 (en) * 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications
US10620910B2 (en) 2016-12-23 2020-04-14 Realwear, Inc. Hands-free navigation of touch-based operating systems
US11099716B2 (en) 2016-12-23 2021-08-24 Realwear, Inc. Context based content navigation for wearable display
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
US10620779B2 (en) * 2017-04-24 2020-04-14 Microsoft Technology Licensing, Llc Navigating a holographic image
US10481755B1 (en) * 2017-04-28 2019-11-19 Meta View, Inc. Systems and methods to present virtual content in an interactive space
US11054894B2 (en) 2017-05-05 2021-07-06 Microsoft Technology Licensing, Llc Integrated mixed-input system
CN111033444B (en) 2017-05-10 2024-03-05 优玛尼股份有限公司 Wearable multimedia devices and cloud computing platform with application ecosystem
US12230029B2 (en) * 2017-05-10 2025-02-18 Humane, Inc. Wearable multimedia device and cloud computing platform with laser projection system
US11023109B2 (en) 2017-06-30 2021-06-01 Microsoft Techniogy Licensing, LLC Annotation using a multi-device mixed interactivity system
US10895966B2 (en) 2017-06-30 2021-01-19 Microsoft Technology Licensing, Llc Selection using a multi-device mixed interactivity system
WO2019021447A1 (en) * 2017-07-28 2019-01-31 株式会社オプティム Wearable terminal display system, wearable terminal display method and program
WO2019021446A1 (en) * 2017-07-28 2019-01-31 株式会社オプティム Wearable terminal display system, wearable terminal display method and program
US10591730B2 (en) * 2017-08-25 2020-03-17 II Jonathan M. Rodriguez Wristwatch based interface for augmented reality eyewear
US10068403B1 (en) 2017-09-21 2018-09-04 Universal City Studios Llc Locker management techniques
US20190129607A1 (en) * 2017-11-02 2019-05-02 Samsung Electronics Co., Ltd. Method and device for performing remote control
JP2019086916A (en) * 2017-11-02 2019-06-06 オリンパス株式会社 Work support device, work support method, and work support program
US11531395B2 (en) 2017-11-26 2022-12-20 Ultrahaptics Ip Ltd Haptic effects from focused acoustic fields
WO2019123762A1 (en) * 2017-12-22 2019-06-27 ソニー株式会社 Information processing device, information processing method, and program
EP3729418B1 (en) 2017-12-22 2024-11-20 Ultrahaptics Ip Ltd Minimizing unwanted responses in haptic systems
WO2019122912A1 (en) 2017-12-22 2019-06-27 Ultrahaptics Limited Tracking in haptic systems
US10739861B2 (en) * 2018-01-10 2020-08-11 Facebook Technologies, Llc Long distance interaction with artificial reality objects using a near eye display interface
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
WO2019147956A1 (en) 2018-01-25 2019-08-01 Ctrl-Labs Corporation Visualization of reconstructed handstate information
US11907423B2 (en) * 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US20190324549A1 (en) * 2018-04-20 2019-10-24 Immersion Corporation Systems, devices, and methods for providing immersive reality interface modes
MX2020011492A (en) 2018-05-02 2021-03-25 Ultrahaptics Ip Ltd Blocking plate structure for improved acoustic transmission efficiency.
US20190339837A1 (en) * 2018-05-04 2019-11-07 Oculus Vr, Llc Copy and Paste in a Virtual Reality Environment
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10768426B2 (en) 2018-05-21 2020-09-08 Microsoft Technology Licensing, Llc Head mounted display system receiving three-dimensional push notification
EP3801216A1 (en) 2018-05-29 2021-04-14 Facebook Technologies, LLC. Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
CN112585600A (en) 2018-06-14 2021-03-30 脸谱科技有限责任公司 User identification and authentication using neuromuscular signatures
JP7056423B2 (en) * 2018-07-10 2022-04-19 オムロン株式会社 Input device
WO2020018892A1 (en) 2018-07-19 2020-01-23 Ctrl-Labs Corporation Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US10909762B2 (en) 2018-08-24 2021-02-02 Microsoft Technology Licensing, Llc Gestures for facilitating interaction with pages in a mixed reality environment
EP4241661A1 (en) 2018-08-31 2023-09-13 Facebook Technologies, LLC Camera-guided interpretation of neuromuscular signals
WO2020061451A1 (en) * 2018-09-20 2020-03-26 Ctrl-Labs Corporation Neuromuscular text entry, writing and drawing in augmented reality systems
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
KR102620702B1 (en) * 2018-10-12 2024-01-04 삼성전자주식회사 A mobile apparatus and a method for controlling the mobile apparatus
US10929099B2 (en) * 2018-11-02 2021-02-23 Bose Corporation Spatialized virtual personal assistant
US10789952B2 (en) * 2018-12-20 2020-09-29 Microsoft Technology Licensing, Llc Voice command execution from auxiliary input
US12373033B2 (en) 2019-01-04 2025-07-29 Ultrahaptics Ip Ltd Mid-air haptic textures
WO2020152828A1 (en) * 2019-01-24 2020-07-30 マクセル株式会社 Display terminal, application control system and application control method
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US11842517B2 (en) 2019-04-12 2023-12-12 Ultrahaptics Ip Ltd Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network
JP7331462B2 (en) * 2019-05-24 2023-08-23 京セラドキュメントソリューションズ株式会社 ROBOT SYSTEM, ROBOT CONTROL METHOD AND ELECTRONIC DEVICE
US10747371B1 (en) * 2019-06-28 2020-08-18 Konica Minolta Business Solutions U.S.A., Inc. Detection of finger press from live video stream
USD1009884S1 (en) * 2019-07-26 2024-01-02 Sony Corporation Mixed reality eyeglasses or portion thereof with an animated graphical user interface
JP2021026260A (en) 2019-07-31 2021-02-22 セイコーエプソン株式会社 Display unit, display method, and computer program
US10909767B1 (en) * 2019-08-01 2021-02-02 International Business Machines Corporation Focal and interaction driven content replacement into augmented reality
US12229341B2 (en) 2019-09-23 2025-02-18 Apple Inc. Finger-mounted input devices
US11275453B1 (en) 2019-09-30 2022-03-15 Snap Inc. Smart ring for manipulating virtual objects displayed by a wearable device
US11374586B2 (en) 2019-10-13 2022-06-28 Ultraleap Limited Reducing harmonic distortion by dithering
US20210116249A1 (en) * 2019-10-16 2021-04-22 The Board Of Trustees Of The California State University Augmented reality marine navigation
US12089953B1 (en) 2019-12-04 2024-09-17 Meta Platforms Technologies, Llc Systems and methods for utilizing intrinsic current noise to measure interface impedances
US11715453B2 (en) 2019-12-25 2023-08-01 Ultraleap Limited Acoustic transducer structures
CN115244263A (en) * 2020-02-28 2022-10-25 日本电气株式会社 Locker system, locker management method, and storage medium
US11277597B1 (en) 2020-03-31 2022-03-15 Snap Inc. Marker-based guided AR experience
US11798429B1 (en) 2020-05-04 2023-10-24 Snap Inc. Virtual tutorials for musical instruments with finger tracking in augmented reality
US11520399B2 (en) 2020-05-26 2022-12-06 Snap Inc. Interactive augmented reality experiences using positional tracking
US11816267B2 (en) 2020-06-23 2023-11-14 Ultraleap Limited Features of airborne ultrasonic fields
JP2022022568A (en) * 2020-06-26 2022-02-07 沖電気工業株式会社 Display operation unit and device
JP7515590B2 (en) * 2020-07-08 2024-07-12 マクセル株式会社 Information processing terminal, remote control method and program
WO2022058738A1 (en) 2020-09-17 2022-03-24 Ultraleap Limited Ultrahapticons
US11925863B2 (en) 2020-09-18 2024-03-12 Snap Inc. Tracking hand gestures for interactive game control in augmented reality
US11546505B2 (en) * 2020-09-28 2023-01-03 Snap Inc. Touchless photo capture in response to detected hand gestures
US11644902B2 (en) * 2020-11-30 2023-05-09 Google Llc Gesture-based content transfer
WO2022146678A1 (en) 2020-12-29 2022-07-07 Snap Inc. Micro hand gestures for controlling virtual and graphical elements
WO2022146673A1 (en) 2020-12-30 2022-07-07 Snap Inc. Augmented reality precision tracking and display
US11740313B2 (en) 2020-12-30 2023-08-29 Snap Inc. Augmented reality precision tracking and display
US11531402B1 (en) 2021-02-25 2022-12-20 Snap Inc. Bimanual gestures for controlling virtual and graphical elements
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
WO2022216784A1 (en) 2021-04-08 2022-10-13 Snap Inc. Bimanual interactions between mapped hand regions for controlling virtual and graphical elements
EP4280599A4 (en) * 2021-04-09 2024-07-17 Samsung Electronics Co., Ltd. PORTABLE ELECTRONIC DEVICE WITH MULTIPLE CAMERAS
WO2022225761A1 (en) 2021-04-19 2022-10-27 Snap Inc. Hand gestures for animating and controlling virtual and graphical elements
US11435857B1 (en) * 2021-04-29 2022-09-06 Google Llc Content access and navigation using a head-mounted device
US11995899B2 (en) * 2021-04-29 2024-05-28 Google Llc Pointer-based content recognition using a head-mounted device
US12517585B2 (en) 2021-07-15 2026-01-06 Ultraleap Limited Control point manipulation techniques in haptic systems
WO2023283934A1 (en) * 2021-07-16 2023-01-19 Huawei Technologies Co.,Ltd. Devices and methods for gesture-based selection
KR102629771B1 (en) * 2021-09-30 2024-01-29 박두고 Wearable device for recognition object using hand or finger tracking
US11967147B2 (en) * 2021-10-01 2024-04-23 At&T Intellectual Proerty I, L.P. Augmented reality visualization of enclosed spaces
CN114089879B (en) * 2021-11-15 2022-08-05 北京灵犀微光科技有限公司 Cursor control method of augmented reality display equipment
US12405661B2 (en) * 2022-01-10 2025-09-02 Apple Inc. Devices and methods for controlling electronic devices or systems with physical objects
US12265663B2 (en) * 2022-04-04 2025-04-01 Snap Inc. Gesture-based application invocation
US12282607B2 (en) 2022-04-27 2025-04-22 Snap Inc. Fingerspelling text entry
KR102703511B1 (en) * 2022-12-29 2024-09-06 서울과학기술대학교 산학협력단 System for gemnerating virtual space using level of detail of object
US20250321630A1 (en) * 2024-04-10 2025-10-16 Meta Platforms Technologies, Llc Single-Handed Mode for an Artificial Reality System

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545663B1 (en) * 1999-04-19 2003-04-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and input device for controlling the position of an object to be graphically displayed in virtual reality
US20040061831A1 (en) * 2002-09-27 2004-04-01 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
CN1815473A (en) * 2005-02-01 2006-08-09 微软公司 Referencing objects in a virtual environment
JP2007115167A (en) * 2005-10-24 2007-05-10 Sanyo Electric Co Ltd Information processing system and portable information terminal
CN101141611A (en) * 2006-09-06 2008-03-12 国际商业机器公司 Method and system for informing a user of gestures made by others out of the user's line of sight
CN101262830A (en) * 2005-07-20 2008-09-10 布拉科成像S.P.A.公司 Method and system for mapping a virtual model of an object to an object
CN101300621A (en) * 2005-09-13 2008-11-05 时空3D公司 System and method for providing a three-dimensional graphical user interface
KR20090000186A (en) * 2007-01-25 2009-01-07 삼성전자주식회사 Apparatus and method for displaying points of interest using augmented reality
CN101375599A (en) * 2004-06-01 2009-02-25 L-3通信公司 Method and system for performing video flashlight
CN101430601A (en) * 2007-10-01 2009-05-13 苹果公司 Mobile-based interface for personal media device
CN101520690A (en) * 2006-09-08 2009-09-02 索尼株式会社 Image capturing and displaying apparatus and image capturing and displaying method
CN101542584A (en) * 2006-10-16 2009-09-23 索尼株式会社 display device, display method
CN101667098A (en) * 2005-12-30 2010-03-10 苹果公司 Portable electronic device with interface reconfiguration mode
US20100125812A1 (en) * 2008-11-17 2010-05-20 Honeywell International Inc. Method and apparatus for marking a position of a real world object in a see-through display
US20110059759A1 (en) * 2009-09-07 2011-03-10 Samsung Electronics Co., Ltd. Method and apparatus for providing POI information in portable terminal
CN102147926A (en) * 2011-01-17 2011-08-10 中兴通讯股份有限公司 Three-dimensional (3D) icon processing method and device and mobile terminal
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece

Family Cites Families (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0981309A (en) * 1995-09-13 1997-03-28 Toshiba Corp Input device
JP3365246B2 (en) 1997-03-14 2003-01-08 ミノルタ株式会社 Electronic still camera
JP3225882B2 (en) * 1997-03-27 2001-11-05 日本電信電話株式会社 Landscape labeling system
AU7651100A (en) 1999-09-15 2001-04-17 Roche Consumer Health Ag Pharmaceutical and/or cosmetical compositions
SE0000850D0 (en) * 2000-03-13 2000-03-13 Pink Solution Ab Recognition arrangement
CA2410427A1 (en) * 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
JP2002157606A (en) * 2000-11-17 2002-05-31 Canon Inc Image display control device, mixed reality presentation system, image display control method, and medium providing processing program
US7215322B2 (en) 2001-05-31 2007-05-08 Siemens Corporate Research, Inc. Input devices for augmented reality applications
US7126558B1 (en) 2001-10-19 2006-10-24 Accenture Global Services Gmbh Industrial augmented reality
AU2003217587A1 (en) * 2002-02-15 2003-09-09 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US7676079B2 (en) * 2003-09-30 2010-03-09 Canon Kabushiki Kaisha Index identification method and apparatus
IL161002A0 (en) 2004-03-22 2004-08-31 Itay Katz Virtual video keyboard system
CN1304931C (en) * 2005-01-27 2007-03-14 北京理工大学 Head carried stereo vision hand gesture identifying device
KR100687737B1 (en) * 2005-03-19 2007-02-27 한국전자통신연구원 Virtual Mouse Device and Method Based on Two-Hand Gesture
WO2008153599A1 (en) * 2006-12-07 2008-12-18 Adapx, Inc. Systems and methods for data annotation, recordation, and communication
JP4933406B2 (en) * 2007-11-15 2012-05-16 キヤノン株式会社 Image processing apparatus and image processing method
US8165345B2 (en) * 2007-12-07 2012-04-24 Tom Chau Method, system, and computer program for detecting and characterizing motion
EP2258587A4 (en) * 2008-03-19 2013-08-07 Denso Corp Operation input device for vehicle
JP5250834B2 (en) * 2008-04-03 2013-07-31 コニカミノルタ株式会社 Head-mounted image display device
WO2009128064A2 (en) * 2008-04-14 2009-10-22 Pointgrab Ltd. Vision based pointing device emulation
US8971565B2 (en) * 2008-05-29 2015-03-03 Hie-D Technologies, Llc Human interface electronic device
WO2010042880A2 (en) * 2008-10-10 2010-04-15 Neoflect, Inc. Mobile computing device with a virtual keyboard
CN101739122A (en) * 2008-11-24 2010-06-16 玴荣科技股份有限公司 Gesture Recognition and Tracking Method
US9041660B2 (en) * 2008-12-09 2015-05-26 Microsoft Technology Licensing, Llc Soft keyboard control
US9405970B2 (en) 2009-02-02 2016-08-02 Eyesight Mobile Technologies Ltd. System and method for object recognition and tracking in a video stream
CN102326133B (en) * 2009-02-20 2015-08-26 皇家飞利浦电子股份有限公司 The equipment of being provided for enters system, the method and apparatus of activity pattern
JP5304329B2 (en) * 2009-03-05 2013-10-02 ブラザー工業株式会社 Head mounted display device, image control method, and image control program
US8009022B2 (en) 2009-05-29 2011-08-30 Microsoft Corporation Systems and methods for immersive interaction with virtual objects
WO2010144050A1 (en) * 2009-06-08 2010-12-16 Agency For Science, Technology And Research Method and system for gesture based manipulation of a 3-dimensional image of object
US20110107216A1 (en) * 2009-11-03 2011-05-05 Qualcomm Incorporated Gesture-based user interface
JP4679661B1 (en) * 2009-12-15 2011-04-27 株式会社東芝 Information presenting apparatus, information presenting method, and program
KR20110075250A (en) 2009-12-28 2011-07-06 엘지전자 주식회사 Object tracking method and device using object tracking mode
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for controlling user gestures by using image capture device
US8490002B2 (en) * 2010-02-11 2013-07-16 Apple Inc. Projected display shared workspaces
US20110214082A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
KR20130000401A (en) * 2010-02-28 2013-01-02 오스터하우트 그룹 인코포레이티드 Local advertising content on an interactive head-mounted eyepiece
US9128281B2 (en) * 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US8788197B2 (en) 2010-04-30 2014-07-22 Ryan Fink Visual training devices, systems, and methods
US8593375B2 (en) 2010-07-23 2013-11-26 Gregory A Maltz Eye gaze user interface and method
JP5499985B2 (en) * 2010-08-09 2014-05-21 ソニー株式会社 Display assembly
US20120062602A1 (en) * 2010-09-13 2012-03-15 Nokia Corporation Method and apparatus for rendering a content display
US8941559B2 (en) 2010-09-21 2015-01-27 Microsoft Corporation Opacity filter for display device
US8768006B2 (en) * 2010-10-19 2014-07-01 Hewlett-Packard Development Company, L.P. Hand gesture recognition
US9336240B2 (en) * 2011-07-15 2016-05-10 Apple Inc. Geo-tagging digital images
US20130050069A1 (en) * 2011-08-23 2013-02-28 Sony Corporation, A Japanese Corporation Method and system for use in providing three dimensional user interface
CN115167675A (en) * 2011-09-19 2022-10-11 视力移动技术有限公司 Augmented reality device
WO2013136333A1 (en) 2012-03-13 2013-09-19 Eyesight Mobile Technologies Ltd. Touch free user interface

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545663B1 (en) * 1999-04-19 2003-04-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and input device for controlling the position of an object to be graphically displayed in virtual reality
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
US20040061831A1 (en) * 2002-09-27 2004-04-01 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
CN101375599A (en) * 2004-06-01 2009-02-25 L-3通信公司 Method and system for performing video flashlight
CN1815473A (en) * 2005-02-01 2006-08-09 微软公司 Referencing objects in a virtual environment
CN101262830A (en) * 2005-07-20 2008-09-10 布拉科成像S.P.A.公司 Method and system for mapping a virtual model of an object to an object
CN101300621A (en) * 2005-09-13 2008-11-05 时空3D公司 System and method for providing a three-dimensional graphical user interface
JP2007115167A (en) * 2005-10-24 2007-05-10 Sanyo Electric Co Ltd Information processing system and portable information terminal
CN101667098A (en) * 2005-12-30 2010-03-10 苹果公司 Portable electronic device with interface reconfiguration mode
CN101141611A (en) * 2006-09-06 2008-03-12 国际商业机器公司 Method and system for informing a user of gestures made by others out of the user's line of sight
CN101520690A (en) * 2006-09-08 2009-09-02 索尼株式会社 Image capturing and displaying apparatus and image capturing and displaying method
CN101542584A (en) * 2006-10-16 2009-09-23 索尼株式会社 display device, display method
KR20090000186A (en) * 2007-01-25 2009-01-07 삼성전자주식회사 Apparatus and method for displaying points of interest using augmented reality
CN101430601A (en) * 2007-10-01 2009-05-13 苹果公司 Mobile-based interface for personal media device
US20100125812A1 (en) * 2008-11-17 2010-05-20 Honeywell International Inc. Method and apparatus for marking a position of a real world object in a see-through display
US20110059759A1 (en) * 2009-09-07 2011-03-10 Samsung Electronics Co., Ltd. Method and apparatus for providing POI information in portable terminal
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
CN102147926A (en) * 2011-01-17 2011-08-10 中兴通讯股份有限公司 Three-dimensional (3D) icon processing method and device and mobile terminal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MING CHEN等: "Research on Eye-gaze Tracking Network Generated by Augmented Reality Application", 《IEEE》 *
涂子琰等: "增强现实技术的应用和研究", 《计算机工程与应用》 *

Cited By (87)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105319714B (en) * 2014-07-31 2019-09-06 精工爱普生株式会社 Display device, control method of display device, and computer storage medium
CN105319714A (en) * 2014-07-31 2016-02-10 精工爱普生株式会社 Display apparatus, method for controlling display apparatus, and program
CN104156082A (en) * 2014-08-06 2014-11-19 北京行云时空科技有限公司 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes
CN104133593A (en) * 2014-08-06 2014-11-05 北京行云时空科技有限公司 Character input system and method based on motion sensing
CN104197950A (en) * 2014-08-19 2014-12-10 奇瑞汽车股份有限公司 Geographic information display method and system
CN104197950B (en) * 2014-08-19 2018-02-16 奇瑞汽车股份有限公司 The method and system that geography information is shown
CN105527709A (en) * 2014-10-15 2016-04-27 通用汽车环球科技运作有限责任公司 Systems and methods for adjusting features within a head-up display
CN105527709B (en) * 2014-10-15 2019-08-27 通用汽车环球科技运作有限���任公司 System and method for adjusting the feature in head-up display
WO2016059530A1 (en) * 2014-10-15 2016-04-21 在地实验文化事业有限公司 Guiding system and method
CN106200892A (en) * 2014-10-30 2016-12-07 联发科技股份有限公司 Virtual reality system, mobile device, wearable device and entry event processing method
US10108256B2 (en) 2014-10-30 2018-10-23 Mediatek Inc. Systems and methods for processing incoming events while performing a virtual reality session
CN107003179A (en) * 2014-11-26 2017-08-01 三星电子株式会社 Ultrasonic sensor and object detection method thereof
US10684367B2 (en) 2014-11-26 2020-06-16 Samsung Electronics Co., Ltd. Ultrasound sensor and object detecting method thereof
CN104537401A (en) * 2014-12-19 2015-04-22 南京大学 Reality augmentation system and working method based on technologies of radio frequency identification and depth of field sensor
CN104537401B (en) * 2014-12-19 2017-05-17 南京大学 Reality augmentation system and working method based on technologies of radio frequency identification and depth of field sensor
CN105759422A (en) * 2015-01-06 2016-07-13 精工爱普生株式会社 Display System And Control Method For Display Device
US10317215B2 (en) 2015-01-09 2019-06-11 Boe Technology Group Co., Ltd. Interactive glasses and navigation system
CN104915581B (en) * 2015-01-09 2018-10-02 中华电信股份有限公司 Augmented reality unlocking system and method
CN104915581A (en) * 2015-01-09 2015-09-16 中华电信股份有限公司 Augmented reality unlocking system and method
CN104570354A (en) * 2015-01-09 2015-04-29 京东方科技集团股份有限公司 Interactive glasses and visitor guide system
CN107407977A (en) * 2015-03-05 2017-11-28 索尼公司 Message processing device, control method and program
CN107407977B (en) * 2015-03-05 2020-12-08 索尼公司 Information processing equipment, control method and program
US11449133B2 (en) 2015-03-31 2022-09-20 Sony Corporation Information processing apparatus and information processing method
US11868517B2 (en) 2015-03-31 2024-01-09 Sony Group Corporation Information processing apparatus and information processing method
US12353613B2 (en) 2015-03-31 2025-07-08 Sony Group Corporation Information processing apparatus and information processing method
US10948977B2 (en) 2015-03-31 2021-03-16 Sony Corporation Information processing apparatus and information processing method
CN107408026A (en) * 2015-03-31 2017-11-28 索尼公司 Message processing device, information processing method and computer program
CN111624770A (en) * 2015-04-15 2020-09-04 索尼互动娱乐股份有限公司 Pinch and hold gesture navigation on head mounted display
CN111624770B (en) * 2015-04-15 2022-05-03 索尼互动娱乐股份有限公司 Pinch and hold gesture navigation on head mounted display
CN106066537B (en) * 2015-04-24 2020-07-24 松下电器(美国)知识产权公司 Head mounted display and control method of head mounted display
CN106066537A (en) * 2015-04-24 2016-11-02 松下电器(美国)知识产权公司 Head mounted display and the control method of head mounted display
CN105183173B (en) * 2015-10-12 2018-08-28 重庆中电大宇卫星应用技术研究所 It is a kind of by tactics and Morse code gesture input and the device for being converted to voice
CN105183173A (en) * 2015-10-12 2015-12-23 重庆中电大宇卫星应用技术研究所 Device for typing and converting tactical gestures and Morse codes into voice
CN108351695A (en) * 2015-11-06 2018-07-31 Bsh家用电器有限公司 System and method for the operation for simplifying household appliance
CN109076164A (en) * 2016-04-18 2018-12-21 月光产业股份有限公司 Focus pulling is carried out by means of the range information from auxiliary camera system
CN109076164B (en) * 2016-04-18 2020-10-27 月光产业股份有限公司 Method, apparatus, and computer-readable storage medium for switching focus
CN106155315A (en) * 2016-06-28 2016-11-23 广东欧珀移动通信有限公司 Method, device and mobile terminal for adding augmented reality effect in shooting
CN106157363A (en) * 2016-06-28 2016-11-23 广东欧珀移动通信有限公司 A camera method, device and mobile terminal based on augmented reality
CN106125932A (en) * 2016-06-28 2016-11-16 广东欧珀移动通信有限公司 A method, device, and mobile terminal for identifying target objects in augmented reality
CN106066701A (en) * 2016-07-05 2016-11-02 成都福兰特电子技术股份有限公司 A kind of AR and VR data handling equipment and method
CN106066701B (en) * 2016-07-05 2019-07-26 上海智旭商务咨询有限公司 A kind of AR and VR data processing equipment and method
CN106354257A (en) * 2016-08-30 2017-01-25 湖北睛彩视讯科技有限公司 Mobile scene fusion system and method based on augmented reality technology
CN108885533B (en) * 2016-12-21 2021-05-07 杰创科科技有限公司 Combining virtual and augmented reality
CN108885533A (en) * 2016-12-21 2018-11-23 杰创科科技有限公司 Combining virtual reality and augmented reality
CN106682468A (en) * 2016-12-30 2017-05-17 百度在线网络技术(北京)有限公司 Method of unlocking electronic device and electronic device
CN110622219A (en) * 2017-03-10 2019-12-27 杰创科增强现实有限公司 Interactive augmented reality
CN110622219B (en) * 2017-03-10 2024-01-19 杰创科增强现实有限公司 interactive augmented reality
CN110582741A (en) * 2017-03-21 2019-12-17 Pcms控股公司 Method and system for detection and enhancement of haptic interactions in augmented reality
US12008154B2 (en) 2017-03-21 2024-06-11 Interdigital Vc Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
CN110582741B (en) * 2017-03-21 2024-04-02 交互数字Vc控股公司 Method and system for haptic interaction detection and augmentation in augmented reality
US11726557B2 (en) 2017-03-21 2023-08-15 Interdigital Vc Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
CN110520899A (en) * 2017-04-14 2019-11-29 微软技术许可有限责任公司 The position of the label of mark in the environment
CN107340871A (en) * 2017-07-25 2017-11-10 深识全球创新科技(北京)有限公司 The devices and methods therefor and purposes of integrated gesture identification and ultrasonic wave touch feedback
CN107635057A (en) * 2017-07-31 2018-01-26 努比亚技术有限公司 A kind of virtual reality terminal control method, terminal and computer-readable recording medium
CN111194423A (en) * 2017-10-09 2020-05-22 脸谱科技有限责任公司 Head Mounted Display Tracking System
CN110209263B (en) * 2018-02-28 2022-11-08 联想(新加坡)私人有限公司 Information processing method, information processing apparatus, and apparatus-readable storage medium
CN110209263A (en) * 2018-02-28 2019-09-06 联想(新加坡)私人有限公司 Information processing method, information processing equipment and device-readable storage medium
CN112424730A (en) * 2018-07-17 2021-02-26 苹果公司 Computer system with finger device
US11176910B2 (en) 2018-08-22 2021-11-16 Google Llc Smartphone providing radar-based proxemic context
US10930251B2 (en) 2018-08-22 2021-02-23 Google Llc Smartphone-based radar system for facilitating awareness of user presence and orientation
US11435468B2 (en) 2018-08-22 2022-09-06 Google Llc Radar-based gesture enhancement for voice interfaces
US10890653B2 (en) 2018-08-22 2021-01-12 Google Llc Radar-based gesture enhancement for voice interfaces
CN111149079A (en) * 2018-08-24 2020-05-12 谷歌有限责任公司 Smart phone, system and method including radar system
US11204694B2 (en) 2018-08-24 2021-12-21 Google Llc Radar system facilitating ease and accuracy of user interactions with a user interface
US10936185B2 (en) 2018-08-24 2021-03-02 Google Llc Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
CN109348003A (en) * 2018-09-17 2019-02-15 深圳市泰衡诺科技有限公司 Application control method and device
CN110942518B (en) * 2018-09-24 2024-03-29 苹果公司 Contextual Computer Generated Reality (CGR) digital assistant
CN110942518A (en) * 2018-09-24 2020-03-31 苹果公司 Contextual computer-generated reality (CGR) digital assistant
US11798242B2 (en) 2018-09-24 2023-10-24 Apple Inc. Contextual computer-generated reality (CGR) digital assistants
CN112822992A (en) * 2018-10-05 2021-05-18 脸谱科技有限责任公司 Providing enhanced interaction with physical objects using neuromuscular signals in augmented reality environments
US11314312B2 (en) 2018-10-22 2022-04-26 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
US12111713B2 (en) 2018-10-22 2024-10-08 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
CN111273766B (en) * 2018-12-04 2022-05-13 苹果公司 Method, apparatus and system for generating an affordance linked to a simulated reality representation of an item
CN111273766A (en) * 2018-12-04 2020-06-12 苹果公司 Method, apparatus and system for generating an affordance linked to a simulated reality representation of an item
CN109782639A (en) * 2018-12-29 2019-05-21 深圳市中孚能电气设备有限公司 The control method and control device of a kind of electronic equipment operating mode
US11393254B2 (en) 2019-01-31 2022-07-19 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
CN112585566A (en) * 2019-01-31 2021-03-30 华为技术有限公司 Hand-covering face input sensing for interacting with device having built-in camera
CN111538405A (en) * 2019-02-07 2020-08-14 株式会社美凯利 Information processing method, terminal and non-transitory computer readable storage medium
CN110109547A (en) * 2019-05-05 2019-08-09 芋头科技(杭州)有限公司 Order Activiation method and system based on gesture identification
CN114245909A (en) * 2019-05-21 2022-03-25 奇跃公司 Caching and updating of dense 3D reconstruction data
CN112783700A (en) * 2019-11-08 2021-05-11 富士施乐株式会社 Computer readable medium for network-based remote assistance system
TWI777333B (en) * 2019-12-20 2022-09-11 大陸商北京外號信息技術有限公司 Method and electronic device for setting spatial positions of a virtual object
CN113190110A (en) * 2021-03-30 2021-07-30 青岛小鸟看看科技有限公司 Interface element control method and device of head-mounted display equipment
CN113141529A (en) * 2021-04-25 2021-07-20 聚好看科技股份有限公司 Display device and media asset playing method
CN113141529B (en) * 2021-04-25 2022-02-25 聚好看科技股份有限公司 Display device and media resource playback method
US12554325B2 (en) 2022-05-10 2026-02-17 Meta Platforms Technologies, Llc Methods and apparatuses for low latency body state prediction based on neuromuscular data
CN115309271A (en) * 2022-09-29 2022-11-08 南方科技大学 Mixed reality-based information display method, device, device and storage medium

Also Published As

Publication number Publication date
US10401967B2 (en) 2019-09-03
US20160291699A1 (en) 2016-10-06
US20140361988A1 (en) 2014-12-11
JP2014531662A (en) 2014-11-27
US20160306433A1 (en) 2016-10-20
US11093045B2 (en) 2021-08-17
KR20190133080A (en) 2019-11-29
CN103858073B (en) 2022-07-29
JP2018028922A (en) 2018-02-22
WO2013093906A1 (en) 2013-06-27
KR20220032059A (en) 2022-03-15
US11494000B2 (en) 2022-11-08
US20170052599A1 (en) 2017-02-23
CN115167675A (en) 2022-10-11
JP2021007022A (en) 2021-01-21
US20200097093A1 (en) 2020-03-26
US20220107687A1 (en) 2022-04-07
KR20140069124A (en) 2014-06-09
US20160259423A1 (en) 2016-09-08
US20160320855A1 (en) 2016-11-03
JP7297216B2 (en) 2023-06-26

Similar Documents

Publication Publication Date Title
US11494000B2 (en) Touch free interface for augmented reality systems
US11941764B2 (en) Systems, methods, and graphical user interfaces for adding effects in augmented reality environments
JP7324813B2 (en) Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments
US12174006B2 (en) Devices and methods for measuring using augmented reality
US20240053859A1 (en) Systems, Methods, and Graphical User Interfaces for Interacting with Virtual Reality Environments
US20230185397A1 (en) Electronic communication based on user input
US10785413B2 (en) Devices, methods, and graphical user interfaces for depth-based annotation
US12462498B2 (en) Systems, methods, and graphical user interfaces for adding effects in augmented reality environments
US12429940B2 (en) Systems, methods, and graphical user interfaces for automatic measurement in augmented reality environments
KR20150116871A (en) Human-body-gesture-based region and volume selection for hmd
US20240364645A1 (en) User interfaces and techniques for editing, creating, and using stickers
US10007418B2 (en) Device, method, and graphical user interface for enabling generation of contact-intensity-dependent interface responses
US20240291944A1 (en) Video application graphical effects
US20250377717A1 (en) Systems, Methods, and Graphical User Interfaces for Automatic Measurement in Augmented Reality Environments
WO2024226225A1 (en) User interfaces and techniques for editing, creating, and using stickers
CN117120956A (en) Systems, methods and graphical user interfaces for automated measurements in augmented reality environments

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220729