CN115167675A - Augmented reality device - Google Patents
Augmented reality device Download PDFInfo
- Publication number
- CN115167675A CN115167675A CN202210808606.2A CN202210808606A CN115167675A CN 115167675 A CN115167675 A CN 115167675A CN 202210808606 A CN202210808606 A CN 202210808606A CN 115167675 A CN115167675 A CN 115167675A
- Authority
- CN
- China
- Prior art keywords
- user
- real
- augmented reality
- predetermined
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
本申请是申请号为201280048836.8(国际申请号PCT/IL2012/050376)、申请日为2012年09月19日、发明名称为“增强现实系统的免触摸界面”的发明专利申请的分案申请。This application is a divisional application for an invention patent application with the application number of 201280048836.8 (international application number PCT/IL2012/050376), the filing date of which is on September 19, 2012, and the invention title is "touch-free interface for augmented reality system".
技术领域technical field
本发明涉及用于增强现实的方法和系统。The present invention relates to methods and systems for augmented reality.
相关技术Related Technology
以下列出与本公开主题的背景相关的参考文献:References relevant to the context of the disclosed subject matter are listed below:
美国专利No.7126558;U.S. Patent No. 7,126,558;
美国公开专利申请20110221669;US Published Patent Application 20110221669;
美国公开专利申请20110270522;US Published Patent Application 20110270522;
GB2465280(A);GB2465280(A);
美国公开专利申请20120068913;U.S. Published Patent Application 20120068913;
美国专利No.7,215,322;US Patent No. 7,215,322;
WO2005/091125;WO2005/091125;
WO 2010/086866WO 2010/086866
Crowley,J.L.等人的“Finger Tracking as an Input Device for AugmentedReality”。其发表在1995年6月Switzerland的Zurich的《International Workshop onFace and Gesture Recognition》的会刊上。"Finger Tracking as an Input Device for AugmentedReality" by Crowley, J.L. et al. It was published in the June 1995 issue of Zurich's "International Workshop on Face and Gesture Recognition", Switzerland.
上述参考文献的确认不应被推断为意指,这些都以任何方式与本公开主题的专利相关。The identification of the above references should not be inferred to mean that these are in any way related to the patents of the disclosed subject matter.
背景技术Background technique
增强现实是物理、现实世界环境的实时、直接或间接视图的术语,所述物理、现实世界环境的元素由计算机生成的信息来增强,例如文本、声音、视频、图形或GPS数据。有关环境和其对象的人工信息因此覆盖在现实世界视图或图像上。增强通常实时进行且在环境因素的语义上下文中,使得关于用户的周围现实世界的信息变得互动和数字可操作。Augmented reality is the term for a real-time, direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated information, such as text, sound, video, graphics, or GPS data. Artificial information about the environment and its objects is thus overlaid on the real-world view or image. Augmentation typically occurs in real-time and in the semantic context of environmental factors, making information about the user's surrounding real world interactive and digitally actionable.
用于增强现实的主要硬件组件是处理器、显示器、传感器和输入设备。特别是CPU、显示器、相机和MEMS传感器(例如加速度计、GPS或固态罗盘)的这些元件存在于例如智能手机的便携式设备中,从而允许它们充当增强现实平台。The main hardware components used for augmented reality are processors, displays, sensors, and input devices. These elements, in particular CPUs, displays, cameras and MEMS sensors such as accelerometers, GPS or solid state compasses, are present in portable devices such as smartphones, allowing them to act as augmented reality platforms.
增强现实系统已在娱乐、导航、装配工艺、维修、医疗程序中广泛应用。便携式增强现实系统也已在旅游观光中广泛应用,其中增强现实用以呈现正在观看的现实世界对象和位置对象的信息。Augmented reality systems have been widely used in entertainment, navigation, assembly processes, maintenance, medical procedures. Portable augmented reality systems have also been widely used in travel and tourism, where augmented reality is used to present information about real-world objects being viewed and location objects.
使用通常呈护目镜或头盔形式的头戴式显示器来提供身临其境的增强现实体验。在使用头戴式显示器的情况下,虚拟视觉对象被叠加到用户现实世界场景的视图上。头戴显示器用允许系统把虚拟信息与物理世界对齐的传感器来跟踪。例如,跟踪可使用例如数码相机或其它光学传感器、加速度计、GPS、陀螺仪、固态罗盘、RFID和无线传感器的技术中的任一个或多个来执行。头戴式显示器是光透视或视频透视。光透视使用一些解决方案,例如半镀银镜以使图像通过镜片并覆盖将要反映到用户眼睛的信息,和透明LCD投影机,所述透明LCD投影机把数字信息和图像直接或间接显示到用户视网膜。Use a head-mounted display, usually in the form of goggles or a helmet, to provide an immersive augmented reality experience. In the case of a head mounted display, virtual visual objects are superimposed on the user's view of the real world scene. Head-mounted displays are tracked with sensors that allow the system to align virtual information with the physical world. For example, tracking may be performed using any one or more of technologies such as digital cameras or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID, and wireless sensors. Head mounted displays are light see-through or video see-through. Light perspective uses solutions such as semi-silvered mirrors to pass the image through the lens and overlay the information to be reflected to the user's eyes, and transparent LCD projectors that display digital information and images directly or indirectly to the user retina.
发明内容SUMMARY OF THE INVENTION
本发明提供一种用于增强现实的互动系统。本发明的互动系统包括可例如并入一副眼镜或护目镜的可穿戴数据显示设备。可穿戴显示器具有提供位置提取功能的设备(例如GPS)和罗盘。系统也包括允许用户选择计算机生成的数据以增强用户查看的现实世界场景的用户界面。相机获得正被查看的现实世界场景的图像。处理器在相机捕获的现实世界场景的图像中检测预定对象,例如用户手指。当用户指向场景中的元素时,与所述元素有关的数据被显示在数据显示设备上并被叠加到用户的场景查看中。The present invention provides an interactive system for augmented reality. The interactive system of the present invention includes a wearable data display device that may, for example, be incorporated into a pair of glasses or goggles. Wearable displays have devices (eg GPS) and compasses that provide location extraction capabilities. The system also includes a user interface that allows the user to select computer-generated data to enhance the real-world scene viewed by the user. The camera obtains an image of the real world scene being viewed. The processor detects predetermined objects, such as a user's finger, in the image of the real-world scene captured by the camera. When the user points to an element in the scene, data related to the element is displayed on the data display device and superimposed on the user's view of the scene.
因此,在一个方面,本发明提供一种用于增强现实的方法,其包括:Accordingly, in one aspect, the present invention provides a method for augmented reality comprising:
(a)从一个或多个图像传感器获得现实世界场景的图像;(a) obtaining images of real-world scenes from one or more image sensors;
(b)从一个或多个状态传感器获得所述图像传感器的方向和位置数据中的一个或两个;(b) obtaining one or both of the orientation and position data of the image sensor from one or more state sensors;
(c)在所述一个或多个图像传感器获得的所述现实世界场景的所述图像中识别现实世界对象,预定指向对象在所述现实世界对象上执行预定手势,所述手势检测模块利用所述一个或多个状态传感器提供的数据;和(c) identifying a real-world object in the image of the real-world scene obtained by the one or more image sensors, a predetermined pointing object performs a predetermined gesture on the real-world object, and the gesture detection module utilizes the data provided by the one or more status sensors; and
(d)在查看设备的显示器上呈现与所述识别的对象关联的数据。(d) presenting data associated with the identified object on a display of the viewing device.
所述图像传感器可选自:相机、光传感器、IR传感器、超声波传感器、接近传感器、CMOS图像传感器、短波红外(SWIR)图像传感器或反射传感器、IR传感器、超声波传感器、接近传感器,和反射传感器。状态传感器中的一个或多个可选自:光学传感器、加速度计、GPS、陀螺仪、罗盘、磁传感器、指示所述设备相对于地球磁���的所述方向的传感器、重力传感器和RFID检测器。The image sensor may be selected from cameras, light sensors, IR sensors, ultrasonic sensors, proximity sensors, CMOS image sensors, short wave infrared (SWIR) image sensors or reflective sensors, IR sensors, ultrasonic sensors, proximity sensors, and reflective sensors. One or more of the status sensors may be selected from: optical sensors, accelerometers, GPS, gyroscopes, compass, magnetic sensors, sensors indicating the orientation of the device relative to the earth's magnetic field, gravity sensors, and RFID detectors.
与所述识别的对象关联的所述数据可通过在存储器中搜索与所述现实世界对象关联的数据来获得。The data associated with the identified object may be obtained by searching memory for data associated with the real world object.
所述预定对象例如可为手、手的一部分、两只手、两只手的部分、手指、手指的一部分或指尖。The predetermined object may be, for example, a hand, a part of a hand, two hands, parts of two hands, a finger, a part of a finger, or a fingertip.
所述查看设备可被配置以由用户佩戴,例如,眼镜或护目镜。所述查看设备可被被并入移动通信设备。The viewing device may be configured to be worn by a user, eg, glasses or goggles. The viewing device may be incorporated into a mobile communication device.
所述在所述一个或多个图像传感器获得的所述现实世界场景的所述图像中识别的步骤可包括:确定所述图像传感器获得的图像中所述预定对象的位置(X,Y);和确定所述传感器提供的所述显示设备的位置和方向中的一个或两个。The step of identifying in the image of the real-world scene obtained by the one or more image sensors may include: determining the position (X, Y) of the predetermined object in the image obtained by the image sensor; and determining one or both of the position and orientation of the display device provided by the sensor.
本发明的方法还可包括:与外部设备或网站通信。所述通信可包括:把消息发送到在所述外部设备上运行的应用程序、在所述外部设备上运行的服务、在所述外部设备上运行的操作系统、在所述外部设备上运行的程序、在所述外部设备的处理器上运行的一个或多个应用程序、在所述外部设备的所述背景中运行的软件程序,或在所述外部设备上运行的一个或多个服务。所述方法还可包括:把消息发送到在所述移动通信设备上运行的应用程序、在所述移动通信设备上运行的服务、在所述移动通信设备上运行的操作系统、在所述移动通信设备上运行的程序、在所述移动通信设备的处理器上运行的一个或多个应用程序、在所述移动通信设备的所述背景中运行的软件程序,或在所述移动通信设备上运行的一个或多个服务。The method of the present invention may further include communicating with an external device or website. The communication may include sending a message to an application running on the external device, a service running on the external device, an operating system running on the external device, a service running on the external device A program, one or more application programs running on a processor of the external device, a software program running in the context of the external device, or one or more services running on the external device. The method may also include sending a message to an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, an application running on the mobile communication device A program running on a communication device, one or more application programs running on a processor of the mobile communication device, a software program running in the context of the mobile communication device, or on the mobile communication device One or more services running.
所述方法还可包括:从在所述外部设备上运行的应用程序、在所述外部设备上运行的服务、在所述外部设备上运行的操作系统、在所述外部设备上运行的程序、在所述外部设备的处理器上运行的一个或多个应用程序、在所述外部设备的所述背景中运行的软件程序发送消息,所述消息请求与图像中识别的现实世界对象有关的数据,或者把所述消息发送到在所述外部设备上运行的一个或多个服务。所述方法还可包括:从在所述移动通信设备上运行的应用程序、在所述移动通信设备上运行的服务、在所述移动通信设备上运行的操作系统、在所述移动通信设备上运行的程序、在所述移动通信设备的处理器上运行的一个或多个应用程序、在所述移动通信设备的所述背景中运行的软件程序发送消息,所述消息请求与图像中识别的现实世界对象有关的数据,或者把所述消息发送到在所述移动通信设备上运行的一个或多个服务。The method may further include: from an application running on the external device, a service running on the external device, an operating system running on the external device, a program running on the external device, One or more application programs running on the processor of the external device, software programs running in the background of the external device, send messages requesting data related to real-world objects identified in the image , or send the message to one or more services running on the external device. The method may also include: from an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, an operating system running on the mobile communication device A program running, one or more application programs running on a processor of the mobile communication device, a software program running in the background of the mobile communication device send a message requesting a data about real world objects, or send the message to one or more services running on the mobile communication device.
到所述外部设备或网站的所述消息可为命令。所述命令可选自:在所述外部设备或网站上运行应用程序的命令、停止在所述外部设备或网站上运行的应用程序的命令、激活在所述外部设备或网站上运行的服务的命令、停止在所述外部设备或网站上运行的服务的命令,或发送与图像中识别的现实世界对象有关的数据的命令。The message to the external device or website may be a command. The command may be selected from: a command to run an application on the external device or website, a command to stop an application running on the external device or website, a command to activate a service running on the external device or website A command, a command to stop a service running on the external device or website, or a command to send data related to a real-world object identified in an image.
到所述移动通信设备的所述消息可为命令。所述命令可选自:在所述移动通信设备上运行应用程序的命令、停止在所述移动通信设备或网站上运行的应用程序的命令、激活在所述移动通信设备上运行的服务的命令、停止在所述移动通信设备上运行的服务的命令,或发送与图像中识别的现实世界对象有关的数据的命令。The message to the mobile communication device may be a command. The command may be selected from: a command to run an application on the mobile communication device, a command to stop an application running on the mobile communication device or a website, a command to activate a service running on the mobile communication device , a command to stop a service running on the mobile communication device, or a command to send data related to real world objects identified in an image.
所述方法还可包括:从所述外部设备或网站接收与图像中识别的现实世界对象有关的数据;和把所述接收到的数据呈现给用户。The method may further include: receiving data from the external device or website related to the real-world object identified in the image; and presenting the received data to a user.
与所述外部设备或网站通信可通过通信网络进行。Communication with the external device or website may take place through a communication network.
到所述外部设备的所述命令可选自:按压所述外部设备的显示设备上显示的虚拟键;旋转选择转盘;切换桌面、在所述外部设备上运行预定软件应用程序;关闭所述外部设备上的应用程序;打开或关闭音箱;调高或调低音量;锁定所述外部设备、解锁所述外部设备、跳到媒体播放器的另一个曲目或在IPTV频道间转换;控制导航应用程序;发起呼叫、结束通话、呈现出通知、显示通知;浏览照片或音乐专辑图库、滚动网页页面、呈现电子邮件、呈现一个或多个文档或地图、控制游戏中的动作、指着地图、放大或缩小地图或图像、在图像上着色、抓激活图标且从所述显示设备拉出所述激活图标、旋转激活图标、在所述外部设备上模拟触摸命令、执行一个或多个多点触摸命令、触摸手势命令、打字、点击显示视频以暂停或播放、标记帧或从视频捕获帧、呈现传入消息;接听来电、静音或拒绝接听来电、打开来电提醒;呈现从网络社区服务收到的通知;呈现由所述外部设备生成的通知、打开预定应用程序、改变所述外部设备的锁定模式并打开最近通话应用程序、改变所述外部设备的锁定模式并打开在线服务应用程序或浏览器、改变所述外部设备的锁定模式并打开电子邮件应用程序、改变所述外部设备的锁定模式并打开在线服务应用程序或浏览器、改变所述设备的锁定模式并打开日历应用程序、改变所述设备的锁定模式并打开提醒应用程序、改变所述设备的锁定模式并打开用户设置的、所述外部设备的���造商设置的或服务运营商设置的预定应用程序、激活激活图标、选择菜单项、在显示器上移动指针、操纵显示器上的免触摸鼠标、激活图标、改变显示器上的信息。The command to the external device may be selected from: pressing a virtual key displayed on a display device of the external device; rotating a selection dial; switching desktops, running a predetermined software application on the external device; closing the external device Apps on the device; turn the speakers on or off; turn the volume up or down; lock the external device, unlock the external device, skip to another track on the media player or switch between IPTV channels; control the navigation application ; initiate a call, end a call, present notifications, display notifications; browse photo or music album galleries, scroll web pages, present emails, present one or more documents or maps, control in-game actions, point to maps, zoom in, or zooming out on a map or image, coloring on an image, grabbing and pulling an active icon from the display device, rotating an active icon, simulating touch commands on the external device, executing one or more multi-touch commands, touch gesture commands, typing, tap to show video to pause or play, mark or capture frames from video, present incoming messages; answer incoming calls, mute or reject incoming calls, turn on incoming call alerts; present notifications received from online community services; Present a notification generated by the external device, open a predetermined application, change the lock mode of the external device and open the recents application, change the lock mode of the external device and open an online service application or browser, change all change the lock mode of the external device and open an email application, change the lock mode of the external device and open an online service application or browser, change the lock mode of the device and open a calendar application, change the lock of the device mode and open a reminder application, change the lock mode of the device and open a predetermined application set by the user, set by the manufacturer of the external device or set by the service operator, activate the activation icon, select a menu item, on the display Move the pointer, manipulate the touch-free mouse on the monitor, activate icons, and change information on the monitor.
在本发明的方法中,所述预定手势可选自:翻页手势、两个手指的对捏运动、指向、左到右手势、右到左手势、向上手势、向下手势、按压手势、打开握紧的拳头、打开握紧的拳头并移向所述图像传感器、轻敲手势、挥手手势、鼓掌手势、反向鼓掌手势、手握成拳、对捏手势、反向对捏手势、张开手指的手势、反向张开手指的手势、指着激活图标、保持激活对象预定时间量、点击激活图标、双击激活图标、从右侧点击激活图标、从左侧点击激活图标、从下点击激活图标、从上点击激活图标、抓激活图标即所述对象、从右指着激活图标即所述对象、从左指着激活图标、从左通过激活图标、推对象、鼓掌、在激活图标上方挥手、执行爆炸手势、执行轻敲手势、在激活图标上执行顺时针或反时针手势、滑动图标、用两个手指抓激活图标,和执行点击拖动释放运动。In the method of the present invention, the predetermined gesture can be selected from: page turning gesture, two-finger pinch motion, pointing, left-to-right gesture, right-to-left gesture, up gesture, down gesture, pressing gesture, open clenched fist, open clenched fist and move towards the image sensor, tap gesture, wave gesture, clapping gesture, reverse clapping gesture, hand in fist, pinch gesture, reverse pinch gesture, open Finger gesture, reverse finger open gesture, pointing at activation icon, holding activated object for predetermined amount of time, click activation icon, double click activation icon, right click activation icon, left click activation icon, bottom click activation icon, click on the activation icon from above, grab the activation icon as the object, point at the activation icon as the object from the right, point at the activation icon from the left, pass the activation icon from the left, push the object, clap, wave over the activation icon , perform an explode gesture, perform a tap gesture, perform a clockwise or counterclockwise gesture on the activation icon, swipe the icon, grab the activation icon with two fingers, and perform a tap-drag-release motion.
与所述识别的对象关联的所述数据可为视觉数据、音频数据或文本数据中的任一个或多个。与所述识别的对象关联的所述数据可为激活图标。所述激活图标可为2D或3D激活图标。所述激活图标可由用户在所述用户前面的3D空间中感知。The data associated with the identified object may be any one or more of visual data, audio data, or textual data. The data associated with the identified object may be an activation icon. The activation icon may be a 2D or 3D activation icon. The activation icon is perceivable by the user in 3D space in front of the user.
本发明的方法可具有两个或更多个操作模式。所述方法可在识别预定手势之后,改变所述系统的所述操作模式。操作模式可由以下中的任一个或多个指定:将要识别的所述手势、在所述手势检测模块上有效的算法;所述图像传感器捕获的图像分辨率,和所述图像传���器捕获的图像捕获率、将要呈现的所述数据的所述详细程度、将要呈现给所述用户的所述激活图标、将要呈现的数据源、将要呈现的所述数据的详细程度、将要在所述显示设备上显示的激活图标、活跃的在线服务。The method of the present invention may have two or more modes of operation. The method may change the operating mode of the system after recognizing a predetermined gesture. The mode of operation may be specified by any one or more of: the gesture to be recognized, the algorithm active on the gesture detection module; the resolution of the image captured by the image sensor, and the image capture captured by the image sensor rate, the level of detail of the data to be presented, the activation icon to be presented to the user, the source of data to be presented, the level of detail of the data to be presented, the level of detail to be displayed on the display device activation icon, active online service.
所述操作模式可为选自以下的模式:在识别预定手势之后所述图像传感器视频录制图像的模式;在识别预定手势之后麦克风录制声音并在识别另一预定手势之后停止录制的模式;连续监测视频或声音并在检测到预定手势之后录制从识别所述手势之前预定时间量开始的所述视频或声音且在识别另一预定手势之后停止所述录制的模式;在识别预定手势之后向捕获和实时录制的视频添加标签的模式;在所述相机捕获的所述视场中选择区域并把所述区域复制到所述视场中另一位置且其调整大小的模式;对图像中选择区域使用跟踪器并在所述显示设备上所述调整大小和重新安置区域中实时呈现所述选择区域的模式;在识别预定手势之后捕获图像的模式。The operating mode may be a mode selected from the group consisting of: a mode in which the image sensor video records an image after recognizing a predetermined gesture; a mode in which the microphone records sound after recognizing a predetermined gesture and stops recording after recognizing another predetermined gesture; continuous monitoring A mode of video or sound and after a predetermined gesture is detected, the video or sound is recorded starting a predetermined amount of time before the gesture is recognized and the recording is stopped after another predetermined gesture is recognized; mode for tagging live recorded video; mode for selecting an area in the field of view captured by the camera and copying the area to another location in the field of view and resizing it; using the selected area in the image tracker and present the pattern of the selection area in real time in the resize and relocation area on the display device; the pattern of capturing an image after a predetermined gesture is recognized.
本发明的方法还可包括:运行跟踪算法,所述跟踪算法跟踪所述识别的现实世界对象并维持所述显示的相关视觉数据相对于所述识别的现实世界对象处于固定位置。The method of the present invention may further include running a tracking algorithm that tracks the identified real-world object and maintains the displayed associated visual data in a fixed position relative to the identified real-world object.
对象识别模块可用以只有当所述显示设备具有低于预定阈值的运动水平时才检测所述预定对象。An object recognition module may be operable to detect the predetermined object only if the display device has a level of motion below a predetermined threshold.
所述方法还可包括:当已识别到预定手势时提供反馈。所述反馈例如可为视觉反馈、听觉反馈、触觉反馈、定向振动、空气触觉反馈,或超声波反馈。所述反馈可为呈选自以下形式的视觉指示:所述显示设备上显示的激活图标、所述显示设备上显示的激活图标的变化、所述显示设备上显示的激活图标的颜色的变化、所述显示设备上显示的激活图标的大小的变化、所述显示设备上显示的激活图标的动画、指示灯、在显示设备上移动的指示器、在所述显示设备上出现的所有其它图像或视频顶部出现的在所述显示设备上移动的指示器,和所述预定对象周围的辉光的所述外观。所述反馈可为振动、定向振动指示,或空气触觉指示。The method may further include providing feedback when the predetermined gesture has been recognized. The feedback can be, for example, visual feedback, auditory feedback, haptic feedback, directional vibration, air haptic feedback, or ultrasonic feedback. The feedback may be a visual indication in a form selected from an activation icon displayed on the display device, a change in the activation icon displayed on the display device, a change in color of the activation icon displayed on the display device, Changes in the size of the activation icon displayed on the display device, animation of the activation icon displayed on the display device, indicator lights, indicators moving on the display device, all other images appearing on the display device, or A pointer appearing at the top of the video moving on the display device, and the appearance of the glow around the predetermined object. The feedback may be vibration, a directional vibration indication, or an air tactile indication.
在本发明的方法中,所述显示设备上显示的激活图标的部分不在所述预定对象所在的位置呈现,使得所述预定对象似乎在所述激活图标的顶部。In the method of the present invention, the portion of the activation icon displayed on the display device is not presented where the predetermined object is located, so that the predetermined object appears to be on top of the activation icon.
当所述显示设备具有高于预定阈值的活动水平时,激活图标可被从所述显示设备删除。例如,当所述显示设备具有低于所述预定阈值的运动水平时,在所述显示设备上的所述删除的图标可被删除。The activation icon may be removed from the display device when the display device has an activity level above a predetermined threshold. For example, the deleted icon on the display device may be deleted when the display device has an exercise level below the predetermined threshold.
当执行预定动作时,所述方法可被带入所述活动模式。所述预定动作可选自:当用户把所述预定对象放入某一位置或一团时,把所述预定对象从下带入所述视场,例如指着所述相机视场的所述右下角或在所述相机视场中打开手;当显示激活图标且所述用户执行关联到所述激活图标的预定手势时,例如指着所述激活图标时,执行预定手势,例如从右向左移动手穿过所述视场,或在呈现所述激活图标的所述位置执行挥手手势,或通过在所述3D空间中在感知所述激活图标处于的位置执行手势、通过触摸所述设备,或如果所述设备具有加速度计就在所述设备上轻敲,把所述浮动激活图标从一个位置滑动到另一位置。作为另一实例,如果所述设备具有接近传感器或超声波传感器,那么当所述用户的手靠近所述设备时,所述系统可进入所述活动模式。所述系统也可由语音命令激活,或当所述用户把所述预定对象放入所述视场中特定位置时激活。作为另一实例,只有当在所述用户的所述视场中有与所述现实世界关联的相关数据时,所述系统才可进入所述活动模式。此时,所述系统可向所述用户指示何时有将要呈现的相关数据,或何时准备好进行互动。The method may be brought into the active mode when a predetermined action is performed. The predetermined action may be selected from: when the user puts the predetermined object into a certain position or a ball, bringing the predetermined object into the field of view from below, for example, pointing at the said field of view of the camera. lower right corner or open hand in the camera field of view; when an activation icon is displayed and the user performs a predetermined gesture associated with the activation icon, such as pointing to the activation icon, a predetermined gesture is performed, such as from the right moving the hand left across the field of view, or performing a waving gesture at the location where the activation icon is presented, or by performing a gesture in the 3D space where the activation icon is perceived to be, by touching the device , or tap on the device if the device has an accelerometer to slide the floating activation icon from one position to another. As another example, if the device has a proximity sensor or an ultrasonic sensor, the system may enter the active mode when the user's hand approaches the device. The system may also be activated by a voice command, or when the user places the predetermined object in a specific location in the field of view. As another example, the system may enter the active mode only when there is relevant data associated with the real world in the user's field of view. At this point, the system can indicate to the user when there is relevant data to be presented, or when it is ready to interact.
本发明的方法还可包括:把视觉指示附加到���实世界对象,以指示存在与所述现实世界对象相关的数据的存储器。所述视觉指示可被覆盖在所述现实世界对象的图像上。所述视觉可选自激活图标、照片和信封的图像。The method of the present invention may further include attaching a visual indication to the real world object to indicate that there is a memory of data related to the real world object. The visual indication may be overlaid on the image of the real world object. The visual may be selected from activation icons, photographs, and images of envelopes.
本发明的方法还可包括:记录所述预定对象的一个或多个物理参数的校准过程。所述校准过程可包括选自以下的任一个或多个步骤:在所述显示器上在3D空间中不同的位置处呈现激活图标;提取所述预定对象的物理特性;和确定所述预定对象的尺寸和它与所述相机的距离之间的相关性。所述校准过程可包括以下步骤:构建三角形,所述三角形的顶点在所述图像传感器中的一个上且在所述预定对象的前端,且所述三角形的边由用户的视线形成。所述现实世界对象与所述相机的所述距离可基于所述校准中提取的信息来估计。The method of the present invention may further comprise a calibration process of recording one or more physical parameters of the predetermined object. The calibration process may include any one or more steps selected from: presenting activation icons on the display at different locations in 3D space; extracting physical properties of the predetermined object; and determining the predetermined object's physical properties. Correlation between size and its distance from the camera. The calibration process may include the steps of constructing a triangle whose apex is on one of the image sensors and in front of the predetermined object and whose sides are formed by the user's line of sight. The distance of the real world object to the camera may be estimated based on information extracted in the calibration.
所述方法还可包括:显示能够进行文字输入的键盘。所述键盘可在检测到预定手势之后显示,所述预定手势例如从右到左的手势、呈现张开的手、在图像传感器的所述视场的预定区域中呈现两个张开的手。所述键盘可在3D打字区域或感知预定激活图标所处于的位置中执行点击手势之后显示。The method may further include: displaying a keyboard capable of text input. The keyboard may be displayed upon detection of a predetermined gesture, such as a right-to-left gesture, presenting an open hand, presenting two open hands in a predetermined area of the field of view of the image sensor. The keyboard may be displayed after performing a tap gesture in a 3D typing area or sensing where a predetermined activation icon is located.
本发明也提供一种系统,其包括被配置以执行本发明的方法的设备。The present invention also provides a system comprising an apparatus configured to perform the method of the present invention.
本发明也提供一种计算机程序,其包括用于当所述程序在计算机上运行时执行本发明的方法的所有步骤的计算机程序代码构件。所述计算机程序可实施在计算机可读介质上。The invention also provides a computer program comprising computer program code means for carrying out all the steps of the method of the invention when said program is run on a computer. The computer program may be embodied on a computer-readable medium.
用户可与通常通过眼镜显示的视觉图像互动。因此,用户的���实视图被显示器上呈现的信息增强。增强现实设备的一个问题是用户与设备互动并控制设备的方式。例如鼠标、跟踪球或触摸屏的传统控制设备难以与增强现实设备连用。在增强现实系统中使用手势识别并不简单,因为用户在不断的实时移动因此增强现实设备也在不断的实时移动。Users can interact with visual images that are typically displayed through glasses. Thus, the user's view of reality is enhanced by the information presented on the display. One problem with augmented reality devices is the way the user interacts with and controls the device. Traditional control devices such as mice, trackballs or touchscreens are difficult to use with augmented reality devices. Using gesture recognition in augmented reality systems is not straightforward because the user is constantly moving in real time and therefore the augmented reality device is constantly moving in real time.
本发明因此提供一种计算机程序产品,其包含用于使处理器执行包括以下步骤的方法的指令:The present invention therefore provides a computer program product comprising instructions for causing a processor to perform a method comprising the steps of:
从与增强现实设备关联的图像传感器接收与环境关���的图像信息;receive image information associated with the environment from an image sensor associated with the augmented reality device;
在与设备关联的显示器上显示与环境有关的增强信息;Display enhanced information about the environment on the display associated with the device;
在图像信息中识别设备用户的手势;Recognize device user gestures in image information;
把手势与增强信息相关联;和associating gestures with augmented information; and
基于关联来改变显示的增强信息。The displayed enhancement information is changed based on the association.
所述增强信息可包括以下中至少一个:与环境中对象关联的信息;与环境关联的图像;和与环境关联的距离。The enhanced information may include at least one of: information associated with objects in the environment; images associated with the environment; and distances associated with the environment.
所述关联可包括:确定用户手的至少一部分的三维空间中的参考位置;和确定与参考位置关联的增强信息和图像信息数据中至少一个。The associating may include: determining a reference position in three-dimensional space of at least a portion of the user's hand; and determining at least one of enhancement information and image information data associated with the reference position.
所述改变可包括:根据与参考位置关联的数据来改变增强信息。The changing may include changing the enhancement information according to data associated with the reference location.
附图说明Description of drawings
为了理解本发明并了解其在实践中如何实施,现在将参照附图仅通过非限制性实例的方式来描述实施方案,在附图中:In order to understand the invention and to understand how it may be implemented in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
图1示意性地示出根据本发明的一个实施方案的用于增强现实的系统;Figure 1 schematically shows a system for augmented reality according to an embodiment of the present invention;
图2示出根据本发明的一个实施方案的用于增强现实的系统,所述系统包括一组护目镜;Figure 2 illustrates a system for augmented reality including a set of goggles according to one embodiment of the present invention;
图3a和图3b示出使用中的图2的系统;Figures 3a and 3b illustrate the system of Figure 2 in use;
图4a示出在图2的系统的显示设备上显示的现实世界场景的视图;图4b示出图4a的视图,其中用户手指指向视图中的对象;和图4c示出覆盖在图4b的视图上的与用户手指所指向的对象有关的视觉文本;Figure 4a shows a view of a real world scene displayed on a display device of the system of Figure 2; Figure 4b shows the view of Figure 4a with a user's finger pointing at an object in the view; and Figure 4c shows the view overlaid on Figure 4b visual text on the object related to the object the user's finger is pointing at;
图5a、图5b示出根据本发明的另一实施方案的与通信设备成一体的用于增强现实的系统;和Figures 5a, 5b illustrate a system for augmented reality integrated with a communication device according to another embodiment of the present invention; and
图6a示出通过用户执行“绘制”区域轮廓的手势来指定图像传感器的视场中的区域;图6b示出通过执行第二手势来调整选择区域的大小;图6c示出调整大小之后的区域;和图6d示出拖到视场中新的位置之后的区域。Fig. 6a shows designation of an area in the field of view of the image sensor by the user performing a gesture of "drawing" the outline of the area; Fig. 6b shows resizing of the selection area by performing a second gesture; Fig. 6c shows the resizing region; and Figure 6d shows the region after dragging to a new location in the field of view.
具体实施方式Detailed ways
图1示意性地示出根据本发明的一个实施方案的用于增强现实的系统30。系统30包括一个或多个图像传感器32,图像传感器32被配置以获得现实世界场景的图像。任何类型的图像传感器可用于本发明的系统,例如相机、光传感器、IR传感器、超声波传感器、接近传感器、CMOS图像传感器、短波红外(SWIR)图像传感器或反射传感器。Figure 1 schematically shows a
系统30还包括具有一个或多个显示设备35的查看设备34,显示设备35使得用户能够看到现实世界场景和叠加到现实世界场景上的外部信息,例如图像、视频或音频信号。允许用户看到现实世界场景和显示的数据的任何类型的显示设备可用于本发明的系统中。The
显示设备35可例如包括视觉材料在其上呈现给用户的表面或直接把图像显示到用户视网膜的一个或多个投影机。处理器36从一个或多个状态传感器38获得系统30的方向和/或位置数据,状态传感器38例如可为光学传感器、加速度计、GPS、陀螺仪、固态罗盘、磁传感器、重力传感器和RFID检测器中的任一个或多个。处理器36例如可为专用处理器、通用处理器、DSP(数字信号处理器)处理器、GPU(可视处理单元)处理器、专用硬件,或者可在外部设备上运行的处理器。系统30可作为软件在查看设备34或并入系统30的其它组件的另一设备37(例如智能手机)上运行。
处理器36被配置以运行手势检测模块40,手势检测模块40在图像传感器32获得的现实世界场景的图像中识别预定对象正指向的一个或多个现实世界对象。现实世界对象例如可为建筑物或广告牌。现实世界对象的确定使用状态传感器38提供的数据。预定对象可为用户的手指或例如手写笔或棒的其它对象。
当处理器36已识别到预定对象正指向的现实世界对象时,处理器在存储器42中搜索与识别的对象关联的数据。数据例如可为视觉数据、音频数据,或文本数据。视觉数据可为与识别的对象有关的文本信息。处理器然后在查看设备的显示器上显示与识别的对象关联的相关视觉数据。存储器42可与系统30成一体,或可能位于远程并通过例如互联网的通信网络访问。系统30因此可包括通信模块39,通信模块39允许系统30与网络、无线网络、蜂窝网络、外部设备(例如,另一设备30、手机、平板),或互联网网站等通信。When the
所述数据可为激活图标。如本文所使用,术语“激活图标”代表与用户互动所激活的一个或多个消息或命令关联的图像或视频中的区域。激活图标例如可为2D或3D视觉元素,例如虚拟按钮、���拟键盘或图标。激活图标借由一个或多个预定对象来激活,所述预定对象可由系统来识别,且例如可为手写笔、用户的手或手的一部分、一个或多个手指或例如指尖的手指的一部分中的一个或多个。预定对象激活激活图标中的一个或多个生成了定位到操作系统、一个或多个服务、一个或多个应用程序、一个或多个设备、一个或多个远程应用程序、一个或多个远程服务,或一个或多个远程设备��消息或命令。The data may be an activation icon. As used herein, the term "activation icon" represents an area in an image or video associated with one or more messages or commands activated by user interaction. The activation icon may be, for example, a 2D or 3D visual element, such as a virtual button, a virtual keyboard or an icon. The activation icon is activated by one or more predetermined objects, which are identifiable by the system and can be, for example, a stylus, a user's hand or part of a hand, one or more fingers, or a part of a finger such as a fingertip one or more of. Activation of one or more of the activation icons by a predetermined object generates a location to the operating system, one or more services, one or more applications, one or more devices, one or more remote applications, one or more remote applications Services, or messages or commands from one or more remote devices.
处理器36可被配置以把消息或命令发送到设备37或远程设备、在设备上运行的应用程序、在设备37上运行的服务,和在设备上运行的操作系统、在设备上运行的程序、在背景中运行的软件程序和在设备上运行的一个或多个服务,或在设备中运行的过程。消息或命令可通过例如互联网或蜂窝电话网络的通信网络来发送。命令例如可为在设备上运行应用程序的命令、停止在设备上运行的应用程序的命令、激活在设备上运行的服务的命令、停止在设备上运行的服务的命令,或把与处理器36在图像中识别的现实世界对象有关的数据发送到处理器36的命令。The
所述命令可为到设备37的命令,例如按压设备的显示设备上显示的虚拟键;旋转选择转盘;切换桌面、在设备上运行预定软件应用程序;关闭设备上的应用程序;打开或关闭音箱;调高或调低音量;锁定设备、解锁设备、跳到媒体播放器的另一个曲目或在IPTV频道间转换;控制导航应用程序;发起呼叫、结束通话、呈现出通知、显示通知;浏览照片或音乐专辑图库、滚动网页页面、呈现电子邮件、呈现一个或多个文档或地图、控制游戏中的动作、控制互动视频或动画内容、编辑视频或���像、指着地图、放大或缩小地图或图像、在图像上着色、把激活图标从显示设备拉开、抓激活图标且从显示设备拉出激活图标、旋转激活图标、在设备上模拟触摸命令、执行一个或多个多点触摸命令、触摸手势命令、打字、点击显示视频以暂停或播放、编辑视频或音乐命令、标������或从视频捕获帧、从视频切割视频的子集、呈现传入消息;接听来电、静音或拒绝接听来电、打开来电提醒;呈现从网络社区服务收到的通知;呈现由设备生成的通知、改变设备的锁定模式并激活最近通话应用程序、改变设备的锁定模式并激活在线服务应用程序或浏览器、改变设备的锁定模式并激活电子邮件应用程序、改变设备的锁定模式并激活在线服务应用程序或浏览器、改变设备的锁定模式并激活日历应用程序、改变设备的锁定模式并激活提醒应用程序、改变设备的锁定模式并激活用户设置的、设备的制造商设置的或服务运营商设置的预定应用程序、激活激活图标、选择菜单项、在显示器上移动指针、操纵免触摸鼠标、激活显示器上的激活图标,和改变显示器上的信息。The command may be a command to the device 37, such as pressing a virtual key displayed on the device's display device; rotating a selection dial; switching desktops, running predetermined software applications on the device; closing applications on the device; turning speakers on or off Turn up or down the volume; lock the device, unlock the device, skip to another track on the media player, or switch between IPTV channels; control the navigation application; initiate calls, end calls, show notifications, show notifications; browse photos or music album galleries, scroll web pages, render emails, render one or more documents or maps, control actions in games, control interactive video or animated content, edit videos or images, point to a map, zoom in or out on a map or image , colorize the image, pull the activation icon away from the display device, grab the activation icon and pull the activation icon from the display device, rotate the activation icon, simulate touch commands on the device, perform one or more multi-touch commands, touch gestures Command, type, tap to show video to pause or play, edit video or music commands, mark or capture frames from video, cut a subset of video from video, render incoming messages; answer incoming calls, mute or reject incoming calls, open incoming calls Reminders; present notifications received from web community services; present notifications generated by the device, change the lock mode of the device and activate the recent calls application, change the lock mode of the device and activate the online service application or browser, change the lock of the device mode and activate email application, change device lock mode and activate online service application or browser, change device lock mode and activate calendar application, change device lock mode and activate reminder application, change device lock mode and activate predetermined applications set by the user, set by the manufacturer of the device, or set by the service operator, activate the activation icon, select a menu item, move the pointer on the display, manipulate a touchless mouse, activate the activation icon on the display, and change information on the display.
通信模块可用以发送例如可被定位到远程设备的消息。消息例如可为到远程设备的命令。命令例如可为在远程设备上运行应用程序的命令、停止在远程设备上运行的应用程序的命令、激活在远程设备上运行的服务的命令、停止在远程设备上运行的服务的命令。消息可为到远程设备的选自以下的命令:按压远程设备的显示设备上显示的虚拟键;旋转选择转盘;切换桌面、在远程设备上运行预定软件应用程序;关闭远程设备上的应用程序;打开或关闭音箱;调高或调低音量;锁定远程设备、解锁远程设备、跳到媒体播放器的另一个曲目或在IPTV频道间转换;控制导航应用程序;发起呼叫、结束通话、呈现出通知、显示通知;浏览照片或音乐专辑图库、滚动网页页面、呈现电子邮件、呈现一个或多个文档或地图、控制游戏中的动作、指着地图、放大或缩小地图或图像、在图像上着色、抓激活图标且从显示设备拉出激活图标、旋转激活图标、在远程设备上模拟触摸命令、执行一个或多个多点触摸命令、触摸手势命令、打字、点击显示视频以暂停或播放、标记帧或从视频捕获帧、呈现传入消息;接听来电、静音或拒绝接听来电、打开来电提醒;呈现从网络社区服务收到的通知;呈现由远程设备生成的通知、打开预定应用程序、改变远程设备的锁定模式并���开最近通话应用程序、改变远程设备的锁定模式并打开在线服务应用程序或浏览器、改变远程设备的锁定模式并打开电子邮件应用程序、改变远程设备的锁定模式并打开在线服务应用程序或浏览器、改变设备的锁定模式并打开日历应用程序、改变设备的锁定模式并打开提醒应用程序、改变设备的锁定模式并打开用户设置的、远程设备的制造商设置的或服务运营商设置的预定应用程序、激活激活图标、选择菜单项、在显示器上移动指针、操纵免触摸鼠标、激活显示器上的图标、改变显示器上的信息。The communication module can be used to send messages that can be located, for example, to a remote device. The message may be, for example, a command to a remote device. The command may be, for example, a command to run an application on the remote device, a command to stop an application running on the remote device, a command to activate a service running on the remote device, a command to stop a service running on the remote device. The message may be a command to the remote device selected from: pressing a virtual key displayed on the display device of the remote device; rotating the selection dial; switching desktops, running a predetermined software application on the remote device; closing the application on the remote device; Turn the speakers on or off; turn the volume up or down; lock the remote device, unlock the remote device, skip to another track on the media player, or switch between IPTV channels; control navigation applications; initiate calls, end calls, and present notifications , display notifications; browse photo or music album galleries, scroll web pages, present emails, present one or more documents or maps, control in-game actions, point to maps, zoom in or out on maps or images, colorize images, Grab the activation icon and pull the activation icon from the display device, rotate the activation icon, simulate a touch command on the remote device, perform one or more multi-touch commands, touch gesture commands, type, tap to display video to pause or play, mark frames or capture frames from video, present incoming messages; answer incoming calls, mute or reject incoming calls, turn on incoming call alerts; present notifications received from web community services; present notifications generated by remote devices, open scheduled applications, change remote devices change the lock mode of the remote device and open the recent call application, change the lock mode of the remote device and open the online service application or browser, change the lock mode of the remote device and open the email application, change the lock mode of the remote device and open the online service application Program or browser, changing the lock mode of the device and opening the Calendar application, changing the lock mode of the device and opening the Reminders application, changing the lock mode of the device and opening the user settings, the manufacturer settings of the remote device or the service operator settings , activate an icon, select a menu item, move a pointer on the display, manipulate a touchless mouse, activate an icon on the display, change information on the display.
消息可为对与识别的对象关联的数据的请求。数据请求消息可被定位到应用程序、服务、过程、在设备上运行的线程,或从应用程序、服务、过程或在外部设备上运行的线程,或在线服务定位。The message may be a request for data associated with the identified object. The data request message may be located to an application, service, process, thread running on a device, or from an application, service, process or thread running on an external device, or an online service.
为了减少CPU资源,只有当耳机如从状态传感器获得的信息所确定地并未显著移动时,才可使用检测预定对象的对象识别模块。To reduce CPU resources, an object recognition module that detects a predetermined object may only be used when the headset is not significantly moving, as determined by the information obtained from the state sensor.
图2示出根据本发明的一个实施方案的用于增强现实的系统2。系统2包括便携式查看设备,所述查看设备例如可为互动头戴式目镜,例如一副眼镜或护目镜4。护目镜4具有获得现实世界场景8的图像的图像传感器6。场景8例如可包括一个或多个建筑物12,或一个或多个广告牌14。护目镜可具有一个或多个显示设备10,所述显示设备10位于护目镜4中以当用户配戴护目镜4时位于用户眼睛前。显示设备10例如可为通过其查看现实世界场景并呈现外部数据的透视设备,例如透明LCD屏幕。系统2还包括处理器16,处理器16被配置以在图像传感器6捕获的图像中识别预定对象,所述预定对象执行手势或指向现实世界场景8中的现实世界对象或显示给用户的激活图标。系统2也包括一个或多个位置和/或方向传感器23,例如GPS、加速度计、陀螺仪、固态罗盘、磁传感器,或重力传感器。Figure 2 shows a system 2 for augmented reality according to one embodiment of the present invention. The system 2 includes a portable viewing device, which may be, for example, an interactive head-mounted eyepiece, such as a pair of glasses or goggles 4 . The goggles 4 have an
图5a、图5b示出根据本发明的另一实施方案的用于增强现实的系统40。系统40被集成到例如手机、平板或相机的移动通信设备42。图5a示出通信设备42的前视图,而图5b示出通信设备42的后视图。通信设备42的背面上具有获得现实世界场景的图像的图像传感器46,图像传感器46在显示设备对面。通信设备42的正面上也具有显示设备48,当相机46面对现实世界场景时,显示设备48位于用户前面。显示设备48例如可为向用户呈现相机6获得的现实世界场景的图像以及如下文所解释的视觉数据的LCD屏幕。系统40使用相机46、显示设备48,和通信设备42的处理器,且还包括一个或多个状态传感器,所述状态传感器被包含在图5a、图5b中未示出的通信设备42的外壳中。处理器被配置以在图像传感器46捕获的图像中识别指向现实世界场景中现实世界对象的预定对象。Figures 5a, 5b illustrate a
图3a示出使用中的系统2。护目镜4被放置在用户18眼睛上方。用户面向现实世界场景8因此查看场景8。图3b示出使用中的系统40。用户18手持通信设备42,通信设备42具有面向现实世界场景8的图像传感器46和面向用户的显示设备48。Figure 3a shows the system 2 in use. The goggles 4 are placed over the eyes of the
系统2或40现在执行以下过程。当用户使用系统2或40时将查看到的场景8的视图被显示在显示设备上。图4a示出当用户使用系统2或40查看现实世界场景8时将看到的场景8的视图。处理器36分析图像传感器获得的图像,以确定图像传感器捕获的图像中的预定对象何时执行与现实世界场景8中的现实世界对象相关的预定手势。
例如护目镜4或通信设备42的查看设备34在使用中并不固定,这是由于用户走路时发生的移动或用户的头或手的移动。在这种情况下,传感器38生成的信号可能是嘈杂且不准确的。在这种情况下,机器视觉模块37运行跟踪算法,所述跟踪算法跟踪识别的现实世界对象并维持显示的相关视觉数据相对于识别的现实世界对象处于固定位置。The
与现实世界对象或激活图标有关的预定手势例如可指向现实世界对象或激活图标,或对现实世界对象或激活图标执行翻页手势。激活图标可与现实世界对象相关或不相关。The predetermined gesture related to the real world object or activation icon may, for example, point to the real world object or activation icon, or perform a page turning gesture on the real world object or activation icon. Activation icons may or may not be related to real-world objects.
其它可能的预定手势包括翻页手势、两个手指的对捏运动(例如食指和拇指或中指和拇指)、指向、左到右手势、右到左手势、向上手势、向下手势、按压手势、打开握紧的拳头、打开握紧的拳头并移向图像传感器、轻敲手势、挥手手势、鼓掌手势、反向鼓掌手势、手握成拳、对捏手势、反向对捏手势、张开手指的手势、反向张开手指的手势、指着激活图标或现实世界对象、指着激活图标或现实世界对象预定时间量、点击激活图标或现实世界对象、双击激活图标或现实世界对象、食指点击激活图标或现实世界对象、中指点击激活图标或现实世界对象、从下点击激活图标或现实世界对象、从上点击激活图标、抓激活图标或现实世界对象、从右指着激活图标或现实世界对象、从左指着激活图标或现实世界对象、从左通过激活图标或现实世界对象、推激活图标或现实世界对象、鼓掌或在激活图标或现实世界对象上方挥手、执行爆炸手势、执行轻敲手势、在激活图标或现实世界对象上执行顺时针或反时针手势、滑动激活图标或现实世界对象、用两个手指抓激活图标或现实世界对象,或执行点击拖动释放运动。Other possible predetermined gestures include page turning gestures, two-finger pinch movements (eg index finger and thumb or middle finger and thumb), pointing, left-to-right, right-to-left, up, down, press, Open clenched fist, open clenched fist and move towards image sensor, tap gesture, wave gesture, clap gesture, reverse clap gesture, hand in fist, pinch gesture, reverse pinch gesture, open fingers gesture, reverse open finger gesture, pointing at an activation icon or real-world object, pointing at an activation icon or real-world object for a predetermined amount of time, tapping an activation icon or real-world object, double-tapping an activation icon or real-world object, index finger tap Activate Icon or Real World Object, Middle Finger Click to Activate Icon or Real World Object, Bottom Click to Activate Icon or Real World Object, Top Click to Activate Icon, Grab Active Icon or Real World Object, Right Point to Activate Icon or Real World Object , Point to the activation icon or real-world object from the left, pass the activation icon or real-world object from the left, push the activation icon or real-world object, clap or wave over the activation icon or real-world object, perform an explode gesture, perform a tap gesture , perform a clockwise or counterclockwise gesture on the activation icon or real-world object, swipe the activation icon or real-world object, grab the activation icon or real-world object with two fingers, or perform a tap-drag-release motion.
预定对象例如可为用户的手、用户的手的一部分,例如用户的手指20或两个不同手的部分。或者,预定对象可为手写笔或棒。The predetermined object may be, for example, the user's hand, a part of the user's hand, such as the user's
当处理器16确定已执行预定手势时,这可能通过任何类型的反馈指示给用户,例如视觉反馈、听觉反馈、触觉反馈、定向振动、空气触觉反馈,或超声波反馈。反馈可为呈选自以下形式的视觉指示:显示设备上显示的激活图标、显示设备上显示的激活图标的变化、显示设备上显示的激活图标的颜色的变化、显示设备上显示的激活图标的大小的变化、显示设备上显示的激活图标的动画��指示灯、在显示设备上移动的指示器、振动、定向振动指示、空气触觉指示。指示可由在显示设备上出现的所有其它图像或视频顶部出现的在显示设备上移动的指示器提供。视觉反馈可为当系统识别预定对象时预定对象周围的辉光的外观。When
手势检测模块40可使用任何用于检测图像传感器32获得的图像中预定对象的方法。例如,手势检测模块可检测如WO2005/091125或WO 2010/086866中所公开的预定对象。
处理器16还被配置以确定对其执行预定手势的场景8中的现实世界对象。因此,例如,在图4b示出的图像中,处理器16将通过确定图像中指尖位置(X,Y)并把这个信息与来自状态传感器21的用户的位置和护目镜4的方向相组合来确定用户的手指20指向广告牌14。现实世界对象因此被处理器识别,而无需向用户呈现光标或其它标志物来指示用户希望选择的现实世界对象,从而使得能够直接指着现实世界对象以开始互动。处理器16在可能与处理器16成一体或可能位于远程的存储器中搜索与用户的手指20指着的现实世界对象有关的数据。例如,存储器可能已存储了与广告牌14有关的数据。当用户指向数据存储在存储器中或从例如网址的远程服务器提取的场景8中的对象时,数据被显示在显示设备10上叠加到用户的场景的视图。因此,当用户指向广告牌14时(图3a),与广告牌14有关的视觉数据21被显示在显示设备10上,如图4c示出。
视觉数据21可为静态或动态的。视觉数据21可包括一个或多个激活图标,使得当相对于激活图标中的一个激活图标执行预定手势时,执行与激活图标关联的命令。命令例如可为显示与选择的现实世界对象有关的具体视觉材料。激活图标可为2D或3D激活图标且可被呈现给用户,使得用户感知在他前面的3D空间中的图标。如本文所使用,激活图标是与用户互动激活的一个或多个消息关联的2D或3D图像或视频中的区域。激活图标例如可为2D或3D视觉元素。激活图标可为虚拟按钮、虚拟键盘、2D或3D激活图标、图像或视频中的区域。激活图标可包含两个或更多个激活图标。
处理器可能不呈现预定对象所处的激活图标的部分,使得预定对象似乎在激活图标的顶部。当用户快速移动他的头时,激活图标可被删除,然后当头部运动低于预定运动速度时,激活图标返回。The processor may not render the portion of the activation icon where the predetermined object is located, so that the predetermined object appears to be on top of the activation icon. When the user moves his head quickly, the activation icon can be deleted, and then when the head movement is lower than the predetermined movement speed, the activation icon is returned.
系统2可具有两个或更多个操作模式,且处理器16可被配置以识别一个或多个预定手势来在操作模式之间变化。因此,手势可用以打开或关闭系统、选择将要呈现的视觉材料源、选择将要呈现的视觉材料的详细程度、选择将要呈现给用户的按钮或激活图标,或激活在线服务,例如关于选择的现实世界对象的在线服务。另一操作模式可为在识别预定手势之后用图像传感器开始视频录制图像且/或用麦克风录制声音,并在识别另一预定手势之后停止录制。另一操作模式是连续监测视频和/或声音,但在检测到预定手势之后,录制从识别手势之前预定时间量开始的视频/声音,且在识别另一预定手势之后停止录制。预定时间可由用户定义。另一操作模式是在识别预定手势之后向捕获和实时录制的视频添加标签。System 2 may have two or more modes of operation, and
图6a至图6d示出另一操作模式。在图6a中,通过用户执行“绘制”区域轮廓的手势来指定图像传感器捕获的视场60中的区域62,如图6a至图6d中虚线示出。选择区域然后被通过用户执行第二手势来调整大小,例如如图6b中箭头66所指示地分开两个手指或把两个手指靠得更近,直到选择区域达到所希望的大小(图6c中的67)。区域67然后被拖到视场中新的位置(图6d)并被复制到视场中新的位置。系统然后在选择区域上使用跟踪��,且选择区域被实时呈现在显示设备上用户设置的调整大小且重新安置区域中。Figures 6a to 6d illustrate another mode of operation. In Fig. 6a, the
为了最小化CPU资源,对于每个显示的激活图标,包含显示的激活图标周围的显示的激活图标边界框的图像区域可被定义为保持不变。系统使用机器视觉跟踪器来跟踪这个边界框。在视频序列的两个帧中的边界框的位置之间的距离小于使用视频跟踪器确定的预定距离,且边界框的跟踪器的相关值低于预定值。To minimize CPU resources, for each displayed active icon, the image area containing the displayed active icon bounding box around the displayed active icon may be defined to remain unchanged. The system uses a machine vision tracker to track this bounding box. The distance between the positions of the bounding boxes in two frames of the video sequence is less than a predetermined distance determined using the video tracker, and the correlation value of the bounding box's tracker is below the predetermined value.
当系统处于其中只可激活激活图标而不能激活现实世界对象的操作模式时,CPU可通过只在每个显示的激活图标附近搜索预定对象来最小化。为了进一步减小CPU,只有当如从状态传感器获得的信息确定耳机未显著移动时,才激活对象识别模块。When the system is in a mode of operation in which only activation icons can be activated and no real world objects can be activated, the CPU can be minimized by only searching for predetermined objects in the vicinity of each displayed activation icon. To further reduce the CPU, the object recognition module is only activated when it is determined that the headset has not moved significantly, as obtained from the state sensor.
用户可选择不同的过滤器来筛选与现实世界对象相关的数据,例如“显示仅由朋友生成的数据”,或显示来自注册源的数据或过去三个月生成的数据的过滤器。Users can select different filters to filter data related to real-world objects, such as "show data generated only by friends," or filters that show data from registered sources or data generated in the past three months.
系统2可具有待机模式,其中系统2的耗电量最小。例如,活动模式可能在以下方面与待机模式不同:系统分析的每秒钟的视频帧的数目、分析的图像分辨率、分析的图像帧的部分,和/或激活的检测模块。系统2可被通过任何技术带入活动模式。例如,系统2可通过以下动作而被带入活动模式:当用户把预定对象放入某一位置或一团时把预定对象从下带入视场,例如指着相机视场的右下角或在相机视场中打开手;当显示激活图标且用户执行关联到激活图标的预定手势时,例如指着所述激活图标时,执行预定手势,例如从右向左移动手穿过视场,或在呈现激活图标的位置执行挥手手势,或通过在3D空间中在感知激活图标处于的位置执行手势、通过触摸设备,或如果设备具有加速度计就在设备上轻敲,把浮动激活图标从一个位置滑动到另一位置。作为另一实例,如果设备具有接近传感器或超声波传感器,那么当用户的手靠近设备时,系统可进入活动模式。系统也可由语音命令激活,或当所述用户把预定对象放入视场中特定位置时激活。作为另一实例,只有当在用户的视场中有与现实世界关联的相关数据时,系统才可进入活动模式。此时,系统可向用户指示何时有将要呈现的相关数据,或何时准备好进行互动。System 2 may have a standby mode in which system 2 consumes minimal power. For example, the active mode may differ from the standby mode in the number of video frames per second analyzed by the system, the image resolution analyzed, the fraction of image frames analyzed, and/or the active detection module. System 2 can be brought into active mode by any technique. For example, the system 2 may be brought into active mode by the action of bringing a predetermined object into the field of view from below when the user places it in a position or mass, such as pointing to the lower right corner of the camera's field of view or at Open the hand in the camera's field of view; when the activation icon is displayed and the user performs a predetermined gesture associated with the activation icon, such as pointing at the activation icon, performs a predetermined gesture, such as moving the hand across the field of view from right to left, or Perform a wave gesture where the activation icon is presented, or by performing a gesture in 3D space where the activation icon is perceived, by touching the device, or tapping on the device if the device has an accelerometer, sliding the floating activation icon from one position to another location. As another example, if the device has a proximity sensor or an ultrasonic sensor, the system may enter an active mode when the user's hand is close to the device. The system can also be activated by a voice command, or when the user places a predetermined object in a specific location in the field of view. As another example, the system may enter active mode only when there is relevant data associated with the real world in the user's field of view. At this point, the system can indicate to the user when there is relevant data to be presented, or when it is ready to interact.
视觉指示可被附加到现实世界对象,以让用户知道有与现实世界对象相关的数据。A visual indication can be attached to the real world object to let the user know that there is data related to the real world object.
相关数据的指示可覆盖在现实世界对象的位置上,因为例如激活图标“i”的小的视觉指示可指示信息,而“照片”的标志可指示与现实世界对象相关的图像,或者“信封”的标志指示朋友或其它用户留下的与现实世界对象相关的消息。当用户执行与激活图标相关的预定手势时,可呈现数据。Indications of relevant data may be overlaid on the location of real world objects, as a small visual indication such as an activation icon "i" may indicate information, while a sign for "photos" may indicate images related to real world objects, or an "envelope" The flag indicates messages left by friends or other users related to real-world objects. The data may be presented when the user performs a predetermined gesture associated with activating the icon.
系统2可被配置以经过校准过程来记录预定对象的各种物理参数,从而促进处理器2在相机获得的图像中识别预定对象。这可例如通过以下动作进行:在显示器上在3D空间中不同的位置处向用户呈现激活图标;提取预定对象的物理特性,例如预定对象的大小或方向;和确定预定对象的尺寸和它与相机的距离之间的相关性。校准可包括计算相机的三角、用户的视线和预定对象的前端来确定用户的指向。通过基于校准中提取的信息而估计现实世界对象与相机的距离来改善准确性。The system 2 may be configured to record various physical parameters of the predetermined object through a calibration process, thereby facilitating the processor 2 to identify the predetermined object in the images obtained by the camera. This may be done, for example, by presenting the activation icon to the user at different locations in 3D space on the display; extracting physical properties of the predetermined object, such as the size or orientation of the predetermined object; and determining the size of the predetermined object and its relationship to the camera the correlation between the distances. Calibration may include calculating the camera's triangulation, the user's line of sight, and the front end of a predetermined object to determine the user's pointing. Improves accuracy by estimating the distance of real-world objects from the camera based on information extracted from calibration.
处理器可被配置以由本发明的系统的用户在相机获得的现实世界场景的图像进行识别。现实世界场景中另一用户的识别可例如通过向远程服务器通知特定地理区域中设备的位置来执行。其它设备的位置可被发送到地理区域中的所有设备。The processor may be configured to identify images of the real world scene obtained at the camera by a user of the system of the present invention. The identification of another user in a real-world scenario can be performed, for example, by notifying a remote server of the location of the device in a particular geographic area. The location of other devices can be sent to all devices in the geographic area.
当本发明的两个系统之间存在通信链路时,这两个系统可用以玩游戏。另一用户可被表示为计算机化身,用户可通过手势与其进行互动,例如向另一用户发送例如“喜欢”的消息。When there is a communication link between the two systems of the present invention, the two systems can be used to play games. The other user may be represented as a computer avatar, with whom the user may interact through gestures, such as sending a message such as "like" to the other user.
处理器可被配置以显示能够使用一个或多个手指或手进行文字输入的键盘。键盘的显示可在检测到预定手势之后开始,所述预定手势例如从右到左的手势,或呈现张开的手,或在相机的视场的预定区域中呈现两个张开的手,所述预定区域例如视场的底部。另一种开始显示键盘的方式是当用户在打字区域或感知激活图标所处于的3D空间中执行点击手势时进行。键盘例如可用以写个便条、进行搜索或通过在虚拟键盘上打字来与在线服务(例如Skype或twitter)通信。系统可能不呈现预定对象所处的键盘的部分,使得预定对象似乎在键盘的顶部来创造例如用户的手的预定对象似乎在键盘“上方”的错觉。The processor may be configured to display a keyboard capable of text input using one or more fingers or hands. Display of the keyboard may begin after detection of a predetermined gesture, such as a right-to-left gesture, or presenting an open hand, or presenting two open hands in a predetermined area of the camera's field of view, so The predetermined area is, for example, the bottom of the field of view. Another way to start showing the keyboard is when the user performs a tap gesture in the typing area or in the 3D space where the sensory active icon is located. The keyboard can be used, for example, to write a note, conduct a search, or communicate with online services such as Skype or twitter by typing on a virtual keyboard. The system may not render the portion of the keyboard where the predetermined object is located so that the predetermined object appears to be on top of the keyboard to create the illusion that the predetermined object, eg, the user's hand, appears to be "above" the keyboard.
当系统处于输入模式时,动画手可呈现在键盘上,其位置与用户的手和手指相关。动画手的指尖可位于虚拟按键上方看到按键字符的位置。键盘和动画手最好是不透明的,使得用户不能看到键盘背后的背景。这往往使得键盘对于用户更加清晰。When the system is in input mode, an animated hand can be presented on the keyboard with a position relative to the user's hand and fingers. The fingertips of the animated hand can be positioned above the virtual keys where the key characters are seen. The keyboard and animated hands are preferably opaque so that the user cannot see the background behind the keyboard. This tends to make the keyboard clearer to the user.
Claims (10)
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201161536144P | 2011-09-19 | 2011-09-19 | |
| US61/536,144 | 2011-09-19 | ||
| CN201280048836.8A CN103858073B (en) | 2011-09-19 | 2012-09-19 | Augmented reality device, method of operating augmented reality device, computer-readable medium |
| PCT/IL2012/050376 WO2013093906A1 (en) | 2011-09-19 | 2012-09-19 | Touch free interface for augmented reality systems |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201280048836.8A Division CN103858073B (en) | 2011-09-19 | 2012-09-19 | Augmented reality device, method of operating augmented reality device, computer-readable medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN115167675A true CN115167675A (en) | 2022-10-11 |
Family
ID=47189999
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202210808606.2A Pending CN115167675A (en) | 2011-09-19 | 2012-09-19 | Augmented reality device |
| CN201280048836.8A Expired - Fee Related CN103858073B (en) | 2011-09-19 | 2012-09-19 | Augmented reality device, method of operating augmented reality device, computer-readable medium |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201280048836.8A Expired - Fee Related CN103858073B (en) | 2011-09-19 | 2012-09-19 | Augmented reality device, method of operating augmented reality device, computer-readable medium |
Country Status (5)
| Country | Link |
|---|---|
| US (8) | US20140361988A1 (en) |
| JP (3) | JP2014531662A (en) |
| KR (3) | KR20140069124A (en) |
| CN (2) | CN115167675A (en) |
| WO (1) | WO2013093906A1 (en) |
Families Citing this family (267)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9865125B2 (en) | 2010-11-15 | 2018-01-09 | Bally Gaming, Inc. | System and method for augmented reality gaming |
| CN115167675A (en) | 2011-09-19 | 2022-10-11 | 视力移动技术有限公司 | Augmented reality device |
| WO2013093837A1 (en) * | 2011-12-23 | 2013-06-27 | Koninklijke Philips Electronics N.V. | Method and apparatus for interactive display of three dimensional ultrasound images |
| US11068049B2 (en) | 2012-03-23 | 2021-07-20 | Microsoft Technology Licensing, Llc | Light guide display and field of view |
| US11169611B2 (en) * | 2012-03-26 | 2021-11-09 | Apple Inc. | Enhanced virtual touchpad |
| US9558590B2 (en) | 2012-03-28 | 2017-01-31 | Microsoft Technology Licensing, Llc | Augmented reality light guide display |
| US10191515B2 (en) | 2012-03-28 | 2019-01-29 | Microsoft Technology Licensing, Llc | Mobile device light guide display |
| US9717981B2 (en) | 2012-04-05 | 2017-08-01 | Microsoft Technology Licensing, Llc | Augmented reality and physical games |
| US10502876B2 (en) | 2012-05-22 | 2019-12-10 | Microsoft Technology Licensing, Llc | Waveguide optics focus elements |
| TWI475474B (en) * | 2012-07-30 | 2015-03-01 | Mitac Int Corp | Gesture combined with the implementation of the icon control method |
| KR102001218B1 (en) * | 2012-11-02 | 2019-07-17 | 삼성전자주�������� | Method and device for providing information regarding the object |
| US10192358B2 (en) | 2012-12-20 | 2019-01-29 | Microsoft Technology Licensing, Llc | Auto-stereoscopic augmented reality display |
| US9770203B1 (en) | 2013-01-19 | 2017-09-26 | Bertec Corporation | Force measurement system and a method of testing a subject |
| US10856796B1 (en) | 2013-01-19 | 2020-12-08 | Bertec Corporation | Force measurement system |
| US10646153B1 (en) | 2013-01-19 | 2020-05-12 | Bertec Corporation | Force measurement system |
| US10413230B1 (en) | 2013-01-19 | 2019-09-17 | Bertec Corporation | Force measurement system |
| US10231662B1 (en) | 2013-01-19 | 2019-03-19 | Bertec Corporation | Force measurement system |
| US10010286B1 (en) | 2013-01-19 | 2018-07-03 | Bertec Corporation | Force measurement system |
| US11052288B1 (en) | 2013-01-19 | 2021-07-06 | Bertec Corporation | Force measurement system |
| US11311209B1 (en) | 2013-01-19 | 2022-04-26 | Bertec Corporation | Force measurement system and a motion base used therein |
| US11540744B1 (en) | 2013-01-19 | 2023-01-03 | Bertec Corporation | Force measurement system |
| US12161477B1 (en) | 2013-01-19 | 2024-12-10 | Bertec Corporation | Force measurement system |
| US9526443B1 (en) * | 2013-01-19 | 2016-12-27 | Bertec Corporation | Force and/or motion measurement system and a method of testing a subject |
| US11857331B1 (en) | 2013-01-19 | 2024-01-02 | Bertec Corporation | Force measurement system |
| US10133342B2 (en) * | 2013-02-14 | 2018-11-20 | Qualcomm Incorporated | Human-body-gesture-based region and volume selection for HMD |
| EP2960867A4 (en) * | 2013-02-21 | 2016-08-03 | Fujitsu Ltd | DISPLAY DEVICE, METHOD, PROGRAM, AND POSITION ADJUSTMENT SYSTEM |
| US20140240226A1 (en) * | 2013-02-27 | 2014-08-28 | Robert Bosch Gmbh | User Interface Apparatus |
| US9122916B2 (en) * | 2013-03-14 | 2015-09-01 | Honda Motor Co., Ltd. | Three dimensional fingertip tracking |
| US20140285520A1 (en) * | 2013-03-22 | 2014-09-25 | Industry-University Cooperation Foundation Hanyang University | Wearable display device using augmented reality |
| US9507426B2 (en) * | 2013-03-27 | 2016-11-29 | Google Inc. | Using the Z-axis in user interfaces for head mountable displays |
| US9213403B1 (en) | 2013-03-27 | 2015-12-15 | Google Inc. | Methods to pan, zoom, crop, and proportionally move on a head mountable display |
| JP6108926B2 (en) * | 2013-04-15 | 2017-04-05 | オリン���ス株式会社 | Wearable device, program, and display control method for wearable device |
| US20140094148A1 (en) | 2013-05-08 | 2014-04-03 | Vringo Infrastructure Inc. | Cognitive Radio System And Cognitive Radio Carrier Device |
| GB2513884B (en) | 2013-05-08 | 2015-06-17 | Univ Bristol | Method and apparatus for producing an acoustic field |
| US9672627B1 (en) * | 2013-05-09 | 2017-06-06 | Amazon Technologies, Inc. | Multiple camera based motion tracking |
| EP2818948B1 (en) * | 2013-06-27 | 2016-11-16 | ABB Schweiz AG | Method and data presenting device for assisting a remote user to provide instructions |
| US10295338B2 (en) | 2013-07-12 | 2019-05-21 | Magic Leap, Inc. | Method and system for generating map data from an image |
| US20150124566A1 (en) | 2013-10-04 | 2015-05-07 | Thalmic Labs Inc. | Systems, articles and methods for wearable electronic devices employing contact sensors |
| US11921471B2 (en) | 2013-08-16 | 2024-03-05 | Meta Platforms Technologies, Llc | Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source |
| US10042422B2 (en) | 2013-11-12 | 2018-08-07 | Thalmic Labs Inc. | Systems, articles, and methods for capacitive electromyography sensors |
| US10188309B2 (en) | 2013-11-27 | 2019-01-29 | North Inc. | Systems, articles, and methods for electromyography sensors |
| US12504816B2 (en) | 2013-08-16 | 2025-12-23 | Meta Platforms Technologies, Llc | Wearable devices and associated band structures for sensing neuromuscular signals using sensor pairs in respective pods with communicative pathways to a common processor |
| KR102157313B1 (en) * | 2013-09-03 | 2020-10-23 | 삼성전자주식회사 | Method and computer readable recording medium for recognizing an object using a captured image |
| KR102165818B1 (en) * | 2013-09-10 | 2020-10-14 | 삼성전자주식회사 | Method, apparatus and recovering medium for controlling user interface using a input image |
| JP5877824B2 (en) * | 2013-09-20 | 2016-03-08 | ヤフー株式会社 | Information processing system, information processing method, and information processing program |
| KR102119659B1 (en) | 2013-09-23 | 2020-06-08 | 엘지전자 주식회사 | Display device and control method thereof |
| CN103501473B (en) * | 2013-09-30 | 2016-03-09 | 陈创举 | Based on multifunctional headphone and the control method thereof of MEMS sensor |
| KR101499044B1 (en) * | 2013-10-07 | 2015-03-11 | 홍익대학교 산학협력단 | Wearable computer obtaining text based on gesture and voice of user and method of obtaining the text |
| US9740935B2 (en) * | 2013-11-26 | 2017-08-22 | Honeywell International Inc. | Maintenance assistant system |
| US9671826B2 (en) * | 2013-11-27 | 2017-06-06 | Immersion Corporation | Method and apparatus of body-mediated digital content transfer and haptic feedback |
| US10586395B2 (en) | 2013-12-30 | 2020-03-10 | Daqri, Llc | Remote object detection and local tracking using visual odometry |
| US9264479B2 (en) | 2013-12-30 | 2016-02-16 | Daqri, Llc | Offloading augmented reality processing |
| EP2899609B1 (en) * | 2014-01-24 | 2019-04-17 | Sony Corporation | System and method for name recollection |
| DE102014201578A1 (en) * | 2014-01-29 | 2015-07-30 | Volkswagen Ag | Device and method for signaling an input area for gesture recognition of a human-machine interface |
| US20150227231A1 (en) * | 2014-02-12 | 2015-08-13 | Microsoft Corporation | Virtual Transparent Display |
| KR20150110032A (en) * | 2014-03-24 | 2015-10-02 | 삼성전자주식회사 | Electronic Apparatus and Method for Image Data Processing |
| WO2015161062A1 (en) * | 2014-04-18 | 2015-10-22 | Bally Gaming, Inc. | System and method for augmented reality gaming |
| US9501871B2 (en) | 2014-04-30 | 2016-11-22 | At&T Mobility Ii Llc | Explorable augmented reality displays |
| TWI518603B (en) | 2014-05-22 | 2016-01-21 | 宏達國際電子股份有限公司 | Image editing method and electronic device |
| US10600245B1 (en) | 2014-05-28 | 2020-03-24 | Lucasfilm Entertainment Company Ltd. | Navigating a virtual environment of a media content item |
| KR102303115B1 (en) * | 2014-06-05 | 2021-09-16 | 삼성전자 주식회사 | Method For Providing Augmented Reality Information And Wearable Device Using The Same |
| KR101595957B1 (en) * | 2014-06-12 | 2016-02-18 | 엘지전자 주식회사 | Mobile terminal and controlling system |
| EP3180676A4 (en) * | 2014-06-17 | 2018-01-10 | Osterhout Group, Inc. | External user interface for head worn computing |
| JP6500355B2 (en) * | 2014-06-20 | 2019-04-17 | 富士通株式会社 | Display device, display program, and display method |
| US20150379770A1 (en) * | 2014-06-27 | 2015-12-31 | David C. Haley, JR. | Digital action in response to object interaction |
| US9959591B2 (en) * | 2014-07-31 | 2018-05-01 | Seiko Epson Corporation | Display apparatus, method for controlling display apparatus, and program |
| JP6638195B2 (en) * | 2015-03-02 | 2020-01-29 | セイコーエプソン株式会社 | DISPLAY DEVICE, DISPLAY DEVICE CONTROL METHOD, AND PROGRAM |
| CN104133593A (en) * | 2014-08-06 | 2014-11-05 | 北京行云时空科技有限公司 | Character input system and method based on motion sensing |
| CN104156082A (en) * | 2014-08-06 | 2014-11-19 | 北京行云时空科技有限公司 | Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes |
| US9696551B2 (en) * | 2014-08-13 | 2017-07-04 | Beijing Lenovo Software Ltd. | Information processing method and electronic device |
| US9690375B2 (en) | 2014-08-18 | 2017-06-27 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
| CN104197950B (en) * | 2014-08-19 | 2018-02-16 | 奇瑞汽车股份有限公司 | The method and system that geography information is shown |
| US9910504B2 (en) * | 2014-08-21 | 2018-03-06 | Samsung Electronics Co., Ltd. | Sensor based UI in HMD incorporating light turning element |
| JP5989725B2 (en) * | 2014-08-29 | 2016-09-07 | 京セラドキュメントソリューションズ株式会社 | Electronic device and information display program |
| DE102014217843A1 (en) * | 2014-09-05 | 2016-03-10 | Martin Cudzilo | Apparatus for facilitating the cleaning of surfaces and methods for detecting cleaning work done |
| GB2530036A (en) | 2014-09-09 | 2016-03-16 | Ultrahaptics Ltd | Method and apparatus for modulating haptic feedback |
| TWI613615B (en) * | 2014-10-15 | 2018-02-01 | 在地實驗文化事業有限公司 | Navigation system and method |
| US20160109701A1 (en) * | 2014-10-15 | 2016-04-21 | GM Global Technology Operations LLC | Systems and methods for adjusting features within a head-up display |
| US10108256B2 (en) * | 2014-10-30 | 2018-10-23 | Mediatek Inc. | Systems and methods for processing incoming events while performing a virtual reality session |
| WO2016071244A2 (en) * | 2014-11-06 | 2016-05-12 | Koninklijke Philips N.V. | Method and system of communication for use in hospitals |
| KR102038965B1 (en) * | 2014-11-26 | 2019-10-31 | 삼성전자주식회사 | Untrasound sensor and object detecting method thereof |
| EP3236335A4 (en) | 2014-12-17 | 2018-07-25 | Konica Minolta, Inc. | Electronic instrument, method of controlling electronic instrument, and control program for same |
| CN104537401B (en) * | 2014-12-19 | 2017-05-17 | 南京大学 | Reality augmentation system and working method based on technologies of radio frequency identification and depth of field sensor |
| US9658693B2 (en) * | 2014-12-19 | 2017-05-23 | Immersion Corporation | Systems and methods for haptically-enabled interactions with objects |
| US9600076B2 (en) * | 2014-12-19 | 2017-03-21 | Immersion Corporation | Systems and methods for object manipulation with haptic feedback |
| US9685005B2 (en) * | 2015-01-02 | 2017-06-20 | Eon Reality, Inc. | Virtual lasers for interacting with augmented reality environments |
| US20160196693A1 (en) * | 2015-01-06 | 2016-07-07 | Seiko Epson Corporation | Display system, control method for display device, and computer program |
| US10317215B2 (en) | 2015-01-09 | 2019-06-11 | Boe Technology Group Co., Ltd. | Interactive glasses and navigation system |
| CN104570354A (en) * | 2015-01-09 | 2015-04-29 | 京东方科技集团股份有限公司 | Interactive glasses and visitor guide system |
| TWI619041B (en) * | 2015-01-09 | 2018-03-21 | Chunghwa Telecom Co Ltd | Augmented reality unlocking system and method |
| JP2016133541A (en) * | 2015-01-16 | 2016-07-25 | 株式会社ブリリアントサービス | Electronic spectacle and method for controlling the same |
| US10317677B2 (en) | 2015-02-09 | 2019-06-11 | Microsoft Technology Licensing, Llc | Display system |
| US10018844B2 (en) | 2015-02-09 | 2018-07-10 | Microsoft Technology Licensing, Llc | Wearable image display system |
| US20170061700A1 (en) * | 2015-02-13 | 2017-03-02 | Julian Michael Urbach | Intercommunication between a head mounted display and a real world object |
| JP6771473B2 (en) | 2015-02-20 | 2020-10-21 | ウルトラハプティクス アイピー リミテッドUltrahaptics Ip Ltd | Improved algorithm in the tactile system |
| US9886633B2 (en) * | 2015-02-23 | 2018-02-06 | Vivint, Inc. | Techniques for identifying and indexing distinguishing features in a video feed |
| EP3267295B1 (en) * | 2015-03-05 | 2021-12-29 | Sony Group Corporation | Information processing device, control method, and program |
| JP6596883B2 (en) | 2015-03-31 | 2019-10-30 | ソニー株式会社 | Head mounted display, head mounted display control method, and computer program |
| US20160292920A1 (en) * | 2015-04-01 | 2016-10-06 | Caterpillar Inc. | Time-Shift Controlled Visualization of Worksite Operations |
| US10156908B2 (en) * | 2015-04-15 | 2018-12-18 | Sony Interactive Entertainment Inc. | Pinch and hold gesture navigation on a head-mounted display |
| JP6534292B2 (en) * | 2015-04-24 | 2019-06-26 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Head mounted display and control method of head mounted display |
| US10055888B2 (en) | 2015-04-28 | 2018-08-21 | Microsoft Technology Licensing, Llc | Producing and consuming metadata within multi-dimensional data |
| DE102015211515A1 (en) * | 2015-06-23 | 2016-12-29 | Siemens Aktiengesellschaft | Interaction system |
| US10409443B2 (en) * | 2015-06-24 | 2019-09-10 | Microsoft Technology Licensing, Llc | Contextual cursor display based on hand tracking |
| US10156726B2 (en) * | 2015-06-29 | 2018-12-18 | Microsoft Technology Licensing, Llc | Graphene in optical systems |
| US10818162B2 (en) | 2015-07-16 | 2020-10-27 | Ultrahaptics Ip Ltd | Calibration techniques in haptic systems |
| CN105138763A (en) * | 2015-08-19 | 2015-12-09 | 中山大学 | Method for real scene and reality information superposition in augmented reality |
| CN112557676B (en) * | 2015-08-25 | 2025-04-29 | 株式会社日立高新技术 | Marking method |
| CN105205454A (en) * | 2015-08-27 | 2015-12-30 | 深圳市国华识别科技开发有限公司 | System and method for capturing target object automatically |
| KR102456597B1 (en) * | 2015-09-01 | 2022-10-20 | 삼성전자주식회사 | Electronic apparatus and operating method thereof |
| KR101708455B1 (en) * | 2015-09-08 | 2017-02-21 | 엠더블유엔테크 주식회사 | Hand Float Menu System |
| CN105183173B (en) * | 2015-10-12 | 2018-08-28 | 重庆中电大宇卫星应用技术研究所 | It is a kind of by tactics and Morse code gesture input and the device for being converted to voice |
| CN113220116A (en) | 2015-10-20 | 2021-08-06 | 奇跃公司 | System and method for changing user input mode of wearable device and wearable system |
| DE102015221860A1 (en) * | 2015-11-06 | 2017-05-11 | BSH Hausgeräte GmbH | System and method for facilitating operation of a household appliance |
| CN105872815A (en) * | 2015-11-25 | 2016-08-17 | 乐视网信息技术(北京)股份有限公司 | Video playing method and device |
| EP3182328A1 (en) * | 2015-12-17 | 2017-06-21 | Nokia Technologies Oy | A method, apparatus or computer program for controlling image processing of a captured image of a scene to adapt the captured image |
| US9697648B1 (en) | 2015-12-23 | 2017-07-04 | Intel Corporation | Text functions in augmented reality |
| JP2017129406A (en) * | 2016-01-19 | 2017-07-27 | 日本電気通信システム株式会社 | Information processing device, smart glass and control method thereof, and computer program |
| CN105843390B (en) * | 2016-02-24 | 2019-03-19 | 上海理湃光晶技术有限公司 | A method of image scaling and AR glasses based on the method |
| US10168768B1 (en) | 2016-03-02 | 2019-01-01 | Meta Company | Systems and methods to facilitate interactions in an interactive space |
| US10133345B2 (en) | 2016-03-22 | 2018-11-20 | Microsoft Technology Licensing, Llc | Virtual-reality navigation |
| US9933855B2 (en) * | 2016-03-31 | 2018-04-03 | Intel Corporation | Augmented reality in a field of view including a reflection |
| AU2017244109B2 (en) | 2016-03-31 | 2022-06-23 | Magic Leap, Inc. | Interactions with 3D virtual objects using poses and multiple-DOF controllers |
| SE541141C2 (en) * | 2016-04-18 | 2019-04-16 | Moonlightning Ind Ab | Focus pulling with a stereo vision camera system |
| US10186088B2 (en) | 2016-05-13 | 2019-01-22 | Meta Company | System and method for managing interactive virtual frames for virtual objects in a virtual environment |
| US9990779B2 (en) | 2016-05-13 | 2018-06-05 | Meta Company | System and method for modifying virtual objects in a virtual environment in response to user interactions |
| ES2643863B1 (en) * | 2016-05-24 | 2018-10-26 | Sonovisión Ingenieros España, S.A.U. | METHOD FOR PROVIDING BY GUIDED INCREASED REALITY, INSPECTION AND SUPPORT IN INSTALLATION OR MAINTENANCE OF PROCESSES FOR COMPLEX ASSEMBLIES COMPATIBLE WITH S1000D AND DEVICE THAT MAKES SAME USE |
| CN105915715A (en) * | 2016-05-25 | 2016-08-31 | 努比亚技术有限公司 | Incoming call reminding method and device thereof, wearable audio device and mobile terminal |
| WO2017217752A1 (en) * | 2016-06-17 | 2017-12-21 | 이철윤 | System and method for generating three dimensional composite image of product and packing box |
| CN106157363A (en) * | 2016-06-28 | 2016-11-23 | 广东欧珀移动通信有限公司 | A camera method, device and mobile terminal based on augmented reality |
| CN106125932A (en) * | 2016-06-28 | 2016-11-16 | 广东欧珀移动通信有限公司 | A method, device, and mobile terminal for identifying target objects in augmented reality |
| CN106155315A (en) * | 2016-06-28 | 2016-11-23 | 广东欧珀移动通信有限公司 | Method, device and mobile terminal for adding augmented reality effect in shooting |
| CN106066701B (en) * | 2016-07-05 | 2019-07-26 | 上海智旭商务咨询有限公司 | A kind of AR and VR data processing equipment and method |
| KR20180009170A (en) * | 2016-07-18 | 2018-01-26 | 엘지전자 주식회사 | Mobile terminal and operating method thereof |
| US11216069B2 (en) | 2018-05-08 | 2022-01-04 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
| US10990174B2 (en) | 2016-07-25 | 2021-04-27 | Facebook Technologies, Llc | Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors |
| EP3487595A4 (en) | 2016-07-25 | 2019-12-25 | CTRL-Labs Corporation | SYSTEM AND METHOD FOR MEASURING MOVEMENTS OF ARTICULATED RIGID BODIES |
| EP3487402B1 (en) | 2016-07-25 | 2021-05-05 | Facebook Technologies, LLC | Methods and apparatus for inferring user intent based on neuromuscular signals |
| US11179066B2 (en) | 2018-08-13 | 2021-11-23 | Facebook Technologies, Llc | Real-time spike detection and identification |
| US11635736B2 (en) | 2017-10-19 | 2023-04-25 | Meta Platforms Technologies, Llc | Systems and methods for identifying biological structures associated with neuromuscular source signals |
| WO2020112986A1 (en) | 2018-11-27 | 2020-06-04 | Facebook Technologies, Inc. | Methods and apparatus for autocalibration of a wearable electrode sensor system |
| US10268275B2 (en) | 2016-08-03 | 2019-04-23 | Ultrahaptics Ip Ltd | Three-dimensional perceptions in haptic systems |
| CN106354257A (en) * | 2016-08-30 | 2017-01-25 | 湖北睛彩视讯科技有限公司 | Mobile scene fusion system and method based on augmented reality technology |
| CN106980362A (en) | 2016-10-09 | 2017-07-25 | 阿里巴巴集团控股有限公司 | Input method and device based on virtual reality scenario |
| US11119585B2 (en) | 2016-10-13 | 2021-09-14 | Ford Motor Company | Dual-mode augmented reality interfaces for mobile devices |
| US10257558B2 (en) * | 2016-10-26 | 2019-04-09 | Orcam Technologies Ltd. | Systems and methods for constructing and indexing a database of joint profiles for persons viewed by multiple wearable apparatuses |
| JP2018082363A (en) * | 2016-11-18 | 2018-05-24 | セイコーエプソン株式会社 | Head-mounted display device and method for controlling the same, and computer program |
| WO2018100575A1 (en) | 2016-11-29 | 2018-06-07 | Real View Imaging Ltd. | Tactile feedback in a display system |
| WO2018113740A1 (en) * | 2016-12-21 | 2018-06-28 | Zyetric Technologies Limited | Combining virtual reality and augmented reality |
| US11507216B2 (en) * | 2016-12-23 | 2022-11-22 | Realwear, Inc. | Customizing user interfaces of binary applications |
| US10620910B2 (en) | 2016-12-23 | 2020-04-14 | Realwear, Inc. | Hands-free navigation of touch-based operating systems |
| US11099716B2 (en) | 2016-12-23 | 2021-08-24 | Realwear, Inc. | Context based content navigation for wearable display |
| CN106682468A (en) * | 2016-12-30 | 2017-05-17 | 百度在线网络技术(北京)有限公司 | Method of unlocking electronic device and electronic device |
| USD864959S1 (en) | 2017-01-04 | 2019-10-29 | Mentor Acquisition One, Llc | Computer glasses |
| US11572653B2 (en) * | 2017-03-10 | 2023-02-07 | Zyetric Augmented Reality Limited | Interactive augmented reality |
| EP4250066A3 (en) | 2017-03-21 | 2023-11-29 | InterDigital VC Holdings, Inc. | Method and system for the detection and augmentation of tactile interactions in augmented reality |
| US10489651B2 (en) * | 2017-04-14 | 2019-11-26 | Microsoft Technology Licensing, Llc | Identifying a position of a marker in an environment |
| US10620779B2 (en) * | 2017-04-24 | 2020-04-14 | Microsoft Technology Licensing, Llc | Navigating a holographic image |
| US10481755B1 (en) * | 2017-04-28 | 2019-11-19 | Meta View, Inc. | Systems and methods to present virtual content in an interactive space |
| US11054894B2 (en) | 2017-05-05 | 2021-07-06 | Microsoft Technology Licensing, Llc | Integrated mixed-input system |
| CN111033444B (en) | 2017-05-10 | 2024-03-05 | 优玛尼股份有限公司 | Wearable multimedia devices and cloud computing platform with application ecosystem |
| US12230029B2 (en) * | 2017-05-10 | 2025-02-18 | Humane, Inc. | Wearable multimedia device and cloud computing platform with laser projection system |
| US11023109B2 (en) | 2017-06-30 | 2021-06-01 | Microsoft Techniogy Licensing, LLC | Annotation using a multi-device mixed interactivity system |
| US10895966B2 (en) | 2017-06-30 | 2021-01-19 | Microsoft Technology Licensing, Llc | Selection using a multi-device mixed interactivity system |
| CN107340871A (en) * | 2017-07-25 | 2017-11-10 | 深识全球创新科技(北京)有限公司 | The devices and methods therefor and purposes of integrated gesture identification and ultrasonic wave touch feedback |
| WO2019021447A1 (en) * | 2017-07-28 | 2019-01-31 | 株式会社オプティム | Wearable terminal display system, wearable terminal display method and program |
| WO2019021446A1 (en) * | 2017-07-28 | 2019-01-31 | 株式会社オプティム | Wearable terminal display system, wearable terminal display method and program |
| CN107635057A (en) * | 2017-07-31 | 2018-01-26 | 努比亚技术有限公司 | A kind of virtual reality terminal control method, terminal and computer-readable recording medium |
| US10591730B2 (en) * | 2017-08-25 | 2020-03-17 | II Jonathan M. Rodriguez | Wristwatch based interface for augmented reality eyewear |
| US10068403B1 (en) | 2017-09-21 | 2018-09-04 | Universal City Studios Llc | Locker management techniques |
| US10506217B2 (en) * | 2017-10-09 | 2019-12-10 | Facebook Technologies, Llc | Head-mounted display tracking system |
| US20190129607A1 (en) * | 2017-11-02 | 2019-05-02 | Samsung Electronics Co., Ltd. | Method and device for performing remote control |
| JP2019086916A (en) * | 2017-11-02 | 2019-06-06 | オリンパス株式会社 | Work support device, work support method, and work support program |
| US11531395B2 (en) | 2017-11-26 | 2022-12-20 | Ultrahaptics Ip Ltd | Haptic effects from focused acoustic fields |
| WO2019123762A1 (en) * | 2017-12-22 | 2019-06-27 | ソニー株式会社 | Information processing device, information processing method, and program |
| EP3729418B1 (en) | 2017-12-22 | 2024-11-20 | Ultrahaptics Ip Ltd | Minimizing unwanted responses in haptic systems |
| WO2019122912A1 (en) | 2017-12-22 | 2019-06-27 | Ultrahaptics Limited | Tracking in haptic systems |
| US10739861B2 (en) * | 2018-01-10 | 2020-08-11 | Facebook Technologies, Llc | Long distance interaction with artificial reality objects using a near eye display interface |
| US11150730B1 (en) | 2019-04-30 | 2021-10-19 | Facebook Technologies, Llc | Devices, systems, and methods for controlling computing devices via neuromuscular signals of users |
| US10937414B2 (en) | 2018-05-08 | 2021-03-02 | Facebook Technologies, Llc | Systems and methods for text input using neuromuscular information |
| US11481030B2 (en) | 2019-03-29 | 2022-10-25 | Meta Platforms Technologies, Llc | Methods and apparatus for gesture detection and classification |
| US11493993B2 (en) | 2019-09-04 | 2022-11-08 | Meta Platforms Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
| WO2019147956A1 (en) | 2018-01-25 | 2019-08-01 | Ctrl-Labs Corporation | Visualization of reconstructed handstate information |
| US11907423B2 (en) * | 2019-11-25 | 2024-02-20 | Meta Platforms Technologies, Llc | Systems and methods for contextualized interactions with an environment |
| US11961494B1 (en) | 2019-03-29 | 2024-04-16 | Meta Platforms Technologies, Llc | Electromagnetic interference reduction in extended reality environments |
| US10706628B2 (en) * | 2018-02-28 | 2020-07-07 | Lenovo (Singapore) Pte. Ltd. | Content transfer |
| US20190324549A1 (en) * | 2018-04-20 | 2019-10-24 | Immersion Corporation | Systems, devices, and methods for providing immersive reality interface modes |
| MX2020011492A (en) | 2018-05-02 | 2021-03-25 | Ultrahaptics Ip Ltd | Blocking plate structure for improved acoustic transmission efficiency. |
| US20190339837A1 (en) * | 2018-05-04 | 2019-11-07 | Oculus Vr, Llc | Copy and Paste in a Virtual Reality Environment |
| US10592001B2 (en) | 2018-05-08 | 2020-03-17 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
| US10768426B2 (en) | 2018-05-21 | 2020-09-08 | Microsoft Technology Licensing, Llc | Head mounted display system receiving three-dimensional push notification |
| EP3801216A1 (en) | 2018-05-29 | 2021-04-14 | Facebook Technologies, LLC. | Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods |
| CN112585600A (en) | 2018-06-14 | 2021-03-30 | 脸谱科技有限责任公司 | User identification and authentication using neuromuscular signatures |
| JP7056423B2 (en) * | 2018-07-10 | 2022-04-19 | オムロン株式会社 | Input device |
| US11360558B2 (en) * | 2018-07-17 | 2022-06-14 | Apple Inc. | Computer systems with finger devices |
| WO2020018892A1 (en) | 2018-07-19 | 2020-01-23 | Ctrl-Labs Corporation | Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device |
| US10890653B2 (en) | 2018-08-22 | 2021-01-12 | Google Llc | Radar-based gesture enhancement for voice interfaces |
| US10770035B2 (en) | 2018-08-22 | 2020-09-08 | Google Llc | Smartphone-based radar system for facilitating awareness of user presence and orientation |
| US10698603B2 (en) * | 2018-08-24 | 2020-06-30 | Google Llc | Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface |
| US10909762B2 (en) | 2018-08-24 | 2021-02-02 | Microsoft Technology Licensing, Llc | Gestures for facilitating interaction with pages in a mixed reality environment |
| EP4241661A1 (en) | 2018-08-31 | 2023-09-13 | Facebook Technologies, LLC | Camera-guided interpretation of neuromuscular signals |
| CN109348003A (en) * | 2018-09-17 | 2019-02-15 | 深圳市泰衡诺科技有限公司 | Application control method and device |
| WO2020061451A1 (en) * | 2018-09-20 | 2020-03-26 | Ctrl-Labs Corporation | Neuromuscular text entry, writing and drawing in augmented reality systems |
| CN110942518B (en) * | 2018-09-24 | 2024-03-29 | 苹果公司 | Contextual Computer Generated Reality (CGR) digital assistant |
| US10921764B2 (en) | 2018-09-26 | 2021-02-16 | Facebook Technologies, Llc | Neuromuscular control of physical objects in an environment |
| CN119454302A (en) * | 2018-10-05 | 2025-02-18 | 元平台技术有限公司 | Using neuromuscular signals to provide enhanced interaction with physical objects in augmented reality environments |
| KR102620702B1 (en) * | 2018-10-12 | 2024-01-04 | 삼성전자주식회사 | A mobile apparatus and a method for controlling the mobile apparatus |
| US10788880B2 (en) | 2018-10-22 | 2020-09-29 | Google Llc | Smartphone-based radar system for determining user intention in a lower-power mode |
| US10929099B2 (en) * | 2018-11-02 | 2021-02-23 | Bose Corporation | Spatialized virtual personal assistant |
| CN111273766B (en) | 2018-12-04 | 2022-05-13 | 苹果公司 | Method, apparatus and system for generating an affordance linked to a simulated reality representation of an item |
| US10789952B2 (en) * | 2018-12-20 | 2020-09-29 | Microsoft Technology Licensing, Llc | Voice command execution from auxiliary input |
| CN109782639A (en) * | 2018-12-29 | 2019-05-21 | 深圳市中孚能电气设备有限公司 | The control method and control device of a kind of electronic equipment operating mode |
| US12373033B2 (en) | 2019-01-04 | 2025-07-29 | Ultrahaptics Ip Ltd | Mid-air haptic textures |
| WO2020152828A1 (en) * | 2019-01-24 | 2020-07-30 | マクセル株式会社 | Display terminal, application control system and application control method |
| US10885322B2 (en) * | 2019-01-31 | 2021-01-05 | Huawei Technologies Co., Ltd. | Hand-over-face input sensing for interaction with a device having a built-in camera |
| JP6720385B1 (en) * | 2019-02-07 | 2020-07-08 | 株式会社メルカリ | Program, information processing method, and information processing terminal |
| US10905383B2 (en) | 2019-02-28 | 2021-02-02 | Facebook Technologies, Llc | Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces |
| US11842517B2 (en) | 2019-04-12 | 2023-12-12 | Ultrahaptics Ip Ltd | Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network |
| CN110109547A (en) * | 2019-05-05 | 2019-08-09 | 芋头科技(杭州)有限公司 | Order Activiation method and system based on gesture identification |
| US11302081B2 (en) * | 2019-05-21 | 2022-04-12 | Magic Leap, Inc. | Caching and updating of dense 3D reconstruction data |
| JP7331462B2 (en) * | 2019-05-24 | 2023-08-23 | 京セラドキュメントソリューションズ株式会社 | ROBOT SYSTEM, ROBOT CONTROL METHOD AND ELECTRONIC DEVICE |
| US10747371B1 (en) * | 2019-06-28 | 2020-08-18 | Konica Minolta Business Solutions U.S.A., Inc. | Detection of finger press from live video stream |
| USD1009884S1 (en) * | 2019-07-26 | 2024-01-02 | Sony Corporation | Mixed reality eyeglasses or portion thereof with an animated graphical user interface |
| JP2021026260A (en) | 2019-07-31 | 2021-02-22 | セイコーエプソン株式会社 | Display unit, display method, and computer program |
| US10909767B1 (en) * | 2019-08-01 | 2021-02-02 | International Business Machines Corporation | Focal and interaction driven content replacement into augmented reality |
| US12229341B2 (en) | 2019-09-23 | 2025-02-18 | Apple Inc. | Finger-mounted input devices |
| US11275453B1 (en) | 2019-09-30 | 2022-03-15 | Snap Inc. | Smart ring for manipulating virtual objects displayed by a wearable device |
| US11374586B2 (en) | 2019-10-13 | 2022-06-28 | Ultraleap Limited | Reducing harmonic distortion by dithering |
| US20210116249A1 (en) * | 2019-10-16 | 2021-04-22 | The Board Of Trustees Of The California State University | Augmented reality marine navigation |
| US11288871B2 (en) * | 2019-11-08 | 2022-03-29 | Fujifilm Business Innovation Corp. | Web-based remote assistance system with context and content-aware 3D hand gesture visualization |
| US12089953B1 (en) | 2019-12-04 | 2024-09-17 | Meta Platforms Technologies, Llc | Systems and methods for utilizing intrinsic current noise to measure interface impedances |
| CN113012214A (en) * | 2019-12-20 | 2021-06-22 | 北京外号信息技术有限公司 | Method and electronic device for setting spatial position of virtual object |
| US11715453B2 (en) | 2019-12-25 | 2023-08-01 | Ultraleap Limited | Acoustic transducer structures |
| CN115244263A (en) * | 2020-02-28 | 2022-10-25 | 日本电气株式会社 | Locker system, locker management method, and storage medium |
| US11277597B1 (en) | 2020-03-31 | 2022-03-15 | Snap Inc. | Marker-based guided AR experience |
| US11798429B1 (en) | 2020-05-04 | 2023-10-24 | Snap Inc. | Virtual tutorials for musical instruments with finger tracking in augmented reality |
| US11520399B2 (en) | 2020-05-26 | 2022-12-06 | Snap Inc. | Interactive augmented reality experiences using positional tracking |
| US11816267B2 (en) | 2020-06-23 | 2023-11-14 | Ultraleap Limited | Features of airborne ultrasonic fields |
| JP2022022568A (en) * | 2020-06-26 | 2022-02-07 | 沖電気工業株式会社 | Display operation unit and device |
| JP7515590B2 (en) * | 2020-07-08 | 2024-07-12 | マクセル株式会社 | Information processing terminal, remote control method and program |
| WO2022058738A1 (en) | 2020-09-17 | 2022-03-24 | Ultraleap Limited | Ultrahapticons |
| US11925863B2 (en) | 2020-09-18 | 2024-03-12 | Snap Inc. | Tracking hand gestures for interactive game control in augmented reality |
| US11546505B2 (en) * | 2020-09-28 | 2023-01-03 | Snap Inc. | Touchless photo capture in response to detected hand gestures |
| US11644902B2 (en) * | 2020-11-30 | 2023-05-09 | Google Llc | Gesture-based content transfer |
| WO2022146678A1 (en) | 2020-12-29 | 2022-07-07 | Snap Inc. | Micro hand gestures for controlling virtual and graphical elements |
| WO2022146673A1 (en) | 2020-12-30 | 2022-07-07 | Snap Inc. | Augmented reality precision tracking and display |
| US11740313B2 (en) | 2020-12-30 | 2023-08-29 | Snap Inc. | Augmented reality precision tracking and display |
| US11531402B1 (en) | 2021-02-25 | 2022-12-20 | Snap Inc. | Bimanual gestures for controlling virtual and graphical elements |
| CN113190110A (en) * | 2021-03-30 | 2021-07-30 | 青岛小鸟看看科技有限公司 | Interface element control method and device of head-mounted display equipment |
| US11868531B1 (en) | 2021-04-08 | 2024-01-09 | Meta Platforms Technologies, Llc | Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof |
| WO2022216784A1 (en) | 2021-04-08 | 2022-10-13 | Snap Inc. | Bimanual interactions between mapped hand regions for controlling virtual and graphical elements |
| EP4280599A4 (en) * | 2021-04-09 | 2024-07-17 | Samsung Electronics Co., Ltd. | PORTABLE ELECTRONIC DEVICE WITH MULTIPLE CAMERAS |
| WO2022225761A1 (en) | 2021-04-19 | 2022-10-27 | Snap Inc. | Hand gestures for animating and controlling virtual and graphical elements |
| CN113141529B (en) * | 2021-04-25 | 2022-02-25 | 聚好看科技股份有限公司 | Display device and media resource playback method |
| US11435857B1 (en) * | 2021-04-29 | 2022-09-06 | Google Llc | Content access and navigation using a head-mounted device |
| US11995899B2 (en) * | 2021-04-29 | 2024-05-28 | Google Llc | Pointer-based content recognition using a head-mounted device |
| US12517585B2 (en) | 2021-07-15 | 2026-01-06 | Ultraleap Limited | Control point manipulation techniques in haptic systems |
| WO2023283934A1 (en) * | 2021-07-16 | 2023-01-19 | Huawei Technologies Co.,Ltd. | Devices and methods for gesture-based selection |
| KR102629771B1 (en) * | 2021-09-30 | 2024-01-29 | 박두고 | Wearable device for recognition object using hand or finger tracking |
| US11967147B2 (en) * | 2021-10-01 | 2024-04-23 | At&T Intellectual Proerty I, L.P. | Augmented reality visualization of enclosed spaces |
| CN114089879B (en) * | 2021-11-15 | 2022-08-05 | 北京灵犀微光科技有限公司 | Cursor control method of augmented reality display equipment |
| US12405661B2 (en) * | 2022-01-10 | 2025-09-02 | Apple Inc. | Devices and methods for controlling electronic devices or systems with physical objects |
| US12265663B2 (en) * | 2022-04-04 | 2025-04-01 | Snap Inc. | Gesture-based application invocation |
| US12282607B2 (en) | 2022-04-27 | 2025-04-22 | Snap Inc. | Fingerspelling text entry |
| CN115309271B (en) * | 2022-09-29 | 2023-03-21 | 南方科技大学 | Information display method, device and equipment based on mixed reality and storage medium |
| KR102703511B1 (en) * | 2022-12-29 | 2024-09-06 | 서울과학기술대학교 산학협력단 | System for gemnerating virtual space using level of detail of object |
| US20250321630A1 (en) * | 2024-04-10 | 2025-10-16 | Meta Platforms Technologies, Llc | Single-Handed Mode for an Artificial Reality System |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1648840A (en) * | 2005-01-27 | 2005-08-03 | 北京理工大学 | Head carried stereo vision hand gesture identifying device |
| CN101739122A (en) * | 2008-11-24 | 2010-06-16 | 玴荣科技股份有限公司 | Gesture Recognition and Tracking Method |
| WO2010101081A1 (en) * | 2009-03-05 | 2010-09-10 | ブラザー工業株式会社 | Head-mounted display apparatus, image control method, and image control program |
| WO2010144050A1 (en) * | 2009-06-08 | 2010-12-16 | Agency For Science, Technology And Research | Method and system for gesture based manipulation of a 3-dimensional image of object |
| CN102117117A (en) * | 2010-01-06 | 2011-07-06 | 致伸科技股份有限公司 | System and method for controlling user gestures by using image capture device |
| US20110197147A1 (en) * | 2010-02-11 | 2011-08-11 | Apple Inc. | Projected display shared workspaces |
| US20110213664A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
Family Cites Families (58)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0981309A (en) * | 1995-09-13 | 1997-03-28 | Toshiba Corp | Input device |
| JP3365246B2 (en) | 1997-03-14 | 2003-01-08 | ミノルタ株式会社 | Electronic still camera |
| JP3225882B2 (en) * | 1997-03-27 | 2001-11-05 | 日本電信電話株式会社 | Landscape labeling system |
| DE19917660A1 (en) * | 1999-04-19 | 2000-11-02 | Deutsch Zentr Luft & Raumfahrt | Method and input device for controlling the position of an object to be graphically represented in a virtual reality |
| AU7651100A (en) | 1999-09-15 | 2001-04-17 | Roche Consumer Health Ag | Pharmaceutical and/or cosmetical compositions |
| US6771294B1 (en) * | 1999-12-29 | 2004-08-03 | Petri Pulli | User interface |
| SE0000850D0 (en) * | 2000-03-13 | 2000-03-13 | Pink Solution Ab | Recognition arrangement |
| CA2410427A1 (en) * | 2000-05-29 | 2001-12-06 | Vkb Inc. | Virtual data entry device and method for input of alphanumeric and other data |
| JP2002157606A (en) * | 2000-11-17 | 2002-05-31 | Canon Inc | Image display control device, mixed reality presentation system, image display control method, and medium providing processing program |
| US7215322B2 (en) | 2001-05-31 | 2007-05-08 | Siemens Corporate Research, Inc. | Input devices for augmented reality applications |
| US7126558B1 (en) | 2001-10-19 | 2006-10-24 | Accenture Global Services Gmbh | Industrial augmented reality |
| AU2003217587A1 (en) * | 2002-02-15 | 2003-09-09 | Canesta, Inc. | Gesture recognition system using depth perceptive sensors |
| US6943754B2 (en) * | 2002-09-27 | 2005-09-13 | The Boeing Company | Gaze tracking system, eye-tracking assembly and an associated method of calibration |
| US7676079B2 (en) * | 2003-09-30 | 2010-03-09 | Canon Kabushiki Kaisha | Index identification method and apparatus |
| IL161002A0 (en) | 2004-03-22 | 2004-08-31 | Itay Katz | Virtual video keyboard system |
| CN101375599A (en) * | 2004-06-01 | 2009-02-25 | L-3通信公司 | Method and system for performing video flashlight |
| US20060200662A1 (en) * | 2005-02-01 | 2006-09-07 | Microsoft Corporation | Referencing objects in a virtual environment |
| WO2007011306A2 (en) * | 2005-07-20 | 2007-01-25 | Bracco Imaging S.P.A. | A method of and apparatus for mapping a virtual model of an object to the object |
| KR100687737B1 (en) * | 2005-03-19 | 2007-02-27 | 한국전자통신연구원 | Virtual Mouse Device and Method Based on Two-Hand Gesture |
| CA2621488A1 (en) * | 2005-09-13 | 2007-03-22 | Spacetime3D, Inc. | System and method for providing three-dimensional graphical user interface |
| JP4851771B2 (en) * | 2005-10-24 | 2012-01-11 | 京セラ株式会社 | Information processing system and portable information terminal |
| US7509588B2 (en) * | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| US7725547B2 (en) * | 2006-09-06 | 2010-05-25 | International Business Machines Corporation | Informing a user of gestures made by others out of the user's line of sight |
| JP4961914B2 (en) * | 2006-09-08 | 2012-06-27 | ソニー株式会社 | Imaging display device and imaging display method |
| JP5228307B2 (en) * | 2006-10-16 | 2013-07-03 | ソニー株式会社 | Display device and display method |
| WO2008153599A1 (en) * | 2006-12-07 | 2008-12-18 | Adapx, Inc. | Systems and methods for data annotation, recordation, and communication |
| KR101285360B1 (en) * | 2007-01-25 | 2013-07-11 | 삼성전자주식회사 | Point of interest displaying apparatus and method for using augmented reality |
| US8942764B2 (en) * | 2007-10-01 | 2015-01-27 | Apple Inc. | Personal media device controlled via user initiated movements utilizing movement based interfaces |
| JP4933406B2 (en) * | 2007-11-15 | 2012-05-16 | キヤノン株式会社 | Image processing apparatus and image processing method |
| US8165345B2 (en) * | 2007-12-07 | 2012-04-24 | Tom Chau | Method, system, and computer program for detecting and characterizing motion |
| EP2258587A4 (en) * | 2008-03-19 | 2013-08-07 | Denso Corp | Operation input device for vehicle |
| JP5250834B2 (en) * | 2008-04-03 | 2013-07-31 | コニカミノルタ株式会社 | Head-mounted image display device |
| WO2009128064A2 (en) * | 2008-04-14 | 2009-10-22 | Pointgrab Ltd. | Vision based pointing device emulation |
| US8971565B2 (en) * | 2008-05-29 | 2015-03-03 | Hie-D Technologies, Llc | Human interface electronic device |
| WO2010042880A2 (en) * | 2008-10-10 | 2010-04-15 | Neoflect, Inc. | Mobile computing device with a virtual keyboard |
| US8397181B2 (en) | 2008-11-17 | 2013-03-12 | Honeywell International Inc. | Method and apparatus for marking a position of a real world object in a see-through display |
| US9041660B2 (en) * | 2008-12-09 | 2015-05-26 | Microsoft Technology Licensing, Llc | Soft keyboard control |
| US9405970B2 (en) | 2009-02-02 | 2016-08-02 | Eyesight Mobile Technologies Ltd. | System and method for object recognition and tracking in a video stream |
| CN102326133B (en) * | 2009-02-20 | 2015-08-26 | 皇家飞利浦电子股份有限公司 | The equipment of being provided for enters system, the method and apparatus of activity pattern |
| US8009022B2 (en) | 2009-05-29 | 2011-08-30 | Microsoft Corporation | Systems and methods for immersive interaction with virtual objects |
| KR101622196B1 (en) * | 2009-09-07 | 2016-05-18 | 삼성전자주식회사 | Apparatus and method for providing poi information in portable terminal |
| US20110107216A1 (en) * | 2009-11-03 | 2011-05-05 | Qualcomm Incorporated | Gesture-based user interface |
| JP4679661B1 (en) * | 2009-12-15 | 2011-04-27 | 株式会社東芝 | Information presenting apparatus, information presenting method, and program |
| KR20110075250A (en) | 2009-12-28 | 2011-07-06 | 엘지전자 주식회사 | Object tracking method and device using object tracking mode |
| US20110214082A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Projection triggering through an external marker in an augmented reality eyepiece |
| KR20130000401A (en) * | 2010-02-28 | 2013-01-02 | 오스터하우트 그룹 인코포레이티드 | Local advertising content on an interactive head-mounted eyepiece |
| US9128281B2 (en) * | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
| US8788197B2 (en) | 2010-04-30 | 2014-07-22 | Ryan Fink | Visual training devices, systems, and methods |
| US8593375B2 (en) | 2010-07-23 | 2013-11-26 | Gregory A Maltz | Eye gaze user interface and method |
| JP5499985B2 (en) * | 2010-08-09 | 2014-05-21 | ソニー株式会社 | Display assembly |
| US20120062602A1 (en) * | 2010-09-13 | 2012-03-15 | Nokia Corporation | Method and apparatus for rendering a content display |
| US8941559B2 (en) | 2010-09-21 | 2015-01-27 | Microsoft Corporation | Opacity filter for display device |
| US8768006B2 (en) * | 2010-10-19 | 2014-07-01 | Hewlett-Packard Development Company, L.P. | Hand gesture recognition |
| CN102147926A (en) * | 2011-01-17 | 2011-08-10 | 中兴通讯股份有限公司 | Three-dimensional (3D) icon processing method and device and mobile terminal |
| US9336240B2 (en) * | 2011-07-15 | 2016-05-10 | Apple Inc. | Geo-tagging digital images |
| US20130050069A1 (en) * | 2011-08-23 | 2013-02-28 | Sony Corporation, A Japanese Corporation | Method and system for use in providing three dimensional user interface |
| CN115167675A (en) * | 2011-09-19 | 2022-10-11 | 视力移动技术有限公司 | Augmented reality device |
| WO2013136333A1 (en) | 2012-03-13 | 2013-09-19 | Eyesight Mobile Technologies Ltd. | Touch free user interface |
-
2012
- 2012-09-19 CN CN202210808606.2A patent/CN115167675A/en active Pending
- 2012-09-19 CN CN201280048836.8A patent/CN103858073B/en not_active Expired - Fee Related
- 2012-09-19 KR KR1020147009451A patent/KR20140069124A/en not_active Ceased
- 2012-09-19 KR KR1020227001961A patent/KR20220032059A/en not_active Ceased
- 2012-09-19 KR KR1020197034815A patent/KR20190133080A/en not_active Ceased
- 2012-09-19 US US14/345,592 patent/US20140361988A1/en not_active Abandoned
- 2012-09-19 JP JP2014531374A patent/JP2014531662A/en active Pending
- 2012-09-19 WO PCT/IL2012/050376 patent/WO2013093906A1/en not_active Ceased
-
2016
- 2016-03-03 US US15/060,533 patent/US20160259423A1/en not_active Abandoned
- 2016-04-04 US US15/090,527 patent/US20160291699A1/en not_active Abandoned
- 2016-04-12 US US15/096,674 patent/US20160306433A1/en not_active Abandoned
- 2016-05-02 US US15/144,209 patent/US10401967B2/en not_active Expired - Fee Related
- 2016-09-02 US US15/256,481 patent/US20170052599A1/en not_active Abandoned
-
2017
- 2017-10-02 JP JP2017192930A patent/JP2018028922A/en active Pending
-
2019
- 2019-08-30 US US16/557,183 patent/US11093045B2/en not_active Expired - Fee Related
-
2020
- 2020-09-18 JP JP2020157123A patent/JP7297216B2/en active Active
-
2021
- 2021-08-13 US US17/401,427 patent/US11494000B2/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1648840A (en) * | 2005-01-27 | 2005-08-03 | 北京理工大学 | Head carried stereo vision hand gesture identifying device |
| CN101739122A (en) * | 2008-11-24 | 2010-06-16 | 玴荣科技股份有限公司 | Gesture Recognition and Tracking Method |
| WO2010101081A1 (en) * | 2009-03-05 | 2010-09-10 | ブラザー工業株式会社 | Head-mounted display apparatus, image control method, and image control program |
| WO2010144050A1 (en) * | 2009-06-08 | 2010-12-16 | Agency For Science, Technology And Research | Method and system for gesture based manipulation of a 3-dimensional image of object |
| CN102117117A (en) * | 2010-01-06 | 2011-07-06 | 致伸科技股份有限公司 | System and method for controlling user gestures by using image capture device |
| US20110197147A1 (en) * | 2010-02-11 | 2011-08-11 | Apple Inc. | Projected display shared workspaces |
| US20110213664A1 (en) * | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
Also Published As
| Publication number | Publication date |
|---|---|
| US10401967B2 (en) | 2019-09-03 |
| US20160291699A1 (en) | 2016-10-06 |
| US20140361988A1 (en) | 2014-12-11 |
| JP2014531662A (en) | 2014-11-27 |
| US20160306433A1 (en) | 2016-10-20 |
| US11093045B2 (en) | 2021-08-17 |
| KR20190133080A (en) | 2019-11-29 |
| CN103858073B (en) | 2022-07-29 |
| JP2018028922A (en) | 2018-02-22 |
| WO2013093906A1 (en) | 2013-06-27 |
| KR20220032059A (en) | 2022-03-15 |
| US11494000B2 (en) | 2022-11-08 |
| US20170052599A1 (en) | 2017-02-23 |
| CN103858073A (en) | 2014-06-11 |
| JP2021007022A (en) | 2021-01-21 |
| US20200097093A1 (en) | 2020-03-26 |
| US20220107687A1 (en) | 2022-04-07 |
| KR20140069124A (en) | 2014-06-09 |
| US20160259423A1 (en) | 2016-09-08 |
| US20160320855A1 (en) | 2016-11-03 |
| JP7297216B2 (en) | 2023-06-26 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7297216B2 (en) | Touch-free interface for augmented reality systems | |
| US11941764B2 (en) | Systems, methods, and graphical user interfaces for adding effects in augmented reality environments | |
| US11782571B2 (en) | Device, method, and graphical user interface for manipulating 3D objects on a 2D screen | |
| US20220319100A1 (en) | User interfaces simulated depth effects | |
| US12462498B2 (en) | Systems, methods, and graphical user interfaces for adding effects in augmented reality environments | |
| US11468890B2 (en) | Methods and user interfaces for voice-based control of electronic devices | |
| US20240053859A1 (en) | Systems, Methods, and Graphical User Interfaces for Interacting with Virtual Reality Environments | |
| US12429940B2 (en) | Systems, methods, and graphical user interfaces for automatic measurement in augmented reality environments | |
| KR20150116871A (en) | Human-body-gesture-based region and volume selection for hmd | |
| US20250110574A1 (en) | User interfaces integrating hardware buttons | |
| US11367416B1 (en) | Presenting computer-generated content associated with reading content based on user interactions | |
| US20240291944A1 (en) | Video application graphical effects | |
| US20190235710A1 (en) | Page Turning Method and System for Digital Devices | |
| US20260019699A1 (en) | Camera user interface | |
| US20250377717A1 (en) | Systems, Methods, and Graphical User Interfaces for Automatic Measurement in Augmented Reality Environments | |
| WO2024238090A1 (en) | Techniques for detecting text |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |