Augmented Reality (real-time) Patents (Class 345/633)
  • Patent number: 10120505
    Abstract: According to an illustrative embodiment, an information processing apparatus is provided. The apparatus is used for processing a first image projected toward a target. The apparatus includes a processing unit for detecting that an object exists between a projector unit and the target, wherein when an object exists between the projector unit and the target, the apparatus determines an area of the object and generates a modified first image, based on the area of the object, for projection toward the target.
    Type: Grant
    Filed: June 21, 2017
    Date of Patent: November 6, 2018
    Assignee: Sony Corporation
    Inventors: Hiroyuki Mizunuma, Ikuo Yamano, Shunichi Kasahara, Ken Miyashita, Kazuyuki Yamamoto
  • Patent number: 10115236
    Abstract: Systems described herein allow for placement and presentation of virtual objects using mobile devices with a single camera lens. A device receives, from a first mobile device, a target image captured from a camera and target image data collected contemporaneously with the target image. The target image data includes a geographic location, a direction heading, and a tilt. The device receives, from the first mobile device, a first virtual object definition that includes an object type, a size, and a mobile device orientation for presenting a first virtual object within a video feed. The device generates a simplified model of the target image, and stores the first virtual object definition associated with the target image data and the simplified model of the target image. The device uploads the first virtual object definition and the target image data, so the first virtual object is discoverable by a second mobile device.
    Type: Grant
    Filed: September 21, 2016
    Date of Patent: October 30, 2018
    Assignee: Verizon Patent and Licensing Inc.
    Inventors: Manish Sharma, Anand Prakash, Devin Blong, Qing Zhang, Chetan Mugur Nagaraj, Srivatsan Rangarajan, Eric Miller
  • Patent number: 10105619
    Abstract: A comprehensive solution is provided to transforming locations and retail spaces into high-traffic VR attractions that provide a VR experience blended with a real-world tactile experience. A modular stage and kit of stage accessories suitable for a wide variety of commercial venues contains all of the necessary equipment, infrastructure, technology and content to assemble and operate a tactile, onsite VR attraction. Utilizing a modular set of set design and physical props, the physical structure and layout of the installations are designed to be easily rearranged and adapted to new VR content, without requiring extensive construction or specialized expertise.
    Type: Grant
    Filed: October 13, 2017
    Date of Patent: October 23, 2018
    Assignee: UNCHARTEDVR INC.
    Inventors: Kalon Ross Gutierrez, John Joseph Duncan, Douglas Griffin, Richard Schulze
  • Patent number: 10110853
    Abstract: An electronic device displays an image during a communication between two people. The image represents one of the people to the communication. The electronic device determines a location where to place the image and displays the image such that the image appears to exist at the location.
    Type: Grant
    Filed: June 28, 2018
    Date of Patent: October 23, 2018
    Inventor: Philip Scott Lyren
  • Patent number: 10102659
    Abstract: Systems and methods for utilizing a device as a marker for virtual content viewed in an augmented reality environment are discussed herein. The device (or sign post) may comprise a wirelessly connectable device linked to a power source and associated with multiple linkage points. The device may provide information to a user (or a device of a user) defining virtual content and a correlation between the linkage points and a reference frame of the virtual content. When rendered by a display device, the virtual content may be presented based on the reference frame of the virtual content correlated to the real world by virtue of the position of the linkage points in the real world.
    Type: Grant
    Filed: September 18, 2017
    Date of Patent: October 16, 2018
    Inventor: Nicholas T. Hariton
  • Patent number: 10102647
    Abstract: A device includes a processor configured to execute a first estimation process including detecting a marker from a captured image, and estimating first position and posture of a camera at a time when the captured image is captured, based on a shape of the marker, execute a second estimation process including obtaining a map point in a three-dimensional space from the memory, and estimating second position and posture of the camera at the time when the captured image is captured, based on a correspondence between a projection point in which the map point is projected on the captured image and a characteristic point extracted from the captured image, and select the first position and posture or the second position and posture based on a result of comparison between a first translational component of the first position and posture and a second translational component of the second position and posture.
    Type: Grant
    Filed: April 18, 2016
    Date of Patent: October 16, 2018
    Assignee: FUJITSU LIMITED
    Inventors: Nobuyasu Yamaguchi, Atsunori Moteki
  • Patent number: 10095929
    Abstract: Systems and methods for presenting an augmented reality view are disclosed. Embodiments include a system with a database for personalizing an augmented reality view of a physical environment using at least one of a location of a physical environment or a location of a user. The system may further include a hardware device in communication with the database, the hardware device including a renderer configured to render the augmented reality view for display and a controller configured to determine a scope of the augmented reality view authenticating the augmented reality view. The hardware device may include a processor configured to receive the augmented reality view of the physical environment, and present, via a display, augmented reality content to the user while the user is present in the physical environment, based on the determined scope of the augmented reality view.
    Type: Grant
    Filed: March 7, 2018
    Date of Patent: October 9, 2018
    Assignee: Capital One Services, LLC
    Inventors: Jason Richard Hoover, Micah Price, Sunil Subrahmanyam Vasisht, Qiaochu Tang, Geoffrey Dagley, Stephen Michael Wylie
  • Patent number: 10096161
    Abstract: Embodiments relate to using sensor data and location data from a device to generate augmented reality images. A mobile device pose can be determined (a geographic position, direction and a three dimensional orientation of the device) within a location. A type of destination in the location can be identified and multiple destinations can be identified, with the mobile device receiving queue information about the identified destinations from a server. A first image can be captured. Based on the queue information, one of the identified destinations can be selected. The geographic position of each identified destination can be identified, and these positions can be combined with the mobile device pose to generate a second image. Finally, an augmented reality image can be generated by combining the first image and the second image, the augmented reality image identifying the selected one destination.
    Type: Grant
    Filed: September 8, 2015
    Date of Patent: October 9, 2018
    Assignee: Live Nation Entertainment, Inc.
    Inventor: James Paul Callaghan
  • Patent number: 10095031
    Abstract: A virtual reality (VR) headset includes a first camera and a second camera capturing image data of an environment of the VR headset. Each camera has a field of view, and a portion of the fields of view of the first and second cameras overlap while a portion of the fields of view do not overlap. A processor receiving the image data from the first and second cameras is configured to identify a first observation of a position of the VR headset in the environment and positions of a plurality of features based on the image data captured by the first camera. The processor also identifies a second observation of the position of the VR headset in the environment and the positions of the features based on the image data captured by the second camera. Based on the first and second observations, the processor determines a model of the environment.
    Type: Grant
    Filed: May 25, 2018
    Date of Patent: October 9, 2018
    Assignee: Oculus VR, LLC
    Inventors: Dov Katz, Oskar Linde, Kjell Erik Alexander Sjoholm, Rasmus Karl Emil Simander
  • Patent number: 10088685
    Abstract: Aspects of the disclosed apparatuses, methods, and systems provide a wearable, augmented or virtual reality display system including two or more sets of backlight illumination arrangements. Each different set of backlight illumination is directed to a corresponding optical imaging system. The corresponding optical imaging system is designed to focus each set of illumination at a different distance. The selection of a particular optical focal distance is controlled by selecting the set of backlight illumination corresponding to the imaging system to provide the desired focal distance. As a result, selection of the backlight illumination determines at which distance an image is perceived by a user. When multiple backlight illumination sets are multiplexed with 2-D images, a single, 3-D image is perceived by the wearer of the display system.
    Type: Grant
    Filed: October 19, 2016
    Date of Patent: October 2, 2018
    Assignee: Meta Company
    Inventors: Rami Aharoni, Ashish Ahuja, Zhangyi Zhong
  • Patent number: 10089794
    Abstract: This invention is a system and method for defining a location-specific augmented reality capability for use in portable devices having a camera. The system and method uses recent photographs or digital drawings of a particular location to help the user of the system or method position the portable device in a specific place. Once aligned, a digital scene is displayed to the user transposed over (and combined with) the camera view of the current, real-world environment at that location, creating an augmented reality experience for the user.
    Type: Grant
    Filed: August 29, 2017
    Date of Patent: October 2, 2018
    Assignee: Membit Inc.
    Inventors: John Christopher Byrne, Andrew Herbert Byrne, Jennifer Mary Byrne
  • Patent number: 10088673
    Abstract: A 3D display apparatus and method that address the vergence-accommodation conflict. A display screen component includes a display screen pixel array adapted to display a display screen image, a microlens imaging component including an array of microlenses corresponding to the display screen pixel array that can form a virtual or a real image of the display screen image, and a controllable movement component coupled to the imaging component or the display screen, wherein the imaging component and the display screen are controllably movable relative to each other, further wherein upon a controlled movement of the imaging component relative to the display screen, a location of the virtual or the real image along an optical axis is controllably changed.
    Type: Grant
    Filed: February 23, 2017
    Date of Patent: October 2, 2018
    Assignee: DeepSee Inc.
    Inventor: Jing Xu
  • Patent number: 10083480
    Abstract: A system may provide an augmented environment that facilitates a transaction. The system may store profile data including user payment or user profile information. The system may then receive environmental data including audio and visual information representing a physical environment. The system may then receive first user input data indicative of a selection of one or more items present in the physical environment, and identify one or more action items in the environmental data based on the first user input data. In response to this identification, the system may augment the environmental data by adding virtual environmental data, and then provide this virtual environmental data to a device to create an augmented environment. The system can then receive second user input data, and provide purchase request data to a merchant terminal to enable a transaction related to the one or more action items.
    Type: Grant
    Filed: May 3, 2018
    Date of Patent: September 25, 2018
    Assignee: CAPITAL ONE SERVICES, LLC
    Inventors: Karen Nickerson, Justin Wishne, Drew Jacobs, Justin Smith, Marco S. Giampaolo, Hannes Jouhikainen
  • Patent number: 10078954
    Abstract: A reading tracking system specifically designed for children including a wrist-worn arm motion and heart rate sensor coupled to a parental monitoring system, a game-style application for the child's use, and a group application useful in a classroom setting. The user's heart rate and arm movements are monitored to detect reading-related behavior states, such as reading, fallen asleep, distracted, and awoke after haven falling asleep states, independent from precise eye gaze tracking. Heart rate monitoring detects a high heart rate that is inconsistent with reading, a moderate heart rate indicative of a sedentary awake state consistent with reading, and a low hart rate indicating that the user has fallen asleep. Arm movement monitoring detects a high arm activity state that is inconsistent with reading, a moderate arm movement state or gestures consistent with page turning while reading, and a low arm activity rate indicating that the user has fallen asleep.
    Type: Grant
    Filed: March 14, 2018
    Date of Patent: September 18, 2018
    Inventor: Culin Tate
  • Patent number: 10070676
    Abstract: A smart cap includes a fixing sleeve, a brim portion, and a sun visor. The fixing sleeve rotatably connects with one side of the brim portion, and the sun visor connects with another side of the brim portion which is relative to the one side of the brim portion. The fixing sleeve includes a voice inputting device, a voice outputting device, a communication device, and a SIM card. A motherboard running programs is configured on the brim portion, and an upper surface of the sun visor has a solar charging battery, a lower surface of the sun visor being a display screen. The solar charging battery provides power for all components of the smart cap via a power management device.
    Type: Grant
    Filed: August 4, 2016
    Date of Patent: September 11, 2018
    Assignees: Fu Tai Hua Industry (Shenzhen) Co., Ltd., HON HAI PRECISION INDUSTRY CO., LTD.
    Inventor: Xue-Qin Zhang
  • Patent number: 10071306
    Abstract: A virtual reality tracking system accurately determines one or more controller orientations using data from tracking cameras and/or an inertial measurement unit (IMU) embedded in each controller. Each controller has two or more distinctive light-emitting tracking markers. The tracking system determines the locations of the tracking markers based on the location of tracking markers in tracking camera's images. The tracking system determines the controller orientation using the locations of the tracking markers and orientation data from the IMU. When the camera views of the markers are obstructed the tracking system relies solely on the less-accurate orientation data from the IMU.
    Type: Grant
    Filed: March 25, 2016
    Date of Patent: September 11, 2018
    Assignee: ZERO LATENCY PTY LTD
    Inventor: Scott Vandonkelaar
  • Patent number: 10075758
    Abstract: Synchronizing an augmented reality video stream with a displayed video stream includes: accessing an augmented reality video stream which corresponds to a displayed video stream and accessing synchronizing metadata associated with the augmented reality video stream for synchronizing the augmented reality video stream to the displayed video stream; the synchronizing metadata includes processed key frames of the displayed video stream. Processed key frames are selected frames from the displayed video stream which have been processed to provide data usable to compare images of frames. The displayed video stream is tracked by capturing and processing a frame of the displayed video stream. The augmented reality video stream is synchronized to the displayed video stream by matching the captured and processed frame of the displayed video stream with a processed key frame of the metadata at a known location in the augmented reality video stream.
    Type: Grant
    Filed: January 17, 2017
    Date of Patent: September 11, 2018
    Assignee: International Business Machines Corporation
    Inventors: Ruth K. Ayers, Chris Bean, Kevin C. Brown, Giacomo G. Chiarella, Alexandra E. Wishart, John J. Wood
  • Patent number: 10068369
    Abstract: To integrate a sensory property such as occlusion, shadowing, reflection, etc. among physical and notional (e.g. virtual/augment) visual or other sensory content, providing an appearance of similar occlusion, shadowing, etc. in both models. A reference position, a physical data model representing physical entities, and a notional data model are created or accessed. A first sensory property from either data model is selected. A second sensory property is determined corresponding with the first sensory property, and notional sensory content is generated from the notional data model with the second sensory property applied thereto. The notional sensory content is outputted to the reference position with a see-through display. Consequently, notional entities may appear occluded by physical entities, physical entities may appear to cast shadows from notional light sources, etc.
    Type: Grant
    Filed: October 31, 2015
    Date of Patent: September 4, 2018
    Inventors: Greg James, Allen Yang Yang, Sleiman Itani
  • Patent number: 10067415
    Abstract: A wearable electronic device is configured to project an image on a glass. The wearable electronic device include: a glass; a projector configured to output one or more images; a shutter unit positioned in front of the projector to output the images output from the projector toward the glass or in an outward direction; and a control unit configured to control the shutter unit.
    Type: Grant
    Filed: March 19, 2015
    Date of Patent: September 4, 2018
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Sungchul Park, Taegun Park, Jiyoon Park
  • Patent number: 10056054
    Abstract: In accordance with one example embodiment of the present invention, a plurality of antennas that are arranged according to a predetermined geometrical pattern receive radio emission signals from nearby radio emitting objects. Said radio emission signals are used, at least in part, to exhibit augmented reality indicia on a display, wherein the position of said augmented reality indicia on said display approximately indicates the direction of arrival of said radio emission signals and is organized or corrected according to predetermined criteria. One or more databases, either positioned on the cloud, or on the headset, or at an intermediate apparatus, may store the data, settings, and authorizations associated with said radio emitting object to permit and regulate the representation of said augmented reality indicia.
    Type: Grant
    Filed: July 26, 2015
    Date of Patent: August 21, 2018
    Inventors: Federico Fraccaroli, Brian Joseph Bochicco
  • Patent number: 10055895
    Abstract: Systems and methods for local augmented reality (AR) tracking of an AR object are disclosed. In one example embodiment a device captures a series of video image frames. A user input is received at the device associating a first portion of a first image of the video image frames with an AR sticker object and a target. A first target template is generated to track the target across frames of the video image frames. In some embodiments, global tracking based on a determination that the target is outside a boundary area is used. The global tracking comprises using a global tracking template for tracking movement in the video image frames captured following the determination that the target is outside the boundary area. When the global tracking determines that the target is within the boundary area, local tracking is resumed along with presentation of the AR sticker object on an output display of the device.
    Type: Grant
    Filed: January 29, 2016
    Date of Patent: August 21, 2018
    Assignee: Snap Inc.
    Inventors: Jia Li, Linjie Luo, Rahul Sheth, Ning Xu, Jianchao Yang
  • Patent number: 10057511
    Abstract: A method is provided for overlaying target contents on a physical display area using projected light grid or a grid of light emitters. Information on the target contents is transmitted from the emitters using light to an augmented reality device. The information includes light-based data stream of the target contents and physical coordinates of a frame. The augmented reality device position and display the target contents on an area defined by the frame.
    Type: Grant
    Filed: May 11, 2016
    Date of Patent: August 21, 2018
    Assignee: International Business Machines Corporation
    Inventors: Ben Z. Akselrod, Anthony DiLoreto, Steve McDuff, Kyle D. Robeson
  • Patent number: 10048498
    Abstract: An illumination module can comprise a circuit board, a semiconductor-based light source mounted to the circuit board, an encasing mounted to the circuit board, and one or more optical surfaces at least partially contained within the encasing. The semiconductor-based light source can emit light in a first illumination pattern. The one or more optical surfaces can be collectively configured to receive the light from the edge-emitting semiconductor-based light source. The one or more optical surfaces can include a single optical surface configured to receive, condition, and redirect the light from the edge-emitting semiconductor-based light source. As such, the one or more optical surfaces can be collectively configured to output the conditioned and redirected light from the illumination module in a second illumination pattern different from the first illumination pattern.
    Type: Grant
    Filed: March 25, 2016
    Date of Patent: August 14, 2018
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Raymond Kirk Price, Ravi Kiran Nalla
  • Patent number: 10049479
    Abstract: An embodiment of the present invention provides a method of updating a CAD model representing an environment. Such an embodiment begins by generating a point cloud representing one or more objects of an environment based on received signals, where the received signals reflected off the one or more objects of the environment. Next, one or more clusters of the point cloud are identified based on a density of points that includes the one or more clusters. In turn, the one or more clusters are mapped to existing CAD diagrams and a CAD model of the environment is automatically updated using the existing CAD diagrams.
    Type: Grant
    Filed: December 30, 2015
    Date of Patent: August 14, 2018
    Assignee: Dassault Systemes
    Inventor: Nelia Gloria Mazula
  • Patent number: 10044984
    Abstract: An electronic device displays an image during a communication between two people. The image represents one of the people to the communication. The electronic device determines a location where to place the image and displays the image such that the image appears to exist at the location.
    Type: Grant
    Filed: April 4, 2018
    Date of Patent: August 7, 2018
    Inventor: Philip Scott Lyren
  • Patent number: 10043430
    Abstract: An apparatus for aligning an eyecup to an electronic display panel of a head-mounted display is presented in this disclosure. The eyecup is coupled to the electronic display panel forming an eyecup assembly. An imaging device captures one or more images of image light projected by the electronic display panel through the eyecup. A calibration controller, interfaced with the electronic display panel and the imaging device, determines physical locations of pixels of the electronic display panel on a sensor of the imaging device based on the captured one or more images. The calibration controller also determines a preferred alignment for presenting images by the electronic display panel based on the determined physical locations of the pixels and a projected location of the eyecup.
    Type: Grant
    Filed: July 25, 2016
    Date of Patent: August 7, 2018
    Assignee: Oculus VR, LLC
    Inventors: Samuel Redmond D'Amico, Simon Hallam, Kieran Tobias Levin
  • Patent number: 10037627
    Abstract: An augmented visualization system includes a camera operable to capture a base image of a field of view. A spatial sensor is configured to sense a position of the camera and to generate positional information corresponding to the position. A controller is in communication with the camera, the spatial sensor, and a data source having stored geospatial data. The controller is configured to determine when the geospatial data corresponds to a location in the field of view of the camera based on the positional information. The controller is also configured to generate a geospatial image in response to the controller determining that the location corresponding to the geospatial data is in the field of view. A display is in communication with the controller and is operable to display a composite image in which the geospatial image is overlaid with the base image.
    Type: Grant
    Filed: August 12, 2016
    Date of Patent: July 31, 2018
    Assignee: Argis Technologies LLC
    Inventors: Brady Hustad, Dolphus James Derry, III, Christopher Anderson, Alex Yrigoyen, Kevin Criss, Jerre Teague
  • Patent number: 10037077
    Abstract: This disclosure relates to system and methods for providing users with augmented reality experiences based on a specific location of a user. An augmented reality experience may include augmented reality objects, and/or other content. Augmented reality objects may be associated with one or more real-world locations. Requests for augmented reality experience may include identifications of real-world locations of client computing devices, and/or other information. An instance of an augmented reality object may be overlaid onto visual information obtained from client computing devices. An interaction between users and/or users and one or more augmented reality objects may be facilitated.
    Type: Grant
    Filed: June 21, 2016
    Date of Patent: July 31, 2018
    Assignee: Disney Enterprises, Inc.
    Inventors: Robert Auten, Malcolm E. Murdock, Joshua Nakaya
  • Patent number: 10038807
    Abstract: An image display apparatus includes an image display, a receiving unit, a detection section, and a change processing section. The image display displays an image. The receiving unit receives an operation performed by a user on information displayed on the image display. The detection section detects a user's finger approaching the receiving unit or an operation performed by the user on the receiving unit. The change processing section performs a change process on the image displayed on the image display, and stops or does not perform the change process when the detection section detects the approaching or the operation.
    Type: Grant
    Filed: February 28, 2017
    Date of Patent: July 31, 2018
    Assignee: FUJI XEROX CO., LTD.
    Inventors: Ryoko Saitoh, Yuichi Kawata, Hideki Yamasaki, Yoshifumi Bando, Kensuke Okamoto, Tomoyo Nishida
  • Patent number: 10033941
    Abstract: A mobile device includes at least one imaging sensor to capture imagery of an environment of the mobile device, a privacy filter module, a spatial feature detection module, an assembly module, and a network interface. The privacy filter module is to perform at least one image-based privacy filtering process using the captured imagery to generate filtered imagery. The spatial feature detection module is to determine a set of spatial features in the filtered imagery. The assembly module is to generate an area description file representative of the set of spatial features. The network interface is to transmit the area description file to a remote computing system. The assembly module may select only a subset of the set of spatial features for inclusion in the area description file.
    Type: Grant
    Filed: May 11, 2015
    Date of Patent: July 24, 2018
    Assignee: Google LLC
    Inventors: Brian Patrick Williams, Ryan Michael Hickman
  • Patent number: 10033844
    Abstract: The present disclosure relates to a proximity illuminance sensor and a mobile terminal using the same, and disclosed is the mobile terminal of which an upper bezel can be shortened by using: the proximity illuminance (IR) sensor disposed on the rear surface of a front case and disposed to be perpendicular to a display unit; and a light reflector disposed at one side of the proximity illuminance sensor, such that light is incident to the proximity illuminance sensor or emitted from the proximity illuminance sensor to the outside.
    Type: Grant
    Filed: January 15, 2015
    Date of Patent: July 24, 2018
    Assignee: LG ELECTRONICS INC.
    Inventors: Byunghwa Lee, Hyunsu Song
  • Patent number: 10025546
    Abstract: Aspects of the present invention disclose a method for controlling a device remotely. The method includes one or more processors identifying one or more electronic devices, viewed through a transparent display, connected to a network. The method further includes one or more processors determining a first electronic device, from the one or more electronic devices. The method further includes one or more processors mirroring the user interface of the first electronic device on the transparent display.
    Type: Grant
    Filed: July 14, 2015
    Date of Patent: July 17, 2018
    Assignee: International Business Machines Corporation
    Inventors: James E. Bostick, John M. Ganci, Jr., Martin G. Keen, Sarbajit K. Rakshit
  • Patent number: 10026033
    Abstract: A mobile device configured to enable a user to maintain a facility includes: a display device; a network interface configured to communicate across a computer network with an external computer server to retrieve facility data; an antenna for interrogating an RFID tag; a reader configured to read a response signal generated by the RFID tag in response to the interrogating, process the response signal to extract tag information, determine whether the tag information includes information identifying one of a room within the facility or equipment within the facility; and a processor configured to determine whether the tag information includes a room identifier identifying a room within the facility or an equipment identifier identifying equipment within the facility based on the facility data, retrieve display data from the stored facility data based on the identified information, and present the display data on the display device.
    Type: Grant
    Filed: October 16, 2017
    Date of Patent: July 17, 2018
    Inventor: Peter M. Curtis
  • Patent number: 10019964
    Abstract: A head mounted display (HMD) includes a magnetic sensor to produce a sensor signal responsive to detecting a magnet within a first threshold distance. The HMD also includes a circuit operatively coupled to the magnetic sensor. The circuit determines that the HMD is to be placed in a storage mode responsive to receiving the sensor signal from the magnetic sensor. The circuit powers down components of the HMD responsive to determining that the HMD is to be placed in the storage mode.
    Type: Grant
    Filed: January 10, 2017
    Date of Patent: July 10, 2018
    Assignee: Oculus VR, LLC
    Inventor: Nirav Rajendra Patel
  • Patent number: 10013795
    Abstract: An operation support method is disclosed. A three dimensional panorama image is generated by overlapping multiple images with each other based on posture information of a camera and a feature point map of the multiple images captured by the camera. The three dimensional panorama image is displayed at a first display device. At a second display device, position information of a target indicated is output based on current posture information of the camera in response to an indication of the target on the three dimensional panorama image.
    Type: Grant
    Filed: September 12, 2016
    Date of Patent: July 3, 2018
    Assignee: FUJITSU LIMITED
    Inventors: Shan Jiang, Keiju Okabayashi
  • Patent number: 10015391
    Abstract: Operating a camera system that includes a forward-facing camera directed at a forward scene and a rearward-facing camera directed at a rearward scene opposite the forward scene includes analyzing a video stream containing a reflection of the forward scene in a cornea of an eye of a user of the camera system, the video stream captured with the rearward-facing camera. The analyzing includes identifying an item in the reflection moving relative to a field of view of the forward-facing camera and predicted to enter the field of view of the forward-facing camera and identifying a characteristic of the item. A camera setting of the forward-facing camera is adjusted according to the characteristic of the identified item, the adjusted camera setting selected to improve capturing a photograph or video containing the item with the forward-facing camera.
    Type: Grant
    Filed: February 13, 2014
    Date of Patent: July 3, 2018
    Assignee: Sony Mobile Communications Inc.
    Inventors: David de Leon, Linus Martensson, Ola Thorn
  • Patent number: 10015403
    Abstract: An image display method displays image data obtained through a photographing operation performed at an image-capturing apparatus having a photographic optical system. The method includes: (i) image-capturing in which image data are generated by photographing a subject within a photographic image plane with the image-capturing apparatus; (ii) information collection in which subject information related to the subject is collected; (iii) correlating in which the subject information is made to correlate to a position within the photographic image plane; (iv) image display in which the image data are displayed at a display screen; (v) position specification in which a position on the display screen at which the image data are displayed is specified; and (vi) information display in which the subject information made to correlate to a position within the photographic image plane corresponding to the position specified in the position specification is displayed.
    Type: Grant
    Filed: July 19, 2016
    Date of Patent: July 3, 2018
    Assignee: NIKON CORPORATION
    Inventors: Yosuke Kusaka, Shoei Nakamura
  • Patent number: 10012838
    Abstract: An optical system for a head-worn computer may include a light source positioned within the head-worn computer and adapted to project non-polarized illuminating light towards a partially reflective partially transmissive surface such that the illuminating light reflects through a field lens and towards a reflective display and a polarizing film adjacent to a surface of the reflective display that polarizes the illuminating light after it passes through the field lens. The illuminating light reflects off a surface of the reflective display, forming image light which is then analyzed by the polarizing film prior to being transmitted through the field lens and then through the partially reflective partially transmissive surface to a non-polarizing lower display optical system adapted to present the image light to an eye of a user wearing the head-worn computer.
    Type: Grant
    Filed: December 16, 2015
    Date of Patent: July 3, 2018
    Assignee: Osterhout Group, Inc.
    Inventor: John N. Border
  • Patent number: 10007118
    Abstract: A compact optical system with improved contrast for a head-worn computer includes a light source including a lens with positive optical power positioned within the head-worn computer and adapted to project converging illuminating light towards a partially reflective partially transmissive surface wherein the illuminating light forms a spot with an area smaller than the light source on the partially reflective partially transmissive surface prior to being reflected as diverging illuminating light that passes through a field lens and towards a reflective display. The illuminating light reflects off a surface of the reflective display, forming diverging image light which is transmitted through the field lens and then through the partially reflective partially transmissive surface to a lower display optical system adapted to present the image light to an eye of a user wearing the head-worn computer.
    Type: Grant
    Filed: December 16, 2015
    Date of Patent: June 26, 2018
    Assignee: Osterhout Group, Inc.
    Inventor: John N. Border
  • Patent number: 10008042
    Abstract: Systems, apparatuses and methods may provide a technology-based way to adapt non-augmented realty (AR) content from a content platform for display in an AR environment. More particularly, systems, apparatuses and methods may provide a way to render an AR environment including some portion of the adapted non-AR content based on one or more physical contexts or device contexts. Systems, apparatuses and methods may provide for modifying one or more readability parameters of the information rendered in the AR environment based on one or more physical contexts or device contexts to improve the readability of non-AR content adapted to AR.
    Type: Grant
    Filed: September 30, 2016
    Date of Patent: June 26, 2018
    Assignee: Intel Corporation
    Inventors: Alexis Menard, Bryan G. Bernhart
  • Patent number: 10008040
    Abstract: A method for virtual shoes fitting includes providing two augmented reality (AR) reference papers, each having a plurality of AR markers and a standard size reference object, one reference paper being placed on a ground and the other AR reference paper being placed directly behind and at right angle to the ground; and providing a mobile device having a camera, a processor, and a machine-readable medium with instructions that, when executed, cause the processor to produce AR markers on the mobile device, and automatically taking photos of a user's foot placed on the AR reference paper on the ground together with the standard size reference object. A system for virtual shoes fitting is also disclosed.
    Type: Grant
    Filed: July 29, 2016
    Date of Patent: June 26, 2018
    Assignees: OnePersonalization Limited, Hong Kong Productivity Council
    Inventors: Pong Yuen Lam, Chin Lok Poon, Sin Yu Fung
  • Patent number: 10007948
    Abstract: A system may provide an augmented environment that facilitates a transaction. The system may store profile data including user payment or user profile information. The system may then receive environmental data including audio and visual information representing a physical environment. The system may then receive first user input data indicative of a selection of one or more items present in the physical environment, and identify one or more action items in the environmental data based on the first user input data. In response to this identification, the system may augment the environmental data by adding virtual environmental data, and then provide this virtual environmental data to a device to create an augmented environment. The system can then receive second user input data, and provide purchase request data to a merchant terminal to enable a transaction related to the one or more action items.
    Type: Grant
    Filed: August 2, 2017
    Date of Patent: June 26, 2018
    Assignee: CAPITAL ONE SERVICES, LLC
    Inventors: Karen Nickerson, Justin Wishne, Drew Jacobs, Justin Smith, Marco S. Giampaolo, Hannes Jouhikainen
  • Patent number: 9998671
    Abstract: Disclosed herein is a method of controlling an electronic apparatus. The method of controlling an electronic apparatus includes: selecting a specific image; generating a first image having a contour line and an adjusted transparency for the selected image; and displaying the generated first image in a state in which the generated first image is overlapped with a photographed image corresponding to live-view-photographing of the electronic apparatus.
    Type: Grant
    Filed: May 27, 2014
    Date of Patent: June 12, 2018
    Assignee: OCEANS CO., LTD.
    Inventor: Sung Jyn Park
  • Patent number: 9996308
    Abstract: Provided are a tethering-type head-mounted display (HMD), and a control method thereof. The HMD is in communications with a mobile terminal and includes a display to selectively display content. A sensor detects a state in which the HMD is worn by a user, and the HMD presents notification information about an event detected in the mobile terminal. The controller further pauses, when the sensor senses a change in the state in which the HMD is worn by the user while the notification information is being presented by the display, the presentation of the content, and generates, based on pausing the presentation of the content, bookmark information identifying the content and a point in time when the sensor senses the change in the state in which the HMD is worn by the user.
    Type: Grant
    Filed: October 4, 2016
    Date of Patent: June 12, 2018
    Assignee: LG ELECTRONICS INC.
    Inventors: Cheongha Park, Goeun Joo
  • Patent number: 9984301
    Abstract: A method for determining a pose of a camera includes obtaining both a first image of a scene and a second image of the scene, where both the first and second images are captured by the camera. A first set of features is extracted from the first image and a second set of features is extracted from the second set of features. The method includes calculating a value of a visual-motion parameter based on the first set of features and the second set of features without matching features of the first set with features of the second set. The method also includes determining the pose of the camera based, at least, on the value of the visual motion parameter.
    Type: Grant
    Filed: April 18, 2016
    Date of Patent: May 29, 2018
    Assignee: QUALCOMM Incorporated
    Inventor: Paulo Ricardo Dos Santos Mendonca
  • Patent number: 9983410
    Abstract: The invention relates to a method and a device (1) for authenticating information contained in a document (6), said device including at least one imager configured for acquiring at least one image of an acquisition field, at least one light source, computer processing means configured for processing the image and for extracting therefrom data relating to the document, said device being adapted so as to be mounted on the head of a user (10) so that its acquisition field (11) covers at least one portion of the field of vision (12) of the user (10), the light source being configured so as to emit in the acquisition field (11) of the imager, a light of at least one non-visible wavelength, the imager being adapted for allowing acquisitions at said wavelength and thus allowing acquisition at this wavelength of an image of a document (6) presented to the user (10) in the acquisition field of the imager. The invention also relates to the corresponding method.
    Type: Grant
    Filed: July 21, 2015
    Date of Patent: May 29, 2018
    Assignee: MORPHO
    Inventors: Eric Nguyen, Francois Rieul, Lauriane Couturier, Marie Jarlegan, Pierre Chastel, Vincent Bouatou
  • Patent number: 9975559
    Abstract: A computer-implemented method for rendering views to an output device and controlling a vehicle. The method includes determining a maneuver path for the vehicle within a spatial environment around the vehicle based on vehicle data from one or more vehicle systems of the vehicle. The method includes updating a view based on the spatial environment and the maneuver path, by augmenting one or more components of a model to provide a representation of the maneuver path virtually in the view as an available maneuver path or an unavailable maneuver path. The view is rendered to an output device and a vehicle maneuver request is generated based on the maneuver path. Further, the one or more vehicle systems are controlled to execute the vehicle maneuver request.
    Type: Grant
    Filed: March 10, 2017
    Date of Patent: May 22, 2018
    Assignee: Honda Motor Co., Ltd.
    Inventors: Arthur Alaniz, Joseph Whinnery, Robert Wesley Murrish, Michael Eamonn Gleeson-May
  • Patent number: 9971156
    Abstract: Aspects of the present invention relate to methods and systems for the see-through computer display systems with a wide field of view.
    Type: Grant
    Filed: September 30, 2016
    Date of Patent: May 15, 2018
    Assignee: Osterhout Group, Inc.
    Inventors: John N. Border, Joseph Bietry, John D. Haddick
  • Patent number: 9972088
    Abstract: An image processing apparatus of the present invention includes an image analysis unit that performs image analysis processing on a plurality of frame images constituting a base dynamic image to obtain an overall analysis value, a statistical analysis unit that performs statistical analysis processing on a diagnostic region using the overall analysis value to obtain a first analysis value, a reference statistical value generating unit that outputs a reference statistical value, and a display unit that displays the first analysis value and the reference statistical value together.
    Type: Grant
    Filed: May 2, 2014
    Date of Patent: May 15, 2018
    Assignee: KONICA MINOLTA, INC.
    Inventors: Koichi Fujiwara, Osamu Toyama, Hiroshi Yamato, Kenta Shimamura, Shintaro Muraoka, Sho Noji
  • Patent number: 9971403
    Abstract: Methods for intentional user experience are provided herein. Exemplary methods include: getting point of interest data, including a location of a point of interest; receiving first user data, including a first location of the user and a first direction of a gaze of the user; calculating a first position of the gaze of the user using the first location of the user and the first direction of the gaze of the user; displaying a first indicator denoting the first position of the gaze of the user; receiving second user data, including a second location of the user and a second direction of a gaze of the user; calculating a second position of the gaze of the user using the second location of the user and the second direction of the gaze of the user; and displaying a second indicator denoting the second position of the gaze of the user.
    Type: Grant
    Filed: November 13, 2017
    Date of Patent: May 15, 2018
    Assignee: Emergent AR Platforms Corp.
    Inventors: Raymond Kallmeyer, Andrew Warren Dawson