Augmented Reality (real-time) Patents (Class 345/633)
  • Patent number: 10317679
    Abstract: An optical reflective device for homogenizing light including a waveguide having a first and second waveguide surface and a partially reflective element is disclosed. The partially reflective element may be located between the first waveguide surface and the second waveguide surface. The partially reflective element may have a reflective axis parallel to a waveguide surface normal. The partially reflective element may be configured to reflect light incident on the partially reflective element at a first reflectivity for a first set of incidence angles and reflect light incident on the partially reflective element at a second reflectivity for a second set of incident angles.
    Type: Grant
    Filed: April 4, 2017
    Date of Patent: June 11, 2019
    Assignee: Akonia Holographics, LLC
    Inventors: Mark R. Ayres, Adam Urness, Kenneth E. Anderson, Friso Schlottau
  • Patent number: 10317670
    Abstract: Examples are disclosed that related to scanning image display systems. In one example, a scanning head-mounted display system includes a light source, a motion sensor, a scanning mirror system configured to scan light from the light source along at least one dimension to form an image, and a controller configured to control the scanning mirror system to scan the light to form the image, receive head motion data from the motion sensor, and adjust one or more of a scan rate and a phase offset between a first frame and a second frame of the image based upon the head motion data.
    Type: Grant
    Filed: January 12, 2018
    Date of Patent: June 11, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: John Allen Tardif, Joshua Owen Miller, Jeffrey N. Margolis
  • Patent number: 10319151
    Abstract: A device and method for hierarchical object recognition is provided. The device comprises: an augmented reality display device including a camera, a display device, and an eye-tracking device; an input device; a memory storing a hierarchical object recognition library arranged in a plurality of levels; and a controller. The controller receives, using the input device, an indication of a selected level of the hierarchical object recognition library. The controller determines, using the eye-tracking device, an eye-gaze direction. The controller recognizes at least one object in an image from the camera in the eye-gaze direction by comparing at least a region of the image in the eye-gaze direction with the selected level of the hierarchical object recognition library. The controller controls the display device to indicate a recognized object in the eye-gaze direction.
    Type: Grant
    Filed: July 7, 2017
    Date of Patent: June 11, 2019
    Assignee: MOTOROLA SOLUTIONS, INC.
    Inventors: Bing Qin Lim, Teik Kean Khoo, Mun Yew Tham
  • Patent number: 10307085
    Abstract: A computer-implemented method of detecting physiological attributes of a wearer of a computerized wearable device having one or more sensors comprises (1) using the information from the one or more sensors to assess the physiology of the wearer; and (2) notifying the wearer of the wearer's physiology. In various embodiments, the method involves using the wearable device to determine the wearer's current posture, balance, alertness, and/or physical state and comparing the current posture, balance, alertness and/or physical state to at least one baseline measurement. For example, the system may measure a baseline posture to determine when the wearer's current posture deviates from the baseline posture, and notify the wearer so that the wearer may improve his or her posture. In other embodiments, the computerized wearable device may detect one or more of the wearer's physiological characteristics (e.g., oxygen levels, pulse rate, pupil size, etc.) and determine the wearer's alertness level.
    Type: Grant
    Filed: January 30, 2015
    Date of Patent: June 4, 2019
    Assignee: Vision Service Plan
    Inventors: Jay William Sales, Richard Chester Klosinski, Jr., Matthew Allen Workman, Meghan Kathleen Murphy, Matthew David Steen
  • Patent number: 10311646
    Abstract: A device may detect, in a field of view of a camera, one or more components of an automated teller machine (ATM) device using a computer vision technique based on generating a three dimensional model of the one or more components. The device may identify the ATM device as a particular device or as a particular type of device based on the one or more components of the ATM device, or first information related to the ATM device. The device may identify a set of tasks to be performed with respect to the ATM device. The device may provide, for display via a display associated with the device, second information associated with the set of tasks as an augmented reality overlay. The device may perform an action related to the set of tasks, the ATM device, or the augmented reality overlay.
    Type: Grant
    Filed: February 26, 2018
    Date of Patent: June 4, 2019
    Assignee: Capital One Services, LLC
    Inventors: David Kelly Wurmfeld, Kevin Osborn
  • Patent number: 10311642
    Abstract: An augmented reality service method for a coloring play according to the present invention comprises the steps of: 1) executing a service application, and selecting and downloading an image to be colored from a server by a user terminal; 2) coloring the downloaded image in a specific pattern, using tool provided by the service application; 3) printing the colored image; 4) executing the service application, scanning a printed material of the colored image so as to recognize a marker, and extracting color information according to coordinates of the image; 5) coloring an augmented reality content provided by the server according to the extracted color information, so as to generate a colored augmented reality content on the basis of the image colored by a user; and 7) displaying the colored augmented reality content on a display unit of the user terminal.
    Type: Grant
    Filed: May 30, 2016
    Date of Patent: June 4, 2019
    Assignee: DS GLOBAL
    Inventor: Sam Hee Lee
  • Patent number: 10313634
    Abstract: An electronic device displays an image during a communication between two people. The image represents one of the people to the communication. The electronic device determines a location where to place the image and displays the image such that the image appears to exist at the location.
    Type: Grant
    Filed: January 12, 2019
    Date of Patent: June 4, 2019
    Inventor: Philip Scott Lyren
  • Patent number: 10303407
    Abstract: An image forming apparatus and a method of controlling the same, wherein stored user information is referenced to perform authentication of a user based on accepted user information, and the user is allowed to confirm whether or not to reset the user information of the user when authentication of the user fails. The user is caused to select a reset method for resetting the user information of the user in accordance with the confirmation, and reset the stored user information of the user in accordance with the selected reset method.
    Type: Grant
    Filed: June 27, 2018
    Date of Patent: May 28, 2019
    Assignee: Canon Kabushiki Kaisha
    Inventor: Tetsuya Yamada
  • Patent number: 10304234
    Abstract: A system for rendering a virtual environment includes a hardware processor, a memory, and a simulation generator software code stored in the memory. The hardware processor is configured to execute the simulation generator software code to receive a three-dimensional-representation (3D-representation) of a physical object, and to receive a two-dimensional-image (2D-image) of the physical object. The hardware processor is also configured to execute the simulation generator software code to compose a simulation of the virtual environment using the 3D-representation and the 2D-image, in which the 3D-representation of the physical object is situated between the 2D-image of the physical object and a predetermined viewpoint of the virtual environment. The hardware processor further executes the simulation generator software code to render the virtual environment such that the 2D-image of the physical object, but not the 3D-representation of the physical object, is visibly rendered.
    Type: Grant
    Filed: December 1, 2016
    Date of Patent: May 28, 2019
    Assignee: Disney Enterprises, Inc.
    Inventors: Taylor S. Hellam, Kimberly K. Porter, Mohammad S. Poswal, Malcolm E. Murdock, Aradhana Modi
  • Patent number: 10304250
    Abstract: A danger avoidance support program is described which make it possible to always grasp the surrounding situation and avoid danger such as collision even when using a smartphone while walking. This danger avoidance support program detects a direction to each of a plurality of measurement objects located ahead of the portable information terminal and measures the distance to the each measurement object to display the plurality of measurement objects in real time on the screen of the portable information terminal by mapping the plurality of measurement objects in the space of the screen of the portable information terminal on the basis of the detected directions and the measured distances. The plurality of measurement objects are displayed on the screen of the portable information terminal as translucent images which are superimposed on an image displayed by another application running on the portable information terminal.
    Type: Grant
    Filed: March 28, 2017
    Date of Patent: May 28, 2019
    Assignee: INTERMAN Corporation
    Inventor: Shigeki Uetabira
  • Patent number: 10304253
    Abstract: A method including acquiring a captured image of an object with a camera, detecting a first pose of the object on the basis of 2D template data and either the captured image at initial time or the captured image at time later than the initial time, detecting a second pose of the object corresponding to the captured image at current time on the basis of the first pose and the captured image at the current time, displaying an AR image in a virtual pose based on the second pose in the case where accuracy of the second pose at the current time falls in a range between a first criterion and a second criterion; and detecting a third pose of the object on the basis of the captured image at the current time and the 2D template data in the case where the accuracy falls in the range.
    Type: Grant
    Filed: September 20, 2017
    Date of Patent: May 28, 2019
    Assignee: SEIKO EPSON CORPORATION
    Inventors: Irina Kezele, Alex Levinshtein
  • Patent number: 10295403
    Abstract: A method, electronic device, and program product is disclosed. The method may include determining an environmental parameter of an environment at a location of an electronic device. The method may include identifying a first object of a virtual scenario that relates to the environmental parameter. The method may include generating a display parameter of the first object based on the environmental parameter. The method may include displaying the first object in the virtual scenario based on the display parameter.
    Type: Grant
    Filed: March 24, 2017
    Date of Patent: May 21, 2019
    Assignee: Lenovo (Beijing) Limited
    Inventor: Ben Xu
  • Patent number: 10297088
    Abstract: The present disclosure includes systems, methods, computer readable media, and devices that can generate accurate augmented reality objects based on tracking a writing device in relation to a real-world surface. In particular, the systems and methods described herein can detect an initial location of a writing device, and further track movement of the writing device on a real-world surface based on one or more sensory inputs. For example, disclosed systems and methods can generate an augmented reality object based on pressure detected at a tip of a writing device, based on orientation of the writing device, based on motion detector elements of the writing device (e.g., reflective materials, emitters, or object tracking shapes), and/or optical sensors. The systems and methods further render augmented reality objects within an augmented reality environment that appear on the real-world surface based on tracking the movement of the writing device.
    Type: Grant
    Filed: September 26, 2017
    Date of Patent: May 21, 2019
    Assignee: ADOBE INC.
    Inventors: Tenell Rhodes, Gavin S. P. Miller, Duygu Ceylan Aksit, Daichi Ito
  • Patent number: 10290152
    Abstract: Methods, computing devices and head-mounted display devices for displaying user interface elements with virtual objects are disclosed. In one example, a virtual object and one or more user interface elements are displayed within a physical environment. User input is received that moves one or more of the virtual object and the one or more user interface elements. One or more of the virtual object and the one or more user interface elements are determined to be within a predetermined distance of a physical surface. Based at least on this determination, the one or more user interface elements are displayed on the surface.
    Type: Grant
    Filed: April 3, 2017
    Date of Patent: May 14, 2019
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Julia Schwarz, Bo Robert Xiao, Hrvoje Benko, Andrew Wilson
  • Patent number: 10282772
    Abstract: A system includes one or more memory devices storing instructions, and one or more processors configured to execute the instructions to perform steps of a method. The system may provide a virtual wardrobe management system. The system may store wardrobe data associated with a plurality of garments. The system may then receive input data indicative of a garment selection. The system may identify the garment based on the input data received. The system may then generate a garment recommendation based on the selected garment and the stored wardrobe data and provide an indication of the recommendation to a computing device.
    Type: Grant
    Filed: August 1, 2017
    Date of Patent: May 7, 2019
    Assignee: CAPITAL ONE SERVICES, LLC
    Inventors: Marco S. Giampaolo, Karen Nickerson, Justin Wishne, Drew Jacobs, Justin Smith, Hannes Jouhikainen
  • Patent number: 10281981
    Abstract: An augmented reality (AR) output method includes generating line of sight information of a user of the electronic device based on at least one of a sensor module or a camera module of the electronic device, extracting line of sight space information corresponding to the line of sight information of the user from three-dimensional (3D) space information corresponding to a real object around the user based on the line of sight information, recognizing a user input using the camera module or the sensor module, comparing a first portion of the line of sight space information with the user input, generating a compensated input information by compensating the user input based on comparing the first portion with the user input, generating a first virtual object based on the compensated input information, and outputting the first virtual object on a display of the electronic device.
    Type: Grant
    Filed: January 25, 2017
    Date of Patent: May 7, 2019
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Kyung Tae Kim, Su Young Park, Sang Chul Baek, Jae Yung Yeo, Young Keun Choi
  • Patent number: 10271726
    Abstract: Provided is an imaging apparatus capable of imaging a confocal image and a nonconfocal image, in which an image intended by an examiner is provided easily and rapidly. The imaging apparatus includes: an acquiring unit configured to acquire a confocal image and a nonconfocal image of an eye to be inspected; a display unit configured to display at least one of the acquired confocal image and the acquired nonconfocal image; an analysis unit configured to analyze the acquired confocal image and the acquired nonconfocal image; and a display control unit configured to change a display form displayed on the display unit in accordance with an analysis result obtained by the analysis unit.
    Type: Grant
    Filed: August 16, 2017
    Date of Patent: April 30, 2019
    Assignee: CANON KABUSHIKI KAISHA
    Inventors: Tsutomu Utagawa, Hiroshi Imamura
  • Patent number: 10275023
    Abstract: In a virtual reality system, an optical tracking device may detect and track a user's eye gaze direction and/or movement, and/or sensors may detect and track a user's head gaze direction and/or movement, relative to virtual user interfaces displayed in a virtual environment. A processor may process the detected gaze direction and/or movement as a user input, and may translate the user input into a corresponding interaction in the virtual environment. Gaze directed swipes on a virtual keyboard displayed in the virtual environment may be detected and tracked, and translated into a corresponding text input, either alone or together with user input(s) received by the controller. The user may also interact with other types of virtual interfaces in the virtual environment using gaze direction and movement to provide an input, either alone or together with a controller input.
    Type: Grant
    Filed: December 21, 2016
    Date of Patent: April 30, 2019
    Assignee: GOOGLE LLC
    Inventors: Chris McKenzie, Chun Yat Frank Li, Hayes S. Raffle
  • Patent number: 10277891
    Abstract: An image processing apparatus includes a first acquisition unit configured to acquire orientation information of a viewer, and an extraction unit configured to extract an overlapping region being a region in which relative positions from points intersecting with optical axes of lenses disposed in front of respective eyes of the viewer are equal, for a right eye display image and a left eye display image to be presented to the viewer. In addition, the image processing apparatus includes a generation unit configured to generate the display image from a predetermined reference image, based on the orientation information acquired by the first acquisition unit, and the overlapping region extracted by the extraction unit. In the overlapping region, the generation unit shares information about generation processing of the display image, between the right eye display image and the left eye display image.
    Type: Grant
    Filed: February 22, 2017
    Date of Patent: April 30, 2019
    Assignee: CANON KABUSHIKI KAISHA
    Inventor: Yuji Kato
  • Patent number: 10269159
    Abstract: A head wearable device, a method, and a system. The head wearable device may include a display, a camera, a convolutional neural network (CNN) processor, and a processor. The CNN processor may be configured to: receive real scene image data from the camera; identify and classify objects in a real scene image; and generate object classification and position data. The processor may be configured to receive the real scene image data; receive the object classification and position data from the CNN processor; perform an image segmentation operation on the real scene image to fill in the objects; generate filled-in object data indicative of filled-in objects; generate a pixel mask; receive virtual scene image data; create mixed reality scene image data; and output the mixed reality scene image data to the display for presentation to a user.
    Type: Grant
    Filed: July 27, 2017
    Date of Patent: April 23, 2019
    Inventors: Peter R. Bellows, Danilo P. Groppa
  • Patent number: 10255725
    Abstract: An augmented reality (AR) system for providing an AR experience to a user of an AR venue includes a hardware processor, a memory, a light detector, a display, and an AR application stored in the memory. The hardware processor can execute the AR application to provide a virtual environment corresponding to the AR venue on the display, and to detect a light event resulting from interaction of an infrared (IR) light produced by an AR accessory utilized by the user with a surface within the AR venue. The hardware processor can further execute the AR application to identify a location of the light event within the AR venue, determine whether the location of the light event is within an activation zone of the AR venue, and render a visual effect corresponding to the light event on the display if the location is within the activation zone.
    Type: Grant
    Filed: November 16, 2016
    Date of Patent: April 9, 2019
    Assignee: Disney Enterprises, Inc.
    Inventors: Nathan Nocon, Katherine M. Bassett
  • Patent number: 10255730
    Abstract: A system of properly displaying chroma key content is presented. The system obtains a digital representation of a 3D environment, for example a digital photo, and gathers data from that digital representation. The system renders the digital representation in an environmental model and displays that digital representation upon an output device. Depending upon the context, content anchors of the environmental model are selected which will be altered by suitable chroma key content. The chroma key content takes into consideration the position and orientation of the chroma key content relative to the content anchor and relative to the point of view that the environmental model is displayed from in order to accurately display chroma key content in a realistic manner.
    Type: Grant
    Filed: June 6, 2018
    Date of Patent: April 9, 2019
    Assignee: NANTMOBILE, LLC
    Inventors: Evgeny Dzhurinskiy, Ludmila Bezyaeva
  • Patent number: 10249096
    Abstract: Methods, computer program products, and systems are presented. The method computer program products, and systems can include, for instance: obtaining virtual image data representing a virtual object; and encoding the virtual image data with physical image data to provide a formatted image file, wherein the encoding includes for a plurality of spatial image elements providing one or more data field that specifies physical image information and one or more data field that specifies virtual image information based on the virtual image data so the formatted image file for each of the plurality of spatial image elements provides physical image information and virtual image information, and wherein the encoding includes providing indexing data that associates an identifier for the virtual object to spatial image elements for the virtual object.
    Type: Grant
    Filed: May 17, 2017
    Date of Patent: April 2, 2019
    Assignee: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: David C. Bastian, Aaron K. Baughman, Nicholas A. McCrory, Todd R. Whitman
  • Patent number: 10249094
    Abstract: A method of synthetic representation of elements of interest in a viewing system for aircraft, the viewing system comprises location sensors, a cartographic database and a database of elements of interest, an image sensor, a unit for processing images and a unit for generating three-dimensional digital images representative of the terrain overflown and a viewing device, wherein, when the terrain overflown comprises an element of interest, the method of synthetic representation comprises: a first step of searching for and detecting the element of interest in each image of a sequence of images, and; a second step of generating three-dimensional digital images representative of the terrain overflown, the element of interest represented according to a first representation if it has not been detected in any of the images of the sequence of images and according to a second representation if it is detected.
    Type: Grant
    Filed: March 26, 2017
    Date of Patent: April 2, 2019
    Assignee: THALES
    Inventors: Thierry Ganille, Bruno Aymeric, Johanna Lux
  • Patent number: 10250870
    Abstract: An adjustable virtual reality device includes a first display module, a first image capturing unit, a second display module, a second image capturing unit, a lateral driving module, and a control unit. The first display module and the second display module display a first image and a second image, respectively. The first image capturing unit and the second image capturing unit detect two positions of two pupils of two eyeballs, respectively. The control unit calculates a pupil distance between the two pupils according to the two positions of the two pupils and controls the lateral driving module to drive the first display module and the second display module to move in a lateral direction, so that a lateral distance between the first display module and the second display module is corresponding to the pupil distance, which enhances comfort in use.
    Type: Grant
    Filed: February 21, 2017
    Date of Patent: April 2, 2019
    Assignee: Wistron Corporation
    Inventors: Chen-Yi Liang, Yao-Tsung Chang
  • Patent number: 10250799
    Abstract: Disclosed are techniques that provide a “best” picture taken within a few seconds of the moment when a capture command is received (e.g., when the “shutter” button is pressed). In some situations, several still images are automatically (that is, without the user's input) captured. These images are compared to find a “best” image that is presented to the photographer for consideration. Video is also captured automatically and analyzed to see if there is an action scene or other motion content around the time of the capture command. If the analysis reveals anything interesting, then the video clip is presented to the photographer. The video clip may be cropped to match the still-capture scene and to remove transitory parts. Higher-precision horizon detection may be provided based on motion analysis and on pixel-data analysis.
    Type: Grant
    Filed: August 4, 2014
    Date of Patent: April 2, 2019
    Assignee: Google Technology Holdings LLC
    Inventors: Doina I. Petrescu, Thomas T. Lay, Steven R. Petrie, Bill Ryan, Snigdha Sinha, Jeffrey S. Vanhoof
  • Patent number: 10250866
    Abstract: A set of light field sensors may generate light field output signals conveying light field information within fields of view of the set of light field sensors. The generation of the light field outputs signals may be characterized by a subpixel accuracy. The subpixel accuracy may be enabled by a physical link between the set of light field sensors. The fields of view of the set of light field sensors may overlap over an overlap volume. An object may be located within the overlap volume. The light field information characterizing light field emanating from the object may be combined.
    Type: Grant
    Filed: December 21, 2016
    Date of Patent: April 2, 2019
    Assignee: GoPro, Inc.
    Inventor: Alexandre Jenny
  • Patent number: 10249060
    Abstract: A tool erosion detecting system for a machine having a ground engaging tool is disclosed. The tool erosion detecting system may include a camera configured to generate a first image of the ground engaging tool on a display device, an input device configured to receive a user input, and a controller in communication with the camera and the input device. The controller may be configured to generate an augmented reality view of the ground engaging tool. The augmented reality view may include the first image of the ground engaging tool generated by the camera and a second image of a ground engaging tool superimposed on the first image and being associated with a selected wear level, wherein the selected wear level is based on the user input.
    Type: Grant
    Filed: December 14, 2016
    Date of Patent: April 2, 2019
    Assignee: Caterpillar Inc.
    Inventors: James Edward Wagner, Patrick Simon Campomanes, Brian Charles Brown, Drew Steven Solorio, John M. Plouzek, David Jason Mcintyre, Paul Davis Jackson, Jr., Phillip John Kunz, Thomas Marshall Congdon, Shadi Naji Kfouf
  • Patent number: 10242456
    Abstract: A system and method for markers with digitally encoded geographic coordinate information for use in an augmented reality (AR) system. The method provides accurate location information for registration of digital data and real world images within an AR system. The method includes automatically matching digital data within an AR system by utilizing a digitally encoded marker (DEM) containing world coordinate information system and mathematical offset of digital data and a viewing device. The method further includes encoding geographic coordinate information into markers (e.g., DEMs) and decoding the coordinate information into an AR system. Through use of the method and corresponding system, marker technology and the basis of geo-location technology can be combined into a geo-located marker, thereby solving the problem of providing accurate registration within an augmented reality.
    Type: Grant
    Filed: June 4, 2012
    Date of Patent: March 26, 2019
    Assignee: LIMITLESS COMPUTING, INC.
    Inventors: Errin T. Weller, Jeffrey B. Franklin
  • Patent number: 10242503
    Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing at least one program, and a method for rendering three-dimensional virtual objects within real world environments. Virtual rendering of a three-dimensional virtual object can be altered appropriately as a user moves around the object in the real world, and the three-dimensional virtual object can exist similarly for multiple users. Virtual object rendering can be with respect to a reference surface in a real world environment, which reference surface can be selected by a user as part of the virtual object rendering process.
    Type: Grant
    Filed: January 5, 2018
    Date of Patent: March 26, 2019
    Assignee: Snap Inc.
    Inventors: Andrew James McPhee, Ebony James Charlton, Samuel Edward Hare, Michael John Evans, Jokubas Dargis, Ricardo Sanchez-Saez
  • Patent number: 10244204
    Abstract: A computer system may be used to project a communication to a user. The system may analyze camera data to detect the facial direction and location of a user. The system may also receive a communication for the user. The system may receive the communication from a user device associated with the user. Based on the detected facial direction and location of the user, the system may determine a target location for the projection. The system may identify a set of visual projectors based on the target location and transmit the communication data and the target location to the projectors.
    Type: Grant
    Filed: March 22, 2017
    Date of Patent: March 26, 2019
    Assignee: International Business Machines Corporation
    Inventors: James E. Bostick, John M. Ganci, Jr., Martin G. Keen, Sarbajit K. Rakshit
  • Patent number: 10237517
    Abstract: An electronic device displays an image during a communication between two people. The image represents one of the people to the communication. The electronic device determines a location where to place the image and displays the image such that the image appears to exist at the location.
    Type: Grant
    Filed: October 16, 2018
    Date of Patent: March 19, 2019
    Inventor: Philip Scott Lyren
  • Patent number: 10235776
    Abstract: According to one embodiment, an information processing device includes processing circuitry. The circuitry acquires a first picture including a first object and a second object; and acquires a second picture including the first object and an object different from the first object. The circuitry generates first data to display first associated information; and generates second data to display the first associated information superimposed onto the second picture in response to a change between a first positional relationship in the first picture and a second positional relationship in the second picture, the first positional relationship being a relative positional relationship between the first and second objects in the first picture. The second positional relationship is a relative positional relationship between the first object and the object different from the first object in the second picture.
    Type: Grant
    Filed: August 31, 2016
    Date of Patent: March 19, 2019
    Assignee: KABUSHIKI KAISHA TOSHIBA
    Inventors: Kazunori Imoto, Shihomi Takahashi, Osamu Yamaguchi
  • Patent number: 10235812
    Abstract: In accordance with an example aspect, there is provided an apparatus comprising at least one processing core and at least one memory including computer program code, the at least one memory and the computer program code being configured to, with the at least one processing core, cause the apparatus at least to obtain a movement context of a user, rank, based at least in part on the movement contest, at least two objects based on their visibility to the user, and determine, based at least in part on the ranking, at least one of the at least two objects as a placement object for an augmented reality information element.
    Type: Grant
    Filed: August 18, 2017
    Date of Patent: March 19, 2019
    Assignee: Nokia Technologies Oy
    Inventors: Jussi Artturi Leppänen, Antti Johannes Eronen, Arto Juhani Lehtiniemi
  • Patent number: 10235806
    Abstract: Methods and systems for selectively merging real-world objects into a virtual environment are disclosed. The method may include: receiving a first input for rendering of a virtual environment, a second input for rendering of a real-world environment, and a depth information regarding the rendering of the real-world environment; identifying at least one portion of the rendering of the real-world environment that is within a depth range and differentiable from a predetermined background; generating a merged rendering including the at least one portion of the rendering of the real-world environment into the rendering of the virtual environment; and displaying the merged rendering to a user.
    Type: Grant
    Filed: November 21, 2014
    Date of Patent: March 19, 2019
    Assignee: Rockwell Collins, Inc.
    Inventors: Danilo P. Groppa, Loyal J. Pyczynski
  • Patent number: 10235123
    Abstract: A method and apparatus for registering, by a first controller associated with a vehicle, at least one information device associated with an occupant when the occupant enters the vehicle. The first controller obtains vehicle information and obtains information associated with each registered occupant from the at least one information device associated with the registered occupant. The first controller transmits the vehicle information and information associated with each registered occupant of the vehicle to an augmented reality viewer, wherein transmitted information is overlaid on an image of the vehicle rendered on the augmented reality viewer.
    Type: Grant
    Filed: April 7, 2017
    Date of Patent: March 19, 2019
    Assignee: MOTOROLA SOLUTIONS, INC.
    Inventors: Stuart S. Kreitzer, Jesus F. Corretjer
  • Patent number: 10234847
    Abstract: A method for supporting 3D printing includes identifying a product, searching and providing a 3D model present on a web in relation to the identified product, calculating and simulating suitability between the identified product and the searched 3D model based on information of the identified product and the searched 3D model, and transmitting the 3D model information to a 3D printer to produce the 3D model. Accordingly, the 3D printing technique may be actively utilized for DIY in various fields, business models or the like.
    Type: Grant
    Filed: April 28, 2015
    Date of Patent: March 19, 2019
    Assignee: Korea Institute of Science and Technology
    Inventors: Byounghyun Yoo, Sangchul Ahn, Heedong Ko
  • Patent number: 10229312
    Abstract: Systems, methods, and non-transitory computer-readable media can identify one or more objects depicted in a camera view of a camera application displayed on a display of a user device. An augmented reality overlay is determined based on the one or more objects identified in the camera view. The camera view is modified based on the augmented reality overlay.
    Type: Grant
    Filed: December 20, 2017
    Date of Patent: March 12, 2019
    Assignee: Facebook, Inc.
    Inventors: John Samuel Barnett, Dantley Davis, Congxi Lu, Jonathan Morton, Peter Vajda, Joshua Charles Harris
  • Patent number: 10228566
    Abstract: An apparatus and a method are described herein related to the art of augmented reality type monitor virtualization. A monitor-virtualization system, such as a head-mountable device, an ophthalmic device or an intraocular implant, can render a virtual monitor in augmented reality. A liquid lens or an optical phased array can position the virtual monitor in space by optical means. A dimmable occlusion matrix can be additionally operated such as to make the image of the virtual monitor substantially opaque. A coordinator module can synchronize the activities of monitor positioning and occlusion masking. The virtual monitor can be anchored to real-world artifacts using bokode technology. Various dimming modes of the occlusion matrix reduce operator fatigue. The apparatus may operate in smart sunglass mode when the virtual monitor function is paused. The virtual monitor can be hidden or visualized differently when thresholds in terms of user geolocation or viewing angle are breached.
    Type: Grant
    Filed: August 7, 2017
    Date of Patent: March 12, 2019
    Inventor: Maximilian Ralph Peter von und zu Liechtenstein
  • Patent number: 10224006
    Abstract: A display device for a vehicle includes: a first display unit that displays a real image of first information in a first display portion; and a second display unit that displays a virtual image of second information in a second display portion by projecting an optical image to the second display portion, the second display portion being disposed above the first display portion and transmitting an external image. The display device includes: a light emission unit located adjacent to a side of the first display portion to form a light emission area; and a control unit that changes the light emission area in a guiding direction from the first display portion toward the second display portion during a linking period for linking a particular virtual image display of the second information with a real image display of the first information.
    Type: Grant
    Filed: April 15, 2016
    Date of Patent: March 5, 2019
    Assignee: DENSO CORPORATION
    Inventors: Katsumi Fujita, Satoru Tamura, Masashi Toyota, Seigo Tane
  • Patent number: 10217290
    Abstract: A user interface enables a user to calibrate the position of a three dimensional model with a real-world environment represented by that model. Using a device's sensor, the device's location and orientation is determined. A video image of the device's environment is displayed on the device's display. The device overlays a representation of an object from a virtual reality model on the video image. The position of the overlaid representation is determined based on the device's location and orientation. In response to user input, the device adjusts a position of the overlaid representation relative to the video image.
    Type: Grant
    Filed: March 26, 2018
    Date of Patent: February 26, 2019
    Assignee: Apple Inc.
    Inventors: Christopher G. Nicholas, Lukas M. Marti, Rudolph van der Merwe, John Kassebaum
  • Patent number: 10217292
    Abstract: According to various embodiments, devices, methods, and computer-readable media for reconstructing a 3D scene are described. A server device, sensor devices, and client devices may interoperate to reconstruct a 3D scene sensed by the sensor devices. The server device may generate one or more models for objects in the scene, including the identification of dynamic and/or static objects. The sensor devices may, provide model data updates based on these generated models, such that only delta changes in the scene may be provided, in addition to raw sensor data. Models may utilize semantic knowledge, such as knowledge of the venue or identity of one or more persons in the scene, to further facilitate model generation and updating. Other embodiments may be described and/or claimed.
    Type: Grant
    Filed: April 1, 2016
    Date of Patent: February 26, 2019
    Assignee: Intel Corporation
    Inventors: Ignacio J. Alvarez, Ranganath Krishnan
  • Patent number: 10210659
    Abstract: Method, apparatus, and system for providing an item image to a client for display in a contextual environment are described. In some embodiments, the user may select an item for display in the contextual environment, and the user may position a camera coupled to a processing system to capture the contextual environment. A placeholder may be generated and associated with an item selected by a user. In an embodiment, the generated placeholder may be placed in a location within the contextual environment, and the user's processing system may send a visual data stream of the camera-captured environment to a server. In an embodiment, the user's processing device may receive a modified data stream including an image of the item, and the user's processing device may display the item image in the same location as the placeholder.
    Type: Grant
    Filed: September 28, 2015
    Date of Patent: February 19, 2019
    Assignee: eBay Inc.
    Inventors: John Tapley, Eric J. Farraro
  • Patent number: 10210663
    Abstract: A contextual local image recognition module of a device retrieves a primary content dataset from a server and then generates and updates a contextual content dataset based on an image captured with the device. The device stores the primary content dataset and the contextual content dataset. The primary content dataset comprises a first set of images and corresponding virtual object models. The contextual content dataset comprises a second set of images and corresponding virtual object models retrieved from the server.
    Type: Grant
    Filed: February 27, 2017
    Date of Patent: February 19, 2019
    Assignee: DAQRI, LLC
    Inventor: Brian Mullins
  • Patent number: 10210565
    Abstract: A system may provide an augmented environment that facilitates a transaction. The system may store profile data including user payment or user profile information. The system may then receive environmental data including audio and visual information representing a physical environment. The system may then receive first user input data indicative of a selection of one or more items present in the physical environment, and identify one or more action items in the environmental data based on the first user input data. In response to this identification, the system may augment the environmental data by adding virtual environmental data, and then provide this virtual environmental data to a device to create an augmented environment. The system can then receive second user input data, and provide purchase request data to a merchant terminal to enable a transaction related to the one or more action items.
    Type: Grant
    Filed: August 21, 2018
    Date of Patent: February 19, 2019
    Assignee: CAPITAL ONE SERVICES, LLC
    Inventors: Karen Nickerson, Justin Wishne, Drew Jacobs, Justin Smith, Marco S. Giampaolo, Hannes Jouhikainen
  • Patent number: 10192339
    Abstract: A comprehensive solution is provided to transforming locations and retail spaces into high-traffic VR attractions that provide a VR experience blended with a real-world tactile experience. A modular stage and kit of stage accessories suitable for a wide variety of commercial venues contains all of the necessary equipment, infrastructure, technology and content to assemble and operate a tactile, onsite VR attraction. Utilizing a modular set of set design and physical props, the physical structure and layout of the installations are designed to be easily rearranged and adapted to new VR content, without requiring extensive construction or specialized expertise.
    Type: Grant
    Filed: November 30, 2017
    Date of Patent: January 29, 2019
    Assignee: Unchartedvr Inc.
    Inventors: Kalon Ross Gutierrez, John Duncan, Douglas Griffin, Richard Schulze
  • Patent number: 10192340
    Abstract: A comprehensive solution is provided to transforming locations and retail spaces into high-traffic VR attractions that provide a VR experience blended with a real-world tactile experience. A modular stage and kit of stage accessories suitable for a wide variety of commercial venues contains all of the necessary equipment, infrastructure, technology and content to assemble and operate a tactile, onsite VR attraction. Utilizing a modular set of set design and physical props, the physical structure and layout of the installations are designed to be easily rearranged and adapted to new VR content, without requiring extensive construction or specialized expertise.
    Type: Grant
    Filed: November 30, 2017
    Date of Patent: January 29, 2019
    Assignee: Unchartedvr Inc.
    Inventors: Kalon Ross Gutierrez, John Duncan, Douglas Griffin, Richard Schulze
  • Patent number: 10191538
    Abstract: An electronic device determines information about a target and provides the information to another electronic device that has an obstructed view of the target. The other electronic device displays an image of the target with an orientation and a location of the target.
    Type: Grant
    Filed: October 16, 2017
    Date of Patent: January 29, 2019
    Inventors: Philip Lyren, Robert Lyren
  • Patent number: 10192115
    Abstract: Described herein are a system and methods for generating a record of objects, as well as respective positions for those objects, with respect to a user. In some embodiments, a user may use a user device to scan an area that includes one or more objects. The one or more objects may be identified from image information obtained from the user device. Positional information for each of the one or more objects may be determined from depth information obtained from a depth sensor installed upon the user device. In some embodiments, the one or more objects may be mapped to object models stored in an object model database. The image information displayed on the user device may be augmented so that it depicts the object models associated with the one or more objects instead of the actual objects.
    Type: Grant
    Filed: December 13, 2017
    Date of Patent: January 29, 2019
    Assignee: Lowe's Companies, Inc.
    Inventors: Mason E. Sheffield, Josh Shabtai
  • Patent number: 10191284
    Abstract: Aspects of the present invention relate to methods and systems for the see-through computer display systems with a wide field of view.
    Type: Grant
    Filed: May 11, 2017
    Date of Patent: January 29, 2019
    Assignee: Osterhout Group, Inc.
    Inventors: John N. Border, John D. Haddick