Patents Examined by Thomas J. Lett
  • Patent number: 11488367
    Abstract: A method for intervention in a radioactive zone includes production of a digital model representing the three-dimensional topography of the radioactive zone (1); and intervention of the at least one operator in the radioactive zone (1). The intervention step includes repeated measurement of the radioactive radiation intensity by a portable detector (3), and determination of the spatial coordinates of the portable detector (3) at the time of the measurement; recording of a plurality of said measurements and the corresponding spatial coordinates in the digital model; materialisation of the recorded measurements in an augmented reality device (5) worn by the at least one operator, by a plurality of discrete holographie symbols (7).
    Type: Grant
    Filed: June 20, 2019
    Date of Patent: November 1, 2022
    Assignee: FRAMATOME
    Inventors: Guillaume Pons, Franck Grember, Audrey Casteleira
  • Patent number: 11487121
    Abstract: Disclosed is an improved diffraction structure for 3D display systems. The improved diffraction structure includes an intermediate layer that resides between a waveguide substrate and a top grating surface. The top grating surface comprises a first material that corresponds to a first refractive index value, the underlayer comprises a second material that corresponds to a second refractive index value, and the substrate comprises a third material that corresponds to a third refractive index value.
    Type: Grant
    Filed: March 23, 2021
    Date of Patent: November 1, 2022
    Assignee: Magic Leap, Inc.
    Inventors: Robert D. Tekolste, Michael A. Klug, Paul M. Greco, Brian T. Schowengerdt
  • Patent number: 11477652
    Abstract: The system and methods described herein aids in the defense of unmanned vehicles, such as aerial vehicles, from wifi cyber attacks. Such attacks usually do not last long and in the case of many point-to-point command and control systems, the attacks originate from close proximity to the unmanned vehicle. The system and methods described herein allow a team to rapidly identify and physically respond to an adversary trying to take control of the unmanned vehicle. Another aspect of the embodiment taught herein is to allow for the location of a wifi signal in a hands-free manner by able to visualize the source of the signal using an augmented reality display coupled to an antenna array.
    Type: Grant
    Filed: November 25, 2019
    Date of Patent: October 18, 2022
    Assignee: United States of America as represented by the Secretary of the Navy
    Inventors: Mark Bilinski, Gerald Thomas Burnette, Fred William Greene, Garrison Buckminster Price
  • Patent number: 11475617
    Abstract: In implementations of path-constrained drawing with visual properties based on a drawing tool, a digital artwork editing system includes a user interface in which a constraint path can be designated in a digital artwork. A stroke input can be sampled as it is drawn with a drawing tool and for each processing interval of the stroke input, a start point of the stroke input and a tangent line to the constraint path is determined. An end point of the stroke input is projected onto a parallel line that is through the start point and parallel to the tangent line, and a stroke is rendered along this line. Hence, the stroke is rendered based on the stroke input in a piecewise linear fashion, simultaneously constrained by the constraint path and rendered based on how the drawing tool is used.
    Type: Grant
    Filed: April 26, 2021
    Date of Patent: October 18, 2022
    Assignee: Adobe Inc.
    Inventor: Dwight O. Rodgers
  • Patent number: 11475650
    Abstract: A method for facilitating an environmentally adaptive extended reality display in a physical environment includes virtually displaying content via a wearable extended reality appliance operating in the physical environment, wherein displaying content via the wearable extended reality appliance is associated with at least one adjustable extended reality display parameter. Image data is obtained from the wearable extended reality appliance and a specific environmental change unrelated to the virtually displayed content is detected in the image data. A group of rules associating environmental changes with changes in the at least one adjustable extended reality display parameter is accessed and a specific rule of the group of rules is determined, the specific rule corresponding to the specific environmental change. The specific rule is implemented to adjust the at least one adjustable extended reality display parameter based on the specific environmental change.
    Type: Grant
    Filed: April 1, 2022
    Date of Patent: October 18, 2022
    Assignee: MULTINARITY LTD
    Inventors: Tamir Berliner, Tomer Kahan, Orit Dolev
  • Patent number: 11468258
    Abstract: An information processing apparatus acquires viewpoint information indicating a position and a direction of a virtual viewpoint designated for generating a virtual viewpoint image in which a predetermined object is set as an object to be targeted, and object information indicating a position of an object. The information processing apparatus further identifies an object to be targeted that corresponds to the viewpoint information, based on the acquired viewpoint information and the acquired object information, and outputs information regarding the identified object.
    Type: Grant
    Filed: October 15, 2020
    Date of Patent: October 11, 2022
    Assignee: Canon Kabushiki Kaisha
    Inventors: Rei Ishikawa, Kazuhiro Matsubayashi
  • Patent number: 11462128
    Abstract: A haptic feedback system to stimulate physical sensation of handling a fluid in virtual spaces. The system includes a physical vessel containing a solid object and mechanical means to move the solid object therein, a virtual reality module to track a position of the physical vessel and a corresponding virtual vessel, logic to simulate a fluid contained in the virtual vessel based on the tracked position of the virtual vessel, logic to calculate coordinates of a center of gravity for the simulated fluid based on the tracked position of the virtual vessel, logic to translate the calculated coordinates of the center of gravity for the simulated fluid into cylindrical coordinates to which to move the solid object in the physical vessel, and logic to send instructions to the mechanical means to move the solid object in the physical vessel based on the cylindrical coordinates to shift the weight of the physical vessel in accordance with the center of gravity of the simulated fluid in the virtual vessel.
    Type: Grant
    Filed: October 18, 2019
    Date of Patent: October 4, 2022
    Assignee: Arizona Board of Regents on behalf of Arizona State University
    Inventors: Robert LiKamWa, Shahabedin Sagheb, Alireza Bahremand, Byron Lahey, Frank Wencheng Liu, Assegid Kidane
  • Patent number: 11455750
    Abstract: A system may include a processing resource, and a computing device comprising instructions executable to: extract data from objects in a digital image of a physical environment; utilize the extracted data to identify information about a task to be performed by an operator at the physical environment; and select, based on a characteristic of the operator, a portion of the identified information about the task to include in a visual overlay to be displayed to the operator at the physical environment.
    Type: Grant
    Filed: April 30, 2018
    Date of Patent: September 27, 2022
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Allen Owen Wright, Krishna Prasad Muraleedharan Pillai, Evan Scheessele
  • Patent number: 11450101
    Abstract: A head-mounted device can be operated to detect and respond to a user's behavior. The head-mounted device can be regularly and frequently worn while the user performs regular daily tasks, allowing the head-mounted device to collect a large volume of data across a long duration of time so that the cognitive status of the user can be evaluated with greater confidence than may be achieved with an evaluation of short duration and/or frequency. Evaluations performed by the head-mounted device can facilitate the determinations of which disorder is present, a severity of the disorder, trends, and/or forecasts. The head-mounted device can provide feedback that can guide and direct a user to correct actions despite a tendency to do otherwise due to cognitive decline.
    Type: Grant
    Filed: March 25, 2020
    Date of Patent: September 20, 2022
    Assignee: Apple Inc.
    Inventor: Paul X. Wang
  • Patent number: 11450074
    Abstract: A method for showing objects in an augmented reality environment includes the steps of an Augmented Reality (AR) device obtaining spatial information and object identification (ID) from an anchor device. The AR device obtains parameters of the objects from a cloud system. The parameters of an object include the image of the object, the audio of the object, and a first spatial relationship between the object and an anchor device. The AR device obtains a second spatial relationship between the AR device and the anchor device. A third spatial relationship between the object and the AR device can be obtained according to the first spatial relationship and the second spatial relationship. The image of the object and the audio of the object is displayed or projected in the AR device according to the third spatial relationship.
    Type: Grant
    Filed: July 9, 2021
    Date of Patent: September 20, 2022
    Assignee: AMBIT MICROSYSTEMS (SHANGHAI) LTD.
    Inventors: Yu-Hu Yan, Chien-Sheng Wu
  • Patent number: 11450033
    Abstract: Provided are an apparatus and method for experiencing an augmented reality (AR)-based screen sports match which enable even a child, an elderly person, and a person with a disability to easily and safely experience a ball sports match, such as tennis, badminton, or squash, as a screen sports match without using a wearable marker or sensor, a wearable display, an actual ball, and an actual tool.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: September 20, 2022
    Assignee: Electronics and Telecommunications Research Institute
    Inventors: Jong Sung Kim, Youn Hee Gil, Seong Min Baek, Hee Sook Shin, Seong Il Yang, Cho Rong Yu, Sung Jin Hong
  • Patent number: 11417031
    Abstract: Methods and devices systems related to a computing device for highlighting a tagged object with augmented reality (AR) are described. An example method can include identifying, using a computing device, an object tagged with a sensor within a plurality of objects. The example method can include tracking movement of the object based on communication between the sensor and the computing device and highlighting the object via AR based on the tracking and responsive to a request to locate the object.
    Type: Grant
    Filed: August 7, 2020
    Date of Patent: August 16, 2022
    Assignee: Micron Technology, Inc.
    Inventors: Bhumika Chhabra, Radhika Viswanathan, Carla L. Christensen, Zahra Hosseinimakarem
  • Patent number: 11417042
    Abstract: Methods and systems are provided for generating animation for non-player characters (NPCs) in a game. The method includes operations for examining a scene for an NPC that is providing voice output. The method further includes operations for examining the voice output to identify am intensity modulation of the voice output. In addition, the method further includes processing the intensity modulation to predict body language signals (BLS) for the NPC. Moreover, the BLS is used to cause features of the NPC to react consistent with an emotion content of the intensity modulation identified in the voice output.
    Type: Grant
    Filed: November 21, 2019
    Date of Patent: August 16, 2022
    Assignee: Sony Interactive Entertainment Inc.
    Inventor: Javier Fernandez Rico
  • Patent number: 11416789
    Abstract: Aspects of the disclosure provide a computer-implemented method and system for the assignment of roadside assistance service providers such as tow trucks to distressed vehicles/drivers requiring roadside assistance. The methods and systems may include a roadside assistance service provider system with a collection module, an assignment module, and a feedback module. The collection module collects roadside assistance service provider information and historical statistics from real-world information and stores the information in a database that may then be analyzed using particular rules and formulas. The assignment module assigns particular roadside assistance service providers to particular distressed vehicles/drivers based on one or more characteristics.
    Type: Grant
    Filed: January 5, 2021
    Date of Patent: August 16, 2022
    Assignee: ALLSTATE INSURANCE COMPANY
    Inventors: Jennifer A. Brandmaier, Jason Balabas, Tao Chen, Christopher J. Lieggi, Robert A. Spinneweber
  • Patent number: 11410217
    Abstract: An augmented reality-based lighting design method includes displaying, by an augmented reality device, a real-time image of a target physical area on a display screen. The method further includes displaying, by the augmented reality device, a lighting fixture 3-D model on the display screen in response to a user input, where the lighting fixture 3-D model is overlaid on the real-time image of the target physical area. The method also includes displaying, by the augmented reality device, a lighting pattern on the display screen overlaid on the real-time image of the target physical area, wherein the lighting pattern is generated based on at least photometric data associated with the lighting fixture 3-D model.
    Type: Grant
    Filed: June 30, 2020
    Date of Patent: August 9, 2022
    Assignee: SIGNIFY HOLDING B.V.
    Inventors: Nam Chin Cho, Parth Joshi, William Thomas Cook
  • Patent number: 11410400
    Abstract: Described herein are systems and methods for maintaining color calibration using common objects. In an exemplary embodiment, an AR system includes a forward-facing camera, an AR display, a processor, and a user interface. The processor is configured to receive image data from the forward-facing camera and identify any known objects depicted in the image data. The processor then determines RGB information at least one test rendering of the identified known object and displays it via the AR display. Input from a user interface, indicating which of the at least one test renderings was a closest match to the real-world object, and a level of satisfaction with the match are received by the processor and used to update an AR display color calibration model. More test renderings may be iteratively provided to improve the accuracy of the calibration.
    Type: Grant
    Filed: December 21, 2018
    Date of Patent: August 9, 2022
    Assignee: PCMS Holdings, Inc.
    Inventors: David Wyble, Patrick Igoe
  • Patent number: 11392876
    Abstract: Methods, systems, and computer-readable media for deploying and implementing enterprise policies that control augmented reality computing functions are presented. A computing device may receive policy information defining policies that, when implemented, control capture of augmented renderings. After receiving the policy information, the computing device may intercept a request to capture at least one view having at least one augmented reality element. In response to intercepting the request, the computing device may determine whether the policies allow capture of views comprising augmented reality elements. Based on determining that the policies allow capture, the computing device may store view information associated with the at least one view having the at least one augmented reality element. Based on determining that the policies do not allow capture, the computing device may prevent the at least one view having the at least one augmented reality element from being captured.
    Type: Grant
    Filed: January 4, 2019
    Date of Patent: July 19, 2022
    Assignee: Citrix Systems, Inc.
    Inventor: Thierry Duchastel
  • Patent number: 11386799
    Abstract: A vision-haptics fused augmented reality simulator for dental surgical skill training, including a dental simulation training platform constructed based on an artificial head phantom; a dental operation training system based on a haptic feedback device; an observation system based on an augmented reality head-mounted display; generating a virtual dental model by modeling based on CBCT data and scan data of a patient's dental cavity, to construct a virtual dental environment; based on the virtual dental model and feature points obtained through scanning on the artificial head phantom, performing a spatial matching of a virtual dental cavity and a dental model; in a virtual dental surgery simulation method, outputting haptics information and visual information at frequencies of not less than 1000 Hz and 60 Hz, respectively; performing a visual information processing method on grid data; and performing a haptics-vision space calibration method based on information of an operator's head.
    Type: Grant
    Filed: August 21, 2020
    Date of Patent: July 12, 2022
    Assignee: BEIJING UNIDRAW VR TECHNOLOGY RESEARECH INSTITUTE CO., LTD
    Inventors: Aimin Hao, Yu Cong, Yongtao Zhao, Xiaohan Zhao
  • Patent number: 11388284
    Abstract: A process for real-time data exchange allows for sharing data, including metadata such as user data and phone data, between multiple handsets so that handsets may automatically display such data upon initiation or establishment of a telephone call, and in some arrangements throughout the life of the call. In this way, the need for subsequent data transmissions is reduced or eliminated, making bandwidth usage more efficient and conserving battery power.
    Type: Grant
    Filed: October 8, 2020
    Date of Patent: July 12, 2022
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Jason Mathew Eilts, Alvin Chardon, Ben Greenier
  • Patent number: 11368573
    Abstract: In some aspects, a user equipment (UE) determines, using an inertial measurement unit, an orientation of the UE and determines, using ambient light sensors, an ambient light condition of the UE. The UE determines, using a machine learning module and based on the orientation and the ambient light condition, a position of the UE. If the position comprises an on-body position, the UE uses the machine learning module and touch data received by a touchscreen of the UE to determine whether the position comprises an in-hand position. If the position comprises the in-hand position, the UE determines, using the machine learning module and based on the orientation and the touch data, a grip mode. If the position comprises an off-body position, the UE determines, using the machine learning module and at least one of the inertial measurement unit or the ambient light sensors, a user presence or a user absence.
    Type: Grant
    Filed: May 11, 2021
    Date of Patent: June 21, 2022
    Assignee: QUALCOMM Incorporated
    Inventors: Diyan Teng, Mehul Soman, Nisarg Trivedi, Rashmi Kulkarni, Justin McGloin