Patents Examined by Jordany Nunez
  • Patent number: 11960790
    Abstract: A computer implemented method includes detecting user interaction with mixed reality displayed content in a mixed reality system. User focus is determined as a function of the user interaction based on the user interaction using a spatial intent model. A length of time for extending voice engagement with the mixed reality system is modified based on the determined user focus. Detecting user interaction with the displayed content may include tracking eye movements to determine objects in the displayed content at which the user is looking and determining a context of a user dialog during the voice engagement.
    Type: Grant
    Filed: May 27, 2021
    Date of Patent: April 16, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Austin S. Lee, Jonathan Kyle Palmer, Anthony James Ambrus, Mathew J. Lamb, Sheng Kai Tang, Sophie Stellmach
  • Patent number: 11947729
    Abstract: The disclosure provides a gesture recognition method and device, a gesture control method and device and a virtual reality apparatus, the gesture recognition method includes: obtaining a hand image, acquired by each lens of a binocular camera, of a user; recognizing, through a pre-constructed recognition model, a first group of hand bone points from the obtained hand image, to obtain a hand bone point image in which the first group of recognized hand bone points is marked on a hand region of the hand image; obtaining, according to the obtained hand bone point image, two-dimensional positional relations and three-dimensional positional relations between various bone points in a second group of hand bone points as hand gesture data of the user; and recognizing a gesture of the user according to the hand gesture data.
    Type: Grant
    Filed: August 10, 2022
    Date of Patent: April 2, 2024
    Assignee: QINGDAO PICO TECHNOLOGY CO., LTD.
    Inventor: Tao Wu
  • Patent number: 11928303
    Abstract: The present disclosure generally relates to user interfaces for managing shared-content sessions. In some embodiments, content is shared with a group of users participating in a shared-content session. In some embodiments, the content is screen-share content that is shared from one device to other participants of the shared-content session. In some embodiments, the content is synchronized content for which output of the content is synchronized across the participants of the shared-content session.
    Type: Grant
    Filed: September 23, 2021
    Date of Patent: March 12, 2024
    Assignee: Apple Inc.
    Inventors: Jae Woo Chang, Taylor G. Carrigan, Marcel Van Os, Elliot A. Barer, Kyle William Horn, Kaely Coon
  • Patent number: 11928328
    Abstract: A non-transitory recording medium has a program recorded thereon for controlling a computer having first recording circuits, an emulator that emulates an information processing device, and a communicator which communicates with the information processing device, and the computer further having second recording circuits. The program controls the communicator to receive first information including input data, which is inputted to the information processing device, from the information processing device, causes recording of the first information in the first recording circuits, and controls the communicator to transmit second information to the information processing device, wherein the second information includes input data for the emulator.
    Type: Grant
    Filed: June 10, 2021
    Date of Patent: March 12, 2024
    Assignee: CASIO COMPUTER CO., LTD.
    Inventor: Manato Ono
  • Patent number: 11928307
    Abstract: Aspects disclosed herein provide an operator training system that can train an operator about basic machine movements, operations, and applications, such as asphalt compactor rolling patterns, paving-by-numbers, milling-by-numbers, etc. with the use of a virtual reality headset at any point or time on a work site. The operator training system uses data received from the virtual reality headset and associated hand controls to guide the operator via training scenarios. The operator training system also includes providing immediate feedback on the outcome of the training scenario and may also identify one or more areas of improvement.
    Type: Grant
    Filed: March 11, 2022
    Date of Patent: March 12, 2024
    Assignee: Caterpillar Paving Products Inc.
    Inventors: David A. King, Todd Willis Mansell, Eric Remboldt, Matt Peasley
  • Patent number: 11914766
    Abstract: A window in a multi-window display configuration is provided. A gaze of one or more users is directed at the window. The multi-window display configuration has a plurality of windows that are each configured to display corresponding content. Further, a window attribute of the window is modified based upon the gaze. In addition, a request for the content corresponding to the window is sent to a server. The content corresponding to the window is received from the server. The content corresponding to the window is then displayed according to the modified window attribute at the window.
    Type: Grant
    Filed: March 4, 2022
    Date of Patent: February 27, 2024
    Assignee: Disney Enterprises, Inc.
    Inventor: Mehul Patel
  • Patent number: 11914759
    Abstract: Examples of augmented reality (AR) environment control advantageously employ multi-factor intention determination and include: performing a multi-factor intention determination for summoning a control object (e.g., a menu, a keyboard, or an input panel) using a set of indications in an AR environment, the set of indications comprising a plurality of indications (e.g., two or more of a palm-facing gesture, an eye gaze, a head gaze, and a finger position simultaneously); and based on at least the set of indications indicating a summoning request by a user, displaying the control object in a position proximate to the user in the AR environment (e.g., docked to a hand of the user). Some examples continue displaying the control object while at least one indication remains, and continue displaying the control object during a timer period if one of the indications is lost.
    Type: Grant
    Filed: January 19, 2022
    Date of Patent: February 27, 2024
    Assignee: Microsoft Technology Licensing, LLC.
    Inventors: Andrew Jackson Klein, Cory Ryan Bramall, Kyle Mouritsen, Ethan Harris Arnowitz, Jeremy Bruce Kersey, Victor Jia, Justin Thomas Savino, Stephen Michael Lucas, Darren A. Bennett
  • Patent number: 11907419
    Abstract: Systems and methods disclosed herein are related to an intelligent UI element selection system using eye-gaze technology. In some example aspects, a UI element selection zone may be determined. The selection zone may be defined as an area surrounding a boundary of the UI element. Gaze input may be received and the gaze input may be compared with the selection zone to determine an intent of the user. The gaze input may comprise one or more gaze locations. Each gaze location may be assigned a value according to its proximity to the UI element and/or its relation to the UI element's selection zone. Each UI element may be assigned a threshold. If the aggregated value of gaze input is equal to or greater than the threshold for the UI element, then the UI element may be selected.
    Type: Grant
    Filed: June 29, 2021
    Date of Patent: February 20, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Narasimhan Raghunath, Austin B. Hodges, Fei Su, Akhilesh Kaza, Peter John Ansell, Jonathan T. Campbell, Harish S. Kulkarni
  • Patent number: 11908243
    Abstract: Systems and methods are provided for performing operations comprising: capturing, by an electronic mirroring device, a video feed received from a camera of the electronic mirroring device, the video feed depicting a user; displaying, by one or more processors of the electronic mirroring device, one or more menu options on the video feed that depicts the user, the one or more menu options relating to a first level in a hierarchy of levels; detecting a gesture performed by the user in the video feed; and in response to detecting the gesture, displaying a set of options related to a given option of the one or more menu options, the set of options relating to a second level in the hierarchy of levels.
    Type: Grant
    Filed: March 16, 2021
    Date of Patent: February 20, 2024
    Assignee: Snap Inc.
    Inventors: Dylan Shane Eirinberg, Kyle Goodrich, Andrew James McPhee, Daniel Moreno
  • Patent number: 11887062
    Abstract: A method for proactively providing information to a user about an event involving the user includes receiving user input indicating a request to display a search page and displaying the search page in response to the user input. In addition, event information may be automatically obtained and displayed on the search page in response to the user input. Some examples of event information that may be provided include participant information about a participant in the event other than the user, file information about a file that is relevant to the event, and conversation information about a conversation that is relevant to the event.
    Type: Grant
    Filed: February 3, 2022
    Date of Patent: January 30, 2024
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Yuan Wei, Wayne Sun, Tali Roth, Miles Cole Fitzgerald, Michael Francis Palermiti, II
  • Patent number: 11875021
    Abstract: The present disclosure generally relates to underwater user interfaces. In some embodiments, a method includes at an electronic device with a display and one or more input devices, receiving a first request to display a user interface for accessing a first function of the electronic device. In response to receiving the first request, and in accordance with a determination that the electronic device is under water, the method includes displaying a first user interface for accessing the first function. In response to receiving the first request, and in accordance with a determination that the electronic device is not under water, the method also includes displaying a second user interface for accessing the first function.
    Type: Grant
    Filed: April 5, 2021
    Date of Patent: January 16, 2024
    Assignee: Apple Inc.
    Inventors: Benjamin W. Bylenok, Alan An, Alyssa C. Ramdyal, Andrew Chen, Anya Prasitthipayong, Cheng-I Lin, Eric Shi, Kenneth H. Mahan, Ki Myung Lee, Kyle B. Cruz, Maxime Chevreton, Richard J. Blanco, Sung Chang Lee, Walton Fong, Wei Guang Wu, Xuefeng Wang
  • Patent number: 11875081
    Abstract: A system, method, and computer-readable media for persisting an annotated screen share within a group-based communication system are provided. A screen share may be received from a sharing user within a synchronous multimedia collaboration session. The screen share may be transmitted to viewing users for display. A viewing user may submit an annotation for the screen share. A sharing or viewing user may then submit a request to save the annotated screen share. The annotated screen share may be automatically persisted in association with the synchronous multimedia collaboration session.
    Type: Grant
    Filed: January 31, 2022
    Date of Patent: January 16, 2024
    Assignee: Salesforce, Inc.
    Inventors: Noah Weiss, Anna Niess, Kevin Marshall, Katie Steigman, Dolapo Falola
  • Patent number: 11875031
    Abstract: An electronic device includes a display screen, and computing hardware to execute a software product. Executing the software product results in generating and rendering a graphical user interface on the display screen to facilitate user interaction. The graphical user interface, when rendered, presents one or more graphical objects, a pointer object configured to be movable over one or more of the one or more graphical objects, and configured to invoke a menu list containing one or more user selectable options as the pointer object is clicked or tapped over the one or more of the one or more of the graphical objects. A user selectable option is selected the pointer object swipes a touch sensitive object and the software product can maintain an effect corresponding to the selected option to be applied to the graphical objects and enable a change in status of the graphical objects.
    Type: Grant
    Filed: September 10, 2021
    Date of Patent: January 16, 2024
    Assignee: Supercell Oy
    Inventors: Timur Haussila, Touko Tahkokallio, Mikko Hokkanen
  • Patent number: 11822771
    Abstract: Techniques for detecting one or more focus areas of a user and structuring activities and content around the focus area(s) are disclosed. The activities of a user, the people associated with the activities, and/or the content associated with the activities are automatically inferred or detected based on the channels of collaboration associated with the focus area(s). Example channels of collaboration include, but are not limited to, electronic mail, instant messages, documents, and in-person and online meetings. Some or all of the activities, people, and/or content are grouped into one or more focus areas, where a focus area relates to an endeavor in which the user focuses on over a period of time. Some or all of the focus areas are presented to the user.
    Type: Grant
    Filed: June 30, 2021
    Date of Patent: November 21, 2023
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Torbjørn Helvik, Andreas Eide, Kjetil Krogvig Bergstrand, Lene Christin Rydningen
  • Patent number: 11822728
    Abstract: An electronic device may include a display, a communication circuit, at least one camera, a memory, and a processor operatively connected to the display, the communication circuit, the at least one camera, and the memory. The memory may store instructions that, when executed, cause the processor to provide an augmented reality (AR) environment or a virtual reality (VR) environment through the display, connect the electronic device and at least one external electronic device through the communication circuit, display the at least one external electronic device through the display, specify a first external electronic device among the displayed at least one external electronic device based on an input interface switching event, and control an operation of the electronic device in the augmented reality environment or the virtual reality environment using the specified first external electronic device.
    Type: Grant
    Filed: December 7, 2021
    Date of Patent: November 21, 2023
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Heonjun Ha, Seungnyun Kim, Junwhon Uhm, Jinchoul Lee, Hyunsoo Kim, Hyunjun Kim
  • Patent number: 11809635
    Abstract: In response to a current gesture performed by a user interacting with an application, a gesture detection module accesses a plurality of predefined control gestures, and compares the current gesture with the plurality of predefined control gestures. The gesture detection module detects whether the current gesture matches at least an initial partial gesture of a particular control gesture associated with at least one application function. A feedback generator module generates confirmation feedback for the user if the current gesture matches at least an initial partial gesture of the particular control gesture. The confirmation feedback encodes, for the user, at least a first information identifying the application function associated with the particular control gesture and a second information about a time interval remaining before the associated application function will be executed. The application triggers the associated function if the particular control gesture is completed during said time interval.
    Type: Grant
    Filed: June 13, 2022
    Date of Patent: November 7, 2023
    Assignee: Treye Tech UG (haftungsbeschränkt)
    Inventor: Anton Wachner
  • Patent number: 11809704
    Abstract: Provided is a control system applied to a screen projection scenario. The system includes: a mobile phone and a display device, receiving first screen content and a target navigation function identifier sent by the mobile phone; generating, according to the target navigation function identifier, a collaboration window including a screen projection area and a navigation bar, where the navigation bar includes three virtual navigation keys; displaying the first screen content in the screen projection area; receiving a keyboard and mouse operation acting on the virtual navigation keys; generating a key instruction according to the keyboard and mouse operation, and sending the key instruction to the mobile phone, so that the mobile phone executes a navigation function according to the key instruction, and the mobile phone can adjust the first screen content to second screen content; and displaying, by the display device, the second screen content in the screen projection area.
    Type: Grant
    Filed: July 22, 2020
    Date of Patent: November 7, 2023
    Assignee: HONOR DEVICE CO., LTD.
    Inventors: Hejin Gu, Siyue Niu
  • Patent number: 11768594
    Abstract: A method of updating a protocol for a Virtual Reality (VR) medical test via a user device having a processor, the VR medical test being performed on a subject via a VR device worn by the subject, wherein the method is performed by the processor and the method comprises: displaying GUI elements associated with the protocol on the user device, the GUI elements having user adjustable settings for modifying a functioning of the VR medical test; receiving a selection input from the user device corresponding to a selection of the GUI elements; receiving a setting input from the user device that corresponds to the selected GUI elements; modifying the user adjustable setting for each of the selected GUI elements according to the corresponding setting input; and updating the protocol based on the user adjustable setting for each of the selected GUI elements and operations associated with the VR device.
    Type: Grant
    Filed: November 26, 2020
    Date of Patent: September 26, 2023
    Assignee: Electric Puppets Incorporated
    Inventor: Ryan Cameron
  • Patent number: 11762476
    Abstract: A virtual reality (VR) or augmented reality (AR) system detects through a camera when a user's hand positions match a predefined position and in response thereto renders an overlay including a crown with which the user can interact to lock onto an object or scene. The system then detects and renders a centre indicator of the crown, tracking the user's hands, enables or disables actions depending on the position of the centre indicator relative to a neutral zone, responds to user hand movement to implement actions, and also detects unlock of the object or scene.
    Type: Grant
    Filed: September 7, 2020
    Date of Patent: September 19, 2023
    Assignee: INTERDIGITAL CE PATENT HOLDINGS, SAS
    Inventors: Sylvain Lelievre, Philippe Schmouker, Jean-Eudes Marvie
  • Patent number: 11762965
    Abstract: Methods and systems may incorporate voice interaction and other audio interaction to facilitate access to prescription related information and processes. Particularly, voice/audio interactions may be utilized to achieve authentication to access prescription-related information and action capabilities. Additionally, voice/audio interactions may be utilized in performance of processes such as obtaining prescription refills and receiving reminders to consume prescription products.
    Type: Grant
    Filed: March 3, 2020
    Date of Patent: September 19, 2023
    Assignee: WALGREEN CO.
    Inventors: Andrew Schweinfurth, Julija Alegra Petkus, Gunjan Dhanesh Bhow