Display Peripheral Interface Input Device Patents (Class 345/156)
  • Patent number: 11994676
    Abstract: Techniques for resolving hemisphere ambiguity are disclosed. One or more magnetic fields are emitted at a handheld controller. The one or more magnetic fields are detected by one or more sensors positioned relative to a headset. Movement data corresponding to the handheld controller or the headset is detected. During a first time interval, a first position and a first orientation of the handheld controller within a first hemisphere are determined based on the detected one or more magnetic fields, and a first discrepancy is calculated based on the first position, the first orientation, and the movement data. During a second time interval, a second position and a second orientation of the handheld controller within a second hemisphere are determined based on the detected one or more magnetic fields, and a second discrepancy is calculated based on the second position, the second orientation, and the movement data.
    Type: Grant
    Filed: March 1, 2022
    Date of Patent: May 28, 2024
    Assignee: Magic Leap, Inc.
    Inventors: Ronald Joseph Degges, Jr., Sheng Wan, Andy Warner, Akash Gujarati
  • Patent number: 11995910
    Abstract: An optical sensing module and an electronic device are provided. The optical sensing module includes a substrate, a plurality of optical sensing elements, and a light-blocking element. The substrate has a sensing region and a non-sensing region around the sensing region. The plurality of optical sensing elements is disposed on the sensing region. The light-blocking element is disposed on the non-sensing region and a portion of the sensing region. The light-blocking element overlaps a portion of the plurality of optical sensing elements in a normal direction of the substrate.
    Type: Grant
    Filed: December 28, 2022
    Date of Patent: May 28, 2024
    Assignee: INNOLUX CORPORATION
    Inventors: Te-Yu Lee, Yu-Tsung Liu, Wei-Ju Liao
  • Patent number: 11996093
    Abstract: An information processing apparatus and an information processing method are provided that enable suitable determination of sensing results used in estimating a user state. The information processing apparatus is provided with a determination unit that determines, on the basis of a predetermined reference, one or more second sensing results used in estimating the user state from among a plurality of first sensing results received from a plurality of devices. The information processing apparatus is further provided with an output control unit that controls an output of information on the basis of the one or more second sensing results.
    Type: Grant
    Filed: July 12, 2018
    Date of Patent: May 28, 2024
    Inventors: Shinichi Kawano, Hiro Iwase, Mari Saito, Yuhei Taki
  • Patent number: 11995899
    Abstract: A head-mounted device (HMD) can be configured to determine a request for recognizing at least one content item included within content framed within a display of the HMD. The HMD can be configured to initiate a head-tracking process that maintains a coordinate system with respect to the content, and a pointer-tracking process that tracks a pointer that is visible together with the content within the display. The HMD can be configured to capture a first image of the content and a second image of the content, the second image including the pointer. The HMD can be configured to map a location of the pointer within the second image to a corresponding image location within the first image, using the coordinate system, and provide the at least one content item from the corresponding image location.
    Type: Grant
    Filed: April 29, 2021
    Date of Patent: May 28, 2024
    Assignee: Google LLC
    Inventors: Qinge Wu, Grant Yoshida, Catherine Boulanger, Erik Hubert Dolly Goossens, Cem Keskin, Sofien Bouaziz, Jonathan James Taylor, Nidhi Rathi, Seth Raphael
  • Patent number: 11995780
    Abstract: The subject technology receives a set of frames. The subject technology detect a first gesture correspond to an open trigger finger gesture. The subject technology receives a second set of frames. The subject technology detects from the second set of frames, a second gesture correspond to a closed trigger finger gesture. The subject technology detects a location and a position of a representation of a finger from the closed trigger finger gesture. The subject technology generates a first virtual object based at least in part on the location and the position of the representation of the finger. The subject technology renders a movement of the first virtual object along a vector away from the location and the position of the representation of the finger within a first scene. The subject technology provides for display the rendered movement of the first virtual object along the vector within the first scene.
    Type: Grant
    Filed: September 9, 2022
    Date of Patent: May 28, 2024
    Assignee: Snap Inc.
    Inventors: Kyle Goodrich, Maxim Maximov Lazarov, Andrew James McPhee, Daniel Moreno
  • Patent number: 11992372
    Abstract: A surgical hub may have cooperative interactions with one of more means of displaying the image from the laparoscopic scope and information from one of more other smart devices. The surgical hub may have the capacity of interacting with these multiple displays using an algorithm or control program that enables the combined display and control of the data distributed across the number of displays in communication with the surgical hub. The hub can obtain display control parameter(s) associated with a surgical procedure. The hub may determine, based on the display control parameter, different contents for different displays. The hub may generate and send the display contents to their respective displays. For example, the visualization control parameter may be a progression of the surgical procedure. The surgical hub may determine different display contents for the primary and the secondary displays based on the progression of the surgical procedure.
    Type: Grant
    Filed: October 2, 2020
    Date of Patent: May 28, 2024
    Assignee: Cilag GmbH International
    Inventors: Frederick E. Shelton, IV, Jason L. Harris, Kevin M. Fiebig, Michael J. Vendely, Shane R. Adams
  • Patent number: 11995239
    Abstract: A display unit comprising a touch surface touchable by a user, a drive unit for moving the touch surface when the touch surface is touched, in particular for haptic feedback to the user, and at least one damping element, which damps mechanical oscillations of the movement of the touch surface. Each damping element comprises an elastic element and a pressure setting element (for example, a valve). This pressure setting element is designed to fill the elastic element in a variable or settable manner with gas for the damping. The damping properties of the display unit can thus be set arbitrarily via the gas pressure inside the elastic element.
    Type: Grant
    Filed: July 20, 2022
    Date of Patent: May 28, 2024
    Assignee: Harman Becker Automotive Systems GmbH
    Inventors: Joerg Welke, Peter Brandt
  • Patent number: 11990107
    Abstract: An information processing apparatus includes a processor configured to, in response to dividing an area of a display surface of a deformable display into multiple areas with a folded portion located as a boundary between the multiple areas as a result of deformation of the deformable display, decide layout of multiple images to be displayed in the area of the display surface. The layout is decided on a basis of information regarding each of the multiple images.
    Type: Grant
    Filed: February 2, 2021
    Date of Patent: May 21, 2024
    Assignee: FUJIFILM Business Innovation Corp.
    Inventor: Kengo Tokuchi
  • Patent number: 11988841
    Abstract: Systems, methods, and computer readable media for voice input for augmented reality (AR) wearable devices are disclosed. Embodiments are disclosed that enable a user to interact with the AR wearable device without using physical user interface devices. A keyword is used to indicate that the user is about to speak an action or command. The AR wearable device divides the processing of the audio data into a keyword module that is trained to recognize the keyword and a module to process the audio data after the keyword. In some embodiments, the AR wearable device transmits the audio data after the keyword to a host device to process. The AR wearable device maintains an application registry that associates actions with applications. Applications can be downloaded, and the application registry updated where the applications indicate actions to associate with the application.
    Type: Grant
    Filed: August 2, 2022
    Date of Patent: May 21, 2024
    Assignee: Snap Inc.
    Inventors: Sharon Moll, Piotr Gurgul, Tomasz Zakrzewski
  • Patent number: 11989475
    Abstract: Examples of methods performed by an electronic device are described. In some examples of the methods, a machine learning model is trained based on a plurality of interaction events and a corresponding plurality of images. In an example, each of the plurality of interaction events corresponds to one of a plurality of displays. In some examples of the methods, a display is selected of the plurality of displays based on the machine learning model. In an example, an object is presented on the display.
    Type: Grant
    Filed: October 9, 2018
    Date of Patent: May 21, 2024
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Fernando Lemes da Silva, Ricardo Ribani
  • Patent number: 11989892
    Abstract: The present disclosure relates to an information processing apparatus, an information processing method, and a program that enable more efficient acquisition of high-quality textures. A motion generation unit generates, on the basis of a state of acquisition of textures that constitute a 3D model of a user, a motion for imaging an area where the textures have not been acquired. Then, a navigation execution unit provides a navigation for making the user execute an action in accordance with the motion generated by the motion generation unit. The present technology can be applied to, for example, an information processing apparatus that performs 3D model generation processing.
    Type: Grant
    Filed: July 26, 2019
    Date of Patent: May 21, 2024
    Assignee: SONY CORPORATION
    Inventor: Ryohei Okada
  • Patent number: 11983329
    Abstract: In one embodiment, a method includes presenting a suggestion to a user of a head-mounted device by the head-mounted device via an assistant xbot during a dialog session between the user and the assistant xbot, wherein the suggestion is associated with a plurality of actions to be performed by an assistant system associated with the assistant xbot, accessing signals from inertial measurement unit (IMU) sensors of the head-mounted device by the head-mounted device during the dialog session, determining a head gesture performed by the user during the dialog session by an on-device head-gesture detection model and based only on the signals from the IMU sensors, and executing a first action from multiple actions by the assistant system executing on the head-mounted device, wherein the first action is selected based on the determined head gesture during the dialog session.
    Type: Grant
    Filed: December 5, 2022
    Date of Patent: May 14, 2024
    Assignee: Meta Platforms, Inc.
    Inventors: Shervin Ghasemlou, Devashish Prasad Joshi, Rongzhou Shen, Riza Kazemi
  • Patent number: 11983352
    Abstract: An MFR sensor array having a first supporting layer. The array has a second supporting layer. The array has a force sensing component disposed between the first and second supporting layers. The array has protrusions combined with the force supporting component, where there are only two supporting layers. When a force is applied to the second supporting layer, the force causes the second supporting layer to contact protrusions so the force is transmitted through the protrusions to the force supporting component and through the force supporting component. An MFR sensor array. A system for sensing. A method for sensing.
    Type: Grant
    Filed: June 18, 2015
    Date of Patent: May 14, 2024
    Assignees: Tactonic Technologies, LLC, New York University
    Inventors: Kenneth Perlin, Charles Hendee, Alex Grau
  • Patent number: 11983398
    Abstract: A foldable screen of an electronic device includes a first display region, a second display region, and a third display region. When the electronic device is in a folded form, an included angle between the first display region and the second display region is less than or equal to a first preset angle. The third display region is disposed between the first display region and the second display region. A touch control method for the electronic device includes detecting a first operation in the third display region and controlling the first display region and/or the second display region. When the electronic device is in the folded form, a primary screen and/or a secondary screen may be controlled by using an operation detected on a side screen.
    Type: Grant
    Filed: July 24, 2020
    Date of Patent: May 14, 2024
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventor: Jing Zhang
  • Patent number: 11983363
    Abstract: A user gesture behavior simulation system includes a touch gesture recording and editing device and a touch gesture simulation device. When at least one touch gesture is implemented on a record touch object with at least one finger of a user, the at least one touch gesture is recorded by the touch gesture recording and editing device, and at least one touch gesture operating trajectory is correspondingly generated by the touch gesture recording and editing device. The touch gesture simulation device includes at least one artificial finger. The at least one artificial finger is driven and moved to an under-test touch object by the touch gesture simulation device. The at least one touch gesture is simulated by the touch gesture simulation device according to the at least one touch gesture operating trajectory.
    Type: Grant
    Filed: September 5, 2023
    Date of Patent: May 14, 2024
    Assignee: PRIMAX ELECTRONICS LTD.
    Inventors: Yung-Tai Pan, Jui-Hung Hsu, Chang-Ming Huang
  • Patent number: 11979984
    Abstract: A splicing display screen is provided. The splicing display screen includes a circuit board, a plurality of display panels, and a plurality of first metal lines. The circuit board includes a plurality of circuit areas, and each of the circuit areas includes one of mounting areas and one of electrical connection areas. A plurality of first metal parts are disposed in the electrical connection areas. The display panels are disposed on the circuit board and positioned in the mounting areas, and metal connection pads on the display panels are electrically connected to the first metal parts by the first metal lines.
    Type: Grant
    Filed: September 8, 2020
    Date of Patent: May 7, 2024
    Assignee: TCL CHINA STAR OPTOELECTRONICS TECHNOLOGY CO., LTD.
    Inventor: Changming Xiang
  • Patent number: 11974619
    Abstract: A smart mask includes a first material layer, at least one display, a first sensor, and a control module. The first material layer is configured to cover a portion of a face of a person. The at least one display is connected to the first material layer and configured to display images over a mouth of the person. The first sensor is configured to detect movement of the mouth of the person and generate a signal indicative of the movement of the mouth. The control module is configured to receive the signal and display the images on the display based on the movement of the mouth.
    Type: Grant
    Filed: August 16, 2023
    Date of Patent: May 7, 2024
    Assignee: Nantworks, LLC
    Inventors: Nicholas James Witchey, Patrick Soon-Shiong
  • Patent number: 11977724
    Abstract: An information processing method includes determining a target area associated with a plurality of widgets that are different, determining a target display mode of the target area, and displaying the plurality of widgets in the target area based on the target display mode. Each of the plurality of widgets obtains a display content based on a target address, and the plurality of widgets are displayed differently in the target area under different display modes.
    Type: Grant
    Filed: February 22, 2022
    Date of Patent: May 7, 2024
    Assignee: LENOVO (BEIJING) LIMITED
    Inventor: Xiao Liu
  • Patent number: 11977224
    Abstract: Systems and methods for headset windowing may include determining a geometry of a first object used with a computing system in a work space; determining, based on information received from a sensor, when the user wearing a VR headset tilts their head to bring the first object into view of the VR headset; and displaying a first window on a display of the VR headset, the first window being dimensioned to conform to the geometry of the first object in view of the headset.
    Type: Grant
    Filed: March 27, 2023
    Date of Patent: May 7, 2024
    Assignee: Voyetra Turtle Beach, Inc.
    Inventors: Juergen Stark, Michael Stark
  • Patent number: 11978169
    Abstract: A wrist-pose isolation system can infer a wrist pose (e.g., the user's hand position relative to her forearm) and can reduce wrist-induced jitter for projection casting in an XR environment. A user's projection cast can be determined as a combination of a “low-wrist contribution” component (e.g., a body-and-arm component) and a “high-wrist contribution” component (e.g., the pose of the wrist with respect to the arm). Using input from a gesture-tracking system, the contribution of the user's wrist pose to the user's current projection cast is calculated as a “wrist-contribution vector.” A projection cast direction can be determined as the interpolation of the current low-wrist contribution component and the high-wrist contribution component. This interpolation can be performed by weighting each by a specified amount and combining them.
    Type: Grant
    Filed: March 13, 2023
    Date of Patent: May 7, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Norah Riley Smith, Matthew Alan Insley
  • Patent number: 11969084
    Abstract: Systems, methods, and apparatuses for tactile user input are described herein. A tactile user input system includes a tactile input device and a smart table. The smart table comprises a display screen, a network interface, and a processing circuit. The processing circuit comprises a processor and a memory. The processing circuit is configured to sense a placement of the tactile input device on the smart table. The processing circuit is further configured to determine a necessary user input and determine a location of the tactile input device on the smart table. The processing circuit is further configured to rearrange a user interface of the smart table. The processing circuit is further configured to communicate a necessary input configuration to the tactile input device and used by the tactile input device to generate a tactile user interface having an appropriate layout based on the necessary input configuration.
    Type: Grant
    Filed: September 8, 2022
    Date of Patent: April 30, 2024
    Assignee: Wells Fargo Bank, N.A.
    Inventors: Kourtney Eidam, Darren M. Goetz, Dennis E. Montenegro
  • Patent number: 11972103
    Abstract: In a computer-implemented method, a portion of an electronic document is displayed on a touch screen display of a portable multifunction device. The displayed portion has a vertical position and a horizontal position in the electronic document. An object is detected on or near the displayed portion of the electronic document. In response to detecting the object, a vertical bar and a horizontal bar are displayed on top of the displayed portion. The vertical bar has a vertical position on top of the displayed portion that corresponds to the vertical position in the electronic document of the displayed portion. The horizontal bar has a horizontal position on top of the displayed portion that corresponds to the horizontal position in the electronic document of the displayed portion. After a predetermined condition is met, display of the vertical bar and of the horizontal bar is ceased.
    Type: Grant
    Filed: October 7, 2022
    Date of Patent: April 30, 2024
    Assignee: Apple Inc.
    Inventors: Scott Forstall, Henri C. Lamiraux, Andrew Emilio Platzer, Michael Matas, Imran Chaudhri
  • Patent number: 11972047
    Abstract: A method for controlling a near eye display system, includes obtaining an eye movement of a user wearing the near eye display system using an eye tracking sensor of the near eye display system, determining a target area of a display of the near eye display system based on the eye movement of the user, the target area of the display being an area that the user is looking at, and controlling a brightness of the target area of the display of the near eye display system to cause a size of a pupil of the user to be in a predetermined range.
    Type: Grant
    Filed: March 20, 2023
    Date of Patent: April 30, 2024
    Assignee: TENCENT AMERICA LLC
    Inventors: John D. Le, Kun Gao, Yi Zhang, Youngshik Yoon, Hao Zheng, Hongdong Li, Jianru Shi
  • Patent number: 11972050
    Abstract: Instances of a single brain computer interface (BCI) system can be implemented on multiple devices. An active instance can control the associated device. The instances can each communicate with a neural decoding system that can receive neural signals from a user, process the neural signals, and output a command based on the processed neural signals. A device running the active instance of can be in communication with the neural decoding system to receive a command. The device can include a display, a non-transitory memory storing instructions, and a processor to execute the instructions to: run an instance of a control program; and execute the task based on the command.
    Type: Grant
    Filed: November 1, 2022
    Date of Patent: April 30, 2024
    Assignees: BROWN UNIVERSITY, THE GENERAL HOSPITAL CORPORATION, THE UNITED STATES GOVERNMENT AS REPRESENTED BY THE DEPARTMENT OF VETERANS AFFAIRS
    Inventors: Leigh Hochberg, John D. Simeral, Tyler Singer-Clark, Ronnie Gross, Thomas Hosman, Anastasia Kapitonava, Rekha Crawford
  • Patent number: 11972059
    Abstract: A gesture-recognition (GR) device made to be held or worn by a user includes an electronic processor configured by program instructions in memory to recognize a gesture. The device or a cooperating system may match a gesture identifier to an action identifier for one or more target devices in a user's environment, enabling control of the target devices by user movement of the GR device in three-dimensional space.
    Type: Grant
    Filed: March 4, 2022
    Date of Patent: April 30, 2024
    Inventors: Gregory I Gewickey, William Zajac, Ha Nguyen, Sam Maliszewski
  • Patent number: 11972040
    Abstract: A virtual space configuration system of an artificial reality system can detect a user posture and provide various corresponding customizations of the system's virtual space. The virtual space configuration system can, when a user is in a seated posture, provide for seated virtual space customizations. In various implementations, these customizations can include allowing adjustment of a floor height; setting a flag that can be surfaced to applications to adjust the applications' mechanics for seated users; customizing display of virtual space boundaries when in seated mode to be less intrusive; providing options to detect when a user leaves seated mode and trigger corresponding actions; provide a passthrough workspace area allowing a user to interact with certain real-world objects naturally without having to remove a virtual reality headset; or automatically determining virtual space dimensions for seated users.
    Type: Grant
    Filed: January 11, 2023
    Date of Patent: April 30, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Samuel Alan Johnson, Shaik Shabnam Nizamudeen Basha, Mahdi Salmani Rahimi, Benjamin Antoine Georges Lefaudeux
  • Patent number: 11972052
    Abstract: Human interactive texture generation and search systems and methods are described. A deep convolutional generative adversarial network is used for mapping information in a latent space into texture models. An interactive evolutionary computation algorithm for searching a texture through an evolving latent space driven by human preference is also described. Advantages of a generative model and an evolutionary computation are combined to realize a controllable and bounded texture tuning process under the guidance of human preferences. Additionally, a fully haptic user interface is described, which can be used to evaluate the systems and methods in terms of their efficiency and accuracy of searching and generating new virtual textures that are closely representative of given real textures.
    Type: Grant
    Filed: April 21, 2022
    Date of Patent: April 30, 2024
    Assignee: UNIVERSITY OF SOUTHERN CALIFORNIA
    Inventors: Shihan Lu, Heather Culbertson, Matthew Fontaine, Mianlun Zheng
  • Patent number: 11972058
    Abstract: A system is provided for generating variable haptic feedback. The system comprises a first haptic feedback device configured to generate haptic feedback according to a received output signal, an input device configured to receive instructions from a remote source, the instructions comprising a haptic feedback output identifier, and a memory device for storing a plurality of haptic feedback profiles, the haptic feedback profiles defining unique haptic feedback patterns characterized at least by a duration and intensity of feedback to be generated by the first haptic feedback device, and said haptic feedback profiles being associated with a haptic feedback output identifier. The system also includes a processor configured to generate and transmit an output signal to the first haptic feedback device according to a haptic feedback profile associated with a received haptic feedback output identifier, said output signal comprising a haptic feedback pattern defined by said haptic feedback profile.
    Type: Grant
    Filed: May 11, 2023
    Date of Patent: April 30, 2024
    Assignee: Capital One Services, LLC
    Inventors: James R. Dillon, Jr., Todd McPherson
  • Patent number: 11966055
    Abstract: A head mounted display system for displaying image content to a user comprises at least one display configured to be worn by a user to present virtual content to first and second eyes of a user, one or more inwardly facing sensors or camera configured to monitor one or both of the users eye and processing electronics. This head mounted display system is configured such that virtual content activity can be initiated and/or driven from eye inputs such as gaze direction, eyelid motions (e.g., blinking), and/or other eye gestures.
    Type: Grant
    Filed: July 19, 2019
    Date of Patent: April 23, 2024
    Assignee: Magic Leap, Inc.
    Inventors: Heather Michelle Martin, Kevin John O'Brien, Pedro Luis Arroyo, Flavio DeOliveira
  • Patent number: 11964198
    Abstract: A directional pad assembly including a directional-control unit having a plate-like structure and two or more plunger elements slidable relative to the plate-like structure; a main support structure to support a centre of the plate-like structure such that the plate-like structure is multi-directional tiltable; an interface frame structure suspended above the plate-like structure of the directional-control unit and in a fixed spatial relationship and non-movable relative to the main support structure; and two or more switches arranged around the main support structure and aligned to the two or more plunger elements. The directional-control unit being operable between a first mode, whereby the plate-like structure is tiltable, and a second mode, whereby the two or more plunger elements are individually slidable relative to the plate-like structure, to activate the two or more switches.
    Type: Grant
    Filed: December 17, 2020
    Date of Patent: April 23, 2024
    Assignee: Razer (Asia-Pacific) Pte. Ltd.
    Inventor: Gil Palma Guerrero, Jr.
  • Patent number: 11967075
    Abstract: A method of measuring working distance between a handheld digital device and eyes of a user, including capturing an image of at least eyes of a user via an onboard camera of the handheld digital device while the user is viewing a display of the handheld digital device and comparing an apparent angular size of a structure of the eyes or face of the user to a previously captured image of the structure of the eyes or the face that was taken in the presence of an object of known size. The method further includes calculating a working distance based on the apparent angular size of the structure of the eyes or the face; and saving at least the working distance to memory or reporting out the calculated working distance on the display. A handheld digital device programmed with an algorithm to perform the method is also included.
    Type: Grant
    Filed: October 17, 2022
    Date of Patent: April 23, 2024
    Inventor: William F. Wiley
  • Patent number: 11968480
    Abstract: A display method includes detecting a write operation on a display surface by a pointer, determining whether an operation button provided on the pointer is pressed or not, displaying an image corresponding to the write operation as a first image on the display surface, when the write operation is detected and it is determined that the operation button is pressed, and erasing the first image when it is determined that the operation button is not pressed during a period when the first image is displayed.
    Type: Grant
    Filed: March 25, 2022
    Date of Patent: April 23, 2024
    Assignee: SEIKO EPSON CORPORATION
    Inventor: Tomonori Kumagai
  • Patent number: 11966518
    Abstract: Implementations set forth herein relate to effectuating device arbitration in a multi-device environment using data available from a wearable computing device, such as computerized glasses. The computerized glasses can include a camera, which can be used to provide image data for resolving issues related to device arbitration. In some implementations, a direction that a user is directing their computerized glasses, and/or directing their gaze (as detected by the computerized glasses with prior permission from the user), can be used to prioritize a particular device in a multi-device environment. A detected orientation of the computerized glasses can also be used to determine how to simultaneously allocate content between a graphical display of the computerized glasses and another graphical display of another client device. When content is allocated to the computerized glasses, content-specific gestures can be enabled and actionable at the computerized glasses.
    Type: Grant
    Filed: September 19, 2022
    Date of Patent: April 23, 2024
    Assignee: GOOGLE LLC
    Inventors: Alexander Chu, Jarlan Perez
  • Patent number: 11963792
    Abstract: Devices, systems and methods that track various aspects of a person's sleep and environment to optimize one or more aspects of the user's environment and sleep conditions, quality and duration, together or alone, and help make the one or more users maintain and prolong his or her deep sleep status and improve the their sleep duration and quality.
    Type: Grant
    Filed: March 15, 2016
    Date of Patent: April 23, 2024
    Assignee: DP Technologies, Inc.
    Inventors: Philippe Richard Kahn, Arthur Kinsolving, Mark Andrew Christensen, Venkat Easwar, Steven D. Powell
  • Patent number: 11966509
    Abstract: An eye-tracking system (100, 302) comprises light source(s) (102, 202, 306); means (104, 200, 308) for changing direction of light beam (210) emitted by light source(s), wherein said means is to be employed to steer light beam to scan surface of user's eye (214); and light sensor(s) (106, 310) that is to be employed to sense reflected light signals. Said means is to be employed to detect when light beam is incident upon pupil of user's eye, based on variations in at least one parameter of reflected light signals; and steer light beam to be incident upon centre of pupil, wherein light beam is determined to be incident upon centre of pupil when minimum value from amongst values of at least one parameter of reflected light signals.
    Type: Grant
    Filed: January 31, 2023
    Date of Patent: April 23, 2024
    Assignee: Pixieray Oy
    Inventors: Ari Pitkänen, Klaus Melakari
  • Patent number: 11960641
    Abstract: The present disclosure relates to determining when the head position of a user viewing user interfaces in a computer-generated reality environment is not in a comfortable and/or ergonomic position and repositioning the displayed user interface so that the user will reposition her/his head to view the user interface at a more comfortable and/or ergonomic head position.
    Type: Grant
    Filed: June 21, 2022
    Date of Patent: April 16, 2024
    Assignee: Apple Inc.
    Inventor: Aaron M. Burns
  • Patent number: 11960660
    Abstract: A terminal device according to one aspect presents an augmented reality space to a user. The terminal device includes: an image-capturing unit configured to capture an image of a real space; a display unit configured to display an augmented reality space image representing the augmented reality space including the real space captured by the image-capturing unit and a virtual object; a determination unit configured to determine at least a part of the virtual object as an operation target; and an object control unit configured to control an operation of the operation target in the augmented reality space. The object control unit detects a direction and an amount of a movement of the terminal device after the operation target is determined and moves the operation target based on the detected direction and amount of the movement of the terminal device.
    Type: Grant
    Filed: July 16, 2021
    Date of Patent: April 16, 2024
    Assignee: POPOPO INC
    Inventor: Shinnosuke Iwaki
  • Patent number: 11960091
    Abstract: Embodiments of the present disclosure provide a method, a computer program product, and a wearable device for controlling display of content. The method is performed in a wearable device (10) comprising a head mounted display having a display region (12). The method comprises causing (S12) to display a first visual content (32) on a first portion (14) of the display region, corresponding to an eye gaze direction (22) of the user (20). The method comprises determining (S13) to transmit a second visual content (34) to one or more external display devices (40a-40 n) based on presence of the one or more external display devices (40a-40 n) in a field of view, FoV, of the wearable device (10). Further, the method comprises sending (S14) a request to at least one of the one or more external display devices (40a-40 n) to display the second visual content (34). The method further comprises causing (S15) to display the second visual content (34), at least outside the first portion (14) of the display region (12).
    Type: Grant
    Filed: May 27, 2020
    Date of Patent: April 16, 2024
    Assignee: Telefonaktiebolaget LM Ericsson (publ)
    Inventors: Tommy Arngren, Andreas Kristensson, Peter Ökvist, Jonas Pettersson
  • Patent number: 11960834
    Abstract: An attention application, such as a web browser, includes a pipeline optimized for faster, more secure, and more private, viewing of hypermedia documents using a reader mode. The reader mode is “always on” in the sense that a classifier runs on every web page and every compatible page is rendered in the reader mode and not rendered in full, referred to as the bloat page. Significant time savings are gained by avoiding fetching and rendering the bloat page at all because the bloat page devours network bandwidth and computing resources. Avoiding loading the bloat page also avoids exposing the user to what are often abusive privacy infringements and security vulnerabilities from running executable code in the browser, while providing an uncluttered viewing experience of content that is actually of interest to the user.
    Type: Grant
    Filed: September 30, 2019
    Date of Patent: April 16, 2024
    Assignee: Brave Software, Inc.
    Inventors: Benjamin Livshits, Peter Snyder, Andrius Aucinas
  • Patent number: 11960705
    Abstract: A user terminal device and a controlling method thereof are provided. The user terminal device includes a display configured to be divided into a first area and a second area which is larger than the first area with reference to a folding line, a cover disposed on a rear side of the display, a detector configured to detect a user interaction on the display and the cover, and a controller configured to, in response to the display being folded along the folding line such that the first area and the second area face each other, control the detector to detect a user interaction through an exposure area, which is an exposed part of the second area, and the cover, and, in response to the display being folded such that the two parts of the cover face with each other with reference to the folding line, control the detector to detect a user interaction through the first area and the second area.
    Type: Grant
    Filed: November 22, 2022
    Date of Patent: April 16, 2024
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Dong-goo Kang, Yun-kyung Kim, Yong-yeon Lee, Ji-yeon Kwak, Hyun-jin Kim
  • Patent number: 11961288
    Abstract: An object is to reduce the processing time. One embodiment is a non-transitory computer readable storage medium storing a program for causing an information processing apparatus to function as: a detection unit configured to detect an area of a first object for each second object from a captured image obtained by capturing the second object including the first object; a first display control unit configured to display a display item indicating a detected area at a position corresponding to the area detected by the detection unit in the captured image displayed on a screen; and a second display control unit configured to display a temporary area that is created based on the area of the first object detected in the second object in which area detection has succeeded in the captured image of the second object that is displayed on the screen and in which area detection has failed.
    Type: Grant
    Filed: July 13, 2021
    Date of Patent: April 16, 2024
    Assignee: Canon Kabushiki Kaisha
    Inventor: Natsumi Tokunaga
  • Patent number: 11954241
    Abstract: A position of a user is recognized. A display unit (102) displays a content-display-frame (112). A determination unit (103) determines whether the position of the user is within a predetermined range from a wall surface (12) or not. The display control unit (104) configured to display a content-display-frame on a floor surface (11) when the position of the user is farther than the predetermined range from the wall surface, and configured to display an operation input area (113) that receives an operation input from the user on the wall surface while displaying a content-display-frame to span from the floor surface to the wall surface when the position of the user is within the predetermined range from the wall surface.
    Type: Grant
    Filed: January 6, 2020
    Date of Patent: April 9, 2024
    Assignee: SHIMIZU CORPORATION
    Inventors: Takashi Matsumoto, Yuya Igarashi, Motoaki Yamazaki, Michihito Shiraishi
  • Patent number: 11954263
    Abstract: Systems and methods for gesture control are described. In some embodiments, a system may include a wearable device configured to be worn on a wrist of a user. The wearable device may include a hub and a wristband. The hub may include a plurality of hub electrodes and a biopotential microchip. The wristband may include one or more wristband electrodes and one or more wristband conductors, which may electrically connecting the one or more wristband electrodes to the electrical port of a sealed housing of the hub, which may in turn electrically connect the one or more wristband electrodes to inputs of the biopotential microchip.
    Type: Grant
    Filed: September 26, 2022
    Date of Patent: April 9, 2024
    Assignee: Pison Technology, Inc.
    Inventors: Tristan McLaurin, Tanya Wang, David Cipoletta, Dexter Ang
  • Patent number: 11954251
    Abstract: Enhanced eye-tracking techniques for augmented or virtual reality display systems. An example method includes obtaining an image of an eye of a user of a wearable system, the image depicting glints on the eye caused by respective light emitters, wherein the image is a low dynamic range (LDR) image; generating a high dynamic range (HDR) image via computation of a forward pass of a machine learning model using the image; determining location information associated with the glints as depicted in the HDR image, wherein the location information is usable to inform an eye pose of the eye.
    Type: Grant
    Filed: April 21, 2023
    Date of Patent: April 9, 2024
    Assignee: Magic Leap, Inc.
    Inventors: Hao Zheng, Zhiheng Jia
  • Patent number: 11954278
    Abstract: The present application provides a display panel and an electronic device. A functional region and a transition region of the display panel are located between two adjacent touch electrodes, each touch electrode is electrically connected to a first signal transmission line, first signal shielding lines are disposed between the first signal transmission lines of a same touch electrode group and the touch electrodes of an adjacent touch electrode group, and a part of the first signal shielding line is located in the transition region, so as to alleviate a problem of poor touch performance of an existing out-cut screen.
    Type: Grant
    Filed: November 26, 2021
    Date of Patent: April 9, 2024
    Assignee: WUHAN CHINA STAR OPTOELECTRONICS SEMICONDUCTOR DISPLAY TECHNOLOGY CO., LTD.
    Inventors: Liang Fang, Ding Ding
  • Patent number: 11954774
    Abstract: Systems and methods enable users to build augmented reality (AR) experiences with Internet of Things (IoT) devices. The system includes an AR object studio that includes a list of IoT devices and control signals for the respective IoT devices and a list of AR objects (e.g., an AR lens). The AR object studio receives selections from users and correlates at least one IoT device to at least one AR object in response to the user selections. During use, a server receives an indication that an AR object has been activated and interacted with on a display of an AR camera device and, in response, sends a control signal to a correlated IoT device. Conversely, the server may receive a signal from an IoT device and, in response, present and control a correlated AR object on the display of the AR camera device.
    Type: Grant
    Filed: August 29, 2021
    Date of Patent: April 9, 2024
    Assignee: Snap Inc.
    Inventors: Yu Jiang Tham, Rajan Vaish, Andrés Monroy-Hernández, Sven Kratz, Ana Maria Cardenas Gasca
  • Patent number: 11956874
    Abstract: A wrist-wearable electronic device comprising first and second light emitting elements, a sensor, and a processor. The processor is configured to transmit a first command to the first light emitting element in response to the wrist-wearable device reaching a forward position relative to a user based on data received from the sensor and transmit a second command to the second light emitting element in response to the wrist-wearable device reaching a rearward position relative to the user based on the data from the sensor.
    Type: Grant
    Filed: June 6, 2023
    Date of Patent: April 9, 2024
    Assignee: Garmin International, Inc.
    Inventors: Jonathan R. Hosler, Aaron J. Lindh, Steven J. Christy, Vincent G. Marco, Thomas I. Loschen, Brent E. Barberis, Hans K. Fritze
  • Patent number: 11955298
    Abstract: A button module is provided. The button module comprises a base, a pressing part, and an elastic part. The pressing part includes a fixed end and a free end. The fixed end is pivotally connected to the base in a first axial direction. The elastic part is disposed on a side of the pressing part facing the base. The elastic part includes a first damping portion and a second damping portion selectively pressing against the base, where a hardness of the first damping portion is different from a hardness of the second damping portion.
    Type: Grant
    Filed: August 30, 2022
    Date of Patent: April 9, 2024
    Assignee: ASUSTEK COMPUTER INC.
    Inventors: Te-Wei Huang, Zih-Siang Huang, Jhih-Wei Rao, Hung-Chieh Wu, Liang-Jen Lin
  • Patent number: 11956288
    Abstract: A smart conference room system is provided. The smart conference room system includes an intelligent conference interactive panel including one or more processors configured to convert user input into a conference information, and a memory configured to store the conference information. The intelligent conference interactive panel is configured to transmit conference information to one or more terminal devices. The one or more terminal devices are configured to perform a controllable operation based on the conference information transmitted to the one or more terminal devices.
    Type: Grant
    Filed: June 24, 2022
    Date of Patent: April 9, 2024
    Assignee: BOE Technology Group Co., Ltd.
    Inventors: Xiangxiang Zou, Yangyang Zhang, Longyu Wang, Hailong Zhang
  • Patent number: 11953690
    Abstract: Multiple head-mounted devices and/or other electronic devices can operate in concert to provide multiple users with shared experiences and content enjoyment. Such operations can be facilitated by a connection between multiple head-mounted devices and/or other electronic devices to allow different users to receive content. Such a connection can be made possible by a connector that directly and physically connects head-mounted devices and/or other electronic devices to each other and transmits signals there between. By providing a physical connection, the signals can be efficiently transmitted even when other types of connections (e.g., wireless) are not available.
    Type: Grant
    Filed: March 15, 2023
    Date of Patent: April 9, 2024
    Assignee: Apple Inc.
    Inventors: Forrest C. Wang, Ritu Shah