Gesture-based Patents (Class 715/863)
  • Patent number: 12288493
    Abstract: An information processing apparatus includes a display, a memory that stores display data to be displayed on the display, and a processor that controls the stored display data. The processor is configured to switch the display between a first display mode in which an entire screen area of the display is used as a display area and a second display mode in which the entire screen area is divided into a plurality of areas, to select a screen on which any display mode of the first display mode and the second display mode is selectable by an operation of a user, and to control the selection screen to be hidden after a predetermined time elapses since the operation of the user to select the display mode is performed on the selection screen, and to change a setting of the predetermined time based on a predetermined condition.
    Type: Grant
    Filed: January 31, 2024
    Date of Patent: April 29, 2025
    Assignee: Lenovo (Singapore) Pte. Ltd.
    Inventors: Yuichi Sone, Yoshinori Ito
  • Patent number: 12284128
    Abstract: Disclosed is a 5th generation (5G) or pre-5G communication system for supporting a data transmission rate higher than that of a 4th generation (4G) communication system, such as long term evolution (LTE). The purpose of the disclosure is to detect interference between base stations in a wireless communication system, and a base station operating method can comprise the steps of: receiving signals through a resource allocated for reference signals (RSs) for interference measurement; detecting at least one RS on the basis of the signals; and determining that at least one among the RSs has been received on the basis of cross-correlation values between candidate RSs.
    Type: Grant
    Filed: July 9, 2020
    Date of Patent: April 22, 2025
    Assignee: Samsung Electronics Co., Ltd.
    Inventors: Dongjae Lee, Hyuncheol Kim
  • Patent number: 12282653
    Abstract: An interactive device is described that is configured to: display an output of a remote device, wherein an output delay exists between the output being generated by the remote device and the output being displayed on the interactive device, generate an interaction data in dependence on an interaction with the interactive device at a location, transmit the interaction data to the remote device, display an updated output of the remote device, the updated output being generated by the remote device subsequent to receiving the interaction data, wherein the interactive device being further configured to: generate an interaction image of an area of the updated output of the remote device corresponding to the location of the interaction, determine an interaction indication in dependence on the interaction image, displaying an interaction indication on the interactive device at a location of one or more further interactions with the interactive device.
    Type: Grant
    Filed: February 5, 2021
    Date of Patent: April 22, 2025
    Assignee: FLATFROG LABORATORIES AB
    Inventors: Markus Andreasson, Gunnar Weibull
  • Patent number: 12282607
    Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.
    Type: Grant
    Filed: April 27, 2022
    Date of Patent: April 22, 2025
    Assignee: Snap Inc.
    Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
  • Patent number: 12283948
    Abstract: A touch button, a control method, and an electronic device are provided. The touch button includes a first conductive part, a second conductive part, and a touch part, where the second conductive part surrounds the first conductive part to form a capacitance structure. The method includes receiving a touch input acting on the touch part; detecting whether the second conductive part has any capacitance change in a plurality of preset directions; and determining, in a case a capacitance change exists in a target direction among the plurality of preset directions, a control instruction corresponding to the target direction. According to the method, when the touch button receives the touch input, the touch part transfers pressure to the first conductive part, so that the first conductive part moves relative to the second conductive part resulting in a change of capacitance.
    Type: Grant
    Filed: June 21, 2022
    Date of Patent: April 22, 2025
    Assignee: VIVO MOBILE COMMUNICATION CO., LTD.
    Inventor: Honggen Li
  • Patent number: 12279081
    Abstract: An information processing apparatus includes a processing unit that generates an operation log of an operation target device operated by a user, by identifying an operation content of the operation target device, based on a captured image obtained by capturing an image of the operation target device, and operation pattern information of a model of the operation target device.
    Type: Grant
    Filed: February 24, 2022
    Date of Patent: April 15, 2025
    Assignee: Yokogawa Electric Corporation
    Inventors: Tomoaki Eto, Sawako Ishihara, Hayato Waki, Ken Muramatsu
  • Patent number: 12271532
    Abstract: Systems and methods herein describe a multi-modal interaction system. The multi-modal interaction system, receives a selection of an augmented reality (AR) experience within an application on a computer device, displays a set of AR objects associated with the AR experience on a graphical user interface (GUI) of the computer device, display textual cues associated with the set of augmented reality objects on the GUI, receives a hand gesture and a voice command, modifies a subset of augmented reality objects of the set of augmented reality objects based on the hand gesture and the voice command, and displays the modified subset of augmented reality objects on the GUI.
    Type: Grant
    Filed: April 3, 2024
    Date of Patent: April 8, 2025
    Assignee: Snap Inc.
    Inventors: Jonathan Solichin, Xinyao Wang
  • Patent number: 12273339
    Abstract: A system and method provide automatic access to applications or data. A portable physical device, referred to herein as a Personal Digital Key or “PDK”, stores one or more profiles in memory, including a biometric profile acquired in a secure trusted process and uniquely associated with a user that is authorized to use and associated with the PDK. The PDK wirelessly transmits identification information including a unique PDK identification number, the biometric profile and a profile over a secure wireless channel to a reader. A computing device is coupled to the reader. An auto login server is coupled to the reader and the computing device and launches one or more applications associated with a user name identified by the received profile.
    Type: Grant
    Filed: June 10, 2021
    Date of Patent: April 8, 2025
    Assignee: Proxense, LLC
    Inventor: John J. Giobbi
  • Patent number: 12265667
    Abstract: Embodiments of this application provide a stylus-based data processing method and apparatus, which are applied to a communication system. The communication system includes a stylus and an electronic device. The electronic device displays a first interface. After a user performs a first operation on a first target area in the first interface by using the stylus, the electronic device may determine first target content in the first target area. The stylus may obtain the first target content from the electronic device, and obtain a first target result corresponding to the first target content. The electronic device may obtain the first target result from the stylus and display the first target result on the first interface.
    Type: Grant
    Filed: August 26, 2022
    Date of Patent: April 1, 2025
    Assignee: Honor Device Co., Ltd.
    Inventors: Jing Yang, Yingjun Xi
  • Patent number: 12265690
    Abstract: A computer system displays a first view of a user interface of a first application with a first size at a first position corresponding to a location of at least a portion of a palm that is currently facing a viewpoint corresponding to a view of a three-dimensional environment provided via a display generation component. While displaying the first view, the computer system detects a first input that corresponds to a request to transfer display of the first application from the palm to a first surface that is within a first proximity of the viewpoint. In response to detecting the first input, the computer system displays a second view of the user interface of the first application with a second size and an orientation that corresponds to the first surface at a second position defined by the first surface.
    Type: Grant
    Filed: November 29, 2023
    Date of Patent: April 1, 2025
    Assignee: APPLE INC.
    Inventors: Stephen O. Lemay, Gregg S. Suzuki, Matthew J. Sundstrom, Jonathan Ive, Jeffrey M. Faulkner, Jonathan R. Dascola, William A. Sorrentino, III, Kristi E. S. Bauerly, Giancarlo Yerkes, Peter D. Anton
  • Patent number: 12248527
    Abstract: A computer-implemented method comprises displaying a grid comprising a plurality of cells; receiving user input modifying a state of one or more of the plurality of cells to create a graphical shape in the grid, wherein each of the plurality of cells is limited to one of a plurality of predefined states; requesting an internet protocol (IP) address corresponding to the graphical shape; and in response to receiving the IP address corresponding to the graphical shape, retrieving a web resource located at the IP address.
    Type: Grant
    Filed: November 11, 2022
    Date of Patent: March 11, 2025
    Assignee: International Business Machines Corporation
    Inventor: Venkataramana Logasundaram Jaganathan
  • Patent number: 12248654
    Abstract: Systems and methods are described for generating and interacting with an extended reality user interface. For example, a system may detect a launch of an extended reality application on a computing device. The system may generate, in a first virtual zone of a plurality of virtual zones of an extended reality user interface, a first plurality of selectable elements. The system may identify, by a camera, a surface in a real world environment of the computing device and map the plurality of virtual zones to a plurality of physical zones on the surface. The system may detect, by the camera, a first gesture over a first physical zone of the plurality of physical zones. In response to determining that the first physical zone corresponds to the first virtual zone, the system may execute a first interaction action on the first plurality of selectable elements based on the first gesture.
    Type: Grant
    Filed: February 16, 2024
    Date of Patent: March 11, 2025
    Assignee: ISOVIST LIMITED
    Inventors: Dale Alan Herigstad, Jack Turpin, Eric Fanghanel Santibanez
  • Patent number: 12246859
    Abstract: Disclosed are an apparatus and method for controlling an aerial vehicle. The apparatus includes a camera that obtains an terrain image during flight, a sensor that obtains scan information of a landing point of the aerial vehicle, and a controller that estimates a location of the landing point based on the terrain image if it is determined that the landing point is recognized in the terrain image, determines whether an obstacle exists at the landing point based on the scan information, and determines whether landing of the aerial vehicle is possible based on a determination result.
    Type: Grant
    Filed: November 10, 2022
    Date of Patent: March 11, 2025
    Assignees: Hyundai Motor Company, Kia Corporation
    Inventors: Hyun Jee Ryu, Kyu Nam Kim, Chang Hyun Sung, Jun Young Lim
  • Patent number: 12243238
    Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by-joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.
    Type: Grant
    Filed: July 20, 2023
    Date of Patent: March 4, 2025
    Assignee: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
  • Patent number: 12236145
    Abstract: A display apparatus displays handwritten data that is input with an inputter. The display apparatus includes processing circuitry; and a memory storing computer-executable instructions that cause the processing circuitry to convert the handwritten data into a character string; and conceal the character string generated by converting the handwritten data, in response to determining that a first time has elapsed after the character string is displayed.
    Type: Grant
    Filed: August 30, 2021
    Date of Patent: February 25, 2025
    Assignee: RICOH COMPANY, LTD.
    Inventor: Kiyoshi Kasatani
  • Patent number: 12236528
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Grant
    Filed: September 30, 2022
    Date of Patent: February 25, 2025
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A. Horowitz, David S. Holz
  • Patent number: 12238092
    Abstract: A system and method provide automatic access to applications or data. A portable physical device, referred to herein as a Personal Digital Key or “PDK”, stores one or more profiles in memory, including a biometric profile acquired in a secure trusted process and uniquely associated with a user that is authorized to use and associated with the PDK. The PDK wirelessly transmits identification information including a unique PDK identification number, the biometric profile and a profile over a secure wireless channel to a reader. A computing device is coupled to the reader. An auto login server is coupled to the reader and the computing device and launches one or more applications associated with a user name identified by the received profile.
    Type: Grant
    Filed: June 10, 2021
    Date of Patent: February 25, 2025
    Assignee: Proxense, LLC
    Inventor: John J. Giobbi
  • Patent number: 12236006
    Abstract: An image forming device stores a play area where a user wearing a head mounted display is movable during play of an application in a space around the user. The image forming device causes the head mounted display to display an image indicating the stored play area. The image forming device receives an operation performed by the user to edit the play area. The image forming device expands or reduces the play area according to the operation performed by the user.
    Type: Grant
    Filed: February 14, 2022
    Date of Patent: February 25, 2025
    Assignee: SONY INTERACTIVE ENTERTAINMENT INC.
    Inventors: Yuto Hayakawa, Shoi Yonetomi, Yurika Mulase, Masanori Nomura
  • Patent number: 12229397
    Abstract: Techniques for controller graphical user interface based on interaction data are described and are implementable to generate a controller graphical user interface for display by a first device to control digital content displayed on a second device. For instance, an application such as a mobile gaming application is displayed on a mobile device and digital content from the application is communicated for display by a display device as part of a content connectivity session. One or more control regions of the application are determined based on a monitored interaction with a user interface of the mobile device. One or more control graphics are identified that correlate to the one or more control regions. A controller graphical user interface is generated that displays the one or more control graphics and filters out extraneous digital content. The controller graphical user interface is then displayed by the mobile device.
    Type: Grant
    Filed: February 8, 2023
    Date of Patent: February 18, 2025
    Assignee: Motorola Mobility LLC
    Inventors: Amit Kumar Agrawal, Satyabrata Rout, J Amarnath, Gokula Ramanan
  • Patent number: 12223163
    Abstract: An interaction method, an interaction apparatus, an electronic device, and a storage medium are provided. The interaction method includes: displaying a target object in a first display mode on a target object display interface, where in the first display mode, the target object display interface includes a first interactive control; receiving a display mode switching on the target object display interface; and in response to the display mode switching, displaying the target object in a second display mode on the target object display interface, where in the second display mode, the target object display interface includes a second interactive control different from the first interactive control.
    Type: Grant
    Filed: December 21, 2023
    Date of Patent: February 11, 2025
    Assignee: BEIJING ZITIAO NETWORK TECHNOLOGY CO., LTD.
    Inventors: Weiyi Chang, Ziyang Zheng
  • Patent number: 12219239
    Abstract: A method of controlling an electronic device by recognizing movement of an object includes obtaining at least one image including an image of the object; dividing the obtained at least one image into a middle zone and a peripheral zone; extracting one or more feature points of the object that are within the peripheral zone; recognizing movement of the object based on the extracted one or more feature points; and controlling the electronic device based on the recognized movement.
    Type: Grant
    Filed: October 5, 2021
    Date of Patent: February 4, 2025
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Yevhenii Yakishyn, Svitlana Alkhimova
  • Patent number: 12216881
    Abstract: A method and device for operating an application icon are provided, to prompt a user where a currently operated application icon is located. The method includes: displaying a first screen page of a desktop, wherein the first screen page comprises N application icons, the N application icons comprise a first application icon and at least one first-layer application icon, and the first-layer application icon is located around the first application icon and adjacent to the first application icon; and the first-layer application icon comprises a foreground layer and a background layer; and receiving an operation of a user related to the first application icon; and in response to the operation, displaying the at least one first-layer application icon in a first display manner, wherein the first display manner comprises that a foreground layer of the at least one first-layer application icon moves relative to the first application icon.
    Type: Grant
    Filed: June 6, 2023
    Date of Patent: February 4, 2025
    Assignee: Huawei Technologies Co., Ltd.
    Inventor: Lang Song
  • Patent number: 12216829
    Abstract: Methods, systems, and devices for device-inputted commands are described. A system may receive physiological data, including motion data, associated with a user from a wearable device worn by the user and may detect multiple motion pulses based on the motion data, where each motion pulse includes motion data that exceeds a motion threshold. The system may additionally identify an input command pattern comprising at least one motion pulse preceded and followed by a static period in a time domain. Additionally, the system may identify one or more user inputs based on the input command pattern matching a reference command pattern and generate one or more instructions based on identifying the one or more user inputs.
    Type: Grant
    Filed: April 26, 2022
    Date of Patent: February 4, 2025
    Assignee: Oura Health Oy
    Inventors: Heli Koskimäki, Johanna Still, Mari Karsikas, Alec Singleton, Petteri Lajunen, Henri Huttunen, Veli-Pekka Halme, Marcus Ziade, Janne Kukka
  • Patent number: 12210687
    Abstract: The disclosure provides gesture recognition methods and apparatuses, and relate to the field of artificial intelligence. One example gesture recognition method includes obtaining an image stream, and determining, based on a plurality of consecutive frames of hand images in the image stream, whether a user makes a preparatory action. When the user makes the preparatory action, continuing to obtain an image stream, and determining a gesture action of the user based on a plurality of consecutive frames of hand images in the continuously obtained image stream. Next, further responding to the gesture action to implement gesture interaction with the user. In this application, in order to reduce the erroneous recognition occurring in a gesture recognition process, the preparatory action is determined before gesture recognition is performed.
    Type: Grant
    Filed: March 8, 2022
    Date of Patent: January 28, 2025
    Assignee: Huawei Technologies Co., Ltd.
    Inventors: Xiaofei Wu, Fei Huang, Songcen Xu, Youliang Yan
  • Patent number: 12203771
    Abstract: An information processing device that is configured to include: a function allocating unit that allocates a first function to a first peripheral area inside a screen along a first side of the screen and allocates a second function to a second peripheral area inside the screen along a second side of the screen; a touch detection unit that detects touch operation of a user on the screen; and a function executing unit that upon detection of a slide operation from outside the peripheral area to inside the peripheral area, implements functions allocated to peripheral areas that are the slide operation destination based on the touch operation in the peripheral area that is the destination of the detected touch operation.
    Type: Grant
    Filed: April 26, 2023
    Date of Patent: January 21, 2025
    Assignee: FAURECIA CLARION ELECTRONICS CO., LTD.
    Inventor: Takuma Shioguchi
  • Patent number: 12200163
    Abstract: An electronic device with a dynamic wallpaper is provided. The electronic device includes a touch screen and a processor. The processor is coupled to the touch screen. The processor displays a first dynamic wallpaper mode by the touch screen when determining that the electronic device is in a non-lock screen mode.
    Type: Grant
    Filed: March 4, 2022
    Date of Patent: January 14, 2025
    Assignee: ASUSTeK COMPUTER INC.
    Inventors: Tzu-Chuan Weng Huang, Ya-Wen Mei
  • Patent number: 12192612
    Abstract: A shooting method is provided. By implementing the shooting method provided in embodiments of this application, a terminal electronic device such as a smartphone or a smart television can determine to change a shooting mode based on a recognized user gesture. The electronic device can recognize a user's waving or sliding gesture more accurately based on the user's hand position change track in an image frame sequence and whether there are hand movements in two directions in the position changes.
    Type: Grant
    Filed: August 26, 2022
    Date of Patent: January 7, 2025
    Assignee: HONOR DEVICE CO., LTD.
    Inventors: Shiyu Zhu, Yonghua Wang
  • Patent number: 12189864
    Abstract: The present invention is an information processing apparatus adapted to a user interface that enables non-contact operation according to a screen displayed on a display. The information processing apparatus includes an input operation identification unit that identifies predetermined input operation on the basis of a sequence of position information of a specific part of a user in a predetermined spatial coordinate system, and a processing execution unit that executes predetermined processing on the basis of identified the predetermined input operation. The input operation identification unit identifies the predetermined input operation in a case of judging, on the basis of a second sequence of the position information, that the specific part of the user is in a moving state after the specific part of the user is in a staying state at a reference position in the predetermined spatial coordinate system on the basis of a first sequence of the position information.
    Type: Grant
    Filed: March 24, 2021
    Date of Patent: January 7, 2025
    Assignee: SONY SEMICONDUCTOR SOLUTIONS CORPORATION
    Inventors: Suguru Kobayashi, Ryuichi Omiya, Takahisa Ohgami, Takumi Tsuji
  • Patent number: 12182905
    Abstract: A method, a processing device, and a system for information display are provided, and the system includes a light transmissive display. A first information extraction device extracts spatial position information of a user, and a second information extraction device extracts spatial position information of a target object. The processing device performs the following steps. Display position information of virtual information of the target object on the display is determined according to the spatial position information of the user and the spatial position information of the target object. The display position information includes a first display reference position corresponding to a previous time and a second display reference position corresponding to a current time. An actual display position of the virtual information on the display corresponding to the current time is determined according to a distance between the first display reference position and the second display reference position.
    Type: Grant
    Filed: September 7, 2022
    Date of Patent: December 31, 2024
    Assignee: Industrial Technology Research Institute
    Inventors: Yi-Wei Luo, Jian-Lung Chen, Ting-Hsun Cheng, Yu-Ju Chao, Yu-Hsin Lin
  • Patent number: 12183103
    Abstract: Systems and methods for transferring data via a finger tracking smart device from a first user interface (“UI”) to a second UI is provided. The data transferred may be documentation data, including signatures and corrections. The finger tracking smart device may include one or more smart lenses. Methods may include triggering a tracking of the movement of the user's fingers on the first UI and further tracking a start point and an end point of the movement of the user's fingers based on detection of deliberate movements and gestures. Methods may further include capturing a segment of data within the start point of movement and the end point of movement and storing the segment of data in memory on the finger tracking smart device. Methods may further include updating the second UI based on an instruction in a data packet transmitted to the second UI, by inputting the segment of data at a point of movement of the user's fingers on the second UI.
    Type: Grant
    Filed: May 23, 2023
    Date of Patent: December 31, 2024
    Assignee: Bank of America Corporation
    Inventors: Jennifer Sanctis, Mary Bangs, Veronica Andrea Cadavid, Taylor Farris, Trish Gillis, Jesse James Godley, Brian Meyers, Vishwas Korde
  • Patent number: 12163368
    Abstract: A vehicle operation detection device includes a storage unit configured to store a trained model obtained by machine learning using training data in which an image captured in advance and a body part used for a gesture of a user are associated with each other, an entry determination unit configured to determine whether, based on a position of the body part in the image obtained by inputting a newly captured image into the trained model, the body part enters a recognition area set in an imaging area of a camera, and a gesture determination unit configured to calculate a displacement vector of the body part based on images captured at a time interval after it is determined that the body part enters the recognition area, and determine, in accordance with whether a direction of the displacement vector is a direction corresponding to the gesture, whether the gesture is made.
    Type: Grant
    Filed: June 1, 2022
    Date of Patent: December 10, 2024
    Assignee: AISIN CORPORATION
    Inventor: Takuya Tamura
  • Patent number: 12164943
    Abstract: In an aspect a system for dynamic user interface generation includes a computing device configured to capture user interaction data from an engagement module on the user interface, to receive server feedback data, to determine a current user interface state as a function of the captured user interaction data and the received server feedback data, to predict an optimal user interface state as a function of the user interaction data and the server feedback data, and to generate an updated display data structure based on the predicted optimal user interface state, and to configure a remote computing device is configured to receive further user interaction data from the user. The computing device then updates the display data structure and consequently the user interface as a function of the further user interaction data.
    Type: Grant
    Filed: July 10, 2023
    Date of Patent: December 10, 2024
    Assignee: THE LONDON OSTEOPOROSIS CLINIC LIMITED
    Inventor: Taher Mahmud
  • Patent number: 12164743
    Abstract: A portable electronic device having a touch screen with a floating soft trigger icon for enabling various functions of the electronic device, such as bar code reading, capturing RFID data, capturing video and images, calling applications, and/or placing phone calls. The floating trigger icon is displayed on the touch screen to enable easy identification and access of the trigger icon. The trigger icon may be selected via application of various unique control gestures to configure the electronic device. Based on the selected mode or function of the device, the trigger icon may alter its appearance to facilitate use of the device. The operation and functionality of the trigger icon may be programmed to customize operation of the device.
    Type: Grant
    Filed: January 30, 2023
    Date of Patent: December 10, 2024
    Assignee: Datalogic USA, Inc.
    Inventors: Elva Martinez Ballesteros, Thomas Burke
  • Patent number: 12153571
    Abstract: This document describes techniques and devices for a radar recognition-aided search. Through use of a radar-based recognition system, gestures made by, and physiological information about, persons can be determined. In the case of physiological information, the techniques can use this information to refine a search. For example, if a person requests a search for a coffee shop, the techniques may refine the search to coffee shops in the direction that the person is walking. In the case of a gesture, the techniques may refine or base a search solely on the gesture. Thus, a search for information about a store, car, or tree can be made responsive to a gesture pointing at the store, car, or tree with or without explicit entry of a search query.
    Type: Grant
    Filed: October 26, 2023
    Date of Patent: November 26, 2024
    Assignee: Google LLC
    Inventors: Ivan Poupyrev, Gaetano Roberto Aiello
  • Patent number: 12147611
    Abstract: In a method for sensing a displacement of a pointing device, like a mouse, said pointing device includes at least one light source configured to illuminate a surface, at least one first secondary photodetector, at least one second secondary photodetector, and at least one primary photodetector. Each individual value of the photodetectors is weighted and compared such as to sense said displacement of the pointing device.
    Type: Grant
    Filed: December 14, 2021
    Date of Patent: November 19, 2024
    Assignee: EM Microelectronic-Marin SA
    Inventors: Sylvain Grosjean, Jérémy Schlachter
  • Patent number: 12141362
    Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.
    Type: Grant
    Filed: April 27, 2022
    Date of Patent: November 12, 2024
    Assignee: Snap Inc.
    Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
  • Patent number: 12137259
    Abstract: A system is configured to enhance a video feed in real time. A live video feed captured by a video capturing device is received. A presentation of the live video feed on one or more client devices is enhanced. The enhancing includes causing a first content item of a plurality of content items to be displayed at a first location within the presentation of the five video feed. Based on a detecting of a first instance of a first gesture made by a hand at the first location in the live video feed, a content item manipulation mode with respect to the first content item is entered. The entering of the content item manipulation mode with respect to the first content includes at least one of causing the first content item to be moved within the presentation of the live video feed based on a movement of the hand or causing a scale of the first content item to be changed within the presentation of the live video feed based on a detecting of a second gesture made by the hand.
    Type: Grant
    Filed: May 16, 2023
    Date of Patent: November 5, 2024
    Assignee: Prezi, Inc.
    Inventors: Adam Somlai-Fischer, Zsuzsa Weiner, Dániel Varga
  • Patent number: 12131010
    Abstract: Disclosed herein are clutch and boom features that can enable the manipulation of user interface elements when using a touch-sensitive component to build or otherwise design a graphical display, such as a website, video game, magazine layout, etc. Upon touching user interface element to be manipulated, the user interface element can be targeted for manipulation. In response to a clutch user interface element being engaged, the targeted user interface element can be enabled for manipulation (e.g., colored, rotated, moved, etc.) by the user, while the non-targeted user elements are protected from manipulation. Boom is an example of manipulation functionality provided by some embodiments, which can be configured to move the targeted user interface element a precise amount (e.g., pixel-by-pixel).
    Type: Grant
    Filed: January 17, 2022
    Date of Patent: October 29, 2024
    Assignee: Newman Infinite, Inc.
    Inventor: Matthew Allan Newman
  • Patent number: 12125137
    Abstract: Exemplary embodiments include an intelligent secure networked architecture configured by at least one processor to execute instructions stored in memory, the architecture comprising a data retention system and a machine learning system, a web services layer providing access to the data retention and machine learning systems, an application server layer that provides a user-facing application that accesses the data retention and machine learning systems through the web services layer and performs processing based on user interaction with an interactive graphical user interface provided by the user-facing application, the user-facing application configured to execute instructions for a method for room labeling for activity tracking and detection, the method including making a 2D sketch of a first room on an interactive graphical user interface, and using machine learning to turn the 2D sketch of the first room into a 3D model of the first room.
    Type: Grant
    Filed: May 11, 2021
    Date of Patent: October 22, 2024
    Assignee: Electronic Caregiver, Inc.
    Inventors: Judah Tveito, Bryan John Chasko, Hannah S. Rich
  • Patent number: 12124543
    Abstract: A permission configuration method includes: receiving a first input performed by a user on a first object and a second object, where the first object is an object that indicates a first application on a first interface, and the second object is an object that indicates a second application or a tartlet function on a second interface; and displaying, on the first interface in response to the first input, a target permission set used to configure a permission for the first application, where the target permission set is an intersection set between a first permission set and a second permission set, the first permission set is a permission set of the first application, and the second permission set is a permission set of the second application or the target function; and the first interface is different from the second interface.
    Type: Grant
    Filed: September 28, 2021
    Date of Patent: October 22, 2024
    Assignee: VIVO MOBILE COMMUNICATION CO., LTD.
    Inventor: Shaoling Liu
  • Patent number: 12120261
    Abstract: An incoming call processing method includes: in a case that an electronic device is connected to an external device, receiving an incoming call answering request; displaying an incoming call screen, where the incoming call screen includes a first answering mode icon and a second answering mode icon; and answering an incoming call through the external device if a first input performed on the first answering mode icon is received, and answering the incoming call through the electronic device if a second input performed on the second answering mode icon is received.
    Type: Grant
    Filed: February 15, 2022
    Date of Patent: October 15, 2024
    Assignee: VIVO MOBILE COMMUNICATION CO., LTD.
    Inventor: Chuxin Wang
  • Patent number: 12115935
    Abstract: A wireless interface for detecting an authorized access of a user to a keyless access system of a transportation vehicle. The part of the wireless interface provided for the wireless communication is detachably connected to the part of the wireless interface containing the electronics to improve the diversity of installation and mounting of the wireless interface.
    Type: Grant
    Filed: May 1, 2023
    Date of Patent: October 15, 2024
    Assignee: VOLKSWAGEN AKTIENGESELLSCHAFT
    Inventor: Bernd Ette
  • Patent number: 12111973
    Abstract: The present disclosure provides AR systems and methods. The computer-implemented method comprises displaying, on a mobile device, a rendered virtual object in an augmented reality (AR) scene, and detecting, using an input device of the mobile device, a gesture having a detected speed. The method further includes identifying the gesture, wherein the gesture is identified as a first command to implement a first function related to the virtual object responsive to the detected speed of the gesture being less than a speed threshold, and the gesture is identified as a second command to implement a second function related to the virtual object responsive to the detected speed of the gesture being greater than the speed threshold. The second command and second function are different from the first command and the first function, respectively. The identified gesture is then processed.
    Type: Grant
    Filed: September 2, 2022
    Date of Patent: October 8, 2024
    Assignee: SHOPIFY INC.
    Inventors: Brennan Letkeman, Bradley Joseph Aldridge
  • Patent number: 12099772
    Abstract: The present disclosure generally relates to engaging in cross device interactions. The method includes at a first device with a first display, while a second device having a second display is placed over a first region of the first display, detecting, via input devices of the first device, a first input. In response to detecting the first input and in accordance with a determination that the first input occurred while focus was directed to the second device, the method includes causing a response to the first input to be displayed on the second display. In response to detecting the first input and in accordance with a determination that the first input occurred while focus was directed to the first device, the method includes displaying, on the first display, a response to the first input without causing a response to the first input to be displayed on the second display.
    Type: Grant
    Filed: August 22, 2022
    Date of Patent: September 24, 2024
    Assignee: Apple Inc.
    Inventors: Tianjia Sun, Chang Zhang, Paul X. Wang, Aaron Wang
  • Patent number: 12099586
    Abstract: The present disclosure generally relates to methods and user interfaces for authentication, including providing and controlling authentication at a computer system using an external device in accordance with some embodiments.
    Type: Grant
    Filed: January 28, 2022
    Date of Patent: September 24, 2024
    Assignee: Apple Inc.
    Inventors: Grant R. Paul, Benjamin Biron, Kyle C. Brogle, Naresh Kumar Chinnathanbi Kailasam, Brent M. Ledvina, Robert W. Mayor, Nicole M. Wells
  • Patent number: 12093516
    Abstract: The present invention relates to an interaction method using eye tracking data, used in human-computer interface system with at least a screen configured to display at least user contents and eye tracking device to collect eye tracking data. The method comprising following steps: displaying selection keys on the screen, when there is a selecting interaction control signal of the user reflecting that the user intends to automatically change user contents being displayed on the screen; defining a selection key which the user wants to input based on at least the eye tracking data when the user inputs the selection key by eye gaze; automatically changing the user contents in a predefined changing manner corresponding to the selection key inputted by the eye gaze of the user, wherein said predefined changing manner selected from a group including at least scroll down, scroll up, scroll right, scroll left, zoom in, zoom out, turn the pages, move the pages, and insert new contents.
    Type: Grant
    Filed: August 31, 2023
    Date of Patent: September 17, 2024
    Inventors: Ha Thanh Le, Duyen Thi Ngo, Bong Thanh Nguyen
  • Patent number: 12093523
    Abstract: In some embodiments, an electronic device receives handwritten inputs in text entry fields and converts the handwritten inputs into font-based text. In some embodiments, an electronic device selects and deletes text based on inputs from a stylus. In some embodiments, an electronic device inserts text into pre-existing text based on inputs from a stylus. In some embodiments, an electronic device manages the timing of converting handwritten inputs into font-based text. In some embodiments, an electronic device presents a handwritten entry menu. In some embodiments, an electronic device controls the characteristic of handwritten inputs based on selections on the handwritten entry menu. In some embodiments, an electronic device presents autocomplete suggestions. In some embodiments, an electronic device converts handwritten input to font-based text. In some embodiments, an electronic device displays options in a content entry palette.
    Type: Grant
    Filed: May 6, 2020
    Date of Patent: September 17, 2024
    Assignee: Apple Inc.
    Inventors: Julian Missig, Matan Stauber, Guillaume Ardaud, Jeffrey Traer Bernstein, Marisa Rei Lu, Christopher D. Soli
  • Patent number: 12088755
    Abstract: Techniques for displaying relevant user interface objects when a device is placed into viewing position are disclosed. The device can update its display in response to a user approaching a vehicle. Display updates can be based on an arrangement of user interface information for unlocking the vehicle.
    Type: Grant
    Filed: April 25, 2022
    Date of Patent: September 10, 2024
    Assignee: Apple Inc.
    Inventors: Gary Ian Butcher, Imran Chaudhri, Jonathan R. Dascola, Alan C. Dye, Christopher Patrick Foss, Daniel C. Gross, Chanaka G. Karunamuni, Stephen O. Lemay, Natalia Maric, Christopher Wilson, Lawrence Y. Yang
  • Patent number: 12086359
    Abstract: Embodiments of the present disclosure provides an electronic apparatus and a data processing method. The electronic apparatus includes a first sensor (101), configured to collect a distance sensing parameter, and a second sensor (102), configured to collect a touch control sensing parameter. A touch control area of the second sensor (102) covers a specific area of a first surface of the electronic apparatus. The first sensor collects the distance sensing parameter in a space above the touch control sensing area. The distance sensing parameter may be collected in the space above the touch control sensing area by arranged a distance sensor. Thus, the electronic apparatus may realize a touch control input and an input through the distance, which enriches input manners of the electronic apparatus and improves the user experience.
    Type: Grant
    Filed: May 28, 2020
    Date of Patent: September 10, 2024
    Assignee: LENOVO (BEIJING) LIMITED
    Inventors: Ying Gao, Xiaoren Cheng, Zhou Yu
  • Patent number: 12086535
    Abstract: A template built by a user may be converted by a Server Script Generation Engine (SSGE) into script code. In converting, the SSGE may load and parse a framework file containing static script syntax to locate insertion points, each associated with an iteration number, and may iteratively parse the template, utilizing the iteration number to resolve, in order, tags and sub-tags contained in the template. If a tag is set to respond to the iteration number, a function of the tag is invoked to process any related sub-tags and return a script associated therewith at the appropriate insertion point. The framework file (with the appropriate script code inserted) is compiled and stored in a compiled script object which can be run multiple times to perform all of the output functions expected by the user in lieu of the need to reconvert the template.
    Type: Grant
    Filed: July 6, 2023
    Date of Patent: September 10, 2024
    Assignee: OPEN TEXT SA ULC
    Inventor: Gregory R. Petti