Gesture-based Patents (Class 715/863)
-
Patent number: 12246859Abstract: Disclosed are an apparatus and method for controlling an aerial vehicle. The apparatus includes a camera that obtains an terrain image during flight, a sensor that obtains scan information of a landing point of the aerial vehicle, and a controller that estimates a location of the landing point based on the terrain image if it is determined that the landing point is recognized in the terrain image, determines whether an obstacle exists at the landing point based on the scan information, and determines whether landing of the aerial vehicle is possible based on a determination result.Type: GrantFiled: November 10, 2022Date of Patent: March 11, 2025Assignees: Hyundai Motor Company, Kia CorporationInventors: Hyun Jee Ryu, Kyu Nam Kim, Chang Hyun Sung, Jun Young Lim
-
Patent number: 12248654Abstract: Systems and methods are described for generating and interacting with an extended reality user interface. For example, a system may detect a launch of an extended reality application on a computing device. The system may generate, in a first virtual zone of a plurality of virtual zones of an extended reality user interface, a first plurality of selectable elements. The system may identify, by a camera, a surface in a real world environment of the computing device and map the plurality of virtual zones to a plurality of physical zones on the surface. The system may detect, by the camera, a first gesture over a first physical zone of the plurality of physical zones. In response to determining that the first physical zone corresponds to the first virtual zone, the system may execute a first interaction action on the first plurality of selectable elements based on the first gesture.Type: GrantFiled: February 16, 2024Date of Patent: March 11, 2025Assignee: ISOVIST LIMITEDInventors: Dale Alan Herigstad, Jack Turpin, Eric Fanghanel Santibanez
-
Patent number: 12248527Abstract: A computer-implemented method comprises displaying a grid comprising a plurality of cells; receiving user input modifying a state of one or more of the plurality of cells to create a graphical shape in the grid, wherein each of the plurality of cells is limited to one of a plurality of predefined states; requesting an internet protocol (IP) address corresponding to the graphical shape; and in response to receiving the IP address corresponding to the graphical shape, retrieving a web resource located at the IP address.Type: GrantFiled: November 11, 2022Date of Patent: March 11, 2025Assignee: International Business Machines CorporationInventor: Venkataramana Logasundaram Jaganathan
-
Patent number: 12243238Abstract: The technology disclosed performs hand pose estimation on a so-called “joint-by-joint” basis. So, when a plurality of estimates for the 28 hand joints are received from a plurality of expert networks (and from master experts in some high-confidence scenarios), the estimates are analyzed at a joint level and a final location for each joint is calculated based on the plurality of estimates for a particular joint. This is a novel solution discovered by the technology disclosed because nothing in the field of art determines hand pose estimates at such granularity and precision. Regarding granularity and precision, because hand pose estimates are computed on a joint-by-joint basis, this allows the technology disclosed to detect in real time even the minutest and most subtle hand movements, such a bend/yaw/tilt/roll of a segment of a finger or a tilt an occluded finger, as demonstrated supra in the Experimental Results section of this application.Type: GrantFiled: July 20, 2023Date of Patent: March 4, 2025Assignee: ULTRAHAPTICS IP TWO LIMITEDInventors: Jonathan Marsden, Raffi Bedikian, David Samuel Holz
-
Patent number: 12236528Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.Type: GrantFiled: September 30, 2022Date of Patent: February 25, 2025Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, David S. Holz
-
Patent number: 12236006Abstract: An image forming device stores a play area where a user wearing a head mounted display is movable during play of an application in a space around the user. The image forming device causes the head mounted display to display an image indicating the stored play area. The image forming device receives an operation performed by the user to edit the play area. The image forming device expands or reduces the play area according to the operation performed by the user.Type: GrantFiled: February 14, 2022Date of Patent: February 25, 2025Assignee: SONY INTERACTIVE ENTERTAINMENT INC.Inventors: Yuto Hayakawa, Shoi Yonetomi, Yurika Mulase, Masanori Nomura
-
Patent number: 12238092Abstract: A system and method provide automatic access to applications or data. A portable physical device, referred to herein as a Personal Digital Key or “PDK”, stores one or more profiles in memory, including a biometric profile acquired in a secure trusted process and uniquely associated with a user that is authorized to use and associated with the PDK. The PDK wirelessly transmits identification information including a unique PDK identification number, the biometric profile and a profile over a secure wireless channel to a reader. A computing device is coupled to the reader. An auto login server is coupled to the reader and the computing device and launches one or more applications associated with a user name identified by the received profile.Type: GrantFiled: June 10, 2021Date of Patent: February 25, 2025Assignee: Proxense, LLCInventor: John J. Giobbi
-
Patent number: 12236145Abstract: A display apparatus displays handwritten data that is input with an inputter. The display apparatus includes processing circuitry; and a memory storing computer-executable instructions that cause the processing circuitry to convert the handwritten data into a character string; and conceal the character string generated by converting the handwritten data, in response to determining that a first time has elapsed after the character string is displayed.Type: GrantFiled: August 30, 2021Date of Patent: February 25, 2025Assignee: RICOH COMPANY, LTD.Inventor: Kiyoshi Kasatani
-
Patent number: 12229397Abstract: Techniques for controller graphical user interface based on interaction data are described and are implementable to generate a controller graphical user interface for display by a first device to control digital content displayed on a second device. For instance, an application such as a mobile gaming application is displayed on a mobile device and digital content from the application is communicated for display by a display device as part of a content connectivity session. One or more control regions of the application are determined based on a monitored interaction with a user interface of the mobile device. One or more control graphics are identified that correlate to the one or more control regions. A controller graphical user interface is generated that displays the one or more control graphics and filters out extraneous digital content. The controller graphical user interface is then displayed by the mobile device.Type: GrantFiled: February 8, 2023Date of Patent: February 18, 2025Assignee: Motorola Mobility LLCInventors: Amit Kumar Agrawal, Satyabrata Rout, J Amarnath, Gokula Ramanan
-
Patent number: 12223163Abstract: An interaction method, an interaction apparatus, an electronic device, and a storage medium are provided. The interaction method includes: displaying a target object in a first display mode on a target object display interface, where in the first display mode, the target object display interface includes a first interactive control; receiving a display mode switching on the target object display interface; and in response to the display mode switching, displaying the target object in a second display mode on the target object display interface, where in the second display mode, the target object display interface includes a second interactive control different from the first interactive control.Type: GrantFiled: December 21, 2023Date of Patent: February 11, 2025Assignee: BEIJING ZITIAO NETWORK TECHNOLOGY CO., LTD.Inventors: Weiyi Chang, Ziyang Zheng
-
Patent number: 12219239Abstract: A method of controlling an electronic device by recognizing movement of an object includes obtaining at least one image including an image of the object; dividing the obtained at least one image into a middle zone and a peripheral zone; extracting one or more feature points of the object that are within the peripheral zone; recognizing movement of the object based on the extracted one or more feature points; and controlling the electronic device based on the recognized movement.Type: GrantFiled: October 5, 2021Date of Patent: February 4, 2025Assignee: SAMSUNG ELECTRONICS CO., LTD.Inventors: Yevhenii Yakishyn, Svitlana Alkhimova
-
Patent number: 12216881Abstract: A method and device for operating an application icon are provided, to prompt a user where a currently operated application icon is located. The method includes: displaying a first screen page of a desktop, wherein the first screen page comprises N application icons, the N application icons comprise a first application icon and at least one first-layer application icon, and the first-layer application icon is located around the first application icon and adjacent to the first application icon; and the first-layer application icon comprises a foreground layer and a background layer; and receiving an operation of a user related to the first application icon; and in response to the operation, displaying the at least one first-layer application icon in a first display manner, wherein the first display manner comprises that a foreground layer of the at least one first-layer application icon moves relative to the first application icon.Type: GrantFiled: June 6, 2023Date of Patent: February 4, 2025Assignee: Huawei Technologies Co., Ltd.Inventor: Lang Song
-
Patent number: 12216829Abstract: Methods, systems, and devices for device-inputted commands are described. A system may receive physiological data, including motion data, associated with a user from a wearable device worn by the user and may detect multiple motion pulses based on the motion data, where each motion pulse includes motion data that exceeds a motion threshold. The system may additionally identify an input command pattern comprising at least one motion pulse preceded and followed by a static period in a time domain. Additionally, the system may identify one or more user inputs based on the input command pattern matching a reference command pattern and generate one or more instructions based on identifying the one or more user inputs.Type: GrantFiled: April 26, 2022Date of Patent: February 4, 2025Assignee: Oura Health OyInventors: Heli Koskimäki, Johanna Still, Mari Karsikas, Alec Singleton, Petteri Lajunen, Henri Huttunen, Veli-Pekka Halme, Marcus Ziade, Janne Kukka
-
Patent number: 12210687Abstract: The disclosure provides gesture recognition methods and apparatuses, and relate to the field of artificial intelligence. One example gesture recognition method includes obtaining an image stream, and determining, based on a plurality of consecutive frames of hand images in the image stream, whether a user makes a preparatory action. When the user makes the preparatory action, continuing to obtain an image stream, and determining a gesture action of the user based on a plurality of consecutive frames of hand images in the continuously obtained image stream. Next, further responding to the gesture action to implement gesture interaction with the user. In this application, in order to reduce the erroneous recognition occurring in a gesture recognition process, the preparatory action is determined before gesture recognition is performed.Type: GrantFiled: March 8, 2022Date of Patent: January 28, 2025Assignee: Huawei Technologies Co., Ltd.Inventors: Xiaofei Wu, Fei Huang, Songcen Xu, Youliang Yan
-
Patent number: 12203771Abstract: An information processing device that is configured to include: a function allocating unit that allocates a first function to a first peripheral area inside a screen along a first side of the screen and allocates a second function to a second peripheral area inside the screen along a second side of the screen; a touch detection unit that detects touch operation of a user on the screen; and a function executing unit that upon detection of a slide operation from outside the peripheral area to inside the peripheral area, implements functions allocated to peripheral areas that are the slide operation destination based on the touch operation in the peripheral area that is the destination of the detected touch operation.Type: GrantFiled: April 26, 2023Date of Patent: January 21, 2025Assignee: FAURECIA CLARION ELECTRONICS CO., LTD.Inventor: Takuma Shioguchi
-
Patent number: 12200163Abstract: An electronic device with a dynamic wallpaper is provided. The electronic device includes a touch screen and a processor. The processor is coupled to the touch screen. The processor displays a first dynamic wallpaper mode by the touch screen when determining that the electronic device is in a non-lock screen mode.Type: GrantFiled: March 4, 2022Date of Patent: January 14, 2025Assignee: ASUSTeK COMPUTER INC.Inventors: Tzu-Chuan Weng Huang, Ya-Wen Mei
-
Patent number: 12192612Abstract: A shooting method is provided. By implementing the shooting method provided in embodiments of this application, a terminal electronic device such as a smartphone or a smart television can determine to change a shooting mode based on a recognized user gesture. The electronic device can recognize a user's waving or sliding gesture more accurately based on the user's hand position change track in an image frame sequence and whether there are hand movements in two directions in the position changes.Type: GrantFiled: August 26, 2022Date of Patent: January 7, 2025Assignee: HONOR DEVICE CO., LTD.Inventors: Shiyu Zhu, Yonghua Wang
-
Patent number: 12189864Abstract: The present invention is an information processing apparatus adapted to a user interface that enables non-contact operation according to a screen displayed on a display. The information processing apparatus includes an input operation identification unit that identifies predetermined input operation on the basis of a sequence of position information of a specific part of a user in a predetermined spatial coordinate system, and a processing execution unit that executes predetermined processing on the basis of identified the predetermined input operation. The input operation identification unit identifies the predetermined input operation in a case of judging, on the basis of a second sequence of the position information, that the specific part of the user is in a moving state after the specific part of the user is in a staying state at a reference position in the predetermined spatial coordinate system on the basis of a first sequence of the position information.Type: GrantFiled: March 24, 2021Date of Patent: January 7, 2025Assignee: SONY SEMICONDUCTOR SOLUTIONS CORPORATIONInventors: Suguru Kobayashi, Ryuichi Omiya, Takahisa Ohgami, Takumi Tsuji
-
Patent number: 12183103Abstract: Systems and methods for transferring data via a finger tracking smart device from a first user interface (“UI”) to a second UI is provided. The data transferred may be documentation data, including signatures and corrections. The finger tracking smart device may include one or more smart lenses. Methods may include triggering a tracking of the movement of the user's fingers on the first UI and further tracking a start point and an end point of the movement of the user's fingers based on detection of deliberate movements and gestures. Methods may further include capturing a segment of data within the start point of movement and the end point of movement and storing the segment of data in memory on the finger tracking smart device. Methods may further include updating the second UI based on an instruction in a data packet transmitted to the second UI, by inputting the segment of data at a point of movement of the user's fingers on the second UI.Type: GrantFiled: May 23, 2023Date of Patent: December 31, 2024Assignee: Bank of America CorporationInventors: Jennifer Sanctis, Mary Bangs, Veronica Andrea Cadavid, Taylor Farris, Trish Gillis, Jesse James Godley, Brian Meyers, Vishwas Korde
-
Patent number: 12182905Abstract: A method, a processing device, and a system for information display are provided, and the system includes a light transmissive display. A first information extraction device extracts spatial position information of a user, and a second information extraction device extracts spatial position information of a target object. The processing device performs the following steps. Display position information of virtual information of the target object on the display is determined according to the spatial position information of the user and the spatial position information of the target object. The display position information includes a first display reference position corresponding to a previous time and a second display reference position corresponding to a current time. An actual display position of the virtual information on the display corresponding to the current time is determined according to a distance between the first display reference position and the second display reference position.Type: GrantFiled: September 7, 2022Date of Patent: December 31, 2024Assignee: Industrial Technology Research InstituteInventors: Yi-Wei Luo, Jian-Lung Chen, Ting-Hsun Cheng, Yu-Ju Chao, Yu-Hsin Lin
-
Patent number: 12164743Abstract: A portable electronic device having a touch screen with a floating soft trigger icon for enabling various functions of the electronic device, such as bar code reading, capturing RFID data, capturing video and images, calling applications, and/or placing phone calls. The floating trigger icon is displayed on the touch screen to enable easy identification and access of the trigger icon. The trigger icon may be selected via application of various unique control gestures to configure the electronic device. Based on the selected mode or function of the device, the trigger icon may alter its appearance to facilitate use of the device. The operation and functionality of the trigger icon may be programmed to customize operation of the device.Type: GrantFiled: January 30, 2023Date of Patent: December 10, 2024Assignee: Datalogic USA, Inc.Inventors: Elva Martinez Ballesteros, Thomas Burke
-
Patent number: 12164943Abstract: In an aspect a system for dynamic user interface generation includes a computing device configured to capture user interaction data from an engagement module on the user interface, to receive server feedback data, to determine a current user interface state as a function of the captured user interaction data and the received server feedback data, to predict an optimal user interface state as a function of the user interaction data and the server feedback data, and to generate an updated display data structure based on the predicted optimal user interface state, and to configure a remote computing device is configured to receive further user interaction data from the user. The computing device then updates the display data structure and consequently the user interface as a function of the further user interaction data.Type: GrantFiled: July 10, 2023Date of Patent: December 10, 2024Assignee: THE LONDON OSTEOPOROSIS CLINIC LIMITEDInventor: Taher Mahmud
-
Patent number: 12163368Abstract: A vehicle operation detection device includes a storage unit configured to store a trained model obtained by machine learning using training data in which an image captured in advance and a body part used for a gesture of a user are associated with each other, an entry determination unit configured to determine whether, based on a position of the body part in the image obtained by inputting a newly captured image into the trained model, the body part enters a recognition area set in an imaging area of a camera, and a gesture determination unit configured to calculate a displacement vector of the body part based on images captured at a time interval after it is determined that the body part enters the recognition area, and determine, in accordance with whether a direction of the displacement vector is a direction corresponding to the gesture, whether the gesture is made.Type: GrantFiled: June 1, 2022Date of Patent: December 10, 2024Assignee: AISIN CORPORATIONInventor: Takuya Tamura
-
Patent number: 12153571Abstract: This document describes techniques and devices for a radar recognition-aided search. Through use of a radar-based recognition system, gestures made by, and physiological information about, persons can be determined. In the case of physiological information, the techniques can use this information to refine a search. For example, if a person requests a search for a coffee shop, the techniques may refine the search to coffee shops in the direction that the person is walking. In the case of a gesture, the techniques may refine or base a search solely on the gesture. Thus, a search for information about a store, car, or tree can be made responsive to a gesture pointing at the store, car, or tree with or without explicit entry of a search query.Type: GrantFiled: October 26, 2023Date of Patent: November 26, 2024Assignee: Google LLCInventors: Ivan Poupyrev, Gaetano Roberto Aiello
-
Patent number: 12147611Abstract: In a method for sensing a displacement of a pointing device, like a mouse, said pointing device includes at least one light source configured to illuminate a surface, at least one first secondary photodetector, at least one second secondary photodetector, and at least one primary photodetector. Each individual value of the photodetectors is weighted and compared such as to sense said displacement of the pointing device.Type: GrantFiled: December 14, 2021Date of Patent: November 19, 2024Assignee: EM Microelectronic-Marin SAInventors: Sylvain Grosjean, Jérémy Schlachter
-
Patent number: 12141362Abstract: A text entry process for an Augmented Reality (AR) system. The AR system detects, using one or more cameras of the AR system, a start text entry gesture made by a user of the AR system. During text entry, the AR system detects, using the one or more cameras, a symbol corresponding to a fingerspelling sign made by the user. The AR system generates entered text data based on the symbol and provides text in a text scene component of an AR overlay provided by the AR system to the user based on the entered text data.Type: GrantFiled: April 27, 2022Date of Patent: November 12, 2024Assignee: Snap Inc.Inventors: Austin Vaday, Rebecca Jean Lee, Jennica Pounds
-
Patent number: 12137259Abstract: A system is configured to enhance a video feed in real time. A live video feed captured by a video capturing device is received. A presentation of the live video feed on one or more client devices is enhanced. The enhancing includes causing a first content item of a plurality of content items to be displayed at a first location within the presentation of the five video feed. Based on a detecting of a first instance of a first gesture made by a hand at the first location in the live video feed, a content item manipulation mode with respect to the first content item is entered. The entering of the content item manipulation mode with respect to the first content includes at least one of causing the first content item to be moved within the presentation of the live video feed based on a movement of the hand or causing a scale of the first content item to be changed within the presentation of the live video feed based on a detecting of a second gesture made by the hand.Type: GrantFiled: May 16, 2023Date of Patent: November 5, 2024Assignee: Prezi, Inc.Inventors: Adam Somlai-Fischer, Zsuzsa Weiner, Dániel Varga
-
Patent number: 12131010Abstract: Disclosed herein are clutch and boom features that can enable the manipulation of user interface elements when using a touch-sensitive component to build or otherwise design a graphical display, such as a website, video game, magazine layout, etc. Upon touching user interface element to be manipulated, the user interface element can be targeted for manipulation. In response to a clutch user interface element being engaged, the targeted user interface element can be enabled for manipulation (e.g., colored, rotated, moved, etc.) by the user, while the non-targeted user elements are protected from manipulation. Boom is an example of manipulation functionality provided by some embodiments, which can be configured to move the targeted user interface element a precise amount (e.g., pixel-by-pixel).Type: GrantFiled: January 17, 2022Date of Patent: October 29, 2024Assignee: Newman Infinite, Inc.Inventor: Matthew Allan Newman
-
Patent number: 12124543Abstract: A permission configuration method includes: receiving a first input performed by a user on a first object and a second object, where the first object is an object that indicates a first application on a first interface, and the second object is an object that indicates a second application or a tartlet function on a second interface; and displaying, on the first interface in response to the first input, a target permission set used to configure a permission for the first application, where the target permission set is an intersection set between a first permission set and a second permission set, the first permission set is a permission set of the first application, and the second permission set is a permission set of the second application or the target function; and the first interface is different from the second interface.Type: GrantFiled: September 28, 2021Date of Patent: October 22, 2024Assignee: VIVO MOBILE COMMUNICATION CO., LTD.Inventor: Shaoling Liu
-
Patent number: 12125137Abstract: Exemplary embodiments include an intelligent secure networked architecture configured by at least one processor to execute instructions stored in memory, the architecture comprising a data retention system and a machine learning system, a web services layer providing access to the data retention and machine learning systems, an application server layer that provides a user-facing application that accesses the data retention and machine learning systems through the web services layer and performs processing based on user interaction with an interactive graphical user interface provided by the user-facing application, the user-facing application configured to execute instructions for a method for room labeling for activity tracking and detection, the method including making a 2D sketch of a first room on an interactive graphical user interface, and using machine learning to turn the 2D sketch of the first room into a 3D model of the first room.Type: GrantFiled: May 11, 2021Date of Patent: October 22, 2024Assignee: Electronic Caregiver, Inc.Inventors: Judah Tveito, Bryan John Chasko, Hannah S. Rich
-
Patent number: 12115935Abstract: A wireless interface for detecting an authorized access of a user to a keyless access system of a transportation vehicle. The part of the wireless interface provided for the wireless communication is detachably connected to the part of the wireless interface containing the electronics to improve the diversity of installation and mounting of the wireless interface.Type: GrantFiled: May 1, 2023Date of Patent: October 15, 2024Assignee: VOLKSWAGEN AKTIENGESELLSCHAFTInventor: Bernd Ette
-
Patent number: 12120261Abstract: An incoming call processing method includes: in a case that an electronic device is connected to an external device, receiving an incoming call answering request; displaying an incoming call screen, where the incoming call screen includes a first answering mode icon and a second answering mode icon; and answering an incoming call through the external device if a first input performed on the first answering mode icon is received, and answering the incoming call through the electronic device if a second input performed on the second answering mode icon is received.Type: GrantFiled: February 15, 2022Date of Patent: October 15, 2024Assignee: VIVO MOBILE COMMUNICATION CO., LTD.Inventor: Chuxin Wang
-
Patent number: 12111973Abstract: The present disclosure provides AR systems and methods. The computer-implemented method comprises displaying, on a mobile device, a rendered virtual object in an augmented reality (AR) scene, and detecting, using an input device of the mobile device, a gesture having a detected speed. The method further includes identifying the gesture, wherein the gesture is identified as a first command to implement a first function related to the virtual object responsive to the detected speed of the gesture being less than a speed threshold, and the gesture is identified as a second command to implement a second function related to the virtual object responsive to the detected speed of the gesture being greater than the speed threshold. The second command and second function are different from the first command and the first function, respectively. The identified gesture is then processed.Type: GrantFiled: September 2, 2022Date of Patent: October 8, 2024Assignee: SHOPIFY INC.Inventors: Brennan Letkeman, Bradley Joseph Aldridge
-
Patent number: 12099586Abstract: The present disclosure generally relates to methods and user interfaces for authentication, including providing and controlling authentication at a computer system using an external device in accordance with some embodiments.Type: GrantFiled: January 28, 2022Date of Patent: September 24, 2024Assignee: Apple Inc.Inventors: Grant R. Paul, Benjamin Biron, Kyle C. Brogle, Naresh Kumar Chinnathanbi Kailasam, Brent M. Ledvina, Robert W. Mayor, Nicole M. Wells
-
Patent number: 12099772Abstract: The present disclosure generally relates to engaging in cross device interactions. The method includes at a first device with a first display, while a second device having a second display is placed over a first region of the first display, detecting, via input devices of the first device, a first input. In response to detecting the first input and in accordance with a determination that the first input occurred while focus was directed to the second device, the method includes causing a response to the first input to be displayed on the second display. In response to detecting the first input and in accordance with a determination that the first input occurred while focus was directed to the first device, the method includes displaying, on the first display, a response to the first input without causing a response to the first input to be displayed on the second display.Type: GrantFiled: August 22, 2022Date of Patent: September 24, 2024Assignee: Apple Inc.Inventors: Tianjia Sun, Chang Zhang, Paul X. Wang, Aaron Wang
-
Patent number: 12093523Abstract: In some embodiments, an electronic device receives handwritten inputs in text entry fields and converts the handwritten inputs into font-based text. In some embodiments, an electronic device selects and deletes text based on inputs from a stylus. In some embodiments, an electronic device inserts text into pre-existing text based on inputs from a stylus. In some embodiments, an electronic device manages the timing of converting handwritten inputs into font-based text. In some embodiments, an electronic device presents a handwritten entry menu. In some embodiments, an electronic device controls the characteristic of handwritten inputs based on selections on the handwritten entry menu. In some embodiments, an electronic device presents autocomplete suggestions. In some embodiments, an electronic device converts handwritten input to font-based text. In some embodiments, an electronic device displays options in a content entry palette.Type: GrantFiled: May 6, 2020Date of Patent: September 17, 2024Assignee: Apple Inc.Inventors: Julian Missig, Matan Stauber, Guillaume Ardaud, Jeffrey Traer Bernstein, Marisa Rei Lu, Christopher D. Soli
-
Patent number: 12093516Abstract: The present invention relates to an interaction method using eye tracking data, used in human-computer interface system with at least a screen configured to display at least user contents and eye tracking device to collect eye tracking data. The method comprising following steps: displaying selection keys on the screen, when there is a selecting interaction control signal of the user reflecting that the user intends to automatically change user contents being displayed on the screen; defining a selection key which the user wants to input based on at least the eye tracking data when the user inputs the selection key by eye gaze; automatically changing the user contents in a predefined changing manner corresponding to the selection key inputted by the eye gaze of the user, wherein said predefined changing manner selected from a group including at least scroll down, scroll up, scroll right, scroll left, zoom in, zoom out, turn the pages, move the pages, and insert new contents.Type: GrantFiled: August 31, 2023Date of Patent: September 17, 2024Inventors: Ha Thanh Le, Duyen Thi Ngo, Bong Thanh Nguyen
-
Patent number: 12086535Abstract: A template built by a user may be converted by a Server Script Generation Engine (SSGE) into script code. In converting, the SSGE may load and parse a framework file containing static script syntax to locate insertion points, each associated with an iteration number, and may iteratively parse the template, utilizing the iteration number to resolve, in order, tags and sub-tags contained in the template. If a tag is set to respond to the iteration number, a function of the tag is invoked to process any related sub-tags and return a script associated therewith at the appropriate insertion point. The framework file (with the appropriate script code inserted) is compiled and stored in a compiled script object which can be run multiple times to perform all of the output functions expected by the user in lieu of the need to reconvert the template.Type: GrantFiled: July 6, 2023Date of Patent: September 10, 2024Assignee: OPEN TEXT SA ULCInventor: Gregory R. Petti
-
Patent number: 12086901Abstract: The present disclosure relates to systems, methods, and non-transitory computer readable media for generating painted digital images utilizing an intelligent painting process that includes progressive layering, sequential brushstroke guidance, and/or brushstroke regularization. For example, the disclosed systems utilize an image painting model to perform progressive layering to generate and apply digital brushstrokes in a progressive fashion for different layers associated with a background canvas and foreground objects. In addition, the disclosed systems utilize sequential brushstroke guidance to generate painted foreground objects by sequentially shifting through attention windows for regions of interest in a target digital image. Furthermore, the disclosed systems utilize brushstroke regularization to generate and apply an efficient brushstroke sequence to generate a painted digital image.Type: GrantFiled: March 29, 2022Date of Patent: September 10, 2024Assignee: Adobe Inc.Inventors: Jaskirat Singh, Jose Ignacio Echevarria Vallespi, Cameron Smith
-
Patent number: 12088755Abstract: Techniques for displaying relevant user interface objects when a device is placed into viewing position are disclosed. The device can update its display in response to a user approaching a vehicle. Display updates can be based on an arrangement of user interface information for unlocking the vehicle.Type: GrantFiled: April 25, 2022Date of Patent: September 10, 2024Assignee: Apple Inc.Inventors: Gary Ian Butcher, Imran Chaudhri, Jonathan R. Dascola, Alan C. Dye, Christopher Patrick Foss, Daniel C. Gross, Chanaka G. Karunamuni, Stephen O. Lemay, Natalia Maric, Christopher Wilson, Lawrence Y. Yang
-
Patent number: 12086359Abstract: Embodiments of the present disclosure provides an electronic apparatus and a data processing method. The electronic apparatus includes a first sensor (101), configured to collect a distance sensing parameter, and a second sensor (102), configured to collect a touch control sensing parameter. A touch control area of the second sensor (102) covers a specific area of a first surface of the electronic apparatus. The first sensor collects the distance sensing parameter in a space above the touch control sensing area. The distance sensing parameter may be collected in the space above the touch control sensing area by arranged a distance sensor. Thus, the electronic apparatus may realize a touch control input and an input through the distance, which enriches input manners of the electronic apparatus and improves the user experience.Type: GrantFiled: May 28, 2020Date of Patent: September 10, 2024Assignee: LENOVO (BEIJING) LIMITEDInventors: Ying Gao, Xiaoren Cheng, Zhou Yu
-
Patent number: 12079457Abstract: Methods, devices, and processor-readable media for scaling up graphical user interface (GUI) elements on a smart watch touchscreen based on the initial position of a stroke gesture, thereby facilitating selection, by a user, of a target GUI element. The user only needs to perform one stroke action to select the desired target. During the procedure, GUI elements of the main screen are scaled up to make target selection easier, and the GUI elements are automatically reset to their original size when the procedure is complete. To achieve this result, a “stroke with initial position” gesture is used to select a target.Type: GrantFiled: August 24, 2022Date of Patent: September 3, 2024Assignees: Huawei Technologies Canada Co., Ltd.Inventors: Zhe Liu, Qiang Xu, Hanaë Rateau, Damien Masson
-
Patent number: 12073075Abstract: A system, a method, and a computer program product for providing wearable continuous blood glucose monitoring. In some embodiments, there is provided a method that includes receiving, at a smartwatch, an alert representative of a glucose state of a host-patient coupled to a glucose sensor; detecting, at the smartwatch, a predetermined action indicative of a request to generate a glance view providing an indication of the glucose state of the host-patient; and presenting, at the smartwatch and in response to the detecting, the glance view providing the indication of the glucose state of the host-patient.Type: GrantFiled: May 13, 2020Date of Patent: August 27, 2024Assignee: Dexcom, Inc.Inventors: Naresh C. Bhavaraju, Eric Cohen, Arturo Garcia, Katherine Yerre Koehler, Michael Robert Mensinger, Eli Reihman, Brian Christopher Smith, Peter Hedlund, Esteban Cabrera, Jr.
-
Patent number: 12073027Abstract: Implementations are directed to receiving a first set of images included in a first video captured by a camera that monitors a human performing a task; processing the first set of images using a first machine learning (ML) model to determine whether the first set of images depicts a gesture that is included in a predefined set of gestures; in response to determining that the first set of images depicts a gesture included in a predefined set of gestures, processing a second set of images included in the first video using a second ML model to determine a first gesture type of the gesture; comparing the first gesture type with a first expected gesture type to determine whether performance of the task conforms to a standard operating procedure (SOP) for the task; and providing feedback representative of a comparison result in a user interface.Type: GrantFiled: December 20, 2022Date of Patent: August 27, 2024Assignee: ACCENTURE GLOBAL SOLUTIONS LIMITEDInventors: Keyu Qi, Hailing Zhou, Nan Ke, David Nguyen, Binghao Tang
-
Patent number: 12067553Abstract: A method of determining a proximity of an antenna located in a payment instrument to an antenna located within an electronic device may include receiving, at the antenna located within the electronic device, a first signal from the antenna located in the payment instrument, the first signal received at a first time, receiving, at the antenna located within an electronic device, a second signal from the antenna located in the payment instrument, the second signal received at a second time, the second time being later than the first time, determining a difference in a signal strength between the first and second signals, and displaying, on a display of the electronic device, an indication based on the determination.Type: GrantFiled: September 29, 2022Date of Patent: August 20, 2024Assignee: Worldpay LimitedInventors: Jonathan Stewart Vokes, Nicholas Telford-Reed
-
Patent number: 12067345Abstract: The embodiments of the display disclosure relate to a table displaying method, device, and medium, wherein the method includes: determining a displaying mode of a table on a current interface; and in response to a browse triggering operation on the table, displaying table information matched with a displaying size of the current interface according to the displaying mode.Type: GrantFiled: October 19, 2023Date of Patent: August 20, 2024Assignee: Beijing Zitiao Network Technology Co., Ltd.Inventors: Shiqi Wan, Hongxiao Xin
-
Patent number: 12061915Abstract: An electronic device displays one or more views of a software application with a plurality of gesture recognizers including at least one discrete gesture recognizer, configured to send a single action message in response to a respective gesture; and at least one continuous gesture recognizer, configured to send action messages at successive recognized sub-events of a respective recognized gesture. The device detects one or more events and processes each event using one or more of the gesture recognizers, including: processing the respective event at a respective gesture recognizer in accordance with a respective gesture definition corresponding to the respective gesture recognizer, and conditionally sending one or more respective action messages to the software application in accordance with an outcome of the processing of the respective event. The device executes the software application in accordance with one or more action messages received from one or more of the gesture recognizers.Type: GrantFiled: July 6, 2020Date of Patent: August 13, 2024Assignee: APPLE INC.Inventors: Joshua H. Shaffer, Bradford Allen Moore, Jason Clay Beaver
-
Patent number: 12056287Abstract: A system having a gesture sensor is provided. The gesture sensor includes an image sensing unit and a processing unit. The image sensing unit captures at least one gesture image of the user. The processing unit is electrically connected to the image sensing unit. The processing unit sends at least one control command to a control valve of the system according to the gesture image to change a status of the flow.Type: GrantFiled: June 30, 2023Date of Patent: August 6, 2024Assignee: PIXART IMAGING INC.Inventors: Chung-Yuo Wu, Yi-Hsien Ko, Nientse Chen
-
Patent number: 12045449Abstract: The present disclosure generally relates to user interfaces for managing user contributions and activity for shared files and folder. With respect to a document, at least a portion of the document is presented. User input in the form of a request to shift the document in a first direction is received. In response to the request passing a predetermined boundary, indications of content contribution attributable to different users is displayed. With respect to one or more digital files, a user input component is presented and is associated with the one or more digital files. User input is received via the user input component. In response to the received user input, an update activity panel is presented which shows multiple activity attributable to different users.Type: GrantFiled: June 6, 2022Date of Patent: July 23, 2024Assignee: Apple Inc.Inventors: Evan S. Torchin, Allen W. Lucas, Jesse W. Bunch, Joseph J. Stelmach, Markus Hagele, Steffen Ryll
-
Patent number: 12036091Abstract: Disclosed is a 3D scanning system and a method for using such a system, where a handheld scanner of the 3D scanner system is used to control the operation mode of the system such that the user does not need to engage an external unit e.g. during a scanning procedure in which the teeth in a patient's upper and lower jaws are scanned.Type: GrantFiled: March 19, 2018Date of Patent: July 16, 2024Assignee: 3SHAPE A/SInventor: Kristian Evers Hansen