Gesture-based Patents (Class 715/863)
-
Patent number: 11964882Abstract: An IoT-based system for measurement of contamination distribution of contaminated groundwater through real-time monitoring of a contamination degree of a contaminated groundwater well for control of a contaminated groundwater purification device and prediction of a purification period based on the measurement result.Type: GrantFiled: December 12, 2019Date of Patent: April 23, 2024Assignees: HYORIM CO., LTD.Inventors: Sung Kook Cho, Seong Ghui Cho, Myeong Gwang Oh, Sang Hwan Lee
-
Patent number: 11964200Abstract: In a particular implementation, a user environment space for haptic feedback and interactivity (HapSpace) is proposed. In one embodiment, the HapSpace is a virtual space attached to the user and is defined by the maximum distance that the user's body can reach. The HapSpace may move as the user moves. Haptic objects and haptic devices, and the associated haptic properties, may also be defined within the HapSpace. New descriptors, such as those enable precise locations and link between the user and haptic objects/devices are defined for describing the HapSpace.Type: GrantFiled: July 7, 2016Date of Patent: April 23, 2024Assignee: InterDigital CE Patent Holdings, SASInventors: Philippe Guillotel, Fabien Danieau, Julien Fleureau, Didier Doyen
-
Patent number: 11966516Abstract: Methods and systems for gesture-based control of a device are described. A virtual gesture-space is determined in a received input frame. The virtual gesture-space is associated with a primary user from a ranked user list of users. The received input frame is processed in only the virtual gesture-space, to detect and track a hand. Using a hand bounding box generated by detecting and tracking the hand, gesture classification is performed to determine a gesture input associated with the hand. A command input associated with the determined gesture input is processed. The device may be a smart television, a smart phone, a tablet, etc.Type: GrantFiled: May 30, 2022Date of Patent: April 23, 2024Assignee: HUAWEI TECHNOLOGIES CO., LTD.Inventors: Juwei Lu, Sayem Mohammad Siam, Wei Zhou, Peng Dai, Xiaofei Wu, Songcen Xu
-
Patent number: 11954936Abstract: A cross-correlation system includes control circuitry that obtains first sensor data of a first user from a radio detection and ranging system. A first portable device carried by the first user is detected based on the first sensor data of the first user. Second sensor data is obtained from the first portable device based on the detection of the first portable device of the first user. The first sensor data and the second sensor data are cross-correlated to obtain cross-correlated information of the first user. A first gesture specific to the first user is recognized based on the cross-correlated information. A first controllable device is identified from a plurality of controllable devices and a first action that is to be executed at the identified first controllable device, based on the first gesture. The identified first controllable device is controlled to execute the first action based on the first gesture.Type: GrantFiled: June 24, 2020Date of Patent: April 9, 2024Assignee: AR & NS Investment, LLCInventor: Alireza Tarighat Mehrabani
-
Patent number: 11954320Abstract: Methods and systems are disclosed for dynamically and automatically modifying a user interface (UI) of an application based on the UI capabilities of a computer device running the application, in particular whether the computer device is touch-enabled or not.Type: GrantFiled: April 9, 2018Date of Patent: April 9, 2024Assignee: VFA, INC.Inventors: Vladislav M. Mangeym, Oleg Puzatkin
-
Patent number: 11947751Abstract: An electronic device that is in communication with a display generation component, and sensor(s) to detect location of an input object displays a content selection object within selectable content, wherein the content selection object includes a first edge and a second edge. The device detects an input by the input object, including detecting the input object at a first hover location that corresponds to the first edge of the content selection object. In response to detecting the first portion of the input: in accordance with a determination that the first portion of the input meets first criteria that require the input object meets proximity criteria with respect to the content selection object, the device changes an appearance of the first edge relative to the second edge of the content selection object to indicate that the first edge will be selected for movement when the input object meets second criteria.Type: GrantFiled: March 30, 2023Date of Patent: April 2, 2024Assignee: APPLE INC.Inventors: Mark K. Hauenstein, Jeffrey T. Bernstein, Julian Missig, Marek A. Bereza
-
Patent number: 11947793Abstract: Provided is a portable terminal including a communication section configured to acquire operational information about a monitoring target device, a display section, a detector configured to detect a gesture motion involving a change in orientation of the portable terminal, and a processor configured to, when the gesture motion is detected while a first screen including an object in accordance with content of the operational information is displayed on the display section, cause a screen in accordance with a type of the gesture motion and a type of the object included in the first screen to be displayed on the display section. When the operational information differs, the object included in the first screen differs and, even when the same gesture motion is detected, a different screen is displayed.Type: GrantFiled: January 31, 2022Date of Patent: April 2, 2024Assignee: Seiko Epson CorporationInventor: Shinichiro Niiyama
-
Patent number: 11941164Abstract: The disclosure relates to a method, implemented in an electronic device adapted to be configured in a mono user mode, where the electronic device is controlled by input elements obtained from a single tracked user, and in a multi-user mode where the electronic device is controlled by input elements obtained from a group or one or more tracked users, the method comprising: —tracking actions of one or more user (s) selected amongst candidate users according to a current mode of the electronic device, and to selection requests of the candidate users, the selection requests being captured by at least one sensor coupled to said electronic device; —determining at least one input element adapted for controlling the electronic device according to the tracked actions. It also relates to corresponding device, electronic assembly, system, computer readable program product and storage medium.Type: GrantFiled: March 20, 2020Date of Patent: March 26, 2024Assignee: INTERDIGITAL MADISON PATENT HOLDINGS, SASInventors: Sylvain Lelievre, Serge Defrance, Olivier Mocquard
-
Patent number: 11936937Abstract: In one general aspect, a method can include detecting at least one indicator of user-initiated interaction with a computing device, obtaining data related to a demographic of a user of the computing device, identifying a current state of the computing device, determining that content displayed on a first display device included in the computing device is to be casted to a second display device separate from the computing device based on the at least one indicator of the user-initiated interaction with the computing device, the data related to a demographic of a user of the computing device, and the current state of the computing device, and casting the content displayed on the first display device to the second display device.Type: GrantFiled: March 16, 2023Date of Patent: March 19, 2024Assignee: Google LLCInventors: James Grafton, James Kent
-
Patent number: 11933822Abstract: A method for estimating actuator parameters for an actuator, in-situ and in real-time, may include driving the actuator with a test signal imperceptible to a user of a device comprising the actuator during real-time operation of the device, measuring a voltage and a current associated with the actuator and caused by the test signal, determining one or more parameters of the actuator based on the voltage and the current, determining an actuator type of the actuator based on the one or more parameters, and controlling a playback signal to the actuator based on the actuator type.Type: GrantFiled: January 12, 2022Date of Patent: March 19, 2024Assignee: Cirrus Logic Inc.Inventors: Jorge L. Reynaga, Marco A. Janko, Emmanuel A. Marchais, John L. Melanson
-
Patent number: 11928306Abstract: Disclosed is a method of receiving and processing navigation inputs executed by one or more processors in a head-worn device system including one or more display devices, one or more cameras and a generally vertically-arranged touchpad. The method comprises displaying a first carousel of AR effects icons, receiving a first horizontal input on the touchpad, rotating the first carousel of AR effects icons in response to first horizontal input, receiving a first touch input on the touchpad to select a particular AR effects icon that is in a selection position in the first carousel, displaying a scene viewed by the one or more cameras, the scene being enhanced with AR effects corresponding to the particular AR effects icon, receiving content capture user input, and in response to the content capture user input, capturing a new content item corresponding to the scene.Type: GrantFiled: September 20, 2021Date of Patent: March 12, 2024Assignee: SNAP INC.Inventors: Karen Stolzenberg, David Meisenholder, Mathieu Emmanuel Vignau, Joseph Timothy Fortier, Kaveh Anvaripour, Daniel Moreno, Kyle Goodrich, Ilteris Kaan Canberk, Shin Hwun Kang
-
Patent number: 11921947Abstract: A touch function setting method is provided. The method comprising: receiving a sequence parameter which includes multiple clicks, each of the clicks is corresponding to one of areas of a touch panel or screen; receiving a function parameter corresponding to the sequence parameter, the function parameter is corresponding to activate a function; and storing a group of touch function parameters, which includes the sequence parameter and the function parameter.Type: GrantFiled: February 18, 2022Date of Patent: March 5, 2024Assignee: EGALAX_EMPIA TECHNOLOGY INC.Inventors: Chin-Fu Chang, Shang-Tai Yeh, Chia-Ling Sun, Jia-Ming Chen
-
Patent number: 11914788Abstract: Methods and systems for gesture-based control of a device are described. A virtual gesture-space is determined in a received input frame. The virtual gesture-space is associated with a primary user from a ranked user list of users. The received input frame is processed in only the virtual gesture-space, to detect and track a hand. Using a hand bounding box generated by detecting and tracking the hand, gesture classification is performed to determine a gesture input associated with the hand. A command input associated with the determined gesture input is processed. The device may be a smart television, a smart phone, a tablet, etc.Type: GrantFiled: May 30, 2022Date of Patent: February 27, 2024Assignee: HUAWEI TECHNOLOGIES CO., LTD.Inventors: Juwei Lu, Sayem Mohammad Siam, Wei Zhou, Peng Dai, Xiaofei Wu, Songcen Xu
-
Patent number: 11914762Abstract: A system having at least a first component and a second component positioned at different locations on a user's body (e.g., on the user's head and held on the user's hand). Each component includes at least one inertial measurement unit (IMU) configured generate measurements indicating acceleration and angular rate data. The generated measurements of the IMUs are used with ground truth information indicating the positions of the first and second component to generate a set of training data to train a neural network configured to predict a relative position between the first and second components based on IMU measurements received over a predetermined time period. Because the neural network is trained based upon movements of a human user, the neural network model takes into account physiological constraints of the user in determining how the set of potential positions of the different components may change over time, reducing potential error.Type: GrantFiled: December 17, 2021Date of Patent: February 27, 2024Assignee: META PLATFORMS TECHNOLOGIES, LLCInventors: Doruk Senkal, Sheng Shen
-
Patent number: 11907741Abstract: The present disclosure provides a method for remotely controlling a personal computer (PC). from a mobile device. The method includes displaying, by a first input module operating on the mobile device, an input GUI on the mobile device. The first input module interacts with a second input module operating on the PC. The method includes, upon receiving an activation of a virtual input on the input GUI, retrieving a first simulated input that is associated with the virtual input. The method further includes transmitting the first simulated input to the second input module, wherein the second input module is configured to perform the first simulated input on the PC in response to the activation of the virtual input on the input GUI.Type: GrantFiled: November 29, 2021Date of Patent: February 20, 2024Assignee: Shanghai Dalong Technology Co., Ltd.Inventors: Zheng Wang, Bingyan Yang, Shuying Liu, Yilei Chai, Meilong Yao
-
Patent number: 11899923Abstract: An information handling system touchpad includes an application area with a display to present control icons of an application executing on the information handling system, such as camera and microphone icons to control camera and microphone functions of a videoconference application or a calculator user interface. The touchpad accepts inputs at the control icons with a predetermined touch, such as a tap or a double tap. The touchpad continues to provide cursor touch inputs across the full touch surface, including over the control icons by applying separate touch logic to isolate control icon inputs, such as isolating finger taps as control icon inputs when control icons are presented.Type: GrantFiled: August 31, 2022Date of Patent: February 13, 2024Assignee: Dell Products L.P.Inventors: Barry Paul Easter, Andelon Xuan Tra
-
Patent number: 11893236Abstract: This application discloses a method of displaying information in a program interface of an application performed by a computer device. The method includes: displaying a virtual keyboard control and an extension bar control in the program interface; in response to an input operation in the virtual keyboard control, displaying at least one character string in the extension bar control, the at least one character string being determined according to the input operation in the virtual keyboard control; and in response to a select operation on a target string among the at least one character string in the extension bar control, displaying a function interface of applying a target function to the target string. This embodiment allows a user to quickly switch between function interfaces when using an application, thereby reducing operation steps of the user and improving human-computer interaction efficiency.Type: GrantFiled: November 29, 2022Date of Patent: February 6, 2024Assignee: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITEDInventors: Hao Zhou, Zhe Feng, Yuxuan Zhang, Xiaosi Lai, Xiangyi Feng, Jiayi Ding, Tao Huang, Ge Wang, Chuangmu Yao, Yixiang Fang, Haitao Chen, Jiashuai Shi, Meng Zhao, Qiang Yan, Jianxiong Feng, Cong Jiang, Jiamin Chen, Tianyi Liang, Hongfa Qiu, Huawei Zhang, Heyi Zhang
-
Patent number: 11893204Abstract: In some examples, a method to present an affordance user interface element within a user interface of an interaction application includes detecting an association of a supplemental media content item with a primary media content item presented within the user interface. The supplemental media content item is identified from among a plurality of supplemental media content items supported by the interaction application. The method may include retrieving metadata related to the supplemental media content item and presenting, within the user interface, a supplementation affordance that presents the metadata. In some examples, the supplementation affordance is user selectable via the user interface to invoke a supplementation function that enables a user to apply the supplemental media content item to a further primary media content item. The supplementation function is invoked responsive to detecting a user selection of the supplementation affordance within the user interface.Type: GrantFiled: May 19, 2023Date of Patent: February 6, 2024Assignee: Snap Inc.Inventors: Christie Marie Heikkinen, David Phillip Taitz, Jeremy Baker Voss
-
Patent number: 11893228Abstract: While concurrently displaying representations of a plurality of recently used applications, including a first application representation that corresponds to a first application and a second application representation that corresponds to a second application, at least a first portion of a first user input, including first movement of an input object, is detected. If the first movement of the input object includes movement in a first direction that meets first criteria, the representations of the plurality of recently used applications cease to be displayed and an application launching user interface is displayed. If the first movement of the input object includes movement in a second direction that meets second criteria, the representations of the plurality of recently used applications cease to be displayed and a first user interface of the first application is displayed in an enhanced-reachability mode and shifted in a predefined direction at least partially off of the display.Type: GrantFiled: April 28, 2022Date of Patent: February 6, 2024Assignee: APPLE INC.Inventors: Marcos Alonso Ruiz, Chanaka G. Karunamuni, Brandon M. Walkin, Shubham Kedia
-
Patent number: 11893167Abstract: An information processing device controls a display for displaying a guide that instructs a user to perform each of a plurality of patterns of motions including translational motions or rotational motions of an inertial sensor that correspond to a first axis and a second axis, which are perpendicular to each other. Based on inertial information and position and orientation information acquired by causing the inertial sensor to perform the plurality of patterns of motions, the information processing device acquires parameters including a degree of correlation between actual inertial values of the inertial sensor corresponding to the first axis and the inertial information corresponding to the second axis.Type: GrantFiled: May 15, 2023Date of Patent: February 6, 2024Assignee: CANON KABUSHIKI KAISHAInventor: Yu Okano
-
Patent number: 11886647Abstract: An in-air gesture control method based on visible light signals, which transmits light signals through a display device and collects its light signals reflected by hand, and after analysis, realizes gesture recognition; the transmitted light signals are in the visible light band; which can realize the gesture recognition and control on mobile devices without modifying hardware and a visible light-based gesture recognition system on commercial mobile devices; compared with the existing gesture recognition methods on commercial mobile devices, the invention avoids special hardware modification, that is, it does not require any additional components on the mobile device, such as a depth camera, and protects the user's privacy well.Type: GrantFiled: September 21, 2022Date of Patent: January 30, 2024Assignee: SHANGHAI JIAOTONG UNIVERSITYInventors: Fan Wu, Zimo Liao, Guihai Chen
-
Patent number: 11880550Abstract: An electronic device includes a touch-sensitive display and one or more programs stored in memory for execution by one or more processors. The one or more programs include instructions for displaying a first application view that corresponds to a first application in a plurality of concurrently open applications. The one or more programs include instructions for detecting a first input, and in response, concurrently displaying a group of open application icons that correspond to at least some of the plurality of concurrently open applications with at least a portion of the first application view. The open application icons are displayed in accordance with a predetermined sequence of the open applications. The one or more programs include instructions for detecting a first gesture distinct from the first input, and in response, displaying a second application view that corresponds to a second application adjacent to the first application in the predetermined sequence.Type: GrantFiled: September 21, 2022Date of Patent: January 23, 2024Assignee: Apple Inc.Inventors: Kenneth Kocienda, Imran Chaudhri
-
Patent number: 11882418Abstract: Embodiments of an audio distribution module and system provide a compact and rugged audio switching device including a radio control unit in communication with an operator control panel. In various embodiments, an audio switching fabric is included with audio relays for directing the transmission and receipt of audio content between a headset in communication with the operator control panel and one or more radios in communication with the radio control unit, facilitating transmission and receipt of audio communications between the radio(s) and the headset.Type: GrantFiled: June 3, 2022Date of Patent: January 23, 2024Assignee: MA Federal, Inc.Inventors: Greg Kasson, Mathew Denault
-
Patent number: 11874970Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: GrantFiled: June 6, 2022Date of Patent: January 16, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
-
Patent number: 11875813Abstract: Disclosed are methods, systems, device, and other implementations, including a method (performed by, for example, a hearing aid device) that includes obtaining a combined sound signal for signals combined from multiple sound sources in an area in which a person is located, and obtaining neural signals for the person, with the neural signals being indicative of one or more target sound sources, from the multiple sound sources, the person is attentive to. The method further includes determining a separation filter based, at least in part, on the neural signals obtained for the person, and applying the separation filter to a representation of the combined sound signal to derive a resultant separated signal representation associated with sound from the one or more target sound sources the person is attentive to.Type: GrantFiled: March 31, 2023Date of Patent: January 16, 2024Assignee: The Trustees of Columbia University in the City of New YorkInventors: Nima Mesgarani, Enea Ceolini, Cong Han
-
Patent number: 11873000Abstract: An example operation includes one or more of detecting a movement in a transport, determining whether the movement includes a gesture definition, when the movement includes a gesture definition, identifying a gesture associated with the gesture definition, and performing an action, via the transport, corresponding to the identified gesture.Type: GrantFiled: February 18, 2020Date of Patent: January 16, 2024Assignee: TOYOTA MOTOR NORTH AMERICA, INC.Inventors: Sachin J. Ahire, Manoj Kalamkar, Christopher J. Risberg
-
Patent number: 11869231Abstract: A method includes detecting a user input comprising an incomplete three-dimensional (3D) gesture performed by one or more hands of a first user by a virtual-reality (VR) headset, selecting candidate 3D gestures from pre-defined 3D gestures based on a personalized gesture-recognition model, wherein each of the candidate 3D gestures is associated with a confidence score representing a likelihood the first user intended to input the respective candidate 3D gesture, and presenting one or more suggested inputs corresponding to one or more of the candidate 3D gestures at the VR headset.Type: GrantFiled: January 5, 2023Date of Patent: January 9, 2024Assignee: Meta Platforms Technologies, LLCInventors: William Crosby Presant, Francislav P Penov, Anuj Kumar
-
Patent number: 11861113Abstract: A contactless touchscreen interface has a digital display to display digital information; the proximity detector and a proximity detector comprising an image sensor to detect user interaction at a virtual touch intersection plane offset a distance from the digital display and to resolve the interaction into XY offset-plane interaction coordinates with reference to the digital display. A gaze determining imaging system comprising an image sensor determines a gaze relative offset with respect to the digital display using facial image data captured by the image sensor. An interface controller comprising a parallax adjustment controller to convert the XY offset-plane interaction coordinates to XY on-screen apparent coordinates using the gaze relative offset and the distance and an input controller generates an input at the XY on-screen apparent coordinates accordingly.Type: GrantFiled: May 28, 2021Date of Patent: January 2, 2024Inventor: Marthinus Van Der Merwe
-
Patent number: 11861944Abstract: Video output is generated based on first video data that depicts the user performing an activity. Poses of the user during performance of the activity are compared with second video data that depicts an instructor performing the activity. Corresponding poses of the user's body and the instructor's body may be determined through comparison of the first and second video data. The video data is used to determine the rate of motion of the user and to generate video output in which a visual representation of the instructor moves at a rate similar to the that of the user. For example, video output generated based on an instructional fitness video may be synchronized so that movement of the presented instructor matches the rate of movement of the user performing an exercise, improving user comprehension and performance.Type: GrantFiled: September 25, 2019Date of Patent: January 2, 2024Assignee: AMAZON TECHNOLOGIES, INC.Inventors: Ido Yerushalmy, Ianir Ideses, Eli Alshan, Mark Kliger, Liza Potikha, Dotan Kaufman, Sharon Alpert, Eduard Oks, Noam Sorek
-
Patent number: 11855947Abstract: A server maintains a gallery of ephemeral messages. Each ephemeral message is posted to the gallery by a user for viewing by recipients via recipient devices. In response to a gallery view request from any of the recipient devices, the ephemeral messages in the gallery are displayed on the requesting device in automated sequence, each message being displayed for a respective display duration before display of the next message in the gallery. Each ephemeral message has an associated message availability parameter. Each ephemeral message is removed from the gallery, thus being unavailable for viewing upon request, at expiry of the corresponding message availability parameter.Type: GrantFiled: July 29, 2016Date of Patent: December 26, 2023Assignee: Snap Inc.Inventors: Nicholas Allen, Donald Giovannini, Chiayi Lin, Robert Murphy, Evan Spiegel
-
Patent number: 11856049Abstract: Systems and methods for processing and displaying information for multiple applications on a computing device are disclosed herein. An example method includes a mobile device retrieving, from a pinboard server, pin blocks corresponding to applications. The mobile device may then filter the pin blocks based on (i) a geolocation of the mobile device, (ii) a user profile activated on the mobile device, (iii) a time of day, (iv) a date, or (v) an activity associated with a user. The mobile device may then display the filtered pin blocks.Type: GrantFiled: October 26, 2020Date of Patent: December 26, 2023Assignee: Zebra Technologies CorporationInventors: Sundar Ranganathan, Sridhar Srinivasan, Venu Challa, Murali Viswanathan, Arun Santhanam
-
Patent number: 11847199Abstract: A method for biometric authentication is disclosed. Reference biometric data established at a first device can be stored at a backend server computer. The server computer can then provide the reference biometric data with a second device when needed for biometric authentication at the second device.Type: GrantFiled: March 2, 2022Date of Patent: December 19, 2023Assignee: Visa International Service AssociationInventor: John Sheets
-
Patent number: 11833961Abstract: A method for controlling a lighting system in an interior chamber of a vehicle having controllable light sources that generate at least one reading light with a defined light cone and a light spot is provided. A gesture camera recognizes gestures of a person located in the interior chamber and the lighting system is controlled by the recognized gestures. A light controller is active between a starting point in time and an end point in time. When the light controller is active and in the event of a gripping or pointing gesture of a hand in the region of the light cone or light spot, the light cone or light spot is guided along with the hand carrying out the gesture.Type: GrantFiled: March 19, 2020Date of Patent: December 5, 2023Assignee: MERCEDES-BENZ GROUP AGInventor: Daniel Betz
-
Patent number: 11836341Abstract: A method includes displaying, by an electronic device, a graphical user interface (GUI) of a first application on a touchscreen of the electronic device, detecting, by the electronic device, a screenshot operation from a user, taking, by the electronic device, a screenshot of the GUI in response to the screenshot operation, displaying, on the touchscreen, a first preview image corresponding to an obtained first screenshot, detecting, by the electronic device, a first touch operation on the first preview image, updating the first preview image to a second preview image in response to the first touch operation, and displaying the second preview image on the touchscreen.Type: GrantFiled: November 29, 2022Date of Patent: December 5, 2023Assignee: HUAWEI TECHNOLOGIES CO., LTD.Inventors: Yun Duan, Zhiyan Yang, Kai Qian
-
Patent number: 11822353Abstract: A method includes generating one or more calibration instructions for calibrating one or more sensors coupled to a movable object and providing at least some of the one or more calibration instructions to a user interface. The at least some of the one or more calibration instructions includes human-readable user instructions for moving the movable object along a predetermined pattern. The method further includes calibrating one or more sensor parameters for the one or more sensors based at least in part on sensor data collected while the movable object moves along the predetermined pattern.Type: GrantFiled: June 11, 2021Date of Patent: November 21, 2023Assignee: SZ DJI TECHNOLOGY CO., LTD.Inventors: You Zhou, Lei Han, Jianzhao Cai
-
Patent number: 11822728Abstract: An electronic device may include a display, a communication circuit, at least one camera, a memory, and a processor operatively connected to the display, the communication circuit, the at least one camera, and the memory. The memory may store instructions that, when executed, cause the processor to provide an augmented reality (AR) environment or a virtual reality (VR) environment through the display, connect the electronic device and at least one external electronic device through the communication circuit, display the at least one external electronic device through the display, specify a first external electronic device among the displayed at least one external electronic device based on an input interface switching event, and control an operation of the electronic device in the augmented reality environment or the virtual reality environment using the specified first external electronic device.Type: GrantFiled: December 7, 2021Date of Patent: November 21, 2023Assignee: Samsung Electronics Co., Ltd.Inventors: Heonjun Ha, Seungnyun Kim, Junwhon Uhm, Jinchoul Lee, Hyunsoo Kim, Hyunjun Kim
-
Patent number: 11816324Abstract: Technologies and techniques for setting a value for a parameter in which a graphical user interface is generated and output, wherein the graphical user interface includes a selection object that is assigned to the parameter. A first user input, which includes a positioning gesture related to the selection object, is captured, wherein a setting object is generated depending on the first user input and is positioned on the graphical user interface according to the positioning gesture. A second user input, which includes a setting gesture, is captured, with the value of the parameter being set using the setting object according to the setting gesture.Type: GrantFiled: November 29, 2019Date of Patent: November 14, 2023Assignee: Volkswagen AktiengesellschaftInventor: Luigi Trabucco
-
Patent number: 11809635Abstract: In response to a current gesture performed by a user interacting with an application, a gesture detection module accesses a plurality of predefined control gestures, and compares the current gesture with the plurality of predefined control gestures. The gesture detection module detects whether the current gesture matches at least an initial partial gesture of a particular control gesture associated with at least one application function. A feedback generator module generates confirmation feedback for the user if the current gesture matches at least an initial partial gesture of the particular control gesture. The confirmation feedback encodes, for the user, at least a first information identifying the application function associated with the particular control gesture and a second information about a time interval remaining before the associated application function will be executed. The application triggers the associated function if the particular control gesture is completed during said time interval.Type: GrantFiled: June 13, 2022Date of Patent: November 7, 2023Assignee: Treye Tech UG (haftungsbeschränkt)Inventor: Anton Wachner
-
Patent number: 11803247Abstract: A computer-implemented method includes: predicting, by a computing device, devices for inclusion in an interface; generating, by the computing device, the interface including areas corresponding to the devices; detecting, by the computing device, a user selection of one of the areas of the interface; detecting, by the computing device, a hand gesture associated with the selected one of the areas; and transmitting, by the computing device, data defining the hand gesture to a respective one of the devices corresponding to the selected one of the areas, thereby causing the respective one of the devices to execute a command based on the hand gesture.Type: GrantFiled: October 25, 2021Date of Patent: October 31, 2023Assignee: KYNDRYL, INC.Inventors: Mauro Marzorati, Todd Russell Whitman, Jeremy R. Fox, Sarbajit K. Rakshit
-
Patent number: 11793435Abstract: The present disclosure provides a method for detecting the focus of attention. The method includes: obtaining the face of a person in the first image, as well as the result of facial recognition; determining whether the distance between the person and the target is within an effective attention range; determining whether the face is frontal; determining whether the effective attention period is not shorter than a period threshold; detecting the focus of attention for the person to the target.Type: GrantFiled: March 17, 2021Date of Patent: October 24, 2023Assignee: ACER INCORPORATEDInventors: Kuan-Chung Hou, Bo-Ting Wu, Chian-Ying Li, Ming-Hsuan Tu, Chien-Hung Lin, Jian-Chi Lin, Fu-Heng Wu, Kai-Lun Chang, Tsung-Yao Chen
-
Patent number: 11792267Abstract: A network-connected electronic device may obtain, at a location, a notification that a network-connected sensor device at the location is activated in response to a triggering of a critical rule of the network-connected sensor device, determine whether the triggering of the critical rule of the network-connected sensor device will activate a function of the network-connected electronic device, activate the function of the network-connected electronic device in response to the determining that the triggering of the critical rule of the network-connected sensor device will activate the function of the network-connected electronic device.Type: GrantFiled: October 12, 2021Date of Patent: October 17, 2023Assignee: AT&T Intellectual Property I, L.P.Inventors: Barrett Kreiner, James Pratt, Adrianne Binh Luu, Robert T. Moton, Jr., Walter Cooper Chastain, Ari Craine, Robert Koch
-
Patent number: 11783622Abstract: Provided are a fingerprint authentication device and a fingerprint authentication method capable of efficiently performing registration of fingerprint information or authentication of a fingerprint. A presentation image that presents a position of a fingernail root at a time of detecting the fingerprint is generated, and an image of the fingerprint is obtained using the generated presentation image.Type: GrantFiled: September 4, 2020Date of Patent: October 10, 2023Assignee: SONY GROUP CORPORATIONInventor: Yu Tanaka
-
Patent number: 11785069Abstract: SphericRTC provides real-time 360-degree video communication, which allows the viewer to observe the environment in any direction from the camera location. This more allows users to more-efficiently exchange information and can be beneficial in the real-time setting. The system selects representations of 360-degree frames to allow efficient, content-adaptive delivery. The system performs joint content and bitrate adaptation in real-time by offloading expensive transformation operations to a GPU. The system demonstrates that the multiple sub-components: viewport feedback, representation selection, and joint content and bitrate adaptation, can be effectively integrated within a single framework. Compared to a baseline implementation, views in SphericRTC have consistently higher visual quality. The median Viewport-PSNR of such views is 2.25 dB higher than views in the baseline system.Type: GrantFiled: October 4, 2021Date of Patent: October 10, 2023Assignee: The Research Foundation for The State University of New YorkInventors: Yao Liu, Shuoqian Wang
-
Patent number: 11774940Abstract: A human machine interface for an industrial automation control system includes at least one touchless input device that is adapted to be in a first state in which said human machine interface provides a first input to said industrial automation control system or a second state in which said human machine interface provides a second input to said industrial automation control system. The at least one touchless input device includes first and second touchless input sensors each configured to detect hand gestures of an operator's hand to provide input to said human machine interface based upon said gestures. The first and second touchless input sensors can be identical with respect to each other or different. In one example, one or both of the sensors are both time-of-flight sensors and one of the sensors can be an electric field proximity sensor. A method of providing a human machine interface with at least one touchless input device is provided.Type: GrantFiled: March 29, 2021Date of Patent: October 3, 2023Assignee: Rockwell Automation Technologies, Inc.Inventors: Yanbin Zhang, Xiaobo Peng, Gary D. Dotson, Christopher G. Mason
-
Patent number: 11755178Abstract: Systems and methods are provided for customizing user interface controls around a cursor. One example method includes receiving, at a computing device, a request to display an indicator menu and identifying at least one user interface element of a program being executed at the computing device. An interaction frequency for each identified user interface element is generated. In response to the request and for each identified user interface element, a spatial relationship between an indicator of the computing device and the identified user interface element is determined. Based on the interaction frequency value and the determined spatial relationship, an element set comprising one or more of the identified user interface elements is generated. The indicator menu comprising at least a portion of the element set is generated for display. The indicator menu is displayed proximate the indicator on a display of the computing device.Type: GrantFiled: May 13, 2022Date of Patent: September 12, 2023Assignee: ROVI GUIDES, INC.Inventors: Charishma Chundi, Rajendra Pandey
-
Patent number: 11755171Abstract: A screenshot or screen capture operation method for an electronic device includes: capturing a screen displayed on a display; determining whether additional information exists on the displayed screen; when the additional information exists, extracting the additional information, based on the displayed screen; determining whether a command for modifying the captured screen has been received; when the command for modifying the captured screen is determined as having been received, modifying the captured screen according to a user input; and storing the extracted additional information and/or the captured screen as a captured image. The resulting method enables a user to capture a screen intended or desired by the user.Type: GrantFiled: September 30, 2022Date of Patent: September 12, 2023Assignee: SAMSUNG ELECTRONICS CO., LTD.Inventors: Inhyung Jung, Sangheon Kim, Yeunwook Lim
-
Patent number: 11748071Abstract: Developer and runtime environments supporting multi-modal input for computing systems are disclosed. The developer environment includes a gesture library of human body gestures (e.g., hand gestures) that a previously-trained, system-level gesture recognition machine is configured to recognize. The developer environment further includes a user interface for linking a gesture of the gesture library with a semantic descriptor that is assigned to a function of the application program. The application program is executable to implement the function responsive to receiving an indication of the gesture recognized by the gesture recognition machine within image data captured by a camera. The semantic descriptor may be additionally linked to a different input modality than the gesture, such as a natural language input.Type: GrantFiled: December 14, 2022Date of Patent: September 5, 2023Inventors: Soumya Batra, Hany Mohamed Salah Eldeen Mohamed Khalil, Imed Zitouni
-
Patent number: 11740688Abstract: A virtual space experience system capable of preventing unintentional contact between players without inhibiting a sense of immersion is provided. A virtual space image determination unit 35 of a VR system S causes an image of a virtual space to be perceived by a second player to include an image in which a first avatar corresponding to a first player moves so as to generate a gap in a correspondence relationship between coordinates of the first player and coordinates of the first avatar when a trigger event is recognized, and causes the image of the virtual space to be perceived by the second player to include an image of a second avatar corresponding to the second player and a third avatar corresponding to the first player after the recognition. An avatar coordinate determination unit 34 determines coordinates of the third avatar based on the coordinates of the first player.Type: GrantFiled: November 13, 2019Date of Patent: August 29, 2023Assignee: Abal Inc.Inventors: Shota Suzuki, Tsuyoshi Nomura, Yoshikatsu Kanemaru, Yoshiya Okoyama, Yoshimasa Takahashi
-
Patent number: 11740758Abstract: Aspects of the present disclosure involve a system comprising a computer-readable storage medium storing a program and method for presenting participant reactions within a virtual working environment.Type: GrantFiled: August 31, 2022Date of Patent: August 29, 2023Assignee: Snap Inc.Inventors: Brandon Francis, Andrew Cheng-min Lin, Walton Lin
-
Patent number: 11740772Abstract: A method and device for controlling a hotspot recommendation pop-up window, a computer-readable medium and an electronic device are provided. On reception of a pop-up window display instruction, a pop-up window including prompt options corresponding to multiple hotspots is displayed on the hot video playing page. In a case that the pop-up window is displayed on the hot video playing page, a first control on the hot video playing page except for the pop-up window is set to a disabled state. Further, the user can select among the prompt options corresponding to multiple hotspots included in the pop-up window.Type: GrantFiled: March 16, 2022Date of Patent: August 29, 2023Assignee: BEIJING BYTEDANCE NETWORK TECHNOLOGY CO., LTD.Inventors: Zesong Dong, Huijun Yu