Patents by Inventor Gaganpreet Singh

Gaganpreet Singh has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11693483
    Abstract: Method and systems for controlling a display device, including detecting a mid-air gesture using a sensing device; mapping the detected mid-air gesture to locations of an interaction region, the interaction region including an on-screen region of the display device and an off-screen region that is located outside an edge of the on-screen region; and performing a display device control action upon detecting an edge interaction based on the mapping of the detected mid-air gesture to locations that interact with the edge of the on-screen region.
    Type: Grant
    Filed: November 10, 2021
    Date of Patent: July 4, 2023
    Assignee: Huawei Technologies Co., Ltd.
    Inventors: Gaganpreet Singh, Wei Zhou, Pourang Polad Irani, Wei Li
  • Patent number: 11693551
    Abstract: Devices and methods for providing feedback for a control display (CD) gain of a slider control on a gesture-controlled device are described. The method includes detecting a speed of a dynamic dragging gesture, determining the CD gain for the slider control based on the speed, and generating an auditory feedback or a visual feedback for the CD gain. A gesture-controlled device, which carries out the method, may have an image-capturing device, a processor, and a memory coupled to the processor, the memory storing machine-executable instructions for carrying out the method. Advantageously, providing feedback for CD gain of a slider control facilitates accurate adjustment of a system parameter associated with the slider control. The devices and methods may be used in industrial applications, vehicle control among other applications.
    Type: Grant
    Filed: May 21, 2021
    Date of Patent: July 4, 2023
    Assignee: Huawei Technologies Co., Ltd.
    Inventors: Wei Zhou, Sachi Mizobuchi, Gaganpreet Singh, Ghazaleh Saniee-Monfared, Sitao Wang, Futian Zhang, Wei Li
  • Patent number: 11693482
    Abstract: A gesture-controlled device and a method thereon are provided. The method includes capturing a plurality of video frames of a user's body and processing the plurality of video frames to allow detecting a portion of the user's body and to allow recognizing hand gestures. In response to detecting the portion of the user's body, the method includes generating at least one widget interaction region corresponding to the portion of the user's body. The method further includes recognizing, a mid-air hand gesture in the at least one widget interaction region, mapping the mid-air hand gesture to at least one virtual widget on the gesture-controlled device, and manipulating the at least one virtual widget based on the mid-air hand gesture. The method and device allow manipulation of virtual widgets using mid-air gestures on a gesture-controlled device.
    Type: Grant
    Filed: May 28, 2021
    Date of Patent: July 4, 2023
    Assignee: Huawei Technologies Co., Ltd.
    Inventors: Mona Hosseinkhani Loorak, Wei Zhou, Gaganpreet Singh, Xiu Yi, Wei Li
  • Publication number: 20230145592
    Abstract: Method and systems for controlling a display device, including detecting a mid-air gesture using a sensing device; mapping the detected mid-air gesture to locations of an interaction region, the interaction region including an on-screen region of the display device and an off-screen region that is located outside an edge of the on-screen region; and performing a display device control action upon detecting an edge interaction based on the mapping of the detected mid-air gesture to locations that interact with the edge of the on-screen region.
    Type: Application
    Filed: November 10, 2021
    Publication date: May 11, 2023
    Inventors: Gaganpreet SINGH, Wei ZHOU, Pourang Polad IRANI, Wei LI
  • Publication number: 20230116341
    Abstract: Methods and apparatuses for controlling a selection focus of a user interface using gestures, in particular mid-air hand gestures, are described. A hand is detected within a defined activation region in a first frame of video data. The detected hand is tracked to determine a tracked location of the detected hand in at least a second frame of video data. A control signal is outputted to control the selection focus to focus on a target in the user interface, where movement of the selection focus is controlled based on a displacement between the tracked location and a reference location in the activation region.
    Type: Application
    Filed: September 30, 2022
    Publication date: April 13, 2023
    Inventors: Futian ZHANG, Edward LANK, Sachi MIZOBUCHI, Gaganpreet SINGH, Wei ZHOU, Wei LI
  • Publication number: 20220382377
    Abstract: A gesture-controlled device and a method thereon are provided. The method includes capturing a plurality of video frames of a user's body and processing the plurality of video frames to allow detecting a portion of the user's body and to allow recognizing hand gestures. In response to detecting the portion of the user's body, the method includes generating at least one widget interaction region corresponding to the portion of the user's body. The method further includes recognizing, a mid-air hand gesture in the at least one widget interaction region, mapping the mid-air hand gesture to at least one virtual widget on the gesture-controlled device, and manipulating the at least one virtual widget based on the mid-air hand gesture. The method and device allow manipulation of virtual widgets using mid-air gestures on a gesture-controlled device.
    Type: Application
    Filed: May 28, 2021
    Publication date: December 1, 2022
    Inventors: Mona HOSSEINKHANI LOORAK, Wei ZHOU, Gaganpreet SINGH, Xiu YI, Wei LI
  • Publication number: 20220374138
    Abstract: Devices and methods for providing feedback for a control display (CD) gain of a slider control on a gesture-controlled device are described. The method includes detecting a speed of a dynamic dragging gesture, determining the CD gain for the slider control based on the speed, and generating an auditory feedback or a visual feedback for the CD gain. A gesture-controlled device, which carries out the method, may have an image-capturing device, a processor, and a memory coupled to the processor, the memory storing machine-executable instructions for carrying out the method. Advantageously, providing feedback for CD gain of a slider control facilitates accurate adjustment of a system parameter associated with the slider control. The devices and methods may be used in industrial applications, vehicle control among other applications.
    Type: Application
    Filed: May 21, 2021
    Publication date: November 24, 2022
    Inventors: Wei ZHOU, Sachi MIZOBUCHI, Gaganpreet SINGH, Ghazaleh SANIEE-MONFARED, Sitao WANG, Futian ZHANG, Wei LI
  • Patent number: 11416138
    Abstract: Methods of detection of swipe gestures and filtering and locating desired information in a multi-attributed search space in response to detection of swipe gestures and devices for performing the same are provided. The method includes activating an attribute associated with at least one element of a list of elements having a visible list portion rendered on the viewing area, in response to receiving an activation input for the attribute and displaying a plurality of attribute field controls associated with the attribute, in response to its activation. The method further includes receiving a manipulation action of at least one attribute field control, updating an attribute value in response to the manipulation action, and updating the visible list portion based on the attribute value. The method may be used with touchscreen displays, and with virtual or augmented reality displays.
    Type: Grant
    Filed: December 11, 2020
    Date of Patent: August 16, 2022
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Gaganpreet Singh, Qiang Xu, Wei Li
  • Publication number: 20220187983
    Abstract: Methods of detection of swipe gestures and filtering and locating desired information in a multi-attributed search space in response to detection of swipe gestures and devices for performing the same are provided. The method includes activating an attribute associated with at least one element of a list of elements having a visible list portion rendered on the viewing area, in response to receiving an activation input for the attribute and displaying a plurality of attribute field controls associated with the attribute, in response to its activation. The method further includes receiving a manipulation action of at least one attribute field control, updating an attribute value in response to the manipulation action, and updating the visible list portion based on the attribute value. The method may be used with touchscreen displays, and with virtual or augmented reality displays.
    Type: Application
    Filed: December 11, 2020
    Publication date: June 16, 2022
    Inventors: Gaganpreet SINGH, Qiang XU, Wei LI
  • Patent number: 11301049
    Abstract: User interface control based on elbow-anchored arm gestures is disclosed. In one aspect, there is provided a method of user interface control on a computing device based on elbow-anchored arm gestures. A visual user interface (VUI) screen is displayed on a display of a computing device. The VUI screen comprises a plurality of VUI elements arranged in a plurality of VUI element levels. Each VUI element level comprising one or more VUI elements. A spatial location of an elbow of an arm of a user and a spatial location of a wrist of the arm of the user are determined. A three-dimensional (3D) arm vector extending from the spatial location of the elbow to the spatial location of the wrist is then determined. A VUI element in the VUI screen corresponding to the 3D arm vector is then determined based on a predetermined 3D spatial mapping between 3D arm vectors and VUI elements for the VUI screen.
    Type: Grant
    Filed: September 17, 2020
    Date of Patent: April 12, 2022
    Assignee: Huawei Technologies Co., Ltd.
    Inventors: Gaganpreet Singh, Rafael Veras Guimaraes, Wei Zhou, Pourang Polad Irani, Farzin Farhadi-Niaki, Wei Li
  • Patent number: 11249565
    Abstract: A touch input tool for interacting with a capacitive touchscreen display. The touch input tool includes a plurality of spaced apart conductive touchscreen touch elements arranged to simultaneously operatively engage a screen of the touchscreen display at a corresponding plurality of discrete respective touch locations.
    Type: Grant
    Filed: May 5, 2020
    Date of Patent: February 15, 2022
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Qiang Xu, Junwei Sun, Jun Li, Chenhe Li, Wei Li, Gaganpreet Singh
  • Patent number: 11249557
    Abstract: Methods and apparatus for gesture-based control of a device in a multi-user environment are described. The methods prioritize users or gestures based on a predetermined priority ruleset. A first-user-in-time ruleset prioritizes gestures based on when in time they were begun by a user in the camera FOV. An action-hierarchy ruleset prioritizes gestures based on the actions they correspond to, and the relative positions of those actions within an action hierarchy. A designated-master-user ruleset prioritizes gestures performed by an explicitly designated master user. Methods for designating a new master user and for providing gesture-control-related user feedback in a multi-user environment are also described.
    Type: Grant
    Filed: May 5, 2021
    Date of Patent: February 15, 2022
    Assignee: Huawei Technologies Co. Ltd.
    Inventors: Wei Zhou, Mona Hosseinkhani Loorak, Gaganpreet Singh, Xiu Yi, Juwei Lu, Wei Li
  • Publication number: 20210349625
    Abstract: A method is provided including generating touch coordinate information corresponding to touch interactions with a touchscreen display of an electronic device and updating information rendered on the touchscreen display in response to determining that the touch coordinate information matches a tool shaft movement gesture corresponding to movement of a touch tool shaft over an area of the touchscreen display. A device is provided, including a touchscreen display comprising a display and a touch sensing system configured to generate signals corresponding to screen touches of the display, a processing device operatively coupled to the touchscreen display, and a non-transitory memory coupled to the processing device and storing software instructions that when executed by the processing device configure the processing device to carry out the provided method. The ability to process tool shaft gestures may improve one or both of the operation of the device and the user experience with the device.
    Type: Application
    Filed: May 5, 2020
    Publication date: November 11, 2021
    Inventors: Wei LI, Jun LI, Qiang XU, Chenhe LI, Junwei SUN, Gaganpreet SINGH
  • Publication number: 20210349555
    Abstract: A touch input tool for interacting with a capacitive touchscreen display. The touch input tool includes a plurality of spaced apart conductive touchscreen touch elements arranged to simultaneously operatively engage a screen of the touchscreen display at a corresponding plurality of discrete respective touch locations.
    Type: Application
    Filed: May 5, 2020
    Publication date: November 11, 2021
    Inventors: Qiang XU, Junwei SUN, Jun LI, Chenhe LI, Wei LI, Gaganpreet SINGH
  • Patent number: 11132104
    Abstract: A method of managing user interface items in a visual user interface (VUI) is disclosed. In one aspect, the method comprises displaying a VUI screen on a display. The VUI screen comprises a scrollable area comprising a plurality of selectable items. The VUI screen provides a pin holder located at an edge of a display area of the display. The state of the pin holder and the second pin holder persists in response to changes in a visual state of the scrollable area. A selected item in the plurality of selectable items is pinned to the pin holder in response to detection of a pinning command. A visual state of the scrollable area is changed in response to detection of corresponding input. A pinned item corresponding to a selected pin from the pin holder is unpinned in response to detection of an unpinning command.
    Type: Grant
    Filed: October 5, 2020
    Date of Patent: September 28, 2021
    Assignee: Huawei Technologies Co., Ltd.
    Inventors: Gaganpreet Singh, Qiang Xu, Wei Li
  • Publication number: 20210294427
    Abstract: Methods and apparatus for gesture-based control of a device in a multi-user environment are described. The methods prioritize users or gestures based on a predetermined priority ruleset. A first-user-in-time ruleset prioritizes gestures based on when in time they were begun by a user in the camera FOV. An action-hierarchy ruleset prioritizes gestures based on the actions they correspond to, and the relative positions of those actions within an action hierarchy. A designated-master-user ruleset prioritizes gestures performed by an explicitly designated master user. Methods for designating a new master user and for providing gesture-control-related user feedback in a multi-user environment are also described.
    Type: Application
    Filed: May 5, 2021
    Publication date: September 23, 2021
    Inventors: Wei ZHOU, Mona HOSSEINKHANI LOORAK, Gaganpreet SINGH, Xiu YI, Juwei LU, Wei LI
  • Publication number: 20210294423
    Abstract: Methods and apparatus for gesture-based control of a device in a multi-user environment are described. The methods prioritize users or gestures based on a predetermined priority ruleset. A first-user-in-time ruleset prioritizes gestures based on when in time they were begun by a user in the camera FOV. An action-hierarchy ruleset prioritizes gestures based on the actions they correspond to, and the relative positions of those actions within an action hierarchy. A designated-master-user ruleset prioritizes gestures performed by an explicitly designated master user. Methods for designating a new master user and for providing gesture-control-related user feedback in a multi-user environment are also described.
    Type: Application
    Filed: April 7, 2020
    Publication date: September 23, 2021
    Inventors: Wei ZHOU, Mona HOSSEINKHANI LOORAK, Gaganpreet SINGH, Xiu YI, Juwei LU, Wei LI
  • Patent number: 11119652
    Abstract: A method and system including sensing a touch event on a touchscreen display of an electronic device, and changing a display layout rendered on the touchscreen display in response to determining that the touch event matches a stylus shaft gesture that corresponds to interaction of the stylus shaft with the touchscreen display. The stylus shaft may be located between a first end and a second end of a stylus.
    Type: Grant
    Filed: May 5, 2020
    Date of Patent: September 14, 2021
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Qiang Xu, Jun Li, Junwei Sun, Chenhe Li, Wei Li, Gaganpreet Singh
  • Patent number: 11112875
    Abstract: Methods and apparatus for gesture-based control of a device in a multi-user environment are described. The methods prioritize users or gestures based on a predetermined priority ruleset. A first-user-in-time ruleset prioritizes gestures based on when in time they were begun by a user in the camera FOV. An action-hierarchy ruleset prioritizes gestures based on the actions they correspond to, and the relative positions of those actions within an action hierarchy. A designated-master-user ruleset prioritizes gestures performed by an explicitly designated master user. Methods for designating a new master user and for providing gesture-control-related user feedback in a multi-user environment are also described.
    Type: Grant
    Filed: April 7, 2020
    Date of Patent: September 7, 2021
    Assignee: HUAWEI TECHNOLOGIES CO., LTD.
    Inventors: Wei Zhou, Mona Hosseinkhani Loorak, Gaganpreet Singh, Xiu Yi, Juwei Lu, Wei Li
  • Publication number: 20210081052
    Abstract: User interface control based on elbow-anchored arm gestures is disclosed. In one aspect, there is provided a method of user interface control on a computing device based on elbow-anchored arm gestures. A visual user interface (VUI) screen is displayed on a display of a computing device. The VUI screen comprises a plurality of VUI elements arranged in a plurality of VUI element levels. Each VUI element level comprising one or more VUI elements. A spatial location of an elbow of an arm of a user and a spatial location of a wrist of the arm of the user are determined. A three-dimensional (3D) arm vector extending from the spatial location of the elbow to the spatial location of the wrist is then determined. A VUI element in the VUI screen corresponding to the 3D arm vector is then determined based on a predetermined 3D spatial mapping between 3D arm vectors and VUI elements for the VUI screen.
    Type: Application
    Filed: September 17, 2020
    Publication date: March 18, 2021
    Inventors: Gaganpreet SINGH, Rafael VERAS GUIMARAES, Wei ZHOU, Pourang Polad IRANI, Farzin FARHADI-NIAKI, Wei LI