Patents by Inventor David S. HOLZ

David S. HOLZ has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11720180
    Abstract: A region of space may be monitored for the presence or absence of one or more control objects, and object attributes and changes thereto may be interpreted as control information provided as input to a machine or application. In some embodiments, the region is monitored using a combination of scanning and image-based sensing.
    Type: Grant
    Filed: July 11, 2022
    Date of Patent: August 8, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20230245500
    Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Application
    Filed: March 31, 2023
    Publication date: August 3, 2023
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Maxwell SILLS, Aaron SMITH, David S. HOLZ, Hongyuan (Jimmy) HE
  • Publication number: 20230205321
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Application
    Filed: February 17, 2023
    Publication date: June 29, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
  • Patent number: 11676349
    Abstract: The technology disclosed can provide capabilities to view and/or interact with the real world to the user of a wearable (or portable) device using a sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved user experience, greater safety, greater functionality to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted devices (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Grant
    Filed: July 23, 2021
    Date of Patent: June 13, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Publication number: 20230169236
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Application
    Filed: January 30, 2023
    Publication date: June 1, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Kevin HOROWITZ, Raffi BEDIKIAN, Hua YANG
  • Publication number: 20230140737
    Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Application
    Filed: December 23, 2022
    Publication date: May 4, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ
  • Publication number: 20230125265
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Application
    Filed: December 21, 2022
    Publication date: April 27, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Robert Samuel GORDON, Matias PEREZ
  • Publication number: 20230121570
    Abstract: The technology disclosed relates to determining intent for the interaction by calculating a center of effort for the applied forces. Movement of the points of virtual contacts and the center of effort are then monitored to determine a gesture-type intended for the interaction. The number of points of virtual contacts of the feeler zones and proximities between the points of virtual contacts are used to determine a degree of precision of a control object-gesture.
    Type: Application
    Filed: December 19, 2022
    Publication date: April 20, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Pohung CHEN, David S. HOLZ
  • Patent number: 11620859
    Abstract: The technology disclosed can provide methods and systems for identifying users while capturing motion and/or determining the path of a portion of the user with one or more optical, acoustic or vibrational sensors. Implementations can enable use of security aware devices, e.g., automated teller machines (ATMs), cash registers and banking machines, other secure vending or service machines, security screening apparatus, secure terminals, airplanes, automobiles and so forth that comprise sensors and processors employing optical, audio or vibrational detection mechanisms suitable for providing gesture detection, personal identification, user recognition, authorization of control inputs, and other machine control and/or machine communications applications. A virtual experience can be provided to the user in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Grant
    Filed: July 28, 2020
    Date of Patent: April 4, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Maxwell Sills, Aaron Smith, David S. Holz, Hongyuan (Jimmy) He
  • Publication number: 20230094182
    Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.
    Type: Application
    Filed: September 30, 2022
    Publication date: March 30, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Kevin A. HOROWITZ, David S. HOLZ
  • Patent number: 11599237
    Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.
    Type: Grant
    Filed: February 12, 2021
    Date of Patent: March 7, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
  • Patent number: 11586292
    Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.
    Type: Grant
    Filed: March 1, 2021
    Date of Patent: February 21, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
  • Publication number: 20230042990
    Abstract: The technology disclosed relates to a motion sensory and imaging device capable of acquiring imaging information of the scene and providing at least a near real time pass-through of imaging information to a user. The sensory and imaging device can be used stand-alone or coupled to a wearable or portable device to create a wearable sensory system capable of presenting to the wearer the imaging information augmented with virtualized or created presentations of information.
    Type: Application
    Filed: October 24, 2022
    Publication date: February 9, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Neeloy ROY, Hongyuan HE
  • Patent number: 11568105
    Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.
    Type: Grant
    Filed: May 5, 2021
    Date of Patent: January 31, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Kevin Horowitz, Raffi Bedikian, Hua Yang
  • Patent number: 11567578
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Grant
    Filed: November 9, 2020
    Date of Patent: January 31, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik
  • Publication number: 20220413566
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Application
    Filed: September 1, 2022
    Publication date: December 29, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Paul DURDIK
  • Patent number: 11537196
    Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.
    Type: Grant
    Filed: August 23, 2021
    Date of Patent: December 27, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventor: David S. Holz
  • Patent number: 11537208
    Abstract: The technology disclosed relates to determining intent for the interaction by calculating a center of effort for the applied forces. Movement of the points of virtual contacts and the center of effort are then monitored to determine a gesture-type intended for the interaction. The number of points of virtual contacts of the feeler zones and proximities between the points of virtual contacts are used to determine a degree of precision of a control object-gesture.
    Type: Grant
    Filed: April 16, 2020
    Date of Patent: December 27, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Pohung Chen, David S. Holz
  • Patent number: 11538224
    Abstract: The technology disclosed can provide improved safety by detecting potential unsafe conditions (e.g., collisions, loss of situational awareness, etc.) confronting the user of a wearable (or portable) sensor configured to capture motion and/or determining the path of an object based on imaging, acoustic or vibrational waves. Implementations can enable improved safety to users of virtual reality for machine control and/or machine communications applications using wearable (or portable) devices, e.g., head mounted displays (HMDs), wearable goggles, watch computers, smartphones, and so forth, or mobile devices, e.g., autonomous and semi-autonomous robots, factory floor material handling systems, autonomous mass-transit vehicles, automobiles (human or machine driven), and so forth, equipped with suitable sensors and processors employing optical, audio or vibrational detection.
    Type: Grant
    Filed: November 7, 2019
    Date of Patent: December 27, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Robert Samuel Gordon, Matias Perez
  • Publication number: 20220404917
    Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.
    Type: Application
    Filed: August 26, 2022
    Publication date: December 22, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventor: David S. HOLZ