Patents by Inventor Paul A. Durdik

Paul A. Durdik has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240028131
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: September 26, 2023
    Publication date: January 25, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
  • Publication number: 20240004438
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Application
    Filed: September 19, 2023
    Publication date: January 4, 2024
    Applicant: ULTRAHAPTICS IP TWO LIMITED
    Inventors: David S. HOLZ, Paul DURDIK
  • Publication number: 20230342024
    Abstract: The technology disclosed relates to selecting a virtual item from a virtual grid in a three-dimensional (3D) sensory space. It also relates to navigating a virtual modality displaying a plurality of virtual items arranged in a grid by and automatically selecting a virtual item in a virtual grid at a terminal end of a control gesture of a control object responsive to a terminal gesture that transitions the control object from one physical arrangement to another. In one implementation, the control object is a hand. In some implementations, physical arrangements of the control object include at least a flat hand with thumb parallel to fingers, closed, half-open, pinched, curled, fisted, mime gun, okay sign, thumbs-up, ILY sign, one-finger point, two-finger point, thumb point, or pinkie point.
    Type: Application
    Filed: June 29, 2023
    Publication date: October 26, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Bingxin Ku, Pohung Chen, Isaac Cohen, Paul A. Durdik
  • Patent number: 11775033
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Grant
    Filed: September 1, 2022
    Date of Patent: October 3, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Paul Durdik
  • Patent number: 11775080
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: October 3, 2022
    Date of Patent: October 3, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
  • Patent number: 11567578
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Grant
    Filed: November 9, 2020
    Date of Patent: January 31, 2023
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik
  • Publication number: 20230025269
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: October 3, 2022
    Publication date: January 26, 2023
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
  • Publication number: 20220413566
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Application
    Filed: September 1, 2022
    Publication date: December 29, 2022
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Paul DURDIK
  • Patent number: 11460929
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: July 16, 2021
    Date of Patent: October 4, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
  • Patent number: 11435788
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Grant
    Filed: March 1, 2021
    Date of Patent: September 6, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Paul Durdik
  • Patent number: 11307282
    Abstract: The technology disclosed relates to determining positional information about an object of interest is provided. In particular, it includes, conducting scanning of a field of interest with an emission from a transmission area according to an ordered scan pattern. The emission can be received to form a signal based upon at least one salient property (e.g., intensity, amplitude, frequency, polarization, phase, or other detectable feature) of the emission varying with time at the object of interest. Synchronization information about the ordered scan pattern can be derived from a source, a second signal broadcast separately, social media share, others, or and/or combinations thereof). A correspondence between at least one characteristic of the signal and the synchronization information can be established. Positional information can be determined based at least in part upon the correspondence.
    Type: Grant
    Filed: October 24, 2014
    Date of Patent: April 19, 2022
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Robert Samuel Gordon, Gabriel A. Hare, Neeloy Roy, Maxwell Sills, Paul Durdik
  • Publication number: 20210382563
    Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.
    Type: Application
    Filed: August 23, 2021
    Publication date: December 9, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Avinash DABIR, Paul DURDIK, Keith MERTENS, Michael ZAGORSEK
  • Publication number: 20210342012
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Application
    Filed: July 16, 2021
    Publication date: November 4, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Isaac COHEN, Maxwell SILLS, Paul DURDIK
  • Patent number: 11099653
    Abstract: Methods and systems for processing an input are disclosed that detect a portion of a hand and/or other detectable object in a region of space monitored by a 3D sensor. The method further includes determining a zone corresponding to the region of space in which the portion of the hand or other detectable object was detected. Also, the method can include determining from the zone a correct way to interpret inputs made by a position, shape or a motion of the portion of the hand or other detectable object.
    Type: Grant
    Filed: October 21, 2019
    Date of Patent: August 24, 2021
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Michael Zagorsek, Avinash Dabir, Paul Durdik, Keith Mertens
  • Patent number: 11068070
    Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.
    Type: Grant
    Filed: February 28, 2020
    Date of Patent: July 20, 2021
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Isaac Cohen, Maxwell Sills, Paul Durdik
  • Publication number: 20210181810
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Application
    Filed: March 1, 2021
    Publication date: June 17, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: David S. HOLZ, Paul DURDIK
  • Publication number: 20210081054
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Application
    Filed: November 9, 2020
    Publication date: March 18, 2021
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik
  • Patent number: 10936022
    Abstract: The technology disclosed relates to enhancing the fields of view of one or more cameras of a gesture recognition system for augmenting the three-dimensional (3D) sensory space of the gesture recognition system. The augmented 3D sensory space allows for inclusion of previously uncaptured of regions and points for which gestures can be interpreted i.e. blind spots of the cameras of the gesture recognition system. Some examples of such blind spots include areas underneath the cameras and/or within 20-85 degrees of a tangential axis of the cameras. In particular, the technology disclosed uses a Fresnel prismatic element and/or a triangular prism element to redirect the optical axis of the cameras, giving the cameras fields of view that cover at least 45 to 80 degrees from tangential to the vertical axis of a display screen on which the cameras are mounted.
    Type: Grant
    Filed: February 22, 2019
    Date of Patent: March 2, 2021
    Assignee: Ultrahaptics IP Two Limited
    Inventors: David S. Holz, Paul Durdik
  • Publication number: 20200401232
    Abstract: The technology disclosed relates to motion capture and gesture recognition. In particular, it calculates the exerted force implied by a human hand motion and applies the equivalent through a robotic arm to a target object. In one implementation, this is achieved by tracking the motion and contact of the human hand and generating corresponding robotic commands that replicate the motion and contact of the human hand on a workpiece through a robotic tool.
    Type: Application
    Filed: September 3, 2020
    Publication date: December 24, 2020
    Applicant: Ultrahaptics IP Two Limited
    Inventors: Maxwell Sills, Robert S. Gordon, Paul Durdik
  • Patent number: 10831281
    Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its contact with a “virtual touch plane or surface” (i.e., a plane, portion of a plane, and/or surface computationally defined in space, or corresponding to any physical surface).
    Type: Grant
    Filed: May 2, 2019
    Date of Patent: November 10, 2020
    Assignee: Ultrahaptics IP Two Limited
    Inventors: Hua Yang, Leonid Kontsevich, James Donald, David S. Holz, Jonathan Marsden, Paul Durdik