Patents Assigned to Ultrahaptics IP Two Limited
-
Patent number: 12105889Abstract: The technology disclosed relates to automatically interpreting motion of a control object in a three dimensional (3D) sensor space by sensing a movement of the control object in the (3D) sensor space, interpreting movement of the control object, and presenting the interpreted movement as a path on a display. The path may be displayed once the speed of the movement exceeds a pre-determined threshold measured in cm per second. Once the path is displayed, the technology duplicates a display object that intersects the path on the display. In some implementations, the control object may be a device, a hand, or a portion of a hand (such as a finger).Type: GrantFiled: June 23, 2023Date of Patent: October 1, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, David S. Holz, Maxwell Sills
-
Patent number: 12099660Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: November 10, 2022Date of Patent: September 24, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Isaac Cohen, Maxwell Sills
-
Patent number: 12086327Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.Type: GrantFiled: September 18, 2023Date of Patent: September 10, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Hua Yang
-
Patent number: 12086323Abstract: A method and system for controlling an electronic device using gesture and/or a device is provided. The method includes capturing, in a 3D sensor space, an image including a user manipulable hand-held input device and a body part of a user, finding an entry in a database of multiple user manipulable hand-held input devices that matches the image of the user manipulable hand-held input device, wherein each user manipulable hand-held input device, having an entry in the database, respectively generates signals in response to performing one or more specific control manipulations, determining a primary control mode of primarily controlling the electronic device using 3D gestures or using control manipulations directly from the user manipulable hand-held input device, the primary control mode being determined based on a predetermined priority level associated with the user manipulable hand-held input device and controlling the electronic device using the determined primary control mode.Type: GrantFiled: November 22, 2021Date of Patent: September 10, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David Holz
-
Patent number: 12086322Abstract: The technology disclosed relates to automatically (e.g., programmatically) initializing predictive information for tracking a complex control object (e.g., hand, hand and tool combination, robot end effector) based upon information about characteristics of the object determined from sets of collected observed information. Automated initialization techniques obviate the need for special and often bizarre start-up rituals (place your hands on the screen at the places indicated during a full moon, and so forth) required by conventional techniques. In implementations, systems can refine initial predictive information to reflect an observed condition based on comparison of the observed with an analysis of sets of collected observed information.Type: GrantFiled: April 19, 2021Date of Patent: September 10, 2024Assignee: Ultrahaptics IP Two LimitedInventor: Kevin A. Horowitz
-
Publication number: 20240296588Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.Type: ApplicationFiled: May 13, 2024Publication date: September 5, 2024Applicant: Ultrahaptics IP Two LimitedInventor: David S. HOLZ
-
Patent number: 12067157Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.Type: GrantFiled: December 23, 2022Date of Patent: August 20, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Publication number: 20240257481Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.Type: ApplicationFiled: April 8, 2024Publication date: August 1, 2024Applicant: Ultrahaptics IP Two LimitedInventors: Alex Marcolina, David Holz
-
Patent number: 12050757Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: February 24, 2023Date of Patent: July 30, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 12039103Abstract: The technology disclosed relates to determining intent for the interaction by calculating a center of effort for the applied forces. Movement of the points of virtual contacts and the center of effort are then monitored to determine a gesture-type intended for the interaction. The number of points of virtual contacts of the feeler zones and proximities between the points of virtual contacts are used to determine a degree of precision of a control object-gesture.Type: GrantFiled: December 19, 2022Date of Patent: July 16, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Pohung Chen, David S. Holz
-
Patent number: 12032746Abstract: The technology disclosed relates to a method of realistic displacement of a virtual object for an interaction between a control object in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a 2D sub-component free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting rotation of the virtual object by the 3D solid control object model.Type: GrantFiled: July 18, 2022Date of Patent: July 9, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Alex Marcolina, David S. Holz
-
Patent number: 12025719Abstract: A method and system determines object orientation using a light source to create a shadow line extending from the light source. A camera captures an image including the shadow line on an object surface. An orientation module determines the surface orientation from the shadow line. In some examples a transparency imperfection in a window through which a camera receives light can be detected and a message sent to a user as to the presence of a light-blocking or light-distorting substance or particle. A system can control illumination while imaging an object in space using a light source mounted to a support structure so a camera captures an image of the illuminated object. Direct illumination of the camera by light from the light source can be prevented such as by blocking the light or using a light-transmissive window adjacent the camera to reject light transmitted directly from the light source.Type: GrantFiled: June 10, 2021Date of Patent: July 2, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David Holz
-
Patent number: 12020458Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.Type: GrantFiled: January 18, 2022Date of Patent: June 25, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Patent number: 11994377Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on reflections therefrom or shadows cast thereby.Type: GrantFiled: September 2, 2020Date of Patent: May 28, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Patent number: 11954808Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.Type: GrantFiled: February 7, 2022Date of Patent: April 9, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Alex Marcolina, David Holz
-
Patent number: 11941163Abstract: The technology disclosed relates to a method of realistic simulation of real world interactions as virtual interactions between a control object sensed acting in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting motions of the virtual object by the 3D solid control object model.Type: GrantFiled: January 27, 2022Date of Patent: March 26, 2024Assignee: Ultrahaptics IP Two LimitedInventors: John Adrian Arthur Johnston, Johnathon Scott Selstad, Alex Marcolina
-
Publication number: 20240094860Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: ApplicationFiled: February 24, 2023Publication date: March 21, 2024Applicant: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Publication number: 20240077950Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The technology disclosed includes determining from the motion information whether a motion of the control object with respect to the virtual control construct is an engagement gesture, such as a virtual mouse click or other control device operation. The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: ApplicationFiled: November 9, 2023Publication date: March 7, 2024Applicant: Ultrahaptics IP Two LimitedInventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
-
Patent number: 11914792Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: February 17, 2023Date of Patent: February 27, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Publication number: 20240061511Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, the gesture is identified as an engagement gesture, and compared with reference gestures from a library of reference gestures. In some embodiments, a degree of completion of the recognized engagement gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: July 7, 2023Publication date: February 22, 2024Applicant: Ultrahaptics IP Two LimitedInventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN