Patents Assigned to Ultrahaptics IP Two Limited
-
Patent number: 12086935Abstract: Free space machine interface and control can be facilitated by predictive entities useful in interpreting a control object's position and/or motion (including objects having one or more articulating members, i.e., humans and/or animals and/or machines). Predictive entities can be driven using motion information captured using image information or the equivalents. Predictive information can be improved applying techniques for correlating with information from observations.Type: GrantFiled: September 26, 2023Date of Patent: September 10, 2024Assignee: ULTRAHAPTICS IP TWO LIMITEDInventors: Kevin A. Horowitz, David S. Holz
-
Patent number: 12086328Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: September 26, 2023Date of Patent: September 10, 2024Assignee: ULTRAHAPTICS IP TWO LIMITEDInventors: Isaac Cohen, Maxwell Sills, Paul Durdik
-
Patent number: 12086327Abstract: A method for detecting a finger is provided. The method includes obtaining a plurality of digital images including a first digital image captured by a camera from a field of view containing a background and a hand including at least one finger, and obtaining an identification of pixels of the plurality of digital images that correspond to at least one finger that is visible in the plurality of digital images rather than to the background, the pixels being identified by: obtaining, from the digital images, a Gaussian brightness falloff pattern indicative of at least one finger, identifying an axis of the at least one finger based on the obtained Gaussian brightness falloff pattern indicative of the at least one finger without identifying edges of the at least one finger, and identifying the pixels that correspond to the at least one finger based on the identified axis.Type: GrantFiled: September 18, 2023Date of Patent: September 10, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Hua Yang
-
Patent number: 12086323Abstract: A method and system for controlling an electronic device using gesture and/or a device is provided. The method includes capturing, in a 3D sensor space, an image including a user manipulable hand-held input device and a body part of a user, finding an entry in a database of multiple user manipulable hand-held input devices that matches the image of the user manipulable hand-held input device, wherein each user manipulable hand-held input device, having an entry in the database, respectively generates signals in response to performing one or more specific control manipulations, determining a primary control mode of primarily controlling the electronic device using 3D gestures or using control manipulations directly from the user manipulable hand-held input device, the primary control mode being determined based on a predetermined priority level associated with the user manipulable hand-held input device and controlling the electronic device using the determined primary control mode.Type: GrantFiled: November 22, 2021Date of Patent: September 10, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David Holz
-
Patent number: 12086322Abstract: The technology disclosed relates to automatically (e.g., programmatically) initializing predictive information for tracking a complex control object (e.g., hand, hand and tool combination, robot end effector) based upon information about characteristics of the object determined from sets of collected observed information. Automated initialization techniques obviate the need for special and often bizarre start-up rituals (place your hands on the screen at the places indicated during a full moon, and so forth) required by conventional techniques. In implementations, systems can refine initial predictive information to reflect an observed condition based on comparison of the observed with an analysis of sets of collected observed information.Type: GrantFiled: April 19, 2021Date of Patent: September 10, 2024Assignee: Ultrahaptics IP Two LimitedInventor: Kevin A. Horowitz
-
Publication number: 20240296588Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.Type: ApplicationFiled: May 13, 2024Publication date: September 5, 2024Applicant: Ultrahaptics IP Two LimitedInventor: David S. HOLZ
-
Patent number: 12067157Abstract: The technology disclosed can provide capabilities such as using motion sensors and/or other types of sensors coupled to a motion-capture system to monitor motions within a real environment. A virtual object can be projected to a user of a portable device integrated into an augmented rendering of a real environment about the user. Motion information of a user body portion is determined based at least in part upon sensory information received from imaging or acoustic sensory devices. Control information is communicated to a system based in part on a combination of the motion of the portable device and the detected motion of the user. The virtual device experience can be augmented in some implementations by the addition of haptic, audio and/or other sensory information projectors.Type: GrantFiled: December 23, 2022Date of Patent: August 20, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Publication number: 20240257481Abstract: The technology disclosed relates to a method of realistic rendering of a real object as a virtual object in a virtual space using an offset in the position of the hand in a three-dimensional (3D) sensory space. An offset between expected positions of the eye(s) of a wearer of a head mounted device and a sensor attached to the head mounted device for sensing a position of at least one hand in a three-dimensional (3D) sensory space is determined. A position of the hand in the three-dimensional (3D) sensory space can be sensed using a sensor. The sensed position of the hand can be transformed by the offset into a re-rendered position of the hand as would appear to the wearer of the head mounted device if the wearer were looking at the actual hand. The re-rendered hand can be depicted to the wearer of the head mounted device.Type: ApplicationFiled: April 8, 2024Publication date: August 1, 2024Applicant: Ultrahaptics IP Two LimitedInventors: Alex Marcolina, David Holz
-
Patent number: 12050757Abstract: The technology disclosed relates to user interfaces for controlling augmented reality environments. Real and virtual objects can be seamlessly integrated to form an augmented reality by tracking motion of one or more real objects within view of a wearable sensor system using a combination a RGB (red, green, and blue) and IR (infrared) pixels of one or more cameras. It also relates to enabling multi-user collaboration and interaction in an immersive virtual environment. In particular, it relates to capturing different sceneries of a shared real world space from the perspective of multiple users. The technology disclosed further relates to sharing content between wearable sensor systems. In particular, it relates to capturing images and video streams from the perspective of a first user of a wearable sensor system and sending an augmented version of the captured images and video stream to a second user of the wearable sensor system.Type: GrantFiled: February 24, 2023Date of Patent: July 30, 2024Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Barrett Fox, Kyle A. Hay, Gabriel A. Hare, Wilbur Yung Sheng Yu, Dave Edelhart, Jody Medich, Daniel Plemmons
-
Patent number: 12045394Abstract: Methods and systems for processing input from an image-capture device for gesture-recognition. The method further includes computationally interpreting user gestures in accordance with a first mode of operation; analyzing the path of movement of an object to determine an intent of a user to change modes of operation; and, upon determining an intent of the user to change modes of operation, subsequently interpreting user gestures in accordance with the second mode of operation.Type: GrantFiled: July 26, 2023Date of Patent: July 23, 2024Assignee: ULTRAHAPTICS IP TWO LIMITEDInventor: David S. Holz
-
Patent number: 12039103Abstract: The technology disclosed relates to determining intent for the interaction by calculating a center of effort for the applied forces. Movement of the points of virtual contacts and the center of effort are then monitored to determine a gesture-type intended for the interaction. The number of points of virtual contacts of the feeler zones and proximities between the points of virtual contacts are used to determine a degree of precision of a control object-gesture.Type: GrantFiled: December 19, 2022Date of Patent: July 16, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Pohung Chen, David S. Holz
-
Patent number: 12032746Abstract: The technology disclosed relates to a method of realistic displacement of a virtual object for an interaction between a control object in a three-dimensional (3D) sensory space and the virtual object in a virtual space that the control object interacts with. In particular, it relates to detecting free-form gestures of a control object in a three-dimensional (3D) sensory space and generating for display a 3D solid control object model for the control object during the free-form gestures, including sub-components of the control object and in response to detecting a 2D sub-component free-form gesture of the control object in the 3D sensory space in virtual contact with the virtual object, depicting, in the generated display, the virtual contact and resulting rotation of the virtual object by the 3D solid control object model.Type: GrantFiled: July 18, 2022Date of Patent: July 9, 2024Assignee: Ultrahaptics IP Two LimitedInventors: Alex Marcolina, David S. Holz
-
Patent number: 12025719Abstract: A method and system determines object orientation using a light source to create a shadow line extending from the light source. A camera captures an image including the shadow line on an object surface. An orientation module determines the surface orientation from the shadow line. In some examples a transparency imperfection in a window through which a camera receives light can be detected and a message sent to a user as to the presence of a light-blocking or light-distorting substance or particle. A system can control illumination while imaging an object in space using a light source mounted to a support structure so a camera captures an image of the illuminated object. Direct illumination of the camera by light from the light source can be prevented such as by blocking the light or using a light-transmissive window adjacent the camera to reject light transmitted directly from the light source.Type: GrantFiled: June 10, 2021Date of Patent: July 2, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David Holz
-
Patent number: 12020458Abstract: The technology disclosed relates to coordinating motion-capture of a hand by a network of motion-capture sensors having overlapping fields of view. In particular, it relates to designating a first sensor among three or more motion-capture sensors as having a master frame of reference, observing motion of a hand as it passes through overlapping fields of view of the respective motion-capture sensors, synchronizing capture of images of the hand within the overlapping fields of view by pairs of the motion-capture devices, and using the pairs of the hand images captured by the synchronized motion-capture devices to automatically calibrate the motion-capture sensors to the master frame of reference frame.Type: GrantFiled: January 18, 2022Date of Patent: June 25, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Publication number: 20240201794Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: February 26, 2024Publication date: June 20, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Patent number: 11994377Abstract: Methods and systems for capturing motion and/or determining the shapes and positions of one or more objects in 3D space utilize cross-sections thereof. In various embodiments, images of the cross-sections are captured using a camera based on reflections therefrom or shadows cast thereby.Type: GrantFiled: September 2, 2020Date of Patent: May 28, 2024Assignee: Ultrahaptics IP Two LimitedInventor: David S. Holz
-
Patent number: 11995245Abstract: The technology disclosed relates to creating user-defined interaction spaces and modalities in a three dimensional (3D) sensor space in response to control gestures. It also relates to controlling virtual cameras in the 3D sensor space using control gestures and manipulating controls of the virtual cameras through the control gestures. In particular, it relates to defining one or more spatial attributes of the interaction spaces and modalities in response to one or more gesture parameters of the control gesture. It also particularly relates to defining one or more visual parameters of a virtual camera in response to one or more gesture parameters of the control gesture.Type: GrantFiled: January 20, 2023Date of Patent: May 28, 2024Assignee: ULTRAHAPTICS IP TWO LIMITEDInventors: Isaac Cohen, Maxwell Sills
-
Publication number: 20240168602Abstract: The technology disclosed relates to positioning and revealing a control interface in a virtual or augmented reality that includes causing display of a plurality of interface projectiles at a first region of a virtual or augmented reality. Input is received that is interpreted as user interaction with an interface projectile. User interaction includes selecting and throwing the interface projectile in a first direction. An animation of the interface projectile is displayed along a trajectory in the first directions to a place where it lands. A blooming of the control interface blooming from the interface projectile at the place where it lands is displayed.Type: ApplicationFiled: January 11, 2024Publication date: May 23, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventor: Nicholas James BENSON
-
Patent number: 11983401Abstract: The technology disclosed relates to selecting a virtual item from a virtual grid in a three-dimensional (3D) sensory space. It also relates to navigating a virtual modality displaying a plurality of virtual items arranged in a grid by and automatically selecting a virtual item in a virtual grid at a terminal end of a control gesture of a control object responsive to a terminal gesture that transitions the control object from one physical arrangement to another. In one implementation, the control object is a hand. In some implementations, physical arrangements of the control object include at least a flat hand with thumb parallel to fingers, closed, half-open, pinched, curled, fisted, mime gun, okay sign, thumbs-up, ILY sign, one-finger point, two-finger point, thumb point, or pinkie point.Type: GrantFiled: December 5, 2017Date of Patent: May 14, 2024Assignee: ULTRAHAPTICS IP TWO LIMITED, , United KingdInventors: Bingxin Ku, Pohung Chen, Isaac L. Cohen, Paul A. Durdik
-
Publication number: 20240143871Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.Type: ApplicationFiled: January 5, 2024Publication date: May 2, 2024Applicant: ULTRAHAPTICS IP TWO LIMITEDInventors: David S. HOLZ, Kevin HOROWITZ, Raffi BEDIKIAN, Hua YANG