Patents by Inventor Raffi A. Bedikian
Raffi A. Bedikian has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20210256182Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.Type: ApplicationFiled: May 5, 2021Publication date: August 19, 2021Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Kevin HOROWITZ, Raffi BEDIKIAN, Hua YANG
-
Publication number: 20210233323Abstract: In some implementations, a method includes: determining first usage patterns associated with a physical object within the physical environment; obtaining a first objective for an objective-effectuator (OE) instantiated in a computer-generated reality (CGR) environment, wherein the first objective is associated with a representation of the physical object; obtaining a first directive for the OE that limits actions for performance by the OE to achieve the first objective to the first usage patterns associated with the physical object; generating first actions, for performance by the OE, in order to achieve the first objective as limited by the first directive, wherein the first set of actions corresponds to a first subset of usage patterns from the first set of usage patterns associated with the physical object; and presenting the OE performing the first actions on the representation of the physical object overlaid on the physical environment.Type: ApplicationFiled: April 15, 2021Publication date: July 29, 2021Inventors: Gutemberg B. Guerra Filho, Ian M. Richter, Raffi A. Bedikian
-
Publication number: 20210181859Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: March 1, 2021Publication date: June 17, 2021Applicant: Ultrahaptics IP Two LimitedInventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Patent number: 11010982Abstract: In some implementations, a method includes: identifying, within first image data that corresponds to a first pose of a physical environment, a target physical object associated with a set of physical features that satisfies a mapping criterion for a computer-generated reality (CGR) object; assigning a secondary semantic label to the target physical object, wherein the secondary semantic label links the target physical object to the CGR object; and generating a CGR overlay associated with the CGR object based on one or more characteristics of the target physical object.Type: GrantFiled: April 27, 2020Date of Patent: May 18, 2021Assignee: APPLE INC.Inventors: Gutemberg B. Guerra Filho, Ian M. Richter, Raffi A. Bedikian
-
Patent number: 11010512Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.Type: GrantFiled: November 25, 2019Date of Patent: May 18, 2021Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Kevin Horowitz, Raffi Bedikian, Hua Yang
-
Patent number: 10936082Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: September 30, 2019Date of Patent: March 2, 2021Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Publication number: 20200363874Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: ApplicationFiled: August 6, 2020Publication date: November 19, 2020Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ
-
Publication number: 20200356238Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: ApplicationFiled: July 28, 2020Publication date: November 12, 2020Applicant: Ultrahaptics IP Two LimitedInventors: David S. HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Gabriel A. HARE, Maxwell SILLS
-
Publication number: 20200348755Abstract: One implementation involves a device receiving a stream of pixel events output by an event camera. The device derives an input image by accumulating pixel events for multiple event camera pixels. The device generates a gaze characteristic using the derived input image as input to a neural network trained to determine the gaze characteristic. The neural network is configured in multiple stages. The first stage of the neural network is configured to determine an initial gaze characteristic, e.g., an initial pupil center, using reduced resolution input(s). The second stage of the neural network is configured to determine adjustments to the initial gaze characteristic using location-focused input(s), e.g., using only a small input image centered around the initial pupil center. The determinations at each stage are thus efficiently made using relatively compact neural network configurations. The device tracks a gaze of the eye based on the gaze characteristic.Type: ApplicationFiled: January 23, 2019Publication date: November 5, 2020Inventors: Thomas GEBAUER, Raffi BEDIKIAN
-
Publication number: 20200278539Abstract: In one implementation, a method includes emitting light with modulating intensity from a plurality of light sources towards an eye of a user. The method includes receiving light intensity data indicative of an intensity of the plurality of glints reflected by the eye of the user in the form of a plurality of glints and determining an eye tracking characteristic of the user based on the light intensity data. In one implementation, a method includes generating, using an event camera comprising a plurality of light sensors at a plurality of respective locations, a plurality of event messages, each of the plurality of event messages being generated in response to a particular light sensor detecting a change in intensity of light and indicating a particular location of the particular light sensor. The method includes determining an eye tracking characteristic of a user based on the plurality of event messages.Type: ApplicationFiled: September 28, 2017Publication date: September 3, 2020Inventors: Branko Petljanski, Raffi A. Bedikian, Daniel Kurz, Thomas Gebauer, Li Jia
-
Patent number: 10739862Abstract: During control of a user interface via free-space motions of a hand or other suitable control object, switching between control modes can be facilitated by tracking the control object's movements relative to, and its penetration of, a virtual control construct (such as a virtual surface construct). The position of the virtual control construct can be updated, continuously or from time to time, based on the control object's location.Type: GrantFiled: August 3, 2018Date of Patent: August 11, 2020Assignee: Ultrahaptics IP Two LimitedInventors: Raffi Bedikian, Jonathan Marsden, Keith Mertens, David Holz
-
Patent number: 10739965Abstract: The technology disclosed relates to providing simplified manipulation of virtual objects by detected hand motions. In particular, it relates to a detecting hand motion and positions of the calculation points relative to a virtual object to be manipulated, dynamically selecting at least one manipulation point proximate to the virtual object based on the detected hand motion and positions of one or more of the calculation points, and manipulating the virtual object by interaction between the detected hand motion and positions of one or more of the calculation points and the dynamically selected manipulation point.Type: GrantFiled: December 20, 2018Date of Patent: August 11, 2020Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Raffi Bedikian, Adrian Gasinski, Hua Yang, Gabriel A. Hare, Maxwell Sills
-
Publication number: 20200167513Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.Type: ApplicationFiled: November 25, 2019Publication date: May 28, 2020Inventors: David S. HOLZ, Kevin HOROWITZ, Raffi BEDIKIAN, Hua YANG
-
Patent number: 10600189Abstract: Determining a movement of an object may include obtaining an event stream corresponding to a scene over time, wherein the event stream includes events associated with detected changes in brightness, wherein each event includes a pixel location, a timestamp, and a brightness indication; selecting a first subset of pixels from the plurality of pixels corresponding to an object for a first time period; determining a first movement of the object based on the brightness indications and timestamps for the first subset of pixels; determining that the first movement is insufficient to generate a velocity measure; selecting, based on the first movement, a second subset of pixels associated with the object for a second time period; determining an updated movement of the object based on the brightness indications and timestamps for the second subset of pixels and the first movement; and generating a velocity measure based on the updated movement.Type: GrantFiled: August 13, 2019Date of Patent: March 24, 2020Assignee: Apple Inc.Inventor: Raffi A. Bedikian
-
Publication number: 20200033951Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: ApplicationFiled: September 30, 2019Publication date: January 30, 2020Inventors: Kevin A. HOROWITZ, Matias PEREZ, Raffi BEDIKIAN, David S. HOLZ, Gabriel A. HARE
-
Publication number: 20200004403Abstract: The technology disclosed relates to using virtual attraction between hand or other control object in a three-dimensional (3D) sensory space and a virtual object in a virtual space. In particular, it relates to defining a virtual attraction zone of a hand or other control object that is tracked in a three-dimensional (3D) sensory space and generating one or more interaction forces between the control object and a virtual object in a virtual space that cause motion of the virtual object responsive to proximity of the control object to the virtual object and escalation with a virtual pinch or grasp action of the control object directed to a manipulation point of the virtual object.Type: ApplicationFiled: September 11, 2019Publication date: January 2, 2020Applicant: Ultrahaptics IP Two LimitedInventors: David S HOLZ, Raffi BEDIKIAN, Adrian GASINSKI, Hua YANG, Maxwell SILLS, Gabriel HARE
-
Patent number: 10489531Abstract: The technology disclosed relates to simplifying updating of a predictive model using clustering observed points. In particular, it relates to observing a set of points in 3D sensory space, determining surface normal directions from the points, clustering the points by their surface normal directions and adjacency, accessing a predictive model of a hand, refining positions of segments of the predictive model, matching the clusters of the points to the segments, and using the matched clusters to refine the positions of the matched segments. It also relates to distinguishing between alternative motions between two observed locations of a control object in a 3D sensory space by accessing first and second positions of a segment of a predictive model of a control object such that motion between the first position and the second position was at least partially occluded from observation in a 3D sensory space.Type: GrantFiled: June 8, 2018Date of Patent: November 26, 2019Assignee: Ultrahaptics IP Two LimitedInventors: David S. Holz, Raffi Bedikian, Kevin Horowitz, Hua Yang
-
Patent number: 10429943Abstract: The technology disclosed relates to relates to providing command input to a machine under control. It further relates to gesturally interacting with the machine. The technology disclosed also relates to providing monitoring information about a process under control. The technology disclosed further relates to providing biometric information about an individual. The technology disclosed yet further relates to providing abstract features information (pose, grab strength, pinch strength, confidence, and so forth) about an individual.Type: GrantFiled: May 24, 2018Date of Patent: October 1, 2019Assignee: Ultrahaptics IP Two LimitedInventors: Kevin A. Horowitz, Matias Perez, Raffi Bedikian, David S. Holz, Gabriel A. Hare
-
Patent number: 10416834Abstract: The technology disclosed relates to using virtual attraction between hand or other control object in a three-dimensional (3D) sensory space and a virtual object in a virtual space. In particular, it relates to defining a virtual attraction zone of a hand or other control object that is tracked in a three-dimensional (3D) sensory space and generating one or more interaction forces between the control object and a virtual object in a virtual space that cause motion of the virtual object responsive to proximity of the control object to the virtual object and escalation with a virtual pinch or grasp action of the control object directed to a manipulation point of the virtual object.Type: GrantFiled: November 13, 2014Date of Patent: September 17, 2019Assignee: Leap Motion, Inc.Inventors: David S Holz, Raffi Bedikian, Adrian Gasinski, Hua Yang, Maxwell Sills, Gabriel Hare
-
Publication number: 20190155394Abstract: Embodiments of display control based on dynamic user interactions generally include capturing a plurality of temporally sequential images of the user, or a body part or other control object manipulated by the user, and computationally analyzing the images to recognize a gesture performed by the user. In some embodiments, a scale indicative of an actual gesture distance traversed in performance of the gesture is identified, and a movement or action is displayed on the device based, at least in part, on a ratio between the identified scale and the scale of the displayed movement. In some embodiments, a degree of completion of the recognized gesture is determined, and the display contents are modified in accordance therewith. In some embodiments, a dominant gesture is computationally determined from among a plurality of user gestures, and an action displayed on the device is based on the dominant gesture.Type: ApplicationFiled: November 19, 2018Publication date: May 23, 2019Applicant: Leap Motion, Inc.Inventors: Raffi BEDIKIAN, Jonathan MARSDEN, Keith MERTENS, David HOLZ, Maxwell SILLS, Matias PEREZ, Gabriel HARE, Ryan JULIAN