Patents by Inventor Itay Bar-Yosef
Itay Bar-Yosef has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250147596Abstract: Enabling gesture input includes obtaining hand tracking data based on one or more camera frames, detecting a contact event between a first finger and a second finger based on the hand tracking data, and determining a first contact location on the first finger and a second contact location on the second finger. In accordance with a determination that the first contact location and the second contact location are within a first predefined gesture zone for a first gesture, an input action is enabled corresponding to the first gesture.Type: ApplicationFiled: January 14, 2025Publication date: May 8, 2025Inventors: Victor Belyaev, Bhavin Vinodkumar Nayak, Daniel J. Brewer, Itay Bar Yosef, Julian K. Shutzberg, Matthias M. Schroeder
-
Patent number: 12229344Abstract: Enabling gesture input includes obtaining hand tracking data based on one or more camera frames, detecting a contact event between a first finger and a second finger based on the hand tracking data, and determining a first contact location on the first finger and a second contact location on the second finger. In accordance with a determination that the first contact location and the second contact location are within a first predefined gesture zone for a first gesture, an input action is enabled corresponding to the first gesture.Type: GrantFiled: September 29, 2023Date of Patent: February 18, 2025Assignee: Apple Inc.Inventors: Victor Belyaev, Bhavin Vinodkumar Nayak, Daniel J. Brewer, Itay Bar Yosef, Julian K. Shutzberg, Matthias M. Schroeder
-
Publication number: 20240402823Abstract: Enabling gesture input includes obtaining hand tracking data based on one or more camera frames, detecting a contact event between a first finger and a second finger based on the hand tracking data, and determining a first contact location on the first finger and a second contact location on the second finger. In accordance with a determination that the first contact location and the second contact location are within a first predefined gesture zone for a first gesture, an input action is enabled corresponding to the first gesture.Type: ApplicationFiled: September 29, 2023Publication date: December 5, 2024Inventors: Victor Belyaev, Bhavin Vinodkumar Nayak, Daniel J. Brewer, Itay Bar Yosef, Julian K. Shutzberg, Matthias M. Schroeder
-
Publication number: 20240402800Abstract: Various implementations disclosed herein include devices, systems, and methods that interpret user activity as user interactions with user interface (UI) elements positioned within a three-dimensional (3D) space such as an extended reality (XR) environment. Some implementations enable user interactions with virtual elements displayed in 3D environments that utilize alternative input modalities, e.g., XR environments that interpret user activity as either direct interactions or indirect interactions with virtual elements.Type: ApplicationFiled: May 29, 2024Publication date: December 5, 2024Inventors: Julian K. Shutzberg, David J. Meyer, David M. Teitelbaum, Mehmet N. Agaoglu, Ian R. Fasel, Chase B. Lortie, Daniel J. Brewer, Tim H. Cornelissen, Leah M. Gum, Alexander G. Berardino, Lorenzo Soto Doblado, Vinay Chawda, Itay Bar Yosef, Dror Irony, Eslam A. Mostafa, Guy Engelhard, Paul A. Lacey, Ashwin Kumar Asoka Kumar Shenoi, Bhavin Vinodkumar Nayak, Liuhao Ge, Lucas Soffer, Victor Belyaev, Bharat C. Dandu, Matthias M. Schroeder, Yirong Tang
-
Publication number: 20240331447Abstract: Processing gesture input includes obtaining hand tracking data based on a set of camera frames, determining a hand pose based on the hand tracking data, and determining an intentionality classification for a gesture based on the hand pose. An input action corresponding to the gesture is enabled based on the hand pose and the intentionality classification. An occlusion classification is determined for the hand based on the hand pose and the input gesture can be determined based on the occlusion classification.Type: ApplicationFiled: September 29, 2023Publication date: October 3, 2024Inventors: Itay Bar Yosef, Bhavin Vinodkumar Nayak, Chao-Ming Yen, Chase B. Lortie, Daniel J. Brewer, Dror Irony, Eslam A. Mostafa, Guy Engelhard, Ian R. Fasel, Julian K. Shutzberg, Liuhao Ge, Lucas Soffer, Matthias M. Schroeder, Mohammadhadi Kiapour, Victor Belyaev, Yirong Tang
-
Publication number: 20240094825Abstract: Aspects of the subject technology provide improved techniques for gesture recognition. Improved techniques may include detecting and/or classifying an interaction between the body part and another object in a scan of the body part, and then controlling recognition of a gesture based on the interaction. In an aspect, recognition parameters may be selected based on the interaction classification that disable recognition of one or more gestures while not disabling recognition of other gestures.Type: ApplicationFiled: September 15, 2023Publication date: March 21, 2024Inventors: Lailin CHEN, Ashwin Kumar ASOKA KUMAR SHENOI, Daniel J. BREWER, Eslam A. MOSTAFA, Itay BAR YOSEF, Julian K. SHUTZBERG, Leah M. GUM, Martin MELOUN, Minhaeng LEE, Victor BELYAEV
-
Patent number: 10126826Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.Type: GrantFiled: June 27, 2016Date of Patent: November 13, 2018Assignee: Eyesight Mobile Technologies Ltd.Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
-
Publication number: 20180024643Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.Type: ApplicationFiled: September 29, 2017Publication date: January 25, 2018Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
-
Patent number: 9377867Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.Type: GrantFiled: August 8, 2012Date of Patent: June 28, 2016Assignee: EYESIGHT MOBILE TECHNOLOGIES LTD.Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
-
Publication number: 20140306877Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.Type: ApplicationFiled: August 8, 2012Publication date: October 16, 2014Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
-
Patent number: 8842919Abstract: Systems, methods, and computer-readable media for gesture recognition are disclosed. The systems include, for example, at least one processor that is configured to receive at least one image from at least one image sensor. The processor may also be configured to detect, in the image, data corresponding to an anatomical structure of a user. The processor may also be configured to identify, in the image, information corresponding to a suspected hand gesture by the user. In addition, the processor may also be configured to discount the information corresponding to the suspected hand gesture if the data corresponding to the anatomical structure of the user is not identified in the image.Type: GrantFiled: February 8, 2014Date of Patent: September 23, 2014Assignee: Eyesight Mobile Technologies Ltd.Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
-
Publication number: 20140157210Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.Type: ApplicationFiled: February 8, 2014Publication date: June 5, 2014Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef