Patents by Inventor Itay Bar-Yosef

Itay Bar-Yosef has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20250147596
    Abstract: Enabling gesture input includes obtaining hand tracking data based on one or more camera frames, detecting a contact event between a first finger and a second finger based on the hand tracking data, and determining a first contact location on the first finger and a second contact location on the second finger. In accordance with a determination that the first contact location and the second contact location are within a first predefined gesture zone for a first gesture, an input action is enabled corresponding to the first gesture.
    Type: Application
    Filed: January 14, 2025
    Publication date: May 8, 2025
    Inventors: Victor Belyaev, Bhavin Vinodkumar Nayak, Daniel J. Brewer, Itay Bar Yosef, Julian K. Shutzberg, Matthias M. Schroeder
  • Patent number: 12229344
    Abstract: Enabling gesture input includes obtaining hand tracking data based on one or more camera frames, detecting a contact event between a first finger and a second finger based on the hand tracking data, and determining a first contact location on the first finger and a second contact location on the second finger. In accordance with a determination that the first contact location and the second contact location are within a first predefined gesture zone for a first gesture, an input action is enabled corresponding to the first gesture.
    Type: Grant
    Filed: September 29, 2023
    Date of Patent: February 18, 2025
    Assignee: Apple Inc.
    Inventors: Victor Belyaev, Bhavin Vinodkumar Nayak, Daniel J. Brewer, Itay Bar Yosef, Julian K. Shutzberg, Matthias M. Schroeder
  • Publication number: 20240402823
    Abstract: Enabling gesture input includes obtaining hand tracking data based on one or more camera frames, detecting a contact event between a first finger and a second finger based on the hand tracking data, and determining a first contact location on the first finger and a second contact location on the second finger. In accordance with a determination that the first contact location and the second contact location are within a first predefined gesture zone for a first gesture, an input action is enabled corresponding to the first gesture.
    Type: Application
    Filed: September 29, 2023
    Publication date: December 5, 2024
    Inventors: Victor Belyaev, Bhavin Vinodkumar Nayak, Daniel J. Brewer, Itay Bar Yosef, Julian K. Shutzberg, Matthias M. Schroeder
  • Publication number: 20240402800
    Abstract: Various implementations disclosed herein include devices, systems, and methods that interpret user activity as user interactions with user interface (UI) elements positioned within a three-dimensional (3D) space such as an extended reality (XR) environment. Some implementations enable user interactions with virtual elements displayed in 3D environments that utilize alternative input modalities, e.g., XR environments that interpret user activity as either direct interactions or indirect interactions with virtual elements.
    Type: Application
    Filed: May 29, 2024
    Publication date: December 5, 2024
    Inventors: Julian K. Shutzberg, David J. Meyer, David M. Teitelbaum, Mehmet N. Agaoglu, Ian R. Fasel, Chase B. Lortie, Daniel J. Brewer, Tim H. Cornelissen, Leah M. Gum, Alexander G. Berardino, Lorenzo Soto Doblado, Vinay Chawda, Itay Bar Yosef, Dror Irony, Eslam A. Mostafa, Guy Engelhard, Paul A. Lacey, Ashwin Kumar Asoka Kumar Shenoi, Bhavin Vinodkumar Nayak, Liuhao Ge, Lucas Soffer, Victor Belyaev, Bharat C. Dandu, Matthias M. Schroeder, Yirong Tang
  • Publication number: 20240331447
    Abstract: Processing gesture input includes obtaining hand tracking data based on a set of camera frames, determining a hand pose based on the hand tracking data, and determining an intentionality classification for a gesture based on the hand pose. An input action corresponding to the gesture is enabled based on the hand pose and the intentionality classification. An occlusion classification is determined for the hand based on the hand pose and the input gesture can be determined based on the occlusion classification.
    Type: Application
    Filed: September 29, 2023
    Publication date: October 3, 2024
    Inventors: Itay Bar Yosef, Bhavin Vinodkumar Nayak, Chao-Ming Yen, Chase B. Lortie, Daniel J. Brewer, Dror Irony, Eslam A. Mostafa, Guy Engelhard, Ian R. Fasel, Julian K. Shutzberg, Liuhao Ge, Lucas Soffer, Matthias M. Schroeder, Mohammadhadi Kiapour, Victor Belyaev, Yirong Tang
  • Publication number: 20240094825
    Abstract: Aspects of the subject technology provide improved techniques for gesture recognition. Improved techniques may include detecting and/or classifying an interaction between the body part and another object in a scan of the body part, and then controlling recognition of a gesture based on the interaction. In an aspect, recognition parameters may be selected based on the interaction classification that disable recognition of one or more gestures while not disabling recognition of other gestures.
    Type: Application
    Filed: September 15, 2023
    Publication date: March 21, 2024
    Inventors: Lailin CHEN, Ashwin Kumar ASOKA KUMAR SHENOI, Daniel J. BREWER, Eslam A. MOSTAFA, Itay BAR YOSEF, Julian K. SHUTZBERG, Leah M. GUM, Martin MELOUN, Minhaeng LEE, Victor BELYAEV
  • Patent number: 10126826
    Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.
    Type: Grant
    Filed: June 27, 2016
    Date of Patent: November 13, 2018
    Assignee: Eyesight Mobile Technologies Ltd.
    Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
  • Publication number: 20180024643
    Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.
    Type: Application
    Filed: September 29, 2017
    Publication date: January 25, 2018
    Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
  • Patent number: 9377867
    Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.
    Type: Grant
    Filed: August 8, 2012
    Date of Patent: June 28, 2016
    Assignee: EYESIGHT MOBILE TECHNOLOGIES LTD.
    Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
  • Publication number: 20140306877
    Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.
    Type: Application
    Filed: August 8, 2012
    Publication date: October 16, 2014
    Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
  • Patent number: 8842919
    Abstract: Systems, methods, and computer-readable media for gesture recognition are disclosed. The systems include, for example, at least one processor that is configured to receive at least one image from at least one image sensor. The processor may also be configured to detect, in the image, data corresponding to an anatomical structure of a user. The processor may also be configured to identify, in the image, information corresponding to a suspected hand gesture by the user. In addition, the processor may also be configured to discount the information corresponding to the suspected hand gesture if the data corresponding to the anatomical structure of the user is not identified in the image.
    Type: Grant
    Filed: February 8, 2014
    Date of Patent: September 23, 2014
    Assignee: Eyesight Mobile Technologies Ltd.
    Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
  • Publication number: 20140157210
    Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.
    Type: Application
    Filed: February 8, 2014
    Publication date: June 5, 2014
    Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef