Patents Assigned to Eyesight Mobile Technologies Ltd.
  • Patent number: 11726577
    Abstract: Systems, methods and non-transitory computer-readable media for triggering actions based on touch-free gesture detection are disclosed. The disclosed systems may include at least one processor. A processor may be configured to receive image information from an image sensor, detect in the image information a gesture performed by a user, detect a location of the gesture in the image information, access information associated with at least one control boundary, the control boundary relating to a physical dimension of a device in a field of view of the user, or a physical dimension of a body of the user as perceived by the image sensor, and cause an action associated with the detected gesture, the detected gesture location, and a relationship between the detected gesture location and the control boundary.
    Type: Grant
    Filed: February 11, 2022
    Date of Patent: August 15, 2023
    Assignee: eyeSight Mobile Technologies, Ltd.
    Inventor: Itay Katz
  • Patent number: 11494000
    Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more slate sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.
    Type: Grant
    Filed: August 13, 2021
    Date of Patent: November 8, 2022
    Assignee: Eyesight Mobile Technologies Ltd.
    Inventors: Itay Katz, Amnon Shenfeld
  • Publication number: 20220261112
    Abstract: Systems, devices, methods, and non-transitory computer-readable media are provided for receiving data input via touch-free gestures and movements. For example, a data input device includes at least one processor for receiving information from a sensor, The processor may be configured to receive sensor data from the sensor of a user's hand spaced a distance from a displayed keyboard and in non-contact with the displayed keyboard, and track, using the received sensor data, one or more fingers in air a distance from the displayed keyboard image. The processor may also be configured to correlate locations of the one or more fingers in the air with images of a plurality of keys in the displayed keyboard, and select keys from the keyboard image based on the correlated locations of the one or more fingers in the air, and a detection of a predefined gesture performed by the user.
    Type: Application
    Filed: November 24, 2021
    Publication date: August 18, 2022
    Applicant: EYESIGHT MOBILE TECHNOLOGIES LTD.
    Inventor: Itay KATZ
  • Patent number: 11314335
    Abstract: Systems, methods, and non-transitory computer-readable media are disclosed. For example, a touch-free gesture recognition system is disclosed that includes at least one processor. The processor may be configured to enable presentation of first display information to a user to prompt a first touch-free gesture at at least a first location on a display. The processor may also be configured to receive first gesture information from at least one image sensor corresponding to a first gesturing location on the display correlated to a first touch-free gesture by the user, wherein the first gesturing location differs from a location of the first display in formation at least in part as a result of one eye of the user being dominant over another eye of the user. In addition, the processor may be configured to determine a first offset associated with the location of the first display information and the first gesturing location.
    Type: Grant
    Filed: October 26, 2020
    Date of Patent: April 26, 2022
    Assignee: Eyesight Mobile Technologies Ltd.
    Inventor: Itay Katz
  • Patent number: 11307666
    Abstract: A system and method for recognizing an aimed point on a plane is provided. Images captured by one or more image sensor are processed for obtaining data indicative of location of at least one pointing element in the viewing space and data indicative of at least one predefined user's body part in the viewing space; using the obtained data, an aimed point on the plane is identified. In case it is determined that a predefined condition is met a predefined command and/or message is executed.
    Type: Grant
    Filed: April 1, 2019
    Date of Patent: April 19, 2022
    Assignee: Eyesight Mobile Technologies Ltd.
    Inventors: Itay Katz, Amnon Shenfeld
  • Publication number: 20220107687
    Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more slate sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.
    Type: Application
    Filed: August 13, 2021
    Publication date: April 7, 2022
    Applicant: Eyesight Mobile Technologies Ltd.
    Inventors: Itay KATZ, Amnon Shenfeld
  • Patent number: 11249555
    Abstract: Systems, methods and non-transitory computer-readable media for triggering actions based on touch-free gesture detection are disclosed. The disclosed systems may include at least one processor. A processor may be configured to receive image information from an image sensor, detect in the image information a gesture performed by a user, detect a location of the gesture in the image information, access information associated with at least one control boundary, the control boundary relating to a physical dimension of a device in a field of view of the user, or a physical dimension of a body of the user as perceived by the image sensor, and cause an action associated with the detected gesture, the detected gesture location, and a relationship between the detected gesture location and the control boundary.
    Type: Grant
    Filed: December 23, 2019
    Date of Patent: February 15, 2022
    Assignee: Eyesight Mobile Technologies, Ltd.
    Inventor: Itay Katz
  • Patent number: 11137834
    Abstract: A system for inputting operation system (OS) commands to a data processing device. The system comprises a video camera that captures images of a viewing space. A processor detects a predetermined object in the images using an object recognition algorithm not involving background information in an image. One or more image analysis parameters of the object are extracted from the images and one or more motion detection tests are applied. Each motion detection test has an associated OS command, and when a test succeeds, the OS command associated with the test is executed. By not relying on background information in an image, the system of the invention may be used in devices that are moved in use, such as a palm plot, personal digital assistant (PDA), a mobile telephone, a digital camera, and a mobile game machine.
    Type: Grant
    Filed: April 16, 2018
    Date of Patent: October 5, 2021
    Assignee: eyeSight Mobile Technologies Ltd.
    Inventor: Itay Katz
  • Patent number: 11137832
    Abstract: Systems, methods and non-transitory computer-readable media for triggering actions based on touch-free gesture detection are disclosed. The disclosed systems may include at least one processor. A processor may be configured to receive image information from an image sensor, detect in the image information a gesture performed by a user, detect a location of the gesture in the image information, access information associated with at least one control boundary, the control boundary relating to a physical dimension of a device in a field of view of the user, or a physical dimension of a body of the user as perceived by the image sensor, and cause an action associated with the detected gesture, the detected gesture location, and a relationship between the detected gesture location and the control boundary.
    Type: Grant
    Filed: December 4, 2019
    Date of Patent: October 5, 2021
    Assignee: Eyesight Mobile Technologies, Ltd.
    Inventor: Itay Katz
  • Patent number: 11093045
    Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.
    Type: Grant
    Filed: August 30, 2019
    Date of Patent: August 17, 2021
    Assignee: Eyesight Mobile Technologies Ltd.
    Inventors: Itay Katz, Amnon Shenfeld
  • Publication number: 20210096651
    Abstract: Systems, methods, and non-transitory computer-readable media are disclosed. For example, a dual sensor control device is disclosed that includes at least one processor for receiving information from a proximity sensor and an image sensor. The processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor. The processor may also be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state. The processor may also be configured to receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data.
    Type: Application
    Filed: August 31, 2020
    Publication date: April 1, 2021
    Applicant: EYESIGHT MOBILE TECHNOLOGIES, LTD.
    Inventor: Itay KATZ
  • Publication number: 20200409529
    Abstract: Abstract: The present invention provides a system and method for interacting with a 3D virtual image containing activatable objects. The system of the invention includes a 3D display device that presents to a user a 3D image, an image sensor and a processor. The processor analyzes images obtained by the image sensor to determine when the user has placed an activating object such as a hand or a finger, or has performed a gesture related to an activatable object, in the 3D space, at the location where the user perceives an activatable object to be located. The user thus perceives that he is “touching’ the activatable object with the activating object.
    Type: Application
    Filed: March 30, 2020
    Publication date: December 31, 2020
    Applicant: Eyesight Mobile Technologies Ltd.
    Inventor: Itay Katz
  • Patent number: 10817067
    Abstract: Systems, methods, and non-transitory computer-readable media are disclosed. For example, a touch-free gesture recognition system is disclosed that includes at least one processor. The processor may be configured to enable presentation of first display information to a user to prompt a first touch-free gesture at at least a first location on a display. The processor may also be configured to receive first gesture information from at least one image sensor corresponding to a first gesturing location on the display correlated to a first touch-free gesture by the user, wherein the first gesturing location differs from a location of the first display information at least in part as a result of one eye of the user being dominant over another eye of the user. In addition, the processor may be configured to determine a first offset associated with the location of the first display information and the first gesturing location.
    Type: Grant
    Filed: December 18, 2017
    Date of Patent: October 27, 2020
    Assignee: Eyesight Mobile Technologies Ltd.
    Inventor: Itay Katz
  • Patent number: 10761610
    Abstract: Systems, methods, and non-transitory computer-readable media are disclosed. For example, a dual sensor control device is disclosed that includes at least one processor for receiving information from a proximity sensor and an image sensor. The processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor. The processor may also be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state. The processor may also be configured to receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data.
    Type: Grant
    Filed: May 21, 2018
    Date of Patent: September 1, 2020
    Assignee: Eyesight Mobile Technologies, Ltd.
    Inventor: Itay Katz
  • Patent number: 10606442
    Abstract: The present invention provides a system and method for interacting with a 3D virtual image containing activatable objects. The system of the invention includes a 3D display device that presents to a user a 3D image, an image sensor and a processor. The processor analyzes images obtained by the image sensor to determine when the user has placed an activating object such as a hand or a finger, or has performed a gesture related to an activatable object, in the 3D space, at the location where the user perceives an activatable object to be located. The user thus perceives that he is “touching” the activatable object with the activating object.
    Type: Grant
    Filed: August 14, 2017
    Date of Patent: March 31, 2020
    Assignee: Eyesight Mobile Technologies, Ltd.
    Inventor: Itay Katz
  • Publication number: 20200097093
    Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.
    Type: Application
    Filed: August 30, 2019
    Publication date: March 26, 2020
    Applicant: Eyesight Mobile Technologies, LTD.
    Inventors: Itay KATZ, Amnon Shenfeld
  • Publication number: 20190324595
    Abstract: Systems, devices, methods, and non-transitory computer-readable media are provided for receiving data input via touch-free gestures and movements. For example, a data input device includes at least one processor for receiving information from a sensor, The processor may be configured to receive sensor data from the sensor of a user's hand spaced a distance from a displayed keyboard and in non-contact with the displayed keyboard, and track, using the received sensor data, one or more fingers in air a distance from the displayed keyboard image. The processor may also be configured to correlate locations of the one or more fingers in the air with images of a plurality of keys in the displayed keyboard, and select keys from the keyboard image based on the correlated locations of the one or more fingers in the air, and a detection of a predefined gesture performed by the user.
    Type: Application
    Filed: February 11, 2019
    Publication date: October 24, 2019
    Applicant: EYESIGHT MOBILE TECHNOLOGIES LTD.
    Inventor: Itay KATZ
  • Patent number: 10401967
    Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained item one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.
    Type: Grant
    Filed: May 2, 2016
    Date of Patent: September 3, 2019
    Assignee: Eyesight Mobile Technologies, Ltd.
    Inventors: Itay Katz, Amnon Shenfeld
  • Patent number: 10248218
    Abstract: A method of recognizing an aimed point on a plane is provided. Images captured by one or more image sensor are processed for obtaining data obtaining data indicative of location of at least one pointing element in the viewing space and data indicative of at least one predefined user's body part in the viewing space; using the obtained data an aimed point on the plane is identified. In case it is determined that a predefined condition is met a predefined command and/or message is executed.
    Type: Grant
    Filed: May 1, 2017
    Date of Patent: April 2, 2019
    Assignee: Eyesight Mobile Technologies, Ltd.
    Inventors: Itay Katz, Amnon Shenfeld
  • Patent number: 10203812
    Abstract: Systems, devices, methods, and non-transitory computer-readable media are provided for receiving data input via touch-free gestures and movements. For example, a data input device includes at least one processor for receiving information from a sensor. The processor may be configured to receive sensor data from the sensor of a user's hand spaced a distance from a displayed keyboard and in non-contact with the displayed keyboard, and track, using the received sensor data, one or more fingers in air a distance from the displayed keyboard image. The processor may also be configured to correlate locations of the one or more fingers in the air with images of a plurality of keys in the displayed keyboard, and select keys from the keyboard image based on the correlated locations of the one or more fingers in the air, and a detection of a predefined gesture performed by the user.
    Type: Grant
    Filed: October 9, 2014
    Date of Patent: February 12, 2019
    Assignee: Eyesight Mobile Technologies, Ltd.
    Inventor: Itay Katz