Patents Assigned to Eyesight Mobile Technologies Ltd.
-
Patent number: 10203764Abstract: Systems, methods and non-transitory computer-readable media for triggering actions based on touch-free gesture detection are disclosed. The disclosed systems may include at least one processor. A processor may be configured to receive image information from an image sensor, detect in the image information a gesture performed by a user, detect a location of the gesture in the image information, access information associated with at least one control boundary, the control boundary relating to a physical dimension of a device in a field of view of the user, or a physical dimension of a body of the user as perceived by the image sensor, and cause an action associated with the detected gesture, the detected gesture location, and a relationship between the detected gesture location and the control boundary.Type: GrantFiled: February 29, 2016Date of Patent: February 12, 2019Assignee: Eyesight Mobile Technologies, Ltd.Inventors: Itay Katz, Ofer Affias
-
Publication number: 20180356896Abstract: Systems, methods, and non-transitory computer-readable media are disclosed. For example, a dual sensor control device is disclosed that includes at least one processor for receiving information from a proximity sensor and an image sensor. The processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor. The processor may also be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state. The processor may also be configured to receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data.Type: ApplicationFiled: May 21, 2018Publication date: December 13, 2018Applicant: EYESIGHT MOBILE TECHNOLOGIES, LTD.Inventor: Itay KATZ
-
Patent number: 10126826Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.Type: GrantFiled: June 27, 2016Date of Patent: November 13, 2018Assignee: Eyesight Mobile Technologies Ltd.Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
-
Patent number: 10120454Abstract: Systems, devices, methods, and non-transitory computer-readable media are provided for gesture recognition and control. For example, a processor of a gesture recognition system may be configured to receive first image(s) from an image sensor and process the image(s) to detect a first position of an object. The processor may also define a first navigation region in relation to the position and define a second navigation region in relation to the first navigation region, the second region surrounding the first region. The processor may also receive second image(s) from the image sensor and process the image(s) to detect a transition of the object from the first region to the second region. The processor may also determine a first command associated with a device and that corresponds to the transition of the object from the first region to the second region and provide the determined command to the device.Type: GrantFiled: September 6, 2016Date of Patent: November 6, 2018Assignee: eyeSight Mobile Technologies Ltd.Inventors: Erez Steinberg, Roey Lehmann, Itay Katz
-
Patent number: 9977507Abstract: Systems, methods, and non-transitory computer-readable media are disclosed. For example, a dual sensor control device is disclosed that includes at least one processor for receiving information from a proximity sensor and an image sensor. The processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor. The processor may also be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state. The processor may also be configured to receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data.Type: GrantFiled: March 13, 2014Date of Patent: May 22, 2018Assignee: EYESIGHT MOBILE TECHNOLOGIES LTD.Inventor: Itay Katz
-
Patent number: 9946362Abstract: A system for inputting commands to a processor. A processor detects an object in one or more images and extracts one or more image analysis parameters associated with the object. A first motion detection test is applied to the one or more image analysis parameters. Based on a result of the first motion detection test, a second motion detection test is applied to the one or more image analysis parameters. A command associated with the second motion detection test is executed.Type: GrantFiled: June 1, 2015Date of Patent: April 17, 2018Assignee: eyeSight Mobile Technologies Ltd.Inventor: Itay Katz
-
Patent number: 9846486Abstract: Systems, methods, and non-transitory computer-readable media are disclosed. For example, a touch-free gesture recognition system is disclosed that includes at least one processor. The processor may be configured to enable presentation of first display information to a user to prompt a first touch-free gesture at at least a first location on a display. The processor may also be configured to receive first gesture information from at least one image sensor corresponding to a first gesturing location on the display correlated to a first touch-free gesture by the user, wherein the first gesturing location differs from a location of the first display information at least in part as a result of one eye of the user being dominant over another eye of the user. In addition, the processor may be configured to determine a first offset associated with the location of the first display information and the first gesturing location.Type: GrantFiled: June 27, 2014Date of Patent: December 19, 2017Assignee: EYESIGHT MOBILE TECHNOLOGIES LTD.Inventor: Itay Katz
-
Patent number: 9733789Abstract: The present invention provides a system and method for interacting with a 3D virtual image containing activatable objects. The system of the invention includes a 3D display device that presents to a user a 3D image, an image sensor and a processor. The processor analyzes images obtained by the image sensor to determine when the user has placed an activating object such as a hand or a finger, or has performed a gesture related to an activatable object, in the 3D space, at the location where the user perceives an activatable object to be located. The user thus perceives that he is “touching” the activatable object with the activating object.Type: GrantFiled: August 2, 2012Date of Patent: August 15, 2017Assignee: Eyesight Mobile Technologies Ltd.Inventor: Itay Katz
-
Patent number: 9671869Abstract: Systems and methods for recognizing an aimed point on a plane are provided. Images captured by one or more image sensor are processed for obtaining data obtaining data indicative of location of at least one pointing element in the viewing space and data indicative of at least one predefined user's body part in the viewing space; using the obtained data an aimed point on the plane is identified. In case it is determined that a predefined condition is met a predefined command and/or message is executed.Type: GrantFiled: March 12, 2013Date of Patent: June 6, 2017Assignee: EYESIGHT MOBILE TECHNOLOGIES LTD.Inventors: Itay Katz, Amnon Shenfeld
-
Publication number: 20170052599Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.Type: ApplicationFiled: September 2, 2016Publication date: February 23, 2017Applicant: Eyesight Mobile Technologies, LTD.Inventors: Itay Katz, Amnon Shenfeld
-
Publication number: 20160320855Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained item one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.Type: ApplicationFiled: May 2, 2016Publication date: November 3, 2016Applicant: Eyesight Mobile Technologies, LTD.Inventors: Itay Katz, Amnon Shenfeld
-
Publication number: 20160306433Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.Type: ApplicationFiled: April 12, 2016Publication date: October 20, 2016Applicant: Eyesight Mobile Technologies, LTD.Inventors: Itay Katz, Amnon Shenfeld
-
Publication number: 20160291699Abstract: A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.Type: ApplicationFiled: April 4, 2016Publication date: October 6, 2016Applicant: Eyesight Mobile Technologies, LTD.Inventors: Itay Katz, Amnon Shenfeld
-
Publication number: 20160253044Abstract: Systems, devices, methods, and non-transitory computer-readable media are provided for receiving data input via touch-free gestures and movements. For example, a data input device includes at least one processor for receiving information from a sensor. The processor may be configured to receive sensor data from the sensor of a user's hand spaced a distance from a displayed keyboard and in non-contact with the displayed keyboard, and track, using the received sensor data, one or more fingers in air a distance from the displayed keyboard image. The processor may also be configured to correlate locations of the one or more fingers in the air with images of a plurality of keys in the displayed keyboard, and select keys from the keyboard image based on the correlated locations of the one or more fingers in the air, and a detection of a predefined gesture performed by the user.Type: ApplicationFiled: October 9, 2014Publication date: September 1, 2016Applicant: EYESIGHT MOBILE TECHNOLOGIES LTD.Inventor: Itay KATZ
-
Patent number: 9405970Abstract: Provided is a system and method for object detection and tracking in a video stream. Frames of the video stream are divided into regions of interest and a probability that the region contains at least a portion of an object to be tracked is calculated for each region of interest. The regions of interest in each frame are then classified based on the calculated probabilities. A region of interest (RI) frame is then constructed for each video frame that reports the classification of regions of interest in the video frame. Two or more RI frames are then compared in order to determine a motion of the object. Also provided is a system executing the presently described method, as well as a device including the system. The device may be for example, a portable computer, a mobile telephone, or an entertainment device.Type: GrantFiled: February 2, 2010Date of Patent: August 2, 2016Assignee: eyeSight Mobile Technologies Ltd.Inventors: Nadav Israel, Itay Katz, Dudi Cohen, Amnon Shenfeld
-
Patent number: 9377867Abstract: A user interface apparatus for controlling any kind of a device. Images obtained by an image sensor in a region adjacent to the device are input to a gesture recognition system which analyzes images obtained by the image sensor to identify one or more gestures. A message decision maker generates a message based upon an identified gesture and a recognition mode of the gesture recognition system. The recognition mode is changed under one or more various conditions.Type: GrantFiled: August 8, 2012Date of Patent: June 28, 2016Assignee: EYESIGHT MOBILE TECHNOLOGIES LTD.Inventors: Itay Katz, Nadav Israel, Tamir Anavi, Shahaf Grofit, Itay Bar-Yosef
-
Patent number: 9274608Abstract: Systems, methods and non-transitory computer-readable media for triggering actions based on touch-free gesture detection are disclosed. The disclosed systems may include at least one processor. A processor may be configured to receive image information from an image sensor, detect in the image information a gesture performed by a user, detect a location of the gesture in the image information, access information associated with at least one control boundary, the control boundary relating to a physical dimension of a device in a field of view of the user, or a physical dimension of a body of the user as perceived by the image sensor, and cause an action associated with the detected gesture, the detected gesture location, and a relationship between the detected gesture location and the control boundary.Type: GrantFiled: November 13, 2013Date of Patent: March 1, 2016Assignee: EYESIGHT MOBILE TECHNOLOGIES LTD.Inventors: Itay Katz, Ofer Affias
-
Publication number: 20160026255Abstract: Systems, methods, and non-transitory computer-readable media are disclosed. For example, a dual sensor control device is disclosed that includes at least one processor for receiving information from a proximity sensor and an image sensor. The processor may be configured to receive first data from the proximity sensor while the image sensor is in a first state, determine, using the first data, a presence of an object in proximity to the proximity sensor. The processor may also be configured to output, based on the determined presence of the object in proximity to the proximity sensor, a signal to the image sensor to cause the image sensor to enter a second state, different from the first state. The processor may also be configured to receive second data from the image sensor in the second state, and output at least one of a message and a command associated with the second data.Type: ApplicationFiled: March 13, 2014Publication date: January 28, 2016Applicant: Eyesight Mobile Technologies Ltd.Inventor: Itay KATZ
-
Patent number: 9046929Abstract: A system for inputting operation system (OS) commands to a data processing device. The system comprises a video camera that captures images of a viewing space. A processor detects a predetermined object in the images using an object recognition algorithm not involving background information in an image. One or more image analysis parameters of the object are extracted from the images and one or more motion detection tests are applied. Each motion detection test has an associated OS command, and when a test succeeds, the OS command associated with the test is executed. By not relying on background information in an image, the system of the invention may be used in devices that are moved in use, such as a palm plot, personal digital assistant (PDA), a mobile telephone, a digital camera, and a mobile game machine.Type: GrantFiled: January 17, 2014Date of Patent: June 2, 2015Assignee: eyeSight Mobile Technologies Ltd.Inventor: Itay Katz
-
Publication number: 20140375547Abstract: The presently disclosed subject matter includes a method of recognizing an aimed point on a plane. Images captured by one or more image sensor are processed for obtaining data obtaining data indicative of location of at least one pointing element in the viewing space and data indicative of at least one predefined user's body part in the viewing space; using the obtained data an aimed point on the plane is identified. In case it is determined that a predefined condition is met a predefined command and/or message is executed.Type: ApplicationFiled: March 12, 2013Publication date: December 25, 2014Applicant: Eyesight Mobile Technologies Ltd.Inventors: Itay Katz, Amnon Shenfeld