Patents by Inventor Micha Galor

Micha Galor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170373105
    Abstract: Provided are techniques for simulating a aperture in a digital imaging device, the aperture simulation generated by a multi-diode pixel image sensor. In one aspect, a method includes detecting light incident on a first light sensitive region on a first photodiode of a pixel, and detecting light incident on a second light sensitive region on a second photodiode of the pixel. The method further includes combining, for each pixel, signals from the first and second light sensitive regions, and generating, for a first aperture setting, a first image based at least in part on the light received from the first light sensitive region, and generating, for a second aperture setting, a second image based at least in part on the light received from the second light sensitive region.
    Type: Application
    Filed: June 23, 2016
    Publication date: December 28, 2017
    Inventor: Micha Galor Gluskin
  • Patent number: 9836201
    Abstract: A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display.
    Type: Grant
    Filed: September 15, 2014
    Date of Patent: December 5, 2017
    Assignee: APPLE INC.
    Inventors: Martin Frey, Marcus Hauer, Dario Buzzini, Philipp Schaefer, Adi Berenson, Micha Galor, Nili Metuki, Alexander Shpunt
  • Patent number: 9829988
    Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
    Type: Grant
    Filed: August 11, 2016
    Date of Patent: November 28, 2017
    Assignee: APPLE INC.
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
  • Patent number: 9804357
    Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) in addition to phase detection pixels for calculating autofocus information. Imaging pixel values can be used to interpolate a value at a phase detection pixel location. The interpolated value and a value received from the phase difference detection pixel can be used to obtain a virtual phase detection pixel value. The interpolated value, value received from the phase difference detection pixel, and the virtual phase detection pixel value can be used to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus.
    Type: Grant
    Filed: September 25, 2015
    Date of Patent: October 31, 2017
    Assignee: QUALCOMM Incorporated
    Inventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
  • Publication number: 20170230649
    Abstract: Methods and apparatuses for calibration of hybrid auto focus (AF) imaging systems are disclosed. In one aspect, a method is operable by an imaging device including a hybrid auto focus (AF) system comprising a lens. The method may include capturing an image of a scene, determining that the image of the scene is out of focus, and estimating, via a first AF process of the hybrid AF system, an initial lens position at which the image is in focus in response to determining that the image is out of focus; moving the lens to the initial lens position. The method may also include determining, via a second AF process of the hybrid AF system, a final lens position in response to the movement of the lens to the initial lens position and calibrating the first AF processes based on the determined final lens position.
    Type: Application
    Filed: February 5, 2016
    Publication date: August 10, 2017
    Inventors: Micha Galor Gluskin, Jisoo Lee
  • Patent number: 9729779
    Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) for performing noise reduction on phase detection autofocus. Advantageously, this can provide for more accurate phase detection autofocus and also to optimized processor usage for performing phase detection. The phase difference detection pixels are provided to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus, and analysis of imaging pixel values can be used to estimate a level of focus of an in-focus region of interest and to limit the identified phase difference accordingly.
    Type: Grant
    Filed: July 27, 2016
    Date of Patent: August 8, 2017
    Assignee: QUALCOMM Incorporated
    Inventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
  • Publication number: 20170185161
    Abstract: A gesture based user interface includes a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand. A processor is configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region.
    Type: Application
    Filed: February 16, 2017
    Publication date: June 29, 2017
    Inventors: Amir Hoffnung, Micha Galor, Jonathan Pokrass, Roee Shenberg, Shlomo Zippel
  • Publication number: 20170094210
    Abstract: An example image capture device includes an image sensor having diodes for sensing light from a target scene, a color filter array disposed above the diodes and including color filters each positioned over one of the diodes, single-diode microlenses positioned above some color filters arranged in a Bayer pattern, and multi-diode microlenses each positioned above at least two adjacent color filters that pass the same wavelengths of light to corresponding adjacent diodes below the color filters, each multi-diode microlens formed such that light incident in a first direction is collected one of the adjacent diodes and light incident in a second direction is collected in another of adjacent diodes. An image signal processor of the image capture device can perform phase detection autofocus using signals received from the adjacent diodes and can interpolate color values for the adjacent diodes.
    Type: Application
    Filed: September 24, 2015
    Publication date: March 30, 2017
    Inventor: Micha Galor Gluskin
  • Publication number: 20170090149
    Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) in addition to phase detection pixels for calculating autofocus information. Imaging pixel values can be used to interpolate a value at a phase detection pixel location. The interpolated value and a value received from the phase difference detection pixel can be used to obtain a virtual phase detection pixel value. The interpolated value, value received from the phase difference detection pixel, and the virtual phase detection pixel value can be used to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus.
    Type: Application
    Filed: September 25, 2015
    Publication date: March 30, 2017
    Inventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
  • Publication number: 20170094149
    Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) for performing noise reduction on phase detection autofocus. Advantageously, this can provide for more accurate phase detection autofocus and also to optimized processor usage for performing phase detection. The phase difference detection pixels are provided to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus, and analysis of imaging pixel values can be used to estimate a level of focus of an in-focus region of interest and to limit the identified phase difference accordingly.
    Type: Application
    Filed: July 27, 2016
    Publication date: March 30, 2017
    Inventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
  • Publication number: 20160370860
    Abstract: A method includes receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system and extracting, from the 3D map, 3D coordinates of a head of the user. Based on the 3D coordinates of the head, a direction of a gaze performed by the user and an interactive item presented in the direction of the gaze on a display coupled to the computerized system are identified. An indication is extracted from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and the identified interactive item is repositioned on the display responsively to the indication.
    Type: Application
    Filed: September 5, 2016
    Publication date: December 22, 2016
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Publication number: 20160349853
    Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
    Type: Application
    Filed: August 11, 2016
    Publication date: December 1, 2016
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
  • Patent number: 9459758
    Abstract: A method includes presenting, on a display coupled to a computer, an image of a keyboard comprising multiple keys, and receiving a sequence of three-dimensional (3D) maps including a hand of a user positioned in proximity to the display. An initial portion of the sequence of 3D maps is processed to detect a transverse gesture performed by a hand of a user positioned in proximity to the display, and a cursor is presented on the display at a position indicated by the transverse gesture. While presenting the cursor in proximity to the one of the multiple keys, one of the multiple keys is selected upon detecting a grab gesture followed by a pull gesture followed by a release gesture in a subsequent portion of the sequence of 3D maps.
    Type: Grant
    Filed: May 29, 2013
    Date of Patent: October 4, 2016
    Assignee: APPLE INC.
    Inventors: Adi Berenson, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki
  • Patent number: 9454225
    Abstract: A method includes receiving an image including an eye of a user of a computerized system and identifying, based the image of the eye, a direction of a gaze performed by the user. Based on the direction of the gaze, a region on a display coupled to the computerized system is identified, an operation is performed on content presented in the region.
    Type: Grant
    Filed: August 7, 2013
    Date of Patent: September 27, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Patent number: 9420164
    Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) for performing noise reduction on phase detection autofocus. Advantageously, this can provide for more accurate phase detection autofocus and also to optimized processor usage for performing phase detection. The phase difference detection pixels are provided to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus, and analysis of imaging pixel values can be used to estimate a level of focus of an in-focus region of interest and to limit the identified phase difference accordingly.
    Type: Grant
    Filed: September 25, 2015
    Date of Patent: August 16, 2016
    Assignee: QUALCOMM Incorporated
    Inventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
  • Patent number: 9417706
    Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
    Type: Grant
    Filed: May 17, 2015
    Date of Patent: August 16, 2016
    Assignee: APPLE INC.
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
  • Patent number: 9377863
    Abstract: A method, including presenting, by a computer, multiple interactive items on a display coupled to the computer, receiving an input indicating a direction of a gaze of a user of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three-dimensional (3D) maps is received containing at least a hand of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
    Type: Grant
    Filed: March 24, 2013
    Date of Patent: June 28, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel
  • Patent number: 9377865
    Abstract: A method includes arranging, by a computer, multiple interactive objects as a hierarchical data structure, each node of the hierarchical data structure associated with a respective one of the multiple interactive objects, and presenting, on a display coupled to the computer, a first subset of the multiple interactive objects that are associated with one or more child nodes of one of the multiple interactive objects. A sequence of three-dimensional (3D) maps including at least part of a hand of a user positioned in proximity to the display is received, and the hand performing a transverse gesture followed by a grab gesture followed by a longitudinal gesture followed by an execute gesture is identified in the sequence of three-dimensional (3D) maps, and an operation associated with the selected object is accordingly performed.
    Type: Grant
    Filed: May 29, 2013
    Date of Patent: June 28, 2016
    Assignee: APPLE INC.
    Inventors: Adi Berenson, Dana Cohen, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Orlee Tal, Arnon Yaari, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki, Marcus Hauer
  • Patent number: 9342146
    Abstract: A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computerized system. A line segment that intersects the first point and the second point is calculated, and a target point is identified where the line segment intersects the display. An interactive item presented on the display in proximity to the target point is engaged.
    Type: Grant
    Filed: August 7, 2013
    Date of Patent: May 17, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Patent number: 9285874
    Abstract: A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.
    Type: Grant
    Filed: February 9, 2012
    Date of Patent: March 15, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner