Patents by Inventor Micha Galor
Micha Galor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10079968Abstract: Embodiments may be directed to lens cameras which may be cameras arranged as a sensor in a lens cap. A lens camera may comprise a printed circuit board with a digital image sensor and associated components enclosed in a cylindrical body that may be constructed of metal, plastic, or the like, or combination thereof. Lens cameras may be fitted with lens mounts for attaching host devices, cameras, interchangeable lens, or the like. Lens mounts on a lens camera may be arranged to be compatible with one or more standard lens mounts. Accordingly, a lens camera may be attached to cameras that have compatible lens mounts. Also, interchangeable lens having lens mounts compatible with the lens camera may be attached to the lens camera. Further, lens cameras may communicate with host devices using wired or wireless communication facilities.Type: GrantFiled: November 27, 2013Date of Patent: September 18, 2018Assignee: QUALCOMM IncorporatedInventors: Micha Galor, Eran David Pinhasov
-
Patent number: 10044926Abstract: Systems, methods, and devices for optimizing phase detection autofocus (PDAF) processing are provided. One aspect provides an apparatus comprising: an image sensor configured to capture image data of a scene; a buffer; and a processor. The processor may be configured to store the image data in the buffer as a current frame and divide the current frame into a plurality of windows each corresponding to a different spatial region of the scene. The processor may be further configured to identify a central portion of the current frame comprising a subset of the plurality of windows. The processor may be further configured to determine a depth value of the central portion based on performing PDAF on the subset of the plurality of windows and determine a confidence value for the central portion based on the depth value and image data corresponding to the subset of the plurality of windows.Type: GrantFiled: July 19, 2017Date of Patent: August 7, 2018Assignee: QUALCOMM IncorporatedInventors: Micha Galor Gluskin, Sanket Krishnamurthy Sagar
-
Patent number: 10044959Abstract: An example image capture device includes an image sensor having diodes for sensing light from a target scene, a color filter array disposed above the diodes and including color filters each positioned over one of the diodes, single-diode microlenses positioned above some color filters arranged in a Bayer pattern, and multi-diode microlenses each positioned above at least two adjacent color filters that pass the same wavelengths of light to corresponding adjacent diodes below the color filters, each multi-diode microlens formed such that light incident in a first direction is collected one of the adjacent diodes and light incident in a second direction is collected in another of adjacent diodes. An image signal processor of the image capture device can perform phase detection autofocus using signals received from the adjacent diodes and can interpolate color values for the adjacent diodes.Type: GrantFiled: September 24, 2015Date of Patent: August 7, 2018Assignee: QUALCOMM IncorporatedInventor: Micha Galor Gluskin
-
Patent number: 10031578Abstract: A method includes receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system and extracting, from the 3D map, 3D coordinates of a head of the user. Based on the 3D coordinates of the head, a direction of a gaze performed by the user and an interactive item presented in the direction of the gaze on a display coupled to the computerized system are identified. An indication is extracted from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and the identified interactive item is repositioned on the display responsively to the indication.Type: GrantFiled: September 5, 2016Date of Patent: July 24, 2018Assignee: APPLE INC.Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Publication number: 20180131862Abstract: Systems, methods, and devices for optimizing phase detection autofocus (PDAF) processing are provided. One aspect provides an apparatus comprising: an image sensor configured to capture image data of a scene; a buffer; and a processor. The processor may be configured to store the image data in the buffer as a current frame and divide the current frame into a plurality of windows each corresponding to a different spatial region of the scene. The processor may be further configured to identify a central portion of the current frame comprising a subset of the plurality of windows. The processor may be further configured to determine a depth value of the central portion based on performing PDAF on the subset of the plurality of windows and determine a confidence value for the central portion based on the depth value and image data corresponding to the subset of the plurality of windows.Type: ApplicationFiled: July 19, 2017Publication date: May 10, 2018Inventors: Micha Galor Gluskin, Sanket Krishnamurthy Sagar
-
Publication number: 20170373105Abstract: Provided are techniques for simulating a aperture in a digital imaging device, the aperture simulation generated by a multi-diode pixel image sensor. In one aspect, a method includes detecting light incident on a first light sensitive region on a first photodiode of a pixel, and detecting light incident on a second light sensitive region on a second photodiode of the pixel. The method further includes combining, for each pixel, signals from the first and second light sensitive regions, and generating, for a first aperture setting, a first image based at least in part on the light received from the first light sensitive region, and generating, for a second aperture setting, a second image based at least in part on the light received from the second light sensitive region.Type: ApplicationFiled: June 23, 2016Publication date: December 28, 2017Inventor: Micha Galor Gluskin
-
Patent number: 9836201Abstract: A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display.Type: GrantFiled: September 15, 2014Date of Patent: December 5, 2017Assignee: APPLE INC.Inventors: Martin Frey, Marcus Hauer, Dario Buzzini, Philipp Schaefer, Adi Berenson, Micha Galor, Nili Metuki, Alexander Shpunt
-
Patent number: 9829988Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: GrantFiled: August 11, 2016Date of Patent: November 28, 2017Assignee: APPLE INC.Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
-
Patent number: 9804357Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) in addition to phase detection pixels for calculating autofocus information. Imaging pixel values can be used to interpolate a value at a phase detection pixel location. The interpolated value and a value received from the phase difference detection pixel can be used to obtain a virtual phase detection pixel value. The interpolated value, value received from the phase difference detection pixel, and the virtual phase detection pixel value can be used to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus.Type: GrantFiled: September 25, 2015Date of Patent: October 31, 2017Assignee: QUALCOMM IncorporatedInventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
-
Publication number: 20170230649Abstract: Methods and apparatuses for calibration of hybrid auto focus (AF) imaging systems are disclosed. In one aspect, a method is operable by an imaging device including a hybrid auto focus (AF) system comprising a lens. The method may include capturing an image of a scene, determining that the image of the scene is out of focus, and estimating, via a first AF process of the hybrid AF system, an initial lens position at which the image is in focus in response to determining that the image is out of focus; moving the lens to the initial lens position. The method may also include determining, via a second AF process of the hybrid AF system, a final lens position in response to the movement of the lens to the initial lens position and calibrating the first AF processes based on the determined final lens position.Type: ApplicationFiled: February 5, 2016Publication date: August 10, 2017Inventors: Micha Galor Gluskin, Jisoo Lee
-
Patent number: 9729779Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) for performing noise reduction on phase detection autofocus. Advantageously, this can provide for more accurate phase detection autofocus and also to optimized processor usage for performing phase detection. The phase difference detection pixels are provided to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus, and analysis of imaging pixel values can be used to estimate a level of focus of an in-focus region of interest and to limit the identified phase difference accordingly.Type: GrantFiled: July 27, 2016Date of Patent: August 8, 2017Assignee: QUALCOMM IncorporatedInventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
-
Publication number: 20170185161Abstract: A gesture based user interface includes a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand. A processor is configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region.Type: ApplicationFiled: February 16, 2017Publication date: June 29, 2017Inventors: Amir Hoffnung, Micha Galor, Jonathan Pokrass, Roee Shenberg, Shlomo Zippel
-
Publication number: 20170094149Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) for performing noise reduction on phase detection autofocus. Advantageously, this can provide for more accurate phase detection autofocus and also to optimized processor usage for performing phase detection. The phase difference detection pixels are provided to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus, and analysis of imaging pixel values can be used to estimate a level of focus of an in-focus region of interest and to limit the identified phase difference accordingly.Type: ApplicationFiled: July 27, 2016Publication date: March 30, 2017Inventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
-
Publication number: 20170094210Abstract: An example image capture device includes an image sensor having diodes for sensing light from a target scene, a color filter array disposed above the diodes and including color filters each positioned over one of the diodes, single-diode microlenses positioned above some color filters arranged in a Bayer pattern, and multi-diode microlenses each positioned above at least two adjacent color filters that pass the same wavelengths of light to corresponding adjacent diodes below the color filters, each multi-diode microlens formed such that light incident in a first direction is collected one of the adjacent diodes and light incident in a second direction is collected in another of adjacent diodes. An image signal processor of the image capture device can perform phase detection autofocus using signals received from the adjacent diodes and can interpolate color values for the adjacent diodes.Type: ApplicationFiled: September 24, 2015Publication date: March 30, 2017Inventor: Micha Galor Gluskin
-
Publication number: 20170090149Abstract: Certain aspects relate to systems and techniques for using imaging pixels (that is, non-phase detection pixels) in addition to phase detection pixels for calculating autofocus information. Imaging pixel values can be used to interpolate a value at a phase detection pixel location. The interpolated value and a value received from the phase difference detection pixel can be used to obtain a virtual phase detection pixel value. The interpolated value, value received from the phase difference detection pixel, and the virtual phase detection pixel value can be used to obtain a phase difference detection signal indicating a shift direction (defocus direction) and a shift amount (defocus amount) of image focus.Type: ApplicationFiled: September 25, 2015Publication date: March 30, 2017Inventors: Micha Galor Gluskin, Ruben Manuel Velarde, Jisoo Lee
-
Publication number: 20160370860Abstract: A method includes receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system and extracting, from the 3D map, 3D coordinates of a head of the user. Based on the 3D coordinates of the head, a direction of a gaze performed by the user and an interactive item presented in the direction of the gaze on a display coupled to the computerized system are identified. An indication is extracted from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and the identified interactive item is repositioned on the display responsively to the indication.Type: ApplicationFiled: September 5, 2016Publication date: December 22, 2016Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Publication number: 20160349853Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: ApplicationFiled: August 11, 2016Publication date: December 1, 2016Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
-
Patent number: 9459758Abstract: A method includes presenting, on a display coupled to a computer, an image of a keyboard comprising multiple keys, and receiving a sequence of three-dimensional (3D) maps including a hand of a user positioned in proximity to the display. An initial portion of the sequence of 3D maps is processed to detect a transverse gesture performed by a hand of a user positioned in proximity to the display, and a cursor is presented on the display at a position indicated by the transverse gesture. While presenting the cursor in proximity to the one of the multiple keys, one of the multiple keys is selected upon detecting a grab gesture followed by a pull gesture followed by a release gesture in a subsequent portion of the sequence of 3D maps.Type: GrantFiled: May 29, 2013Date of Patent: October 4, 2016Assignee: APPLE INC.Inventors: Adi Berenson, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki
-
Patent number: 9454225Abstract: A method includes receiving an image including an eye of a user of a computerized system and identifying, based the image of the eye, a direction of a gaze performed by the user. Based on the direction of the gaze, a region on a display coupled to the computerized system is identified, an operation is performed on content presented in the region.Type: GrantFiled: August 7, 2013Date of Patent: September 27, 2016Assignee: APPLE INC.Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Patent number: 9417706Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: GrantFiled: May 17, 2015Date of Patent: August 16, 2016Assignee: APPLE INC.Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung