Patents by Inventor Micha Galor

Micha Galor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20160041623
    Abstract: A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.
    Type: Application
    Filed: October 22, 2015
    Publication date: February 11, 2016
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung, Ofir Or
  • Patent number: 9229534
    Abstract: A method, including receiving, by a computer, a sequence of signals indicating a motion of a hand of a user within a predefined area, and segmenting the area into multiple regions. Responsively to the signals, a region is identified in which the hand is located, and a mapping ration is assigned to the motion of the hand based on a direction of the motion and the region in which the hand is located. Using the assigned mapping ratio, a cursor on a display is presented responsively to the indicated motion of the hand.
    Type: Grant
    Filed: February 27, 2013
    Date of Patent: January 5, 2016
    Assignee: APPLE INC.
    Inventor: Micha Galor
  • Patent number: 9218063
    Abstract: A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.
    Type: Grant
    Filed: August 23, 2012
    Date of Patent: December 22, 2015
    Assignee: APPLE INC.
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung, Ofir Or
  • Publication number: 20150248171
    Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
    Type: Application
    Filed: May 17, 2015
    Publication date: September 3, 2015
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
  • Patent number: 9122311
    Abstract: A method, including presenting, by a computer, a scrollable list of interactive items on a display driven by the computer, and receiving an input from a user of the computer. The list is scrolled at a speed indicated by the input, and the list is zoomed in response to the speed of the scrolling.
    Type: Grant
    Filed: August 23, 2012
    Date of Patent: September 1, 2015
    Assignee: APPLE INC.
    Inventor: Micha Galor
  • Patent number: 9035876
    Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
    Type: Grant
    Filed: October 17, 2013
    Date of Patent: May 19, 2015
    Assignee: Apple Inc.
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
  • Patent number: 9030498
    Abstract: A method including presenting, by a computer, multiple interactive items on a display coupled to the computer, and receiving, from a depth sensor, a sequence of three-dimensional (3D) maps containing at least a hand of a user of the computer. An explicit select gesture performed by the user toward one of the interactive items is detected in the maps, and the one of the interactive items is selected responsively to the explicit select gesture. Subsequent to selecting the one of the interactive items, a TimeClick functionality is actuated for subsequent interactive item selections to be made by the user.
    Type: Grant
    Filed: August 14, 2012
    Date of Patent: May 12, 2015
    Assignee: Apple Inc.
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung, Ofir Or
  • Patent number: 8959013
    Abstract: A method, including presenting, by a computer system executing a non-tactile three dimensional user interface, a virtual keyboard on a display, the virtual keyboard including multiple virtual keys, and capturing a sequence of depth maps over time of a body part of a human subject. On the display, a cursor is presented at positions indicated by the body part in the captured sequence of depth maps, and one of the multiple virtual keys is selected in response to an interruption of a motion of the presented cursor in proximity to the one of the multiple virtual keys.
    Type: Grant
    Filed: September 25, 2011
    Date of Patent: February 17, 2015
    Assignee: Apple Inc.
    Inventors: Micha Galor, Ofir Or, Shai Litvak, Erez Sali
  • Publication number: 20150022687
    Abstract: A method of capturing a digital image with a digital camera includes determining a first exposure level for capturing an image based on a first luminance level of the image, determining a second exposure level for capturing the image based on a threshold exposure level of the image, configuring an exposure level of a sensor of the digital camera based on the second exposure level, capturing the image as a digital image, and adding a non-linear digital gain to the digital image based on a difference between the first exposure level and the second exposure level.
    Type: Application
    Filed: July 19, 2013
    Publication date: January 22, 2015
    Inventor: Micha Galor
  • Patent number: 8933876
    Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
    Type: Grant
    Filed: December 8, 2011
    Date of Patent: January 13, 2015
    Assignee: Apple Inc.
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
  • Publication number: 20140380241
    Abstract: A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display.
    Type: Application
    Filed: September 15, 2014
    Publication date: December 25, 2014
    Inventors: Martin Frey, Marcus Hauer, Dario Buzzini, Philipp Schaefer, Adi Berenson, Micha Galor, Nili Metuki, Alexander Shpunt
  • Patent number: 8881051
    Abstract: A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display.
    Type: Grant
    Filed: July 5, 2012
    Date of Patent: November 4, 2014
    Assignee: Primesense Ltd
    Inventors: Martin Frey, Marcus Hauer, Dario Buzzini, Philipp Schaefer, Adi Berenson, Micha Galor, Nili Metuki, Alexander Shpunt
  • Patent number: 8872762
    Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a first set of multiple 3D coordinates representing a gesture performed by a user positioned within a field of view of a sensing device coupled to the computer, the first set of 3D coordinates comprising multiple points in a fixed 3D coordinate system local to the sensing device. The first set of multiple 3D coordinates are transformed to a second set of corresponding multiple 3D coordinates in a subjective 3D coordinate system local to the user.
    Type: Grant
    Filed: December 8, 2011
    Date of Patent: October 28, 2014
    Assignee: Primesense Ltd.
    Inventors: Micha Galor, Idan Gelbourt, Ofir Or, Jonathan Pokrass, Amir Hoffnung
  • Publication number: 20140160304
    Abstract: Embodiments may be directed to lens cameras which may be cameras arranged as a sensor in a lens cap. A lens camera may comprise a printed circuit board with a digital image sensor and associated components enclosed in a cylindrical body that may be constructed of metal, plastic, or the like, or combination thereof. Lens cameras may be fitted with lens mounts for attaching host devices, cameras, interchangeable lens, or the like. Lens mounts on a lens camera may be arranged to be compatible with one or more standard lens mounts. Accordingly, a lens camera may be attached to cameras that have compatible lens mounts. Also, interchangeable lens having lens mounts compatible with the lens camera may be attached to the lens camera. Further, lens cameras may communicate with host devices using wired or wireless communication facilities.
    Type: Application
    Filed: November 27, 2013
    Publication date: June 12, 2014
    Applicant: CSR TECHNOLOGY INC.
    Inventors: Micha Galor, Eran Pinhasov
  • Publication number: 20140152777
    Abstract: Embodiments may be directed to lens cameras which may be cameras arranged as a sensor in a lens cap. A lens camera may comprise a printed circuit board with a digital image sensor and associated components enclosed in a cylindrical body that may be constructed of metal, plastic, or the like, or combination thereof. Lens cameras may be fitted with lens mounts for attaching host devices, cameras, interchangeable lens, or the like. Lens mounts on a lens camera may be arranged to be compatible with one or more standard lens mounts. Accordingly, a lens camera may be attached to cameras that have compatible lens mounts. Also, interchangeable lens having lens mounts compatible with the lens camera may be attached to the lens camera. Further, lens cameras may communicate with host devices using wired or wireless communication facilities.
    Type: Application
    Filed: November 27, 2013
    Publication date: June 5, 2014
    Applicant: CSR TECHOLOGY INC.
    Inventors: Micha Galor, Eran Pinhasov
  • Publication number: 20140043230
    Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
    Type: Application
    Filed: October 17, 2013
    Publication date: February 13, 2014
    Applicant: PrimeSense Ltd.
    Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
  • Publication number: 20140028548
    Abstract: A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.
    Type: Application
    Filed: February 9, 2012
    Publication date: January 30, 2014
    Applicant: PRIMESENSE LTD
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Porkrass, Amir Hoffnung, Tamir Berliner
  • Publication number: 20130321265
    Abstract: A method includes receiving an image including an eye of a user of a computerized system and identifying, based the image of the eye, a direction of a gaze performed by the user. Based on the direction of the gaze, a region on a display coupled to the computerized system is identified, an operation is performed on content presented in the region.
    Type: Application
    Filed: August 7, 2013
    Publication date: December 5, 2013
    Applicant: PRIMESENSE LTD.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Publication number: 20130321271
    Abstract: A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computerized system. A line segment that intersects the first point and the second point is calculated, and a target point is identified where the line segment intersects the display. An interactive item presented on the display in proximity to the target point is engaged.
    Type: Application
    Filed: August 7, 2013
    Publication date: December 5, 2013
    Applicant: PRIMESENSE LTD
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Publication number: 20130283208
    Abstract: A method, including presenting, by a computer, multiple interactive items on a display coupled to the computer, receiving an input indicating a direction of a gaze of a user of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three-dimensional (3D) maps is received containing at least a hand of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
    Type: Application
    Filed: March 24, 2013
    Publication date: October 24, 2013
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel