Patents by Inventor Jonathan Pokrass
Jonathan Pokrass has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20180314329Abstract: A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.Type: ApplicationFiled: June 20, 2018Publication date: November 1, 2018Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Patent number: 10088909Abstract: A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.Type: GrantFiled: October 22, 2015Date of Patent: October 2, 2018Assignee: APPLE INC.Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung, Ofir Or
-
Patent number: 10031578Abstract: A method includes receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system and extracting, from the 3D map, 3D coordinates of a head of the user. Based on the 3D coordinates of the head, a direction of a gaze performed by the user and an interactive item presented in the direction of the gaze on a display coupled to the computerized system are identified. An indication is extracted from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and the identified interactive item is repositioned on the display responsively to the indication.Type: GrantFiled: September 5, 2016Date of Patent: July 24, 2018Assignee: APPLE INC.Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Patent number: 9829988Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: GrantFiled: August 11, 2016Date of Patent: November 28, 2017Assignee: APPLE INC.Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
-
Publication number: 20170185161Abstract: A gesture based user interface includes a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand. A processor is configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region.Type: ApplicationFiled: February 16, 2017Publication date: June 29, 2017Inventors: Amir Hoffnung, Micha Galor, Jonathan Pokrass, Roee Shenberg, Shlomo Zippel
-
Publication number: 20160370860Abstract: A method includes receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system and extracting, from the 3D map, 3D coordinates of a head of the user. Based on the 3D coordinates of the head, a direction of a gaze performed by the user and an interactive item presented in the direction of the gaze on a display coupled to the computerized system are identified. An indication is extracted from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and the identified interactive item is repositioned on the display responsively to the indication.Type: ApplicationFiled: September 5, 2016Publication date: December 22, 2016Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Publication number: 20160349853Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: ApplicationFiled: August 11, 2016Publication date: December 1, 2016Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
-
Patent number: 9459758Abstract: A method includes presenting, on a display coupled to a computer, an image of a keyboard comprising multiple keys, and receiving a sequence of three-dimensional (3D) maps including a hand of a user positioned in proximity to the display. An initial portion of the sequence of 3D maps is processed to detect a transverse gesture performed by a hand of a user positioned in proximity to the display, and a cursor is presented on the display at a position indicated by the transverse gesture. While presenting the cursor in proximity to the one of the multiple keys, one of the multiple keys is selected upon detecting a grab gesture followed by a pull gesture followed by a release gesture in a subsequent portion of the sequence of 3D maps.Type: GrantFiled: May 29, 2013Date of Patent: October 4, 2016Assignee: APPLE INC.Inventors: Adi Berenson, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki
-
Patent number: 9454225Abstract: A method includes receiving an image including an eye of a user of a computerized system and identifying, based the image of the eye, a direction of a gaze performed by the user. Based on the direction of the gaze, a region on a display coupled to the computerized system is identified, an operation is performed on content presented in the region.Type: GrantFiled: August 7, 2013Date of Patent: September 27, 2016Assignee: APPLE INC.Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Patent number: 9417706Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: GrantFiled: May 17, 2015Date of Patent: August 16, 2016Assignee: APPLE INC.Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
-
Patent number: 9377863Abstract: A method, including presenting, by a computer, multiple interactive items on a display coupled to the computer, receiving an input indicating a direction of a gaze of a user of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three-dimensional (3D) maps is received containing at least a hand of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.Type: GrantFiled: March 24, 2013Date of Patent: June 28, 2016Assignee: APPLE INC.Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel
-
Patent number: 9377865Abstract: A method includes arranging, by a computer, multiple interactive objects as a hierarchical data structure, each node of the hierarchical data structure associated with a respective one of the multiple interactive objects, and presenting, on a display coupled to the computer, a first subset of the multiple interactive objects that are associated with one or more child nodes of one of the multiple interactive objects. A sequence of three-dimensional (3D) maps including at least part of a hand of a user positioned in proximity to the display is received, and the hand performing a transverse gesture followed by a grab gesture followed by a longitudinal gesture followed by an execute gesture is identified in the sequence of three-dimensional (3D) maps, and an operation associated with the selected object is accordingly performed.Type: GrantFiled: May 29, 2013Date of Patent: June 28, 2016Assignee: APPLE INC.Inventors: Adi Berenson, Dana Cohen, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Orlee Tal, Arnon Yaari, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki, Marcus Hauer
-
Patent number: 9342146Abstract: A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computerized system. A line segment that intersects the first point and the second point is calculated, and a target point is identified where the line segment intersects the display. An interactive item presented on the display in proximity to the target point is engaged.Type: GrantFiled: August 7, 2013Date of Patent: May 17, 2016Assignee: APPLE INC.Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Patent number: 9285874Abstract: A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.Type: GrantFiled: February 9, 2012Date of Patent: March 15, 2016Assignee: APPLE INC.Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
-
Publication number: 20160041623Abstract: A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.Type: ApplicationFiled: October 22, 2015Publication date: February 11, 2016Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung, Ofir Or
-
Patent number: 9218063Abstract: A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture.Type: GrantFiled: August 23, 2012Date of Patent: December 22, 2015Assignee: APPLE INC.Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung, Ofir Or
-
Patent number: 9152234Abstract: A method includes receiving, from a three-dimensional (3D) sensing device coupled to a computer, a sequence of 3D maps including at least part of a hand of a user positioned in proximity to the computer. In embodiments of the present invention, the computer is coupled to one or more peripheral devices, and upon identifying, in the sequence of 3D maps, a movement of the hand toward a given peripheral device, an action preparatory to disengaging the given peripheral device is initiated.Type: GrantFiled: December 1, 2013Date of Patent: October 6, 2015Assignee: APPLE INC.Inventors: Tamir Berliner, Jonathan Pokrass
-
Publication number: 20150248171Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: ApplicationFiled: May 17, 2015Publication date: September 3, 2015Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung
-
Publication number: 20150156485Abstract: A mobile device includes a camera module, including a lens and an image sensor. An inertial sensor in the mobile device outputs a signal indicative of an orientation of the device. A controller corrects one or more focal properties of the camera module responsively to the orientation indicated by the inertial sensor.Type: ApplicationFiled: December 2, 2014Publication date: June 4, 2015Inventors: Daniel Kravitz, Niv Gilboa, Yohai Zmora, Zafrir Mor, Jonathan Pokrass
-
Patent number: 9035876Abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.Type: GrantFiled: October 17, 2013Date of Patent: May 19, 2015Assignee: Apple Inc.Inventors: Micha Galor, Jonathan Pokrass, Amir Hoffnung