Patents by Inventor Adi Berenson

Adi Berenson has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20220244838
    Abstract: The present disclosure generally relates to using avatars and image data for enhanced user interactions. In some examples, user status dependent avatars are generated and displayed with a message associated with the user status. In some examples, a device captures image information to scan an object to create a 3D model of the object. The device determines an algorithm for the 3D model based on the capture image information and provides visual feedback on additional image data that is needed for the algorithm to build the 3D model. In some examples, an application's operation on a device is restricted based on whether an authorized user is identified as using the device based on captured image data.
    Type: Application
    Filed: April 20, 2022
    Publication date: August 4, 2022
    Inventors: Marek BEREZA, Adi BERENSON, Jeffrey Traer BERNSTEIN, Lukas Robert Tom GIRLING, Mark HAUENSTEIN, Amir HOFFNUNG, William D. LINDMEIER, Joseph A. MALIA, Julian MISSIG
  • Patent number: 11169611
    Abstract: A method, including receiving, by a computer, a two-dimensional image (2D) containing at least a physical surface and segmenting the physical surface into one or more physical regions. A functionality is assigned to each of the one or more physical regions, each of the functionalities corresponding to a tactile input device, and a sequence of three-dimensional (3D) maps is received, the sequence of 3D maps containing at least a hand of a user of the computer, the hand positioned on one of the physical regions. The 3D maps are analyzed to detect a gesture performed by the user, and based on the gesture, an input is simulated for the tactile input device corresponding to the one of the physical regions.
    Type: Grant
    Filed: March 24, 2013
    Date of Patent: November 9, 2021
    Assignee: APPLE INC.
    Inventors: Eran Guendelman, Ofir Or, Eyal Bychkov, Oren Brezner, Adi Berenson
  • Publication number: 20210168347
    Abstract: A system includes a stereo camera, a memory and a processor. The camera includes a first imaging device configured to acquire a first image of an object at a first wavelength range from a first direction, and a second imaging device configured to acquire a second image of the object at a second wavelength range from a second direction. The memory is configured to store weights of an ANN trained to estimate a spatial disparity between a first image patch of the first image and a second image patch of the second image. The processor is configured to (a) apply the ANN to the first and second image patches so as to estimate (i) the spatial disparity and (ii) a degree of matching between the first and second image patches at the estimated spatial disparity, and (b) output the estimated degree of matching.
    Type: Application
    Filed: November 23, 2020
    Publication date: June 3, 2021
    Inventors: Ran Margolin, Adi Berenson
  • Patent number: 10943092
    Abstract: A method, system and a computer readable medium for monitoring a person. The method may include (a) acquiring a group of other images and acquiring a group of LWIR of a face of the person. The LWIR imager is mechanically coupled to the other imager has a lower resolution, (b) determining locations of a facial feature of the person within the groups of other and LWIR images, by applying a compensation process for compensating for differences between the acquiring of the group of LWIR images and the acquiring of the group of other images; (c) applying a thermal dynamic analysis on pixels of the facial feature within the LWIR images; and (d) determining, based on an outcome of the thermal dynamic analysis, at least one parameter of the person.
    Type: Grant
    Filed: February 3, 2019
    Date of Patent: March 9, 2021
    Assignee: CLAIRLABS LTD.
    Inventors: Ran Margolin, Adi Berenson
  • Publication number: 20190362133
    Abstract: A method, system and a computer readable medium for monitoring a person. The method may include (a) acquiring a group of other images and acquiring a group of LWIR of a face of the person. The LWIR imager is mechanically coupled to the other imager has a lower resolution, (b) determining locations of a facial feature of the person within the groups of other and LWIR images, by applying a compensation process for compensating for differences between the acquiring of the group of LWIR images and the acquiring of the group of other images; (c) applying a thermal dynamic analysis on pixels of the facial feature within the LWIR images; and (d) determining, based on an outcome of the thermal dynamic analysis, at least one parameter of the person.
    Type: Application
    Filed: February 3, 2019
    Publication date: November 28, 2019
    Inventors: Ran Margolin, Adi Berenson
  • Patent number: 10444963
    Abstract: The present disclosure generally relates to using avatars and image data for enhanced user interactions. In some examples, user status dependent avatars are generated and displayed with a message associated with the user status. In some examples, a device captures image information to scan an object to create a 3D model of the object. The device determines an algorithm for the 3D model based on the capture image information and provides visual feedback on additional image data that is needed for the algorithm to build the 3D model. In some examples, an application's operation on a device is restricted based on whether an authorized user is identified as using the device based on captured image data. In some examples, depth data is used to combine two sets of image data.
    Type: Grant
    Filed: July 13, 2018
    Date of Patent: October 15, 2019
    Assignee: Apple Inc.
    Inventors: Marek Bereza, Adi Berenson, Jeffrey Traer Bernstein, Lukas Robert Tom Girling, Mark Hauenstein, Amir Hoffnung, William D. Lindmeier, Joseph A. Malia, Julian Missig
  • Publication number: 20180321826
    Abstract: The present disclosure generally relates to using avatars and image data for enhanced user interactions. In some examples, user status dependent avatars are generated and displayed with a message associated with the user status. In some examples, a device captures image information to scan an object to create a 3D model of the object. The device determines an algorithm for the 3D model based on the capture image information and provides visual feedback on additional image data that is needed for the algorithm to build the 3D model. In some examples, an application's operation on a device is restricted based on whether an authorized user is identified as using the device based on captured image data. In some examples, depth data is used to combine two sets of image data.
    Type: Application
    Filed: July 13, 2018
    Publication date: November 8, 2018
    Inventors: Marek BEREZA, Adi BERENSON, Jeffrey Traer BERNSTEIN, Lukas Robert Tom GIRLING, Mark HAUENSTEIN, Amir HOFFNUNG, William D. LINDMEIER, Joseph A. MALIA, Julian MISSIG
  • Publication number: 20180088787
    Abstract: The present disclosure generally relates to using avatars and image data for enhanced user interactions. In some examples, user status dependent avatars are generated and displayed with a message associated with the user status. In some examples, a device captures image information to scan an object to create a 3D model of the object. The device determines an algorithm for the 3D model based on the capture image information and provides visual feedback on additional image data that is needed for the algorithm to build the 3D model. In some examples, an application's operation on a device is restricted based on whether an authorized user is identified as using the device based on captured image data.
    Type: Application
    Filed: September 25, 2017
    Publication date: March 29, 2018
    Inventors: Marek BEREZA, Adi BERENSON, Jeffrey Traer BERNSTEIN, Lukas Robert Tom GIRLING, Mark HAUENSTEIN, Amir HOFFNUNG, William D. LINDMEIER, Joseph A. MALIA, Julian MISSIG
  • Patent number: 9836201
    Abstract: A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display.
    Type: Grant
    Filed: September 15, 2014
    Date of Patent: December 5, 2017
    Assignee: APPLE INC.
    Inventors: Martin Frey, Marcus Hauer, Dario Buzzini, Philipp Schaefer, Adi Berenson, Micha Galor, Nili Metuki, Alexander Shpunt
  • Patent number: 9459758
    Abstract: A method includes presenting, on a display coupled to a computer, an image of a keyboard comprising multiple keys, and receiving a sequence of three-dimensional (3D) maps including a hand of a user positioned in proximity to the display. An initial portion of the sequence of 3D maps is processed to detect a transverse gesture performed by a hand of a user positioned in proximity to the display, and a cursor is presented on the display at a position indicated by the transverse gesture. While presenting the cursor in proximity to the one of the multiple keys, one of the multiple keys is selected upon detecting a grab gesture followed by a pull gesture followed by a release gesture in a subsequent portion of the sequence of 3D maps.
    Type: Grant
    Filed: May 29, 2013
    Date of Patent: October 4, 2016
    Assignee: APPLE INC.
    Inventors: Adi Berenson, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki
  • Patent number: 9377865
    Abstract: A method includes arranging, by a computer, multiple interactive objects as a hierarchical data structure, each node of the hierarchical data structure associated with a respective one of the multiple interactive objects, and presenting, on a display coupled to the computer, a first subset of the multiple interactive objects that are associated with one or more child nodes of one of the multiple interactive objects. A sequence of three-dimensional (3D) maps including at least part of a hand of a user positioned in proximity to the display is received, and the hand performing a transverse gesture followed by a grab gesture followed by a longitudinal gesture followed by an execute gesture is identified in the sequence of three-dimensional (3D) maps, and an operation associated with the selected object is accordingly performed.
    Type: Grant
    Filed: May 29, 2013
    Date of Patent: June 28, 2016
    Assignee: APPLE INC.
    Inventors: Adi Berenson, Dana Cohen, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Orlee Tal, Arnon Yaari, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki, Marcus Hauer
  • Publication number: 20140380241
    Abstract: A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display.
    Type: Application
    Filed: September 15, 2014
    Publication date: December 25, 2014
    Inventors: Martin Frey, Marcus Hauer, Dario Buzzini, Philipp Schaefer, Adi Berenson, Micha Galor, Nili Metuki, Alexander Shpunt
  • Patent number: 8881051
    Abstract: A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display.
    Type: Grant
    Filed: July 5, 2012
    Date of Patent: November 4, 2014
    Assignee: Primesense Ltd
    Inventors: Martin Frey, Marcus Hauer, Dario Buzzini, Philipp Schaefer, Adi Berenson, Micha Galor, Nili Metuki, Alexander Shpunt
  • Publication number: 20130283213
    Abstract: A method, including receiving, by a computer, a two-dimensional image (2D) containing at least a physical surface and segmenting the physical surface into one or more physical regions. A functionality is assigned to each of the one or more physical regions, each of the functionalities corresponding to a tactile input device, and a sequence of three-dimensional (3D) maps is received, the sequence of 3D maps containing at least a hand of a user of the computer, the hand positioned on one of the physical regions. The 3D maps are analyzed to detect a gesture performed by the user, and based on the gesture, an input is simulated for the tactile input device corresponding to the one of the physical regions.
    Type: Application
    Filed: March 24, 2013
    Publication date: October 24, 2013
    Inventors: Eran Guendelman, Ofir Or, Eyal Bychkov, Oren Brezner, Adi Berenson
  • Publication number: 20130265222
    Abstract: A method includes arranging, by a computer, multiple interactive objects as a hierarchical data structure, each node of the hierarchical data structure associated with a respective one of the multiple interactive objects, and presenting, on a display coupled to the computer, a first subset of the multiple interactive objects that are associated with one or more child nodes of one of the multiple interactive objects. A sequence of three-dimensional (3D) maps including at least part of a hand of a user positioned in proximity to the display is received, and the hand performing a transverse gesture followed by a grab gesture followed by a longitudinal gesture followed by an execute gesture is identified in the sequence of three-dimensional (3D) maps, and an operation associated with the selected object is accordingly performed.
    Type: Application
    Filed: May 29, 2013
    Publication date: October 10, 2013
    Inventors: Adi Berenson, Dana Cohen, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Orlee Tal, Arnon Yaari, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki, Marcus Hauer
  • Publication number: 20130263036
    Abstract: A method includes presenting, on a display coupled to a computer, an image of a keyboard comprising multiple keys, and receiving a sequence of three-dimensional (3D) maps including a hand of a user positioned in proximity to the display. An initial portion of the sequence of 3D maps is processed to detect a transverse gesture performed by a hand of a user positioned in proximity to the display, and a cursor is presented on the display at a position indicated by the transverse gesture. While presenting the cursor in proximity to the one of the multiple keys, one of the multiple keys is selected upon detecting a grab gesture followed by a pull gesture followed by a release gesture in a subsequent portion of the sequence of 3D maps.
    Type: Application
    Filed: May 29, 2013
    Publication date: October 3, 2013
    Applicant: PRIMESENSE LTD.
    Inventors: Adi Berenson, Micha Galor, Jonathan Pokrass, Ran Shani, Daniel Shein, Eran Weissenstern, Martin Frey, Amir Hoffnung, Nili Metuki
  • Publication number: 20130014052
    Abstract: A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display.
    Type: Application
    Filed: July 5, 2012
    Publication date: January 10, 2013
    Applicant: PRIMESENSE LTD.
    Inventors: Martin Frey, Marcus Hauer, Dario Buzzini, Philipp Schaefer, Adi Berenson, Micha Galor, Nili Metuki, Alexander Shpunt