Patents by Inventor Oren Brezner

Oren Brezner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11900182
    Abstract: A method by one or more computing devices functioning as a ticket master for a website that has a virtual waiting room, wherein the ticket master is communicatively coupled to a plurality of proxies controlling access to the website. When the ticket master is in a relaxed mode (as opposed to a pressure mode), the method includes pre-allocating a number of tickets to the plurality of proxies for a first upcoming time period and setting a queue head for the first upcoming time period to a ticket number of a last ticket created, wherein the number of tickets that are pre-allocated for the first upcoming time period is greater than a target number of users allowed to enter the website during the first upcoming time period but less than a predefined maximum sudden spike number.
    Type: Grant
    Filed: October 6, 2021
    Date of Patent: February 13, 2024
    Assignee: Imperva, Inc.
    Inventors: Oren Brezner, Nir Gabay, Ortal Hasid, Shlomit Abergel
  • Publication number: 20230107052
    Abstract: A method by one or more computing devices functioning as a ticket master for a website that has a virtual waiting room, wherein the ticket master is communicatively coupled to a plurality of proxies controlling access to the website. When the ticket master is in a relaxed mode (as opposed to a pressure mode), the method includes pre-allocating a number of tickets to the plurality of proxies for a first upcoming time period and setting a queue head for the first upcoming time period to a ticket number of a last ticket created, wherein the number of tickets that are pre-allocated for the first upcoming time period is greater than a target number of users allowed to enter the website during the first upcoming time period but less than a predefined maximum sudden spike number.
    Type: Application
    Filed: October 6, 2021
    Publication date: April 6, 2023
    Applicant: Imperva, Inc.
    Inventors: Oren BREZNER, Nir GABAY, Ortal HASID, Shlomit ABERGEL
  • Publication number: 20220164032
    Abstract: A method, including receiving, by a computer, a two-dimensional image (2D) containing at least a physical surface and segmenting the physical surface into one or more physical regions. A functionality is assigned to each of the one or more physical regions, each of the functionalities corresponding to a tactile input device, and a sequence of three-dimensional (3D) maps is received, the sequence of 3D maps containing at least a hand of a user of the computer, the hand positioned on one of the physical regions. The 3D maps are analyzed to detect a gesture performed by the user, and based on the gesture, an input is simulated for the tactile input device corresponding to the one of the physical regions.
    Type: Application
    Filed: November 7, 2021
    Publication date: May 26, 2022
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel
  • Patent number: 11262840
    Abstract: A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.
    Type: Grant
    Filed: June 20, 2018
    Date of Patent: March 1, 2022
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Patent number: 11169611
    Abstract: A method, including receiving, by a computer, a two-dimensional image (2D) containing at least a physical surface and segmenting the physical surface into one or more physical regions. A functionality is assigned to each of the one or more physical regions, each of the functionalities corresponding to a tactile input device, and a sequence of three-dimensional (3D) maps is received, the sequence of 3D maps containing at least a hand of a user of the computer, the hand positioned on one of the physical regions. The 3D maps are analyzed to detect a gesture performed by the user, and based on the gesture, an input is simulated for the tactile input device corresponding to the one of the physical regions.
    Type: Grant
    Filed: March 24, 2013
    Date of Patent: November 9, 2021
    Assignee: APPLE INC.
    Inventors: Eran Guendelman, Ofir Or, Eyal Bychkov, Oren Brezner, Adi Berenson
  • Publication number: 20180314329
    Abstract: A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.
    Type: Application
    Filed: June 20, 2018
    Publication date: November 1, 2018
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Patent number: 10031578
    Abstract: A method includes receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system and extracting, from the 3D map, 3D coordinates of a head of the user. Based on the 3D coordinates of the head, a direction of a gaze performed by the user and an interactive item presented in the direction of the gaze on a display coupled to the computerized system are identified. An indication is extracted from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and the identified interactive item is repositioned on the display responsively to the indication.
    Type: Grant
    Filed: September 5, 2016
    Date of Patent: July 24, 2018
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Patent number: 10020941
    Abstract: Techniques related to virtual encryption patching are described. A security gateway includes multiple Transport Layer Security Implementations (TLSI) that can be used for creating secure communications channels to carry application-layer traffic between one or more clients and one or more server applications. In some embodiments, upon determining that one of the multiple TLSIs contains a security vulnerability, that TLSI can be disabled, leaving one or more others of the multiple TLSIs enabled and available to be used to carry traffic of new connections between the clients and server applications.
    Type: Grant
    Filed: November 17, 2015
    Date of Patent: July 10, 2018
    Assignee: Imperva, Inc.
    Inventors: Amichai Shulman, Itsik Mantin, Nadav Avital, Offir Zigelman, Oren Brezner, Dmitry Babich
  • Publication number: 20170093824
    Abstract: Techniques related to virtual encryption patching are described. A security gateway includes multiple Transport Layer Security Implementations (TLSI) that can be used for creating secure communications channels to carry application-layer traffic between one or more clients and one or more server applications. In some embodiments, upon determining that one of the multiple TLSIs contains a security vulnerability, that TLSI can be disabled, leaving one or more others of the multiple TLSIs enabled and available to be used to carry traffic of new connections between the clients and server applications.
    Type: Application
    Filed: November 17, 2015
    Publication date: March 30, 2017
    Inventors: Amichai SHULMAN, Itsik MANTIN, Nadav AVITAL, Offir ZIGELMAN, Oren BREZNER, Dmitry BABICH
  • Publication number: 20160370860
    Abstract: A method includes receiving a sequence of three-dimensional (3D) maps of at least a part of a body of a user of a computerized system and extracting, from the 3D map, 3D coordinates of a head of the user. Based on the 3D coordinates of the head, a direction of a gaze performed by the user and an interactive item presented in the direction of the gaze on a display coupled to the computerized system are identified. An indication is extracted from the 3D maps an indication that the user is moving a limb of the body in a specific direction, and the identified interactive item is repositioned on the display responsively to the indication.
    Type: Application
    Filed: September 5, 2016
    Publication date: December 22, 2016
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Patent number: 9454225
    Abstract: A method includes receiving an image including an eye of a user of a computerized system and identifying, based the image of the eye, a direction of a gaze performed by the user. Based on the direction of the gaze, a region on a display coupled to the computerized system is identified, an operation is performed on content presented in the region.
    Type: Grant
    Filed: August 7, 2013
    Date of Patent: September 27, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Patent number: 9377863
    Abstract: A method, including presenting, by a computer, multiple interactive items on a display coupled to the computer, receiving an input indicating a direction of a gaze of a user of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three-dimensional (3D) maps is received containing at least a hand of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
    Type: Grant
    Filed: March 24, 2013
    Date of Patent: June 28, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel
  • Patent number: 9342146
    Abstract: A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computerized system. A line segment that intersects the first point and the second point is calculated, and a target point is identified where the line segment intersects the display. An interactive item presented on the display in proximity to the target point is engaged.
    Type: Grant
    Filed: August 7, 2013
    Date of Patent: May 17, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Patent number: 9285874
    Abstract: A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.
    Type: Grant
    Filed: February 9, 2012
    Date of Patent: March 15, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Publication number: 20140028548
    Abstract: A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user (22) of a computerized system, and receiving a two dimensional (2D) image of the user, the image including an eye (34) of the user. 3D coordinates of a head (32) of the user are extracted from the 3D map and the 2D image, and a direction of a gaze performed by the user is identified based on the 3D coordinates of the head and the image of the eye.
    Type: Application
    Filed: February 9, 2012
    Publication date: January 30, 2014
    Applicant: PRIMESENSE LTD
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Porkrass, Amir Hoffnung, Tamir Berliner
  • Publication number: 20130321265
    Abstract: A method includes receiving an image including an eye of a user of a computerized system and identifying, based the image of the eye, a direction of a gaze performed by the user. Based on the direction of the gaze, a region on a display coupled to the computerized system is identified, an operation is performed on content presented in the region.
    Type: Application
    Filed: August 7, 2013
    Publication date: December 5, 2013
    Applicant: PRIMESENSE LTD.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Publication number: 20130321271
    Abstract: A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computerized system. A line segment that intersects the first point and the second point is calculated, and a target point is identified where the line segment intersects the display. An interactive item presented on the display in proximity to the target point is engaged.
    Type: Application
    Filed: August 7, 2013
    Publication date: December 5, 2013
    Applicant: PRIMESENSE LTD
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Hoffnung, Tamir Berliner
  • Publication number: 20130283208
    Abstract: A method, including presenting, by a computer, multiple interactive items on a display coupled to the computer, receiving an input indicating a direction of a gaze of a user of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three-dimensional (3D) maps is received containing at least a hand of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
    Type: Application
    Filed: March 24, 2013
    Publication date: October 24, 2013
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel
  • Publication number: 20130283213
    Abstract: A method, including receiving, by a computer, a two-dimensional image (2D) containing at least a physical surface and segmenting the physical surface into one or more physical regions. A functionality is assigned to each of the one or more physical regions, each of the functionalities corresponding to a tactile input device, and a sequence of three-dimensional (3D) maps is received, the sequence of 3D maps containing at least a hand of a user of the computer, the hand positioned on one of the physical regions. The 3D maps are analyzed to detect a gesture performed by the user, and based on the gesture, an input is simulated for the tactile input device corresponding to the one of the physical regions.
    Type: Application
    Filed: March 24, 2013
    Publication date: October 24, 2013
    Inventors: Eran Guendelman, Ofir Or, Eyal Bychkov, Oren Brezner, Adi Berenson