Patents by Inventor Amir Eshel

Amir Eshel has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240112427
    Abstract: Systems, methods, and non-transitory computer readable media including instructions for presenting location-based content are described. Presenting location-based content includes obtaining an indication of a current physical location of an extended reality appliance; providing the indication to a first server that maps physical locations to a plurality of content addresses; receiving from the first server, at least one specific content address associated with the current physical location; using the at least one specific content address to access a second server; receiving content, associated with the current physical location, from the second server; and presenting the content via the extended reality appliance, while the extended reality appliance is in the current physical location.
    Type: Application
    Filed: December 5, 2023
    Publication date: April 4, 2024
    Applicant: Sightful Computers Ltd
    Inventors: Tamir BERLINER, Doron Assayas TERRE, Tomer KAHAN, Dori PELEG, Oded NOAM, Orit DOLEV, Amir ESHEL
  • Patent number: 11650626
    Abstract: Systems, methods, and non-transitory computer readable media for virtually extending a physical keyboard are disclosed. In one implementation, a non-transitory computer readable medium contains instructions that cause a processor to: receive image data representing a keyboard placed on a surface from an image sensor associated with a wearable extended reality appliance; determine that the keyboard is paired with the wearable extended reality appliance; receive an input for causing a display of a virtual controller with the keyboard; display, via the wearable extended reality appliance, the virtual controller in a first location on the surface, with original spatial orientation relative to the keyboard; detect a movement of the keyboard to a different location on the surface; and in response to the detected movement, present the virtual controller in a second location on the surface, with a subsequent spatial orientation relative to the keyboard that corresponds to the original spatial orientation.
    Type: Grant
    Filed: April 4, 2022
    Date of Patent: May 16, 2023
    Assignee: MULTINARITY LTD
    Inventors: Tamir Berliner, Tomer Kahan, Orit Dolev, Oded Noam, Doron Assayas Terre, Amir Eshel, Aviv Burshtein
  • Publication number: 20220253139
    Abstract: Systems, methods, and non-transitory computer readable media for virtually extending a physical keyboard are disclosed. In one implementation, a non-transitory computer readable medium contains instructions that cause a processor to: receive image data representing a keyboard placed on a surface from an image sensor associated with a wearable extended reality appliance; determine that the keyboard is paired with the wearable extended reality appliance; receive an input for causing a display of a virtual controller with the keyboard; display, via the wearable extended reality appliance, the virtual controller in a first location on the surface, with original spatial orientation relative to the keyboard; detect a movement of the keyboard to a different location on the surface; and in response to the detected movement, present the virtual controller in a second location on the surface, with a subsequent spatial orientation relative to the keyboard that corresponds to the original spatial orientation.
    Type: Application
    Filed: April 4, 2022
    Publication date: August 11, 2022
    Applicant: Multinarity Ltd
    Inventors: Tamir Berliner, Tomer Kahan, Orit Dolev, Oded Noam, Doron Assayas Terre, Amir Eshel, Aviv Burshtein
  • Publication number: 20220164032
    Abstract: A method, including receiving, by a computer, a two-dimensional image (2D) containing at least a physical surface and segmenting the physical surface into one or more physical regions. A functionality is assigned to each of the one or more physical regions, each of the functionalities corresponding to a tactile input device, and a sequence of three-dimensional (3D) maps is received, the sequence of 3D maps containing at least a hand of a user of the computer, the hand positioned on one of the physical regions. The 3D maps are analyzed to detect a gesture performed by the user, and based on the gesture, an input is simulated for the tactile input device corresponding to the one of the physical regions.
    Type: Application
    Filed: November 7, 2021
    Publication date: May 26, 2022
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel
  • Patent number: 9377863
    Abstract: A method, including presenting, by a computer, multiple interactive items on a display coupled to the computer, receiving an input indicating a direction of a gaze of a user of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three-dimensional (3D) maps is received containing at least a hand of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
    Type: Grant
    Filed: March 24, 2013
    Date of Patent: June 28, 2016
    Assignee: APPLE INC.
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel
  • Publication number: 20130283208
    Abstract: A method, including presenting, by a computer, multiple interactive items on a display coupled to the computer, receiving an input indicating a direction of a gaze of a user of the computer. In response to the gaze direction, one of the multiple interactive items is selected, and subsequent to the one of the interactive items being selected, a sequence of three-dimensional (3D) maps is received containing at least a hand of the user. The 3D maps are analyzed to detect a gesture performed by the user, and an operation is performed on the selected interactive item in response to the gesture.
    Type: Application
    Filed: March 24, 2013
    Publication date: October 24, 2013
    Inventors: Eyal Bychkov, Oren Brezner, Micha Galor, Ofir Or, Jonathan Pokrass, Amir Eshel