Patents by Inventor Eli Elhadad
Eli Elhadad has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240045495Abstract: Systems, methods, and non-transitory computer readable media including instructions for extracting content from a virtual display are disclosed. Extracting content from a virtual display includes generating a virtual display via a wearable extended reality appliance, wherein the virtual display presents a group of virtual objects and is located at a first virtual distance from the wearable extended reality appliance; generating an extended reality environment via the wearable extended reality appliance including at least one additional virtual object at a second virtual distance from the wearable extended reality appliance; receiving input for causing a specific virtual object to move from the virtual display to the extended reality environment; and in response, generating a presentation of a version of the specific virtual object in the extended reality environment at a third virtual distance different from the first virtual distance and the second virtual distance.Type: ApplicationFiled: October 20, 2023Publication date: February 8, 2024Applicant: ULTINARITY LTDInventors: Eli ELHADAD, Amit KNAANI, Tomer KAHAN, Tamir Berliner, Orit DOLEV
-
Patent number: 11829524Abstract: Systems, methods, and non-transitory computer readable media including instructions for extracting content from a virtual display are disclosed. Extracting content from a virtual display includes generating a virtual display via a wearable extended reality appliance, wherein the virtual display presents a group of virtual objects and is located at a first virtual distance from the wearable extended reality appliance; generating an extended reality environment via the wearable extended reality appliance including at least one additional virtual object at a second virtual distance from the wearable extended reality appliance; receiving input for causing a specific virtual object to move from the virtual display to the extended reality environment; and in response, generating a presentation of a version of the specific virtual object in the extended reality environment at a third virtual distance different from the first virtual distance and the second virtual distance.Type: GrantFiled: December 28, 2022Date of Patent: November 28, 2023Assignee: MULTINARITY LTD.Inventors: Eli Elhadad, Amit Knaani, Tomer Kahan, Tamir Berliner, Orit Dolev
-
Publication number: 20230237752Abstract: Systems, methods, and non-transitory computer readable media including instructions for extracting content from a virtual display are disclosed. Extracting content from a virtual display includes generating a virtual display via a wearable extended reality appliance, wherein the virtual display presents a group of virtual objects and is located at a first virtual distance from the wearable extended reality appliance; generating an extended reality environment via the wearable extended reality appliance including at least one additional virtual object at a second virtual distance from the wearable extended reality appliance; receiving input for causing a specific virtual object to move from the virtual display to the extended reality environment; and in response, generating a presentation of a version of the specific virtual object in the extended reality environment at a third virtual distance different from the first virtual distance and the second virtual distance.Type: ApplicationFiled: December 28, 2022Publication date: July 27, 2023Applicant: MULTINARITY LTDInventors: Eli Elhadad, Amit Knaani, Tomer Kahan, Tamir Berliner, Orit Dolev
-
Patent number: 11599148Abstract: Consistent with disclosed embodiments, systems, methods, and computer readable media including instructions for implementing hybrid virtual keys in an extended reality environment are disclosed. Embodiments may include a processor to receive signals from a touch-sensitive surface, wherein a wearable extended reality appliance may virtually project a plurality of virtual activatable elements on the touch-sensitive surface. The plurality of virtual activatable elements virtually projected on the touch-sensitive surface may be a proper sub-set of a group of virtual activatable elements, based on the action of a user. The processor may receive touch inputs from the user via the touch-sensitive surface and identify one of the plurality of virtual activatable elements. The processor may cause a change in virtual content based on the identified virtual activatable element.Type: GrantFiled: March 31, 2022Date of Patent: March 7, 2023Assignee: MULTINARITY LTDInventors: Tamir Berliner, Tomer Kahan, Eli Elhadad
-
Patent number: 11574452Abstract: Systems, methods, and non-transitory computer readable media containing instructions for causing at least one processor to perform operations to enable cursor control in an extended reality space are provided. In one implementation, the processor is configured to perform operations comprising receiving from an image sensor first image data reflecting a first region of focus of a user of a wearable extended reality appliance; causing a first presentation of a virtual cursor in the first region of focus; receiving from the image sensor second image data reflecting a second region of focus of the user outside the initial field of view in the extended reality space; receiving input data indicative of a desire of the user to interact with the virtual cursor; and causing a second presentation of the virtual cursor in the second region of focus in response to the input data.Type: GrantFiled: April 1, 2022Date of Patent: February 7, 2023Assignee: MULTINARITY LTDInventors: Tamir Berliner, Tomer Kahan, Orit Dolev, Oded Noam, Doron Assayas Terre, Eli Elhadad
-
Publication number: 20220261149Abstract: Consistent with disclosed embodiments, systems, methods, and computer readable media including instructions for implementing hybrid virtual keys in an extended reality environment are disclosed. Embodiments may include a processor to receive signals from a touch-sensitive surface, wherein a wearable extended reality appliance may virtually project a plurality of virtual activatable elements on the touch-sensitive surface. The plurality of virtual activatable elements virtually projected on the touch-sensitive surface may be a proper sub-set of a group of virtual activatable elements, based on the action of a user. The processor may receive touch inputs from the user via the touch-sensitive surface and identify one of the plurality of virtual activatable elements. The processor may cause a change in virtual content based on the identified virtual activatable element.Type: ApplicationFiled: March 31, 2022Publication date: August 18, 2022Applicant: Multinarity LtdInventors: Tamir Berliner, Tomer Kahan, Eli Elhadad
-
Publication number: 20220253194Abstract: Systems, methods, and non-transitory computer readable media containing instructions for causing at least one processor to perform operations to enable cursor control in an extended reality space are provided. In one implementation, the processor is configured to perform operations comprising receiving from an image sensor first image data reflecting a first region of focus of a user of a wearable extended reality appliance; causing a first presentation of a virtual cursor in the first region of focus; receiving from the image sensor second image data reflecting a second region of focus of the user outside the initial field of view in the extended reality space; receiving input data indicative of a desire of the user to interact with the virtual cursor; and causing a second presentation of the virtual cursor in the second region of focus in response to the input data.Type: ApplicationFiled: April 1, 2022Publication date: August 11, 2022Applicant: Multinarity LtdInventors: Tamir Berliner, Tomer Kahan, Orit Dolev, Oded Noam, Doron Assayas Terre, Eli Elhadad
-
Patent number: 9395821Abstract: Embodiments of systems and techniques for user interface (UI) control are disclosed herein. In some embodiments, a UI control system may determine locations of landmarks on a body of a user of a computing system, determine a pointer based at least in part on the landmark locations, and identify a UI element of an UI of the computing system based at least in part on the pointer. Other embodiments may be described and/or claimed.Type: GrantFiled: March 27, 2014Date of Patent: July 19, 2016Assignee: INTEL CORPORATIONInventors: Yaron Yanai, Eli Elhadad, Amir Rosenberger
-
Publication number: 20150324001Abstract: Embodiments of systems and techniques for user interface (UI) control are disclosed herein. In some embodiments, a UI control system may determine locations of landmarks on a body of a user of a computing system, determine a pointer based at least in part on the landmark locations, and identify a UI element of an UI of the computing system based at least in part on the pointer. Other embodiments may be described and/or claimed.Type: ApplicationFiled: March 27, 2014Publication date: November 12, 2015Inventors: Yaron YANAI, Eli ELHADAD, Amir ROSENBERGER
-
Publication number: 20140123077Abstract: A system and method for close range object tracking are described. Close range depth images of a user's hands and fingers are acquired using a depth sensor. Movements of the user's hands and fingers are identified and tracked. This information is used to permit the user to interact with a virtual object, such as an icon or other object displayed on a screen, or the screen itself.Type: ApplicationFiled: November 13, 2012Publication date: May 1, 2014Applicants: Intel Corporation, Omek Interactive, Ltd.Inventors: Gershom Kutliroff, Yaron Yanai, Eli Elhadad