Patents by Inventor Avi Bar-Zeev
Avi Bar-Zeev has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20220108537Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: ApplicationFiled: December 14, 2021Publication date: April 7, 2022Applicant: Campfire3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, JR.
-
Publication number: 20220084297Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: ApplicationFiled: October 5, 2021Publication date: March 17, 2022Applicant: Meta View, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, JR.
-
Publication number: 20220084235Abstract: An augmented reality collaboration system comprises a first system configured to display virtual content, comprising: a structure comprising a plurality of radiation emitters arranged in a predetermined pattern, and a user device comprising: one or more sensors configured to sense outputs of the plurality of radiation emitters, and one or more displays; one or more hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the one or more hardware processors to, for the user device: determine a pose of the user device with respect to the structure based on the sensed outputs of the plurality of radiation emitters, and generate an image of virtual content based on the pose of the user device with respect to the structure, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the structure.Type: ApplicationFiled: September 16, 2020Publication date: March 17, 2022Applicant: Meta View, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, JR.
-
Publication number: 20220076496Abstract: The present disclosure relates to techniques for providing tangibility visualization of virtual objects within a computer-generated reality (CGR) environment, such as a CGR environment based on virtual reality and/or a CGR environment based on mixed reality. A visual feedback indicating tangibility is provided for a virtual object within a CGR environment that does not correspond to a real, tangible object in the real environment. A visual feedback indicating tangibility is not provided for a virtual representation of a real object within a CGR environment that corresponds to a real, tangible object in the real environment.Type: ApplicationFiled: November 16, 2021Publication date: March 10, 2022Inventors: Alexis H. PALANGIE, Avi BAR-ZEEV
-
Publication number: 20220058965Abstract: This disclosure describes an unmanned aerial vehicle (“UAV”) configured to autonomously deliver items of inventory to various destinations. The UAV may receive inventory information and a destination location and autonomously retrieve the inventory from a location within a materials handling facility, compute a route from the materials handling facility to a destination and travel to the destination to deliver the inventory.Type: ApplicationFiled: November 4, 2021Publication date: February 24, 2022Inventors: Gur Kimchi, Daniel Buchmueller, Scott A. Green, Brian C. Beckman, Scott Isaacs, Amir Navot, Fabian Hensel, Avi Bar-Zeev, Severan Sylvain Jean-Michel Rault
-
Patent number: 11243402Abstract: A mixed reality system including a head-mounted display (HMD) and a base station. Information collected by HMD sensors may be transmitted to the base via a wired or wireless connection. On the base, a rendering engine renders frames including virtual content based in part on the sensor information, and an encoder compresses the frames according to an encoding protocol before sending the frames to the HMD over the connection. Instead of using a previous frame to estimate motion vectors in the encoder, motion vectors from the HMD and the rendering engine are input to the encoder and used in compressing the frame. The motion vectors may be embedded in the data stream along with the encoded frame data and transmitted to the HMD over the connection. If a frame is not received at the HMD, the HMD may synthesize a frame from a previous frame using the motion vectors.Type: GrantFiled: February 5, 2021Date of Patent: February 8, 2022Assignee: Apple Inc.Inventors: Geoffrey Stahl, Avi Bar-Zeev
-
Publication number: 20220012002Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.Type: ApplicationFiled: September 27, 2021Publication date: January 13, 2022Inventors: Avi BAR-ZEEV, Ryan S. BURGOYNE, Devin W. CHALMERS, Luis R. DELIZ CENTENO, Rahul NAIR, Timothy R. ORIOL, Alexis H. PALANGIE
-
Patent number: 11223704Abstract: In server/client architectures, the server application and client applications are often developed in different languages and execute in different environments specialized for the different contexts of each application (e.g., low-level, performant, platform-specialized, and stateless instructions on the server, and high-level, flexible, platform-agnostic, and stateful languages on the client) and are often executed on different devices. Convergence of these environments (e.g., server-side JavaScript using Node.js) enables the provision of a server that services client applications executing on the same device. The local server may monitor local events occurring on the device, and may execute one or more server scripts associated with particular local events on behalf of local clients subscribing to the local event (e.g., via a subscription model).Type: GrantFiled: August 29, 2018Date of Patent: January 11, 2022Assignee: Microsoft Technology Licensing, LLCInventors: Avi Bar-Zeev, Gur Kimchi, Brian C. Beckman, Scott Isaacs, Meir Ben-Itay, Eran Yariv, Blaise Aguera y Arcas
-
Patent number: 11195422Abstract: This disclosure describes an unmanned aerial vehicle (“UAV”) configured to autonomously deliver items of inventory to various destinations. The UAV may receive inventory information and a destination location and autonomously retrieve the inventory from a location within a materials handling facility, compute a route from the materials handling facility to a destination and travel to the destination to deliver the inventory.Type: GrantFiled: July 25, 2019Date of Patent: December 7, 2021Assignee: Amazon Technologies, Inc.Inventors: Gur Kimchi, Daniel Buchmueller, Scott A. Green, Brian C. Beckman, Scott Isaacs, Amir Navot, Fabian Hensel, Avi Bar-Zeev, Severan Sylvain Jean-Michel Rault
-
Publication number: 20210365116Abstract: One exemplary implementation provides an improved user experience on a device by using physiological data to initiate a user interaction for the user experience based on an identified interest or intention of a user. For example, a sensor may obtain physiological data (e.g., pupil diameter) of a user during a user experience in which content is displayed on a display. The physiological data varies over time during the user experience and a pattern is detected. The detected pattern is used to identify an interest of the user in the content or an intention of the user regarding the content. The user interaction is then initiated based on the identified interest or the identified intention.Type: ApplicationFiled: August 6, 2021Publication date: November 25, 2021Inventors: Avi Bar-Zeev, Devin W. Chalmers, Fletcher R. Rothkopf, Grant H. Mulliken, Holly E. Gerhard, Lilli I. Jonsson
-
Patent number: 11182964Abstract: The present disclosure relates to techniques for providing tangibility visualization of virtual objects within a computer-generated reality (CGR) environment, such as a CGR environment based on virtual reality and/or a CGR environment based on mixed reality. A visual feedback indicating tangibility is provided for a virtual object within a CGR environment that does not correspond to a real, tangible object in the real environment. A visual feedback indicating tangibility is not provided for a virtual representation of a real object within a CGR environment that corresponds to a real, tangible object in the real environment.Type: GrantFiled: April 4, 2019Date of Patent: November 23, 2021Assignee: Apple Inc.Inventors: Alexis H. Palangie, Avi Bar-Zeev
-
Publication number: 20210354038Abstract: Systems and methods to provide a mobile computing platform as a physical interface for an interactive space are presented herein. The interactive space may be experienced by a user of a host device (e.g., headset). The interactive space may include views of virtual content. A position and/or heading of the mobile computing platform relative to a perceived position and/or heading of the virtual content of the interactive space may be determined. Remote command information may be determined based on the relative position information and/or user input information conveying user entry and/or selection of one or more input elements of the mobile computing platform. The remote command information may be configured to effectuate user interactions with the virtual content in the interactive space based on user interactions with the mobile computing platform.Type: ApplicationFiled: October 16, 2019Publication date: November 18, 2021Inventors: Avi BAR-ZEEV, Gerald WRIGHT JR, Alex TURIN, Diego LEYTON
-
Patent number: 11176756Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: GrantFiled: September 16, 2020Date of Patent: November 16, 2021Assignee: Meta View, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, Jr.
-
Publication number: 20210339143Abstract: In various implementations, methods and devices for attenuation of co-user interactions in SR space are described. In one implementation, a method of attenuating avatars based on a breach of avatar social interaction criteria is performed at a device provided to deliver simulated reality (SR) content. In one implementation, a method of close collaboration in SR setting is performed at a device provided to deliver SR content.Type: ApplicationFiled: September 17, 2019Publication date: November 4, 2021Inventors: Avi Bar-Zeev, Alexis Henri Palangie, Luis Rafael Deliz Centeno, Rahul Nair
-
Publication number: 20210312694Abstract: A mixed reality system that includes a device and a base station that communicate via a wireless connection The device may include sensors that collect information about the user's environment and about the user. The information collected by the sensors may be transmitted to the base station via the wireless connection. The base station renders frames or slices based at least in part on the sensor information received from the device, encodes the frames or slices, and transmits the compressed frames or slices to the device for decoding and display. The base station may provide more computing power than conventional stand-alone systems, and the wireless connection does not tether the device to the base station as in conventional tethered systems. The system may implement methods and apparatus to maintain a target frame rate through the wireless link and to minimize latency in frame rendering, transmittal, and display.Type: ApplicationFiled: June 18, 2021Publication date: October 7, 2021Applicant: Apple Inc.Inventors: Arthur Y Zhang, Ray L. Chang, Timothy R. Oriol, Ling Su, Gurjeet S. Saund, Guy Cote, Jim C. Chou, Hao Pan, Tobias Eble, Avi Bar-Zeev, Sheng Zhang, Justin A. Hensley, Geoffrey Stahl
-
Patent number: 11137967Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.Type: GrantFiled: March 24, 2020Date of Patent: October 5, 2021Assignee: Apple Inc.Inventors: Avi Bar-Zeev, Ryan S. Burgoyne, Devin W. Chalmers, Luis R. Deliz Centeno, Rahul Nair
-
Patent number: 11132162Abstract: In an exemplary process for interacting with user interface objects using an eye gaze, an affordance associated with a first object is displayed. A gaze direction or a gaze depth is determined. While the gaze direction or the gaze depth is determined to correspond to a gaze at the affordance, a first input representing user instruction to take action on the affordance is received, and the affordance is selected responsive to receiving the first input.Type: GrantFiled: March 24, 2020Date of Patent: September 28, 2021Assignee: Apple Inc.Inventors: Avi Bar-Zeev, Ryan S. Burgoyne, Devin W. Chalmers, Luis R. Deliz Centeno, Rahul Nair, Timothy R. Oriol, Alexis H. Palangie
-
Patent number: 11127182Abstract: Techniques for alerting a user, who is immersed in a virtual reality environment, to physical obstacles in their physical environment are disclosed.Type: GrantFiled: March 27, 2020Date of Patent: September 21, 2021Assignee: Apple Inc.Inventors: Seyedkoosha Mirhosseini, Avi Bar-Zeev, Duncan A. K. Mcroberts
-
Patent number: 11119573Abstract: One exemplary implementation provides an improved user experience on a device by using physiological data to initiate a user interaction for the user experience based on an identified interest or intention of a user. For example, a sensor may obtain physiological data (e.g., pupil diameter) of a user during a user experience in which content is displayed on a display. The physiological data varies over time during the user experience and a pattern is detected. The detected pattern is used to identify an interest of the user in the content or an intention of the user regarding the content. The user interaction is then initiated based on the identified interest or the identified intention.Type: GrantFiled: September 12, 2019Date of Patent: September 14, 2021Assignee: Apple Inc.Inventors: Avi Bar-Zeev, Devin W. Chalmers, Fletcher R. Rothkopf, Grant H. Mulliken, Holly E. Gerhard, Lilli I. Jonsson
-
Patent number: 11087518Abstract: The claimed subject matter relates to an architecture that can provide for a second-person avatar. The second-person avatar can rely upon a second-person-based perspective such that the avatar is displayed to appear to encompass all or portions of a target user. Accordingly, actions or a configuration of the avatar can serve as a model or demonstration for the user in order to aid the user in accomplishing a particular task. Updates to avatar activity or configuration can be provided by a dynamic virtual handbook. The virtual handbook can be constructed based upon a set of instruction associated with accomplishing the desired task and further based upon features or aspects of the user as well as those of the local environment.Type: GrantFiled: August 4, 2016Date of Patent: August 10, 2021Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Eyal Ofek, Blaise H. Aguera y Arcas, Avi Bar-Zeev, Gur Kimchi, Jason Szabo