Patents by Inventor Alexander Tyurin
Alexander Tyurin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11922652Abstract: An augmented reality collaboration system comprises a first system configured to display virtual content, comprising: a structure comprising a plurality of radiation emitters arranged in a predetermined pattern, and a user device comprising: one or more sensors configured to sense outputs of the plurality of radiation emitters, and one or more displays; one or more hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the one or more hardware processors to, for the user device: determine a pose of the user device with respect to the structure based on the sensed outputs of the plurality of radiation emitters, and generate an image of virtual content based on the pose of the user device with respect to the structure, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the structure.Type: GrantFiled: January 13, 2023Date of Patent: March 5, 2024Assignee: Campfire 3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, Jr.
-
Patent number: 11847752Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: GrantFiled: January 4, 2023Date of Patent: December 19, 2023Assignee: Campfire 3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, Jr.
-
Patent number: 11756225Abstract: An augmented reality collaboration system comprises a first system configured to display virtual content, comprising: a structure comprising a plurality of radiation emitters arranged in a predetermined pattern, and a user device comprising: one or more sensors configured to sense outputs of the plurality of radiation emitters, and one or more displays; one or more hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the one or more hardware processors to, for the user device: determine a pose of the user device with respect to the structure based on the sensed outputs of the plurality of radiation emitters, and generate an image of virtual content based on the pose of the user device with respect to the structure, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the structure.Type: GrantFiled: September 16, 2020Date of Patent: September 12, 2023Assignee: Campfire 3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, Jr.
-
Patent number: 11710284Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: GrantFiled: December 14, 2021Date of Patent: July 25, 2023Assignee: Campfire 3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, Jr.
-
Patent number: 11697068Abstract: Systems and methods to provide a mobile computing platform as a physical interface for an interactive space are presented herein. The interactive space may be experienced by a user of a host device (e.g., headset). The interactive space may include views of virtual content. A position and/or heading of the mobile computing platform relative to a perceived position and/or heading of the virtual content of the interactive space may be determined. Remote command information may be determined based on the relative position information and/or user input information conveying user entry and/or selection of one or more input elements of the mobile computing platform. The remote command information may be configured to effectuate user interactions with the virtual content in the interactive space based on user interactions with the mobile computing platform.Type: GrantFiled: October 16, 2019Date of Patent: July 11, 2023Assignee: Campfire 3D, Inc.Inventors: Avi Bar-Zeev, Gerald Wright, Jr., Alexander Tyurin, Diego Leyton
-
Patent number: 11688147Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: GrantFiled: December 14, 2021Date of Patent: June 27, 2023Assignee: Campfire 3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, Jr.
-
Publication number: 20230154036Abstract: An augmented reality collaboration system comprises a first system configured to display virtual content, comprising: a structure comprising a plurality of radiation emitters arranged in a predetermined pattern, and a user device comprising: one or more sensors configured to sense outputs of the plurality of radiation emitters, and one or more displays; one or more hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the one or more hardware processors to, for the user device: determine a pose of the user device with respect to the structure based on the sensed outputs of the plurality of radiation emitters, and generate an image of virtual content based on the pose of the user device with respect to the structure, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the structure.Type: ApplicationFiled: January 13, 2023Publication date: May 18, 2023Applicant: Campfire 3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, JR.
-
Publication number: 20230143213Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non- transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: ApplicationFiled: January 4, 2023Publication date: May 11, 2023Inventors: AVI BAR-ZEEV, ALEXANDER TYURIN, GERALD V. WRIGHT, JR.
-
Patent number: 11587295Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: GrantFiled: October 5, 2021Date of Patent: February 21, 2023Assignee: Meta View, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, Jr.
-
Publication number: 20220108538Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: ApplicationFiled: December 14, 2021Publication date: April 7, 2022Applicant: Campfire3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, JR.
-
Publication number: 20220108537Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: ApplicationFiled: December 14, 2021Publication date: April 7, 2022Applicant: Campfire3D, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, JR.
-
Publication number: 20220083307Abstract: In general, one aspect disclosed features a system comprising: a first user device configured to display virtual content, the first user device comprising one or more displays; one or more hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the one or more hardware processors to: generate a first image depicting virtual content in a virtual location corresponding to a physical location in a physical environment of the first user device, display the first image in the one or more displays of the first user device, enable a user of the first user device to create media and associate that media with the virtual content in the first image in the form of an annotation, and store the annotation and virtual content, and make it available for access by a plurality of additional user devices.Type: ApplicationFiled: November 9, 2021Publication date: March 17, 2022Applicant: Meta View, Inc.Inventors: Alexander Tyurin, Gerald V. Wright, JR.
-
Publication number: 20220084297Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: ApplicationFiled: October 5, 2021Publication date: March 17, 2022Applicant: Meta View, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, JR.
-
Publication number: 20220084235Abstract: An augmented reality collaboration system comprises a first system configured to display virtual content, comprising: a structure comprising a plurality of radiation emitters arranged in a predetermined pattern, and a user device comprising: one or more sensors configured to sense outputs of the plurality of radiation emitters, and one or more displays; one or more hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the one or more hardware processors to, for the user device: determine a pose of the user device with respect to the structure based on the sensed outputs of the plurality of radiation emitters, and generate an image of virtual content based on the pose of the user device with respect to the structure, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the structure.Type: ApplicationFiled: September 16, 2020Publication date: March 17, 2022Applicant: Meta View, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, JR.
-
Patent number: 11176756Abstract: A system comprising: a user device, comprising: sensors configured to sense data related to a physical environment of the user device, displays; hardware processors; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processors to: place a virtual object in a 3D scene displayed by the second user device, determine a pose of the user device with respect to the physical location in the physical environment of the user device, and generate an image of virtual content based on the pose of the user device with respect to the placed virtual object, wherein the image of the virtual content is projected by the one or more displays of the user device in a predetermined location relative to the physical location in the physical environment of the user device.Type: GrantFiled: September 16, 2020Date of Patent: November 16, 2021Assignee: Meta View, Inc.Inventors: Avi Bar-Zeev, Alexander Tyurin, Gerald V. Wright, Jr.