Patents by Inventor Alessia Marra
Alessia Marra has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11914836Abstract: A method includes accessing an image of a physical environment of a user, the image depicting a physical input device and a physical hand of the user, determining that a contrast between the physical input device and the physical hand depicted in the image is lower than a predetermined threshold, modifying the image to increase the contrast, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane, generating, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.Type: GrantFiled: December 20, 2022Date of Patent: February 27, 2024Assignee: Meta Platforms Technologies, LLCInventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
-
Patent number: 11902288Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: February 9, 2023Date of Patent: February 13, 2024Assignee: Meta Platforms Technologies, LLCInventors: Michael James Lebeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Björn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Patent number: 11770384Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: February 18, 2022Date of Patent: September 26, 2023Assignee: Meta Platforms Technologies, LLCInventors: Michael James Lebeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Björn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Publication number: 20230259194Abstract: In one embodiment, a method includes capturing, by a first VR display device, one or more frames of a shared real-world environment. The VR display device identifies one or more anchor points within the shared real-world environment from the one or more frames. The first VR display device receives localization information with respect to a second VR display device in the shared real-world environment and determines a pose of the first VR display device with respect to the second VR display device based on the localization information. A first output image is rendered for one or more displays of the first VR display device. The rendered image may comprise a proximity warning with respect to the second VR display device based on determining the pose of the first VR display device with respect to the second VR display device is within a threshold distance.Type: ApplicationFiled: February 16, 2022Publication date: August 17, 2023Inventors: David Frederick Geisert, Alessia Marra, Gioacchino Noris, Panya Inversin
-
Publication number: 20230188533Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: ApplicationFiled: February 9, 2023Publication date: June 15, 2023Applicant: Meta Platforms Technologies, LLCInventors: Michael James LEBEAU, Manuel Ricardo FREIRE SANTOS, Aleksejs ANPILOGOVS, Alexander SORKINE HORNUNG, Björn WANBO, Connor TREACY, Fangwei LEE, Federico RUIZ, Jonathan MALLINSON, Jonathan Richard MAYOH, Marcus TANNER, Panya INVERSIN, Sarthak RAY, Sheng SHEN, William Arthur Hugh STEPTOE, Alessia MARRA, Gioacchino NORIS, Derrick READINGER, Jeffrey Wai-King LOCK, Jeffrey WITTHUHN, Jennifer Lynn SPURLOCK, Larissa Heike LAICH, Javier Alejandro Sierra SANTOS
-
Publication number: 20230131667Abstract: A method includes accessing an image of a physical environment of a user, the image depicting a physical input device and a physical hand of the user, determining that a contrast between the physical input device and the physical hand depicted in the image is lower than a predetermined threshold, modifying the image to increase the contrast, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane, generating, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.Type: ApplicationFiled: December 20, 2022Publication date: April 27, 2023Inventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
-
Patent number: 11606364Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: October 30, 2020Date of Patent: March 14, 2023Assignee: Meta Platforms Technologies, LLCInventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Patent number: 11582245Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: October 30, 2020Date of Patent: February 14, 2023Assignee: Meta Platforms Technologies, LLCInventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Publication number: 20230037750Abstract: A method includes detecting an object of interest in a real environment and depth information of the object; determining one or more anchor locations in a three-dimensional space that correspond to a position of the object in the three-dimensional space; and generating a virtual surface anchored in the three-dimensional space. The method may further determine a pose of a camera when an image is captured and determine a region in the image that corresponds to the virtual surface. The method may further determine a first viewpoint of a first eye of the user; render a first output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface; and display the first output image on a first display of the computing device, the first display being configured to be viewed by the first eye of the user.Type: ApplicationFiled: October 24, 2022Publication date: February 9, 2023Inventors: Alessia Marra, Gioacchino Noris, Panya Inversin
-
Patent number: 11537258Abstract: In one embodiment, a method includes a computer system accessing an image of a physical environment of a user, the image being associated with a perspective of the user and depicting a physical input device and a physical hand of the user, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane associated with the perspective of the user, generating, by applying the image mask to the image, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.Type: GrantFiled: October 16, 2020Date of Patent: December 27, 2022Assignee: Meta Platforms Technologies, LLCInventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
-
Publication number: 20220358715Abstract: In one embodiment, a method includes displaying, for one or more displays of a virtual VR device, a first output image comprising a passthrough view of a real-world environment. The method includes identifying, using one or more images captured by one or more cameras of the VR display device, a real-world object in the real-world environment. The method includes receiving a user input indicating a first dimension corresponding to the real-world object. The method includes automatically determining, based on the first dimension, a second and third dimension corresponding to the real-world object. The method includes rendering, for the one or more displays of the VR display device, a second output image of a VR environment. The VR environment includes a MR object that corresponds to the real-world object. The MR object is defined by the determined first, second, and third dimensions.Type: ApplicationFiled: June 30, 2022Publication date: November 10, 2022Inventors: Christopher Richard Tanner, Amir Mesguich Havilio, Michelle Pujals, Gioacchino Noris, Alessia Marra, Nicholas Wallen
-
Patent number: 11481960Abstract: A method includes a computing system tracking motions performed by a hand of a user, determining one or more anchor locations in a three-dimensional space, and generating a virtual surface anchored in the three-dimensional space. An image of a real environment is captured using a camera worn by the user, and a pose of the camera when the image is captured is determined. The computing system determines a first viewpoint of a first eye of the user and a region in the image that, as viewed from the camera, corresponds to the virtual surface. The computing system renders an output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface, and displays the output image on a first display of the device, the first display being configured to be viewed by the first eye of the user.Type: GrantFiled: December 30, 2020Date of Patent: October 25, 2022Assignee: Meta Platforms Technologies, LLCInventors: Alessia Marra, Gioacchino Noris, Panya Inversin
-
Patent number: 11417054Abstract: In one embodiment, a method includes displaying, for one or more displays of a virtual VR device, a first output image comprising a passthrough view of a real-world environment. The method includes identifying, using one or more images captured by one or more cameras of the VR display device, a real-world object in the real-world environment. The method includes receiving a user input indicating a first dimension corresponding to the real-world object. The method includes automatically determining, based on the first dimension, a second and third dimension corresponding to the real-world object. The method includes rendering, for the one or more displays of the VR display device, a second output image of a VR environment. The VR environment includes a MR object that corresponds to the real-world object. The MR object is defined by the determined first, second, and third dimensions.Type: GrantFiled: March 17, 2021Date of Patent: August 16, 2022Assignee: Facebook Technologies, LLC.Inventors: Christopher Richard Tanner, Amir Mesguich Havilio, Michelle Pujals, Gioacchino Noris, Alessia Marra, Nicholas Wallen
-
Publication number: 20220207816Abstract: A method includes a computing system tracking motions performed by a hand of a user, determining one or more anchor locations in a three-dimensional space, and generating a virtual surface anchored in the three-dimensional space. An image of a real environment is captured using a camera worn by the user, and a pose of the camera when the image is captured is determined. The computing system determines a first viewpoint of a first eye of the user and a region in the image that, as viewed from the camera, corresponds to the virtual surface. The computing system renders an output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface, and displays the output image on a first display of the device, the first display being configured to be viewed by the first eye of the user.Type: ApplicationFiled: December 30, 2020Publication date: June 30, 2022Inventors: Alessia Marra, Gioacchino Noris, Panya Inversin
-
Publication number: 20220172444Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: ApplicationFiled: February 18, 2022Publication date: June 2, 2022Applicant: Facebook Technologies, LLCInventors: Michael James LEBEAU, Manuel Ricardo FREIRE SANTOS, Aleksejs ANPILOGOVS, Alexander SORKINE HORNUNG, Björn WANBO, Connor TREACY, Fangwei LEE, Federico RUIZ, Jonathan MALLINSON, Jonathan Richard MAYOH, Marcus TANNER, Panya INVERSIN, Sarthak RAY, Sheng SHEN, William Arthur Hugh STEPTOE, Alessia MARRA, Gioacchino NORIS, Derrick READINGER, Jeffrey Wai-King LOCK, Jeffrey WITTHUHN, Jennifer Lynn SPURLOCK, Larissa Heike LAICH, Javier Alejandro Sierra SANTOS
-
Patent number: 11315329Abstract: In one embodiment, a method includes accessing a plurality of points, wherein each point (1) corresponds to a spatial location associated with an observed feature of a physical environment and (2) is associated with a patch representing the observed feature, determining a density associated with each of the plurality of points based on the spatial locations of the plurality of points, scaling the patch associated with each of the plurality of points based on the density associated with the point, and reconstructing a scene of the physical environment based on at least the scaled patches.Type: GrantFiled: February 25, 2020Date of Patent: April 26, 2022Assignee: Facebook Technologies, LLC.Inventors: Alexander Sorkine Hornung, Alessia Marra, Fabian Langguth, Matthew James Alderman
-
Publication number: 20220121343Abstract: In one embodiment, a method includes a computer system accessing an image of a physical environment of a user, the image being associated with a perspective of the user and depicting a physical input device and a physical hand of the user, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane associated with the perspective of the user, generating, by applying the image mask to the image, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.Type: ApplicationFiled: October 16, 2020Publication date: April 21, 2022Inventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
-
Patent number: 11302085Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: October 30, 2020Date of Patent: April 12, 2022Assignee: Facebook Technologies, LLCInventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Publication number: 20220086205Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: ApplicationFiled: October 30, 2020Publication date: March 17, 2022Inventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Publication number: 20220084288Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: ApplicationFiled: October 30, 2020Publication date: March 17, 2022Inventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos