Patents by Inventor Alessia Marra

Alessia Marra has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11914836
    Abstract: A method includes accessing an image of a physical environment of a user, the image depicting a physical input device and a physical hand of the user, determining that a contrast between the physical input device and the physical hand depicted in the image is lower than a predetermined threshold, modifying the image to increase the contrast, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane, generating, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.
    Type: Grant
    Filed: December 20, 2022
    Date of Patent: February 27, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
  • Patent number: 11902288
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Grant
    Filed: February 9, 2023
    Date of Patent: February 13, 2024
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Michael James Lebeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Björn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
  • Patent number: 11770384
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Grant
    Filed: February 18, 2022
    Date of Patent: September 26, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Michael James Lebeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Björn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
  • Publication number: 20230259194
    Abstract: In one embodiment, a method includes capturing, by a first VR display device, one or more frames of a shared real-world environment. The VR display device identifies one or more anchor points within the shared real-world environment from the one or more frames. The first VR display device receives localization information with respect to a second VR display device in the shared real-world environment and determines a pose of the first VR display device with respect to the second VR display device based on the localization information. A first output image is rendered for one or more displays of the first VR display device. The rendered image may comprise a proximity warning with respect to the second VR display device based on determining the pose of the first VR display device with respect to the second VR display device is within a threshold distance.
    Type: Application
    Filed: February 16, 2022
    Publication date: August 17, 2023
    Inventors: David Frederick Geisert, Alessia Marra, Gioacchino Noris, Panya Inversin
  • Publication number: 20230188533
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Application
    Filed: February 9, 2023
    Publication date: June 15, 2023
    Applicant: Meta Platforms Technologies, LLC
    Inventors: Michael James LEBEAU, Manuel Ricardo FREIRE SANTOS, Aleksejs ANPILOGOVS, Alexander SORKINE HORNUNG, Björn WANBO, Connor TREACY, Fangwei LEE, Federico RUIZ, Jonathan MALLINSON, Jonathan Richard MAYOH, Marcus TANNER, Panya INVERSIN, Sarthak RAY, Sheng SHEN, William Arthur Hugh STEPTOE, Alessia MARRA, Gioacchino NORIS, Derrick READINGER, Jeffrey Wai-King LOCK, Jeffrey WITTHUHN, Jennifer Lynn SPURLOCK, Larissa Heike LAICH, Javier Alejandro Sierra SANTOS
  • Publication number: 20230131667
    Abstract: A method includes accessing an image of a physical environment of a user, the image depicting a physical input device and a physical hand of the user, determining that a contrast between the physical input device and the physical hand depicted in the image is lower than a predetermined threshold, modifying the image to increase the contrast, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane, generating, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.
    Type: Application
    Filed: December 20, 2022
    Publication date: April 27, 2023
    Inventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
  • Patent number: 11606364
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Grant
    Filed: October 30, 2020
    Date of Patent: March 14, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
  • Patent number: 11582245
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Grant
    Filed: October 30, 2020
    Date of Patent: February 14, 2023
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
  • Publication number: 20230037750
    Abstract: A method includes detecting an object of interest in a real environment and depth information of the object; determining one or more anchor locations in a three-dimensional space that correspond to a position of the object in the three-dimensional space; and generating a virtual surface anchored in the three-dimensional space. The method may further determine a pose of a camera when an image is captured and determine a region in the image that corresponds to the virtual surface. The method may further determine a first viewpoint of a first eye of the user; render a first output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface; and display the first output image on a first display of the computing device, the first display being configured to be viewed by the first eye of the user.
    Type: Application
    Filed: October 24, 2022
    Publication date: February 9, 2023
    Inventors: Alessia Marra, Gioacchino Noris, Panya Inversin
  • Patent number: 11537258
    Abstract: In one embodiment, a method includes a computer system accessing an image of a physical environment of a user, the image being associated with a perspective of the user and depicting a physical input device and a physical hand of the user, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane associated with the perspective of the user, generating, by applying the image mask to the image, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.
    Type: Grant
    Filed: October 16, 2020
    Date of Patent: December 27, 2022
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
  • Publication number: 20220358715
    Abstract: In one embodiment, a method includes displaying, for one or more displays of a virtual VR device, a first output image comprising a passthrough view of a real-world environment. The method includes identifying, using one or more images captured by one or more cameras of the VR display device, a real-world object in the real-world environment. The method includes receiving a user input indicating a first dimension corresponding to the real-world object. The method includes automatically determining, based on the first dimension, a second and third dimension corresponding to the real-world object. The method includes rendering, for the one or more displays of the VR display device, a second output image of a VR environment. The VR environment includes a MR object that corresponds to the real-world object. The MR object is defined by the determined first, second, and third dimensions.
    Type: Application
    Filed: June 30, 2022
    Publication date: November 10, 2022
    Inventors: Christopher Richard Tanner, Amir Mesguich Havilio, Michelle Pujals, Gioacchino Noris, Alessia Marra, Nicholas Wallen
  • Patent number: 11481960
    Abstract: A method includes a computing system tracking motions performed by a hand of a user, determining one or more anchor locations in a three-dimensional space, and generating a virtual surface anchored in the three-dimensional space. An image of a real environment is captured using a camera worn by the user, and a pose of the camera when the image is captured is determined. The computing system determines a first viewpoint of a first eye of the user and a region in the image that, as viewed from the camera, corresponds to the virtual surface. The computing system renders an output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface, and displays the output image on a first display of the device, the first display being configured to be viewed by the first eye of the user.
    Type: Grant
    Filed: December 30, 2020
    Date of Patent: October 25, 2022
    Assignee: Meta Platforms Technologies, LLC
    Inventors: Alessia Marra, Gioacchino Noris, Panya Inversin
  • Patent number: 11417054
    Abstract: In one embodiment, a method includes displaying, for one or more displays of a virtual VR device, a first output image comprising a passthrough view of a real-world environment. The method includes identifying, using one or more images captured by one or more cameras of the VR display device, a real-world object in the real-world environment. The method includes receiving a user input indicating a first dimension corresponding to the real-world object. The method includes automatically determining, based on the first dimension, a second and third dimension corresponding to the real-world object. The method includes rendering, for the one or more displays of the VR display device, a second output image of a VR environment. The VR environment includes a MR object that corresponds to the real-world object. The MR object is defined by the determined first, second, and third dimensions.
    Type: Grant
    Filed: March 17, 2021
    Date of Patent: August 16, 2022
    Assignee: Facebook Technologies, LLC.
    Inventors: Christopher Richard Tanner, Amir Mesguich Havilio, Michelle Pujals, Gioacchino Noris, Alessia Marra, Nicholas Wallen
  • Publication number: 20220207816
    Abstract: A method includes a computing system tracking motions performed by a hand of a user, determining one or more anchor locations in a three-dimensional space, and generating a virtual surface anchored in the three-dimensional space. An image of a real environment is captured using a camera worn by the user, and a pose of the camera when the image is captured is determined. The computing system determines a first viewpoint of a first eye of the user and a region in the image that, as viewed from the camera, corresponds to the virtual surface. The computing system renders an output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface, and displays the output image on a first display of the device, the first display being configured to be viewed by the first eye of the user.
    Type: Application
    Filed: December 30, 2020
    Publication date: June 30, 2022
    Inventors: Alessia Marra, Gioacchino Noris, Panya Inversin
  • Publication number: 20220172444
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Application
    Filed: February 18, 2022
    Publication date: June 2, 2022
    Applicant: Facebook Technologies, LLC
    Inventors: Michael James LEBEAU, Manuel Ricardo FREIRE SANTOS, Aleksejs ANPILOGOVS, Alexander SORKINE HORNUNG, Björn WANBO, Connor TREACY, Fangwei LEE, Federico RUIZ, Jonathan MALLINSON, Jonathan Richard MAYOH, Marcus TANNER, Panya INVERSIN, Sarthak RAY, Sheng SHEN, William Arthur Hugh STEPTOE, Alessia MARRA, Gioacchino NORIS, Derrick READINGER, Jeffrey Wai-King LOCK, Jeffrey WITTHUHN, Jennifer Lynn SPURLOCK, Larissa Heike LAICH, Javier Alejandro Sierra SANTOS
  • Patent number: 11315329
    Abstract: In one embodiment, a method includes accessing a plurality of points, wherein each point (1) corresponds to a spatial location associated with an observed feature of a physical environment and (2) is associated with a patch representing the observed feature, determining a density associated with each of the plurality of points based on the spatial locations of the plurality of points, scaling the patch associated with each of the plurality of points based on the density associated with the point, and reconstructing a scene of the physical environment based on at least the scaled patches.
    Type: Grant
    Filed: February 25, 2020
    Date of Patent: April 26, 2022
    Assignee: Facebook Technologies, LLC.
    Inventors: Alexander Sorkine Hornung, Alessia Marra, Fabian Langguth, Matthew James Alderman
  • Publication number: 20220121343
    Abstract: In one embodiment, a method includes a computer system accessing an image of a physical environment of a user, the image being associated with a perspective of the user and depicting a physical input device and a physical hand of the user, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane associated with the perspective of the user, generating, by applying the image mask to the image, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.
    Type: Application
    Filed: October 16, 2020
    Publication date: April 21, 2022
    Inventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
  • Patent number: 11302085
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Grant
    Filed: October 30, 2020
    Date of Patent: April 12, 2022
    Assignee: Facebook Technologies, LLC
    Inventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
  • Publication number: 20220086205
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Application
    Filed: October 30, 2020
    Publication date: March 17, 2022
    Inventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
  • Publication number: 20220084288
    Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.
    Type: Application
    Filed: October 30, 2020
    Publication date: March 17, 2022
    Inventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos