Patents by Inventor Panya Inversin
Panya Inversin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11914836Abstract: A method includes accessing an image of a physical environment of a user, the image depicting a physical input device and a physical hand of the user, determining that a contrast between the physical input device and the physical hand depicted in the image is lower than a predetermined threshold, modifying the image to increase the contrast, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane, generating, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.Type: GrantFiled: December 20, 2022Date of Patent: February 27, 2024Assignee: Meta Platforms Technologies, LLCInventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
-
Patent number: 11902288Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: February 9, 2023Date of Patent: February 13, 2024Assignee: Meta Platforms Technologies, LLCInventors: Michael James Lebeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Björn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Patent number: 11887249Abstract: A method includes receiving video data of a user, the video data comprising a first captured image and a second captured image, generating a two-dimensional planar proxy of the user, determining a pose comprising a location and orientation of the two-dimensional planar proxy within a three-dimensional virtual environment, rendering one or more display images for one or more displays of an artificial-reality device based on the two-dimensional planar proxy having the determined pose and at least one of the first and second captured images, displaying the rendered one or more display images using the one or more displays, respectively, determining that a viewing angle of the artificial-reality device relative to the two-dimensional planar proxy exceeds a predetermined maximum threshold, and based on the determination that the viewing angle exceeds the predetermined maximum threshold, ceasing to display the one or more display images.Type: GrantFiled: December 22, 2022Date of Patent: January 30, 2024Assignee: Meta Platforms Technologies, LLCInventors: Alexander Sorkine Hornung, Panya Inversin
-
Patent number: 11770384Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: February 18, 2022Date of Patent: September 26, 2023Assignee: Meta Platforms Technologies, LLCInventors: Michael James Lebeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Björn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Publication number: 20230259194Abstract: In one embodiment, a method includes capturing, by a first VR display device, one or more frames of a shared real-world environment. The VR display device identifies one or more anchor points within the shared real-world environment from the one or more frames. The first VR display device receives localization information with respect to a second VR display device in the shared real-world environment and determines a pose of the first VR display device with respect to the second VR display device based on the localization information. A first output image is rendered for one or more displays of the first VR display device. The rendered image may comprise a proximity warning with respect to the second VR display device based on determining the pose of the first VR display device with respect to the second VR display device is within a threshold distance.Type: ApplicationFiled: February 16, 2022Publication date: August 17, 2023Inventors: David Frederick Geisert, Alessia Marra, Gioacchino Noris, Panya Inversin
-
Publication number: 20230188533Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: ApplicationFiled: February 9, 2023Publication date: June 15, 2023Applicant: Meta Platforms Technologies, LLCInventors: Michael James LEBEAU, Manuel Ricardo FREIRE SANTOS, Aleksejs ANPILOGOVS, Alexander SORKINE HORNUNG, Björn WANBO, Connor TREACY, Fangwei LEE, Federico RUIZ, Jonathan MALLINSON, Jonathan Richard MAYOH, Marcus TANNER, Panya INVERSIN, Sarthak RAY, Sheng SHEN, William Arthur Hugh STEPTOE, Alessia MARRA, Gioacchino NORIS, Derrick READINGER, Jeffrey Wai-King LOCK, Jeffrey WITTHUHN, Jennifer Lynn SPURLOCK, Larissa Heike LAICH, Javier Alejandro Sierra SANTOS
-
Publication number: 20230125961Abstract: A method includes receiving video data of a user, the video data comprising a first captured image and a second captured image, generating a two-dimensional planar proxy of the user, determining a pose comprising a location and orientation of the two-dimensional planar proxy within a three-dimensional virtual environment, rendering one or more display images for one or more displays of an artificial-reality device based on the two-dimensional planar proxy having the determined pose and at least one of the first and second captured images, displaying the rendered one or more display images using the one or more displays, respectively, determining that a viewing angle of the artificial-reality device relative to the two-dimensional planar proxy exceeds a predetermined maximum threshold, and based on the determination that the viewing angle exceeds the predetermined maximum threshold, ceasing to display the one or more display images.Type: ApplicationFiled: December 22, 2022Publication date: April 27, 2023Inventors: Alexander Sorkine Hornung, Panya Inversin
-
Publication number: 20230131667Abstract: A method includes accessing an image of a physical environment of a user, the image depicting a physical input device and a physical hand of the user, determining that a contrast between the physical input device and the physical hand depicted in the image is lower than a predetermined threshold, modifying the image to increase the contrast, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane, generating, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.Type: ApplicationFiled: December 20, 2022Publication date: April 27, 2023Inventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
-
Patent number: 11606364Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: October 30, 2020Date of Patent: March 14, 2023Assignee: Meta Platforms Technologies, LLCInventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Patent number: 11582245Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: October 30, 2020Date of Patent: February 14, 2023Assignee: Meta Platforms Technologies, LLCInventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Publication number: 20230037750Abstract: A method includes detecting an object of interest in a real environment and depth information of the object; determining one or more anchor locations in a three-dimensional space that correspond to a position of the object in the three-dimensional space; and generating a virtual surface anchored in the three-dimensional space. The method may further determine a pose of a camera when an image is captured and determine a region in the image that corresponds to the virtual surface. The method may further determine a first viewpoint of a first eye of the user; render a first output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface; and display the first output image on a first display of the computing device, the first display being configured to be viewed by the first eye of the user.Type: ApplicationFiled: October 24, 2022Publication date: February 9, 2023Inventors: Alessia Marra, Gioacchino Noris, Panya Inversin
-
Patent number: 11537258Abstract: In one embodiment, a method includes a computer system accessing an image of a physical environment of a user, the image being associated with a perspective of the user and depicting a physical input device and a physical hand of the user, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane associated with the perspective of the user, generating, by applying the image mask to the image, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.Type: GrantFiled: October 16, 2020Date of Patent: December 27, 2022Assignee: Meta Platforms Technologies, LLCInventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
-
Patent number: 11538214Abstract: A method includes receiving, through a network by a computing system associated with an artificial-reality device, video data of a user of a second computing system comprising a first and second image of the user. The first and second images may be captured concurrently by a first camera and a second camera of the second computing system, respectively. The computing system generates a planar proxy for displaying the user and determines a pose for the planar proxy within a three-dimensional virtual environment. The computing system renders a left image for a left-eye display and a right image for a right-eye display of the artificial-reality device based on the planar proxy having the determined pose and the first image, and the planar proxy having the determined pose and the second image, respectively. The computing system displays the rendered left image and right image using the left-eye display and right-eye display, respectively.Type: GrantFiled: November 9, 2020Date of Patent: December 27, 2022Assignee: Meta Platforms Technologies, LLCInventors: Alexander Sorkine Hornung, Panya Inversin
-
Patent number: 11481960Abstract: A method includes a computing system tracking motions performed by a hand of a user, determining one or more anchor locations in a three-dimensional space, and generating a virtual surface anchored in the three-dimensional space. An image of a real environment is captured using a camera worn by the user, and a pose of the camera when the image is captured is determined. The computing system determines a first viewpoint of a first eye of the user and a region in the image that, as viewed from the camera, corresponds to the virtual surface. The computing system renders an output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface, and displays the output image on a first display of the device, the first display being configured to be viewed by the first eye of the user.Type: GrantFiled: December 30, 2020Date of Patent: October 25, 2022Assignee: Meta Platforms Technologies, LLCInventors: Alessia Marra, Gioacchino Noris, Panya Inversin
-
Publication number: 20220207816Abstract: A method includes a computing system tracking motions performed by a hand of a user, determining one or more anchor locations in a three-dimensional space, and generating a virtual surface anchored in the three-dimensional space. An image of a real environment is captured using a camera worn by the user, and a pose of the camera when the image is captured is determined. The computing system determines a first viewpoint of a first eye of the user and a region in the image that, as viewed from the camera, corresponds to the virtual surface. The computing system renders an output image based on (1) the first viewpoint relative to the virtual surface and (2) the image region corresponding to the virtual surface, and displays the output image on a first display of the device, the first display being configured to be viewed by the first eye of the user.Type: ApplicationFiled: December 30, 2020Publication date: June 30, 2022Inventors: Alessia Marra, Gioacchino Noris, Panya Inversin
-
Publication number: 20220172444Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: ApplicationFiled: February 18, 2022Publication date: June 2, 2022Applicant: Facebook Technologies, LLCInventors: Michael James LEBEAU, Manuel Ricardo FREIRE SANTOS, Aleksejs ANPILOGOVS, Alexander SORKINE HORNUNG, Björn WANBO, Connor TREACY, Fangwei LEE, Federico RUIZ, Jonathan MALLINSON, Jonathan Richard MAYOH, Marcus TANNER, Panya INVERSIN, Sarthak RAY, Sheng SHEN, William Arthur Hugh STEPTOE, Alessia MARRA, Gioacchino NORIS, Derrick READINGER, Jeffrey Wai-King LOCK, Jeffrey WITTHUHN, Jennifer Lynn SPURLOCK, Larissa Heike LAICH, Javier Alejandro Sierra SANTOS
-
Publication number: 20220148254Abstract: A method includes receiving, through a network by a computing system associated with an artificial-reality device, video data of a user of a second computing system comprising a first and second image of the user. The first and second images may be captured concurrently by a first camera and a second camera of the second computing system, respectively. The computing system generates a planar proxy for displaying the user and determines a pose for the planar proxy within a three-dimensional virtual environment. The computing system renders a left image for a left-eye display and a right image for a right-eye display of the artificial-reality device based on the planar proxy having the determined pose and the first image, and the planar proxy having the determined pose and the second image, respectively. The computing system displays the rendered left image and right image using the left-eye display and right-eye display, respectively.Type: ApplicationFiled: November 9, 2020Publication date: May 12, 2022Inventors: Alexander Sorkine Hornung, Panya Inversin
-
Publication number: 20220121343Abstract: In one embodiment, a method includes a computer system accessing an image of a physical environment of a user, the image being associated with a perspective of the user and depicting a physical input device and a physical hand of the user, determining a pose of the physical input device, generating a three-dimensional model representing the physical hand of the user, generating an image mask by projecting the three-dimensional model onto an image plane associated with the perspective of the user, generating, by applying the image mask to the image, a cropped image depicting at least the physical hand of the user in the image, rendering, based on the perspective of the user and the pose of the physical input device, a virtual input device to represent the physical input device, and displaying the cropped image depicting at least the physical hand over the rendered virtual input device.Type: ApplicationFiled: October 16, 2020Publication date: April 21, 2022Inventors: Adrian Brian Ratter, Alessia Marra, Yugeng He, Panya Inversin
-
Patent number: 11302085Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: GrantFiled: October 30, 2020Date of Patent: April 12, 2022Assignee: Facebook Technologies, LLCInventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos
-
Publication number: 20220086205Abstract: Aspects of the present disclosure are directed to creating and administering artificial reality collaborative working environments and providing interaction modes for them. An XR work system can provide and control such artificial reality collaborative working environments to enable, for example, A) links between real-world surfaces and XR surfaces; B) links between multiple real-world areas to XR areas with dedicated functionality; C) maintaining access, while inside the artificial reality working environment, to real-world work tools such as the user's computer screen and keyboard; D) various hand and controller modes for different interaction and collaboration modalities; E) use-based, multi-desk collaborative room configurations; and F) context-based auto population of users and content items into the artificial reality working environment.Type: ApplicationFiled: October 30, 2020Publication date: March 17, 2022Inventors: Michael James LeBeau, Manuel Ricardo Freire Santos, Aleksejs Anpilogovs, Alexander Sorkine Hornung, Bjorn Wanbo, Connor Treacy, Fangwei Lee, Federico Ruiz, Jonathan Mallinson, Jonathan Richard Mayoh, Marcus Tanner, Panya Inversin, Sarthak Ray, Sheng Shen, William Arthur Hugh Steptoe, Alessia Marra, Gioacchino Noris, Derrick Readinger, Jeffrey Wai-King Lock, Jeffrey Witthuhn, Jennifer Lynn Spurlock, Larissa Heike Laich, Javier Alejandro Sierra Santos