Patents by Inventor Charlene Mary ATLAS
Charlene Mary ATLAS has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11887263Abstract: In one embodiment, a computing device may determine a virtual content to be displayed with a scene of a real-world environment. The device may generate an image depicting the virtual content. Using one or more sensors, the device may detect characteristics of the scene of the real-world environment. Based on the image and the characteristics of the scene, the device may determine that a visual enhancement is to be applied to the virtual content depicted in the image to enhance a contrast between the depicted virtual content and the scene. The device may generate a visually-enhanced image depicting the virtual content by applying the visual enhancement to the virtual content depicted in the image. The device may display the visually-enhanced image of the virtual content on a display of the computing device, wherein the scene of the real-world environment is visible through the display.Type: GrantFiled: July 8, 2022Date of Patent: January 30, 2024Assignee: Meta Platforms Technologies, LLCInventors: Charlene Mary Atlas, Romain Bachy, Kevin James MacKenzie, Nathan Matsuda, Thomas Scott Murdison, Ocean Quigley, Jasmine Soria Sears
-
Publication number: 20240012243Abstract: In some implementations, the disclosed systems and methods can include one or more tinting elements (e.g., flip-down blacked-out lenses, a blacked-out slider, or a blacked-out removeable cover) configured to cover the lenses of the MR glasses. In some implementations, the disclosed systems and methods can be coupled to one or more micro electrical motors configured to drive the clear fluid into and out of the pairs of clear flexible membranes, in order to make the focus tunable lenses concave to correct myopia, or convex to correct hyperopia or presbyopia. In some implementations, the disclosed systems and methods can be directed to online calibration of headset proximity sensors to mitigate after factory sensor drift and prevent automatic OFF and ON system failures.Type: ApplicationFiled: August 24, 2023Publication date: January 11, 2024Applicant: Meta Platforms Technologies, LLCInventors: Charlene Mary ATLAS, Nadine Sharon ANGLIN, Dong YANG, Jianjun JU, Chih-Chen SUN, Jian ZHANG, Wanli WU
-
Publication number: 20230281929Abstract: The present disclosure relates to systems, methods, and non-transitory computer-readable media that initiate communication between users of a networking system within an extended reality environment. For example, the disclosed systems can generate an extended-reality lobby window graphical user interface element for display on an extended-reality device of a user. The disclosed systems can further determine a connection between the user and a co-user and provide an animated visual representation of the co-user for display within the extended-reality lobby window graphical user interface element. In response to receiving user input targeting the animated visual representation of the co-user, the disclosed systems can generate and send, for display on an extended-reality device of the co-user, an invitation to join an extended-reality communication session with the user.Type: ApplicationFiled: September 13, 2022Publication date: September 7, 2023Inventors: Charlene Mary Atlas, Mark Terrano
-
Patent number: 11475634Abstract: The present disclosure relates to systems, methods, and non-transitory computer-readable media that initiate communication between users of a networking system within an extended reality environment. For example, the disclosed systems can generate an extended-reality lobby window graphical user interface element for display on an extended-reality device of a user. The disclosed systems can further determine a connection between the user and a co-user and provide an animated visual representation of the co-user for display within the extended-reality lobby window graphical user interface element. In response to receiving user input targeting the animated visual representation of the co-user, the disclosed systems can generate and send, for display on an extended-reality device of the co-user, an invitation to join an extended-reality communication session with the user.Type: GrantFiled: July 2, 2020Date of Patent: October 18, 2022Assignee: Meta Platforms Technologies, LLCInventors: Charlene Mary Atlas, Mark Terrano
-
Patent number: 11423621Abstract: In one embodiment, a computing device may determine a virtual content to be displayed with a scene of a real-world environment. The device may generate an image depicting the virtual content. Using one or more sensors, the device may detect characteristics of the scene of the real-world environment. Based on the image and the characteristics of the scene, the device may determine that a visual enhancement is to be applied to the virtual content depicted in the image to enhance a contrast between the depicted virtual content and the scene. The device may generate a visually-enhanced image depicting the virtual content by applying the visual enhancement to the virtual content depicted in the image. The device may display the visually-enhanced image of the virtual content on a display of the computing device, wherein the scene of the real-world environment is visible through the display.Type: GrantFiled: May 21, 2020Date of Patent: August 23, 2022Assignee: Facebook Technologies, LLC.Inventors: Charlene Mary Atlas, Romain Bachy, Kevin James MacKenzie, Nathan Matsuda, Thomas Scott Murdison, Ocean Quigley, Jasmine Soria Sears
-
Publication number: 20220230352Abstract: One embodiment is directed to controlling a computing system based on an interpreted user intention. Another embodiment is directed to generating a smoothed position of a feature based upon detected and reprojected positions of the feature. Another embodiment is directed to performing one or more image treatments on a facial region of a user until the perceived SQS satisfies the predetermined target SQS. Another embodiment is directed to video conferencing monitoring the quality of video feed coming from the participants of the video conferencing and creating an image or video from the feed when that participant's feed is good and replacing the live video with the newly created good quality image or video when the feed is bad. Another embodiment is directed to a process of baked triplanar projection using triangles generated from a tessellation, where the baked triplanar projection can generate a 2D mesh including UV coordinates.Type: ApplicationFiled: February 4, 2022Publication date: July 21, 2022Inventors: Mahdi Salmani Rahimi, Yu Mao, Zhiqing Rao, Charlene Mary Atlas, Jasmine Soria Sears, Ocean Quigley, Romain Bachy, Yehezkel Shraga Resheff, Michal Rosen, Michael Bunnell, Bret Hobbs
-
Publication number: 20220005275Abstract: The present disclosure relates to systems, methods, and non-transitory computer-readable media that initiate communication between users of a networking system within an extended reality environment. For example, the disclosed systems can generate an extended-reality lobby window graphical user interface element for display on an extended-reality device of a user. The disclosed systems can further determine a connection between the user and a co-user and provide an animated visual representation of the co-user for display within the extended-reality lobby window graphical user interface element. In response to receiving user input targeting the animated visual representation of the co-user, the disclosed systems can generate and send, for display on an extended-reality device of the co-user, an invitation to join an extended-reality communication session with the user.Type: ApplicationFiled: July 2, 2020Publication date: January 6, 2022Inventors: Charlene Mary Atlas, Mark Terrano
-
Publication number: 20210327156Abstract: This disclosure describes an artificial reality system that presents artificial reality content in the context of a physical environment that includes a mirror or other reflective surface. In one example, this disclosure describes a method that includes capturing capture data representative of a physical environment, wherein the physical environment includes a reflective surface and a plurality of objects, determining a pose of the HMD, determining a map of the physical environment, wherein the map includes position information about the reflective surface and position information about each of the plurality of physical objects in the physical environment, identifying a visible object from among the plurality of physical objects, and generating artificial reality content associated with the visible object.Type: ApplicationFiled: July 1, 2021Publication date: October 21, 2021Inventors: Chad Austin Bramwell, Caryn Vainio, Charlene Mary Atlas, Mark Terrano
-
Patent number: 11145126Abstract: This disclosure describes an artificial reality system that presents artificial reality content in the context of a physical environment that includes a mirror or other reflective surface. In one example, this disclosure describes a method that includes capturing capture data representative of a physical environment, wherein the physical environment includes a reflective surface and a plurality of objects, determining a pose of the HMD, determining a map of the physical environment, wherein the map includes position information about the reflective surface and position information about each of the plurality of physical objects in the physical environment, identifying a visible object from among the plurality of physical objects, and generating artificial reality content associated with the visible object.Type: GrantFiled: June 27, 2019Date of Patent: October 12, 2021Assignee: Facebook Technologies, LLCInventors: Chad Austin Bramwell, Caryn Vainio, Charlene Mary Atlas, Mark Terrano
-
Patent number: 11055920Abstract: This disclosure describes an artificial reality system that presents artificial reality content in the context of a physical environment that includes a mirror or other reflective surface. In one example, this disclosure describes a method that includes capturing capture data representative of a physical environment, wherein the physical environment includes a reflective surface and a plurality of objects, determining a pose of the HMD, determining a map of the physical environment, wherein the map includes position information about the reflective surface and position information about each of the plurality of physical objects in the physical environment, identifying a visible object from among the plurality of physical objects, and generating artificial reality content associated with the visible object.Type: GrantFiled: June 27, 2019Date of Patent: July 6, 2021Assignee: Facebook Technologies, LLCInventors: Chad Austin Bramwell, Caryn Vainio, Charlene Mary Atlas, Mark Terrano
-
Patent number: 11036987Abstract: This disclosure describes an artificial reality system that presents artificial reality content in the context of a physical environment that includes a mirror or other reflective surface. In one example, this disclosure describes a method that includes capturing capture data representative of a physical environment, wherein the physical environment includes a reflective surface and a plurality of objects, determining a pose of the HMD, determining a map of the physical environment, wherein the map includes position information about the reflective surface and position information about each of the plurality of physical objects in the physical environment, identifying a visible object from among the plurality of physical objects, and generating artificial reality content associated with the visible object.Type: GrantFiled: June 27, 2019Date of Patent: June 15, 2021Assignee: Facebook Technologies, LLCInventors: Chad Austin Bramwell, Caryn Vainio, Charlene Mary Atlas, Mark Terrano
-
Patent number: 11023035Abstract: In general, the disclosure describes artificial reality (AR) systems and techniques for generating and presenting virtual surfaces within an artificial reality environment and for facilitating user interaction with the virtual surfaces using a physical peripheral device. For example, AR systems are described that generate and render virtual surfaces, such as a virtual pinboard or a virtual drawing surface (e.g., a virtual canvas) in an artificial reality environment, for display to a user. The AR systems enable the user to interact with the virtual surfaces using a physical peripheral device, which may be manipulated and otherwise interacted with by the user to provide input to an AR system through pose tracking of the peripheral device and/or via one or more input devices of the peripheral device, such as a presence-sensitive surface.Type: GrantFiled: July 9, 2019Date of Patent: June 1, 2021Assignee: Facebook Technologies, LLCInventors: Charlene Mary Atlas, Chad Austin Bramwell, Mark Terrano, Caryn Vainio
-
Patent number: 11023036Abstract: In some examples, a virtual surface presented by an artificial reality (AR) system may be a virtual drawing surface in an artificial reality environment, with which users may interact using a physical peripheral device. For example, the AR system may enable a user to draw or write on a surface of the peripheral device and may simultaneously, i.e., along with the user inputs, render virtual markings corresponding to the user inputs on a virtual drawing surface locked to the surface of the peripheral device. In some cases, the AR system enables the user to “transfer” the virtual markings rendered on the surface of the peripheral device (e.g., a virtual drawing) to another virtual drawing surface (e.g., to a planar or other surface in the artificial reality environment).Type: GrantFiled: July 9, 2019Date of Patent: June 1, 2021Assignee: Facebook Technologies, LLCInventors: Charlene Mary Atlas, Chad Austin Bramwell, Mark Terrano, Caryn Vainio
-
Patent number: 10984242Abstract: An artificial reality system includes a head mounted display (HMD) and a compass generator that generates a proximity compass used to locate virtual or physical elements within an artificial reality environment. The artificial reality system can render a proximity compass around or in proximity to a virtual device, a virtual or physical object, or at a designated location. The proximity compass includes graphical elements that can represent virtual experiences (applications, games, utilities etc.), physical or virtual objects, and physical or virtual locations within the artificial reality environment. The positioning of a graphical element within the proximity compass represents a direction of the associated experience, object or location with respect to the HMD. The graphical elements within the proximity compass may rotate around a virtual hand-held device when the user changes the orientation of the hand-held device.Type: GrantFiled: September 5, 2019Date of Patent: April 20, 2021Assignee: Facebook Technologies, LLCInventors: Charlene Mary Atlas, Chad Austin Bramwell, Mark Terrano, Caryn Vainio
-
Patent number: 10976804Abstract: In some examples, a method comprises: obtaining, by an artificial reality system including a head mounted display (HMD), image data via an image capture device, the HMD configured to output artificial reality content; detecting, by the artificial reality system, a physical peripheral device from the image data; detecting, by the artificial reality system, a pose of the peripheral device; generating, by the artificial reality system, based on the pose of the peripheral device, a virtual pointer along at least a portion of a line between the peripheral device and a virtual surface, the virtual pointer pointing at a location of the virtual surface; performing, by the artificial reality system, one or more actions based on the location of the virtual surface; and rendering, by the artificial reality system, the virtual pointer and the virtual surface for display at the HMD.Type: GrantFiled: July 9, 2019Date of Patent: April 13, 2021Assignee: Facebook Technologies, LLCInventors: Charlene Mary Atlas, Chad Austin Bramwell, Mark Terrano, Caryn Vainio, Xiaoyu Liu
-
Publication number: 20210011556Abstract: In general, the disclosure describes artificial reality systems and techniques for generating and presenting a virtual user interface with which users may interact using a physical peripheral device. In some examples, an artificial reality system includes an image capture device configured to capture image data; a head-mounted display (HMD) configured to output artificial reality content; a user interface engine configured to detect a peripheral device from the image data, wherein the user interface engine is configured to generate a virtual user interface comprising one or more virtual user interface elements; and a rendering engine configured to render the artificial reality content and to render, at a user interface position locked relative to a position of the peripheral device in an artificial reality environment, the virtual user interface for display at the HMD.Type: ApplicationFiled: July 9, 2019Publication date: January 14, 2021Inventors: Charlene Mary Atlas, Chad Austin Bramwell, Mark Terrano, Caryn Vainio
-
Publication number: 20200019242Abstract: Examples are disclosed that relate to evoking an emotion and/or other an expression of an avatar via a gesture and/or posture sensed by a wearable device. One example provides a computing device including a logic subsystem and memory storing instructions executable by the logic subsystem to receive, from a wearable device configured to be worn on a hand of a user, an input of data indicative of one or more of a gesture and a posture. The instructions are further executable to, based on the input of data received, determine a digital personal expression corresponding to the one or more of the gesture and the posture, and output the digital personal expression.Type: ApplicationFiled: July 12, 2018Publication date: January 16, 2020Applicant: Microsoft Technology Licensing, LLCInventors: Charlene Mary ATLAS, Sean Kenneth MCBETH, Andrew Frederick MUEHLHAUSEN, Kenneth Mitchell JAKUBZAK
-
Publication number: 20190302903Abstract: Examples are disclosed herein that relate to a six degree-of-freedom (DOF) input device. An example provides an input device comprising a body, a sensor system configured to sense motion of the input device with six DOF, a communication interface and a controller. The controller is configured to transmit output based on sensor data from the sensor system for use in controlling an application in a first mode in which each of the six degrees-of-freedom is used as input, the application being controlled in the first mode in response to detecting a first condition, and transmit output based on sensor data from the sensor system for use in controlling the application in a second mode in which one or more of the six degrees-of-freedom is not used as input, the application being controlled in the second mode in response to detecting a second condition.Type: ApplicationFiled: March 30, 2018Publication date: October 3, 2019Applicant: Microsoft Technology Licensing, LLCInventors: Charlene Mary ATLAS, Ishac BERTRAN, Benjamin Hunter BOESEL, Lorenz Henric JENTZ, Nikolai Michael FAALAND, Christian KLEIN, Xin Xian LIANG, Orr SROUR