Patents by Inventor Sapna Shroff
Sapna Shroff has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250030940Abstract: A wearable device for use in immersive reality applications is provided. The wearable device has a frame including an eyepiece to provide a forward-image to a user, a first forward-looking camera mounted on the frame, the first forward-looking camera having a field of view within the forward-image, a sensor configured to receive a command from the user, the command indicative of a region of interest within the field of view, and an interface device to indicate to the user that a field of view of the first forward-looking camera is aligned with the region of interest. Methods of use of the device, a memory storing instructions and a processor to execute the instructions to cause the device to perform the methods of use, are also provided.Type: ApplicationFiled: September 13, 2024Publication date: January 23, 2025Inventors: Sapna Shroff, Sebastian Sztuk, Johana Gabriela Coyoc Escudero, Jun Hu
-
Publication number: 20250024138Abstract: A wearable device for use in immersive reality applications is provided. The wearable device includes eyepieces to provide a forward-image to a user, a first forward-looking camera mounted on the frame and having a field of view, a processor configured to identify a region of interest within the forward-image, and an interface device to indicate to the user that a field of view of the first forward-looking camera is misaligned with the region of interest. Methods of use of the device, a memory storing instructions and a processor to execute the instructions to cause the device to perform the methods of use, are also provided.Type: ApplicationFiled: September 26, 2024Publication date: January 16, 2025Inventors: Sapna Shroff, Sebastian Sztuk, Jun Hu, Johana Gabriela Coyoc Escudero
-
Patent number: 12132983Abstract: A wearable device for use in immersive reality applications is provided. The wearable device includes eyepieces to provide a forward-image to a user, a first forward-looking camera mounted on the frame and having a field of view, a processor configured to identify a region of interest within the forward-image, and an interface device to indicate to the user that a field of view of the first forward-looking camera is misaligned with the region of interest. Methods of use of the device, a memory storing instructions and a processor to execute the instructions to cause the device to perform the methods of use, are also provided.Type: GrantFiled: July 7, 2022Date of Patent: October 29, 2024Assignee: Meta Platforms Technologies, LLCInventors: Sapna Shroff, Sebastian Sztuk, Jun Hu, Johana Gabriela Coyoc Escudero
-
Patent number: 12101549Abstract: A method for using cameras in an augmented reality headset is provided. The method includes receiving a signal from a sensor mounted on a headset worn by a user, the signal being indicative of a user intention for capturing an image. The method also includes identifying the user intention for capturing the image, based on a model to classify the signal from the sensor according to the user intention, selecting a first image capturing device in the headset based on a specification of the first image capturing device and the user intention for capturing the image, and capturing the image with the first image capturing device. An augmented reality headset, a memory storing instructions, and a processor to execute the instructions to cause the augmented reality headset as above are also provided.Type: GrantFiled: July 1, 2022Date of Patent: September 24, 2024Assignee: Meta Platforms Technologies, LLCInventors: Sapna Shroff, Linsen Bie, Jun Hu, David Tao, Demetrios Basil Karanikos, Sebastian Sztuk, Zhaonian Zhang, Jiangtao Kuang, Danni Luo, Yilei Li
-
Patent number: 12096112Abstract: A wearable device for use in immersive reality applications is provided. The wearable device has a frame including an eyepiece to provide a forward-image to a user, a first forward-looking camera mounted on the frame, the first forward-looking camera having a field of view within the forward-image, a sensor configured to receive a command from the user, the command indicative of a region of interest within the field of view, and an interface device to indicate to the user that a field of view of the first forward-looking camera is aligned with the region of interest. Methods of use of the device, a memory storing instructions and a processor to execute the instructions to cause the device to perform the methods of use, are also provided.Type: GrantFiled: June 2, 2022Date of Patent: September 17, 2024Assignee: Meta Platforms Technologies, LLCInventors: Sapna Shroff, Sebastian Sztuk, Johana Gabriela Coyoc Escudero, Jun Hu
-
Publication number: 20230156314Abstract: A method for capturing a scene in a virtual environment for an immersive reality application running in a headset is provided. The method includes determining initiation of an auto-capture session in a headset by a user, the headset running an immersive reality application hosted by a remote server, executing a gaze model based on the initiation, detecting through the gaze model a gaze of the user, tracking the gaze of the user, capturing a scene in a virtual environment based on the gaze of the user, and storing the scene as a media file in storage. A headset and a memory storing instructions to cause the headset and a remote server to perform the above method are also provided.Type: ApplicationFiled: November 7, 2022Publication date: May 18, 2023Inventors: Sebastian Sztuk, Salvael Ortega Estrada, Sapna Shroff, Jun Hu, Yilei Li
-
Publication number: 20230031871Abstract: A wearable device for use in immersive reality applications is provided. The wearable device has a frame including an eyepiece to provide a forward-image to a user, a first forward-looking camera mounted on the frame, the first forward-looking camera having a field of view within the forward-image, a sensor configured to receive a command from the user, the command indicative of a region of interest within the field of view, and an interface device to indicate to the user that a field of view of the first forward-looking camera is aligned with the region of interest. Methods of use of the device, a memory storing instructions and a processor to execute the instructions to cause the device to perform the methods of use, are also provided.Type: ApplicationFiled: June 2, 2022Publication date: February 2, 2023Inventors: Sapna Shroff, Sebastian Sztuk, Johana Gabriela Coyoc Escudero, Jun Hu
-
Publication number: 20230032467Abstract: A wearable device for use in immersive reality applications is provided. The wearable device includes eyepieces to provide a forward-image to a user, a first forward-looking camera mounted on the frame and having a field of view, a processor configured to identify a region of interest within the forward-image, and an interface device to indicate to the user that a field of view of the first forward-looking camera is misaligned with the region of interest. Methods of use of the device, a memory storing instructions and a processor to execute the instructions to cause the device to perform the methods of use, are also provided.Type: ApplicationFiled: July 7, 2022Publication date: February 2, 2023Inventors: Sapna Shroff, Sebastian Sztuk, Jun Hu, Johana Gabriela Coyoc Escudero
-
Publication number: 20230012426Abstract: A method for using cameras in an augmented reality headset is provided. The method includes receiving a signal from a sensor mounted on a headset worn by a user, the signal being indicative of a user intention for capturing an image. The method also includes identifying the user intention for capturing the image, based on a model to classify the signal from the sensor according to the user intention, selecting a first image capturing device in the headset based on a specification of the first image capturing device and the user intention for capturing the image, and capturing the image with the first image capturing device. An augmented reality headset, a memory storing instructions, and a processor to execute the instructions to cause the augmented reality headset as above are also provided.Type: ApplicationFiled: July 1, 2022Publication date: January 12, 2023Inventors: Sapna Shroff, Linsen Bie, Jun Hu, David Tao, Demetrios Basil Karanikos, Sebastian Sztuk, Zhaonian Zhang, Jiangtao Kuang, Danni Luo, Yilei Li
-
Patent number: 11494960Abstract: A display assembly generates environmentally matched virtual content for an electronic display. The display assembly includes a display controller and a display. The display controller is configured to estimate environmental matching information for a target area within a local area based in part on light information received from a light sensor. The target area is a region for placement of a virtual object. The light information describes light values. The display controller generates display instructions for the target area based in part on a human vision model, the estimated environmental matching information, and rendering information associated with the virtual object. The display is configured to present the virtual object as part of artificial reality content in accordance with the display instructions. The color and brightness of the virtual object is environmentally matched to the portion of the local area surrounding the target area.Type: GrantFiled: June 15, 2021Date of Patent: November 8, 2022Assignee: Meta Platforms Technologies, LLCInventors: Jiangtao Kuang, Edward Buckley, Honghong Peng, Sapna Shroff, Romain Bachy
-
Publication number: 20210358187Abstract: A display assembly generates environmentally matched virtual content for an electronic display. The display assembly includes a display controller and a display. The display controller is configured to estimate environmental matching information for a target area within a local area based in part on light information received from a light sensor. The target area is a region for placement of a virtual object. The light information describes light values. The display controller generates display instructions for the target area based in part on a human vision model, the estimated environmental matching information, and rendering information associated with the virtual object. The display is configured to present the virtual object as part of artificial reality content in accordance with the display instructions. The color and brightness of the virtual object is environmentally matched to the portion of the local area surrounding the target area.Type: ApplicationFiled: June 15, 2021Publication date: November 18, 2021Inventors: Jiangtao Kuang, Edward Buckley, Honghong Peng, Sapna Shroff, Romain Bachy
-
Patent number: 11069104Abstract: A display assembly generates environmentally matched virtual content for an electronic display. The display assembly includes a display controller and a display. The display controller is configured to estimate environmental matching information for a target area within a local area based in part on light information received from a light sensor. The target area is a region for placement of a virtual object. The light information describes light values. The display controller generates display instructions for the target area based in part on a human vision model, the estimated environmental matching information, and rendering information associated with the virtual object. The display is configured to present the virtual object as part of artificial reality content in accordance with the display instructions. The color and brightness of the virtual object is environmentally matched to the portion of the local area surrounding the target area.Type: GrantFiled: May 13, 2020Date of Patent: July 20, 2021Assignee: Facebook Technologies, LLCInventors: Jiangtao Kuang, Edward Buckley, Honghong Peng, Sapna Shroff, Romain Bachy
-
Patent number: 11042034Abstract: A system is describes that includes a head mounted display (HMD) and a portable docking station configured to receive the HMD for calibration of one or more components of the HMD. The portable docking station includes at least one calibration target, e.g., a checkerboard pattern and/or a convex reflector. Techniques of this disclosure include calibrating an image capture device of the HMD based on one or more images of the calibration target captured by the image capture device when the HMD is placed in the portable docking station. The disclosed techniques may be applied to calibrate multiple different components of the HMD, including image capture devices such as eye-tracking cameras and inside-out cameras, displays, illuminators, sensors, and the like. In some examples, a rechargeable battery of the HMD may be charged when the HMD is placed in the portable docking station.Type: GrantFiled: August 30, 2019Date of Patent: June 22, 2021Assignee: Facebook Technologies, LLCInventors: Sebastian Sztuk, Javier San Agustin Lopez, Sapna Shroff, Robert Dale Cavin, Alexander Jobe Fix
-
Publication number: 20200209628Abstract: A system is describes that includes a head mounted display (HMD) and a portable docking station configured to receive the HMD for calibration of one or more components of the HMD. The portable docking station includes at least one calibration target, e.g., a checkerboard pattern and/or a convex reflector. Techniques of this disclosure include calibrating an image capture device of the HMD based on one or more images of the calibration target captured by the image capture device when the HMD is placed in the portable docking station. The disclosed techniques may be applied to calibrate multiple different components of the HMD, including image capture devices such as eye-tracking cameras and inside-out cameras, displays, illuminators, sensors, and the like. In some examples, a rechargeable battery of the HMD may be charged when the HMD is placed in the portable docking station.Type: ApplicationFiled: August 30, 2019Publication date: July 2, 2020Inventors: Sebastian Sztuk, Javier San Agustin Lopez, Sapna Shroff, Robert Dale Cavin, Alexander Jobe Fix
-
Patent number: 10552676Abstract: A device for eye tracking is disclosed. The device includes a first depth profiler configured to determine a distance from the first depth profiler to a surface of an eye. The device may also include a display device configured to display one or more images selected based on a position of the eye. The position of the eye is determined based on the determined distance. Also disclosed is a method for eye tracking. The method includes determining, with a first depth profiler, a distance from the first depth profiler to a surface of an eye. A position of the eye is determined based on the determine distance. One or more images selected based on the position of the eye are displayed on a display device.Type: GrantFiled: December 30, 2016Date of Patent: February 4, 2020Assignee: Facebook Technologies, LLCInventors: Sapna Shroff, Mary Lou Jepsen
-
Patent number: 10274730Abstract: A display device includes a two-dimensional array of tiles. Each tile includes a two-dimensional array of pixels and a lens of a two-dimensional array of lenses. Each pixel is configured to output light so that the two-dimensional array of pixels outputs a respective pattern of light. Each lens is configured to direct at least a portion of the respective pattern of light from the two-dimensional array of pixels to a pupil of an eye of a user. The display device also includes an array of sensors for determining a location of the pupil of the eye of the user.Type: GrantFiled: March 9, 2016Date of Patent: April 30, 2019Assignee: FACEBOOK TECHNOLOGIES, LLCInventors: Mary Lou Jepsen, Sapna Shroff
-
Publication number: 20170109562Abstract: A device for eye tracking is disclosed. The device includes a first depth profiler configured to determine a distance from the first depth profiler to a surface of an eye. The device may also include a display device configured to display one or more images selected based on a position of the eye. The position of the eye is determined based on the determined distance. Also disclosed is a method for eye tracking. The method includes determining, with a first depth profiler, a distance from the first depth profiler to a surface of an eye. A position of the eye is determined based on the determine distance. One or more images selected based on the position of the eye are displayed on a display device.Type: ApplicationFiled: December 30, 2016Publication date: April 20, 2017Inventors: Sapna Shroff, Mary Lou Jepsen
-
Publication number: 20170038836Abstract: A display device includes a two-dimensional array of tiles. Each tile includes a two-dimensional array of pixels and a lens of a two-dimensional array of lenses. Each pixel is configured to output light so that the two-dimensional array of pixels outputs a respective pattern of light. Each lens is configured to direct at least a portion of the respective pattern of light from the two-dimensional array of pixels to a pupil of an eye of a user. The display device also includes an array of sensors for determining a location of the pupil of the eye of the user.Type: ApplicationFiled: March 9, 2016Publication date: February 9, 2017Inventors: Mary Lou Jepsen, Sapna Shroff
-
Patent number: 8949078Abstract: A method for designing the spatial partition of a filter module used in an aperture-multiplexed imaging system. The filter module is spatially partitioned into filter cells, and the spatial partition is designed by considering data captured at the sensor in light of an application-specific performance metric.Type: GrantFiled: March 4, 2011Date of Patent: February 3, 2015Assignee: Ricoh Co., Ltd.Inventors: Kathrin Berkner, Sapna Shroff
-
Patent number: 8279329Abstract: An object to be imaged is illuminated with a structured (e.g., sinusoidal) illumination at a plurality of phase shifts to allow lateral superresolution and axial sectioning in images. When an object is to be imaged in vitro or in another situation in which the phase shifts cannot be accurately determined a priori, the images are taken, and the phase shifts are estimated a posteriori from peaks in the Fourier transforms. The technique is extended to the imaging of fluorescent and non-fluorescent objects as well as stationary and non-stationary objects.Type: GrantFiled: April 10, 2008Date of Patent: October 2, 2012Assignee: University of RochesterInventors: Sapna Shroff, David R. Williams, James Fienup