Patents by Inventor Alexandre CHAPIRO
Alexandre CHAPIRO has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240073376Abstract: In holographic calling, it is difficult to capture the eyes of a caller due to lighting effects on an artificial reality (XR) headset. However, it can be important to capture the eyes when rendering the caller as they can show emotion, gaze, physical characteristics, etc., that aid in natural communication. Thus, implementations can capture the eyes of the caller using an external image capture device by briefly turning off the lighting effects on the XR headset. Some implementations can trigger the image capture device to capture an image of the eyes by temporal multiplexing in which timers on both the image capture device and the XR headset are synchronized. In other implementations, the image capture device can be an event-based camera that is automatically triggered to capture an image of the eyes based on a detected pixel change caused by deactivation of the lighting effects on the XR headset.Type: ApplicationFiled: August 26, 2022Publication date: February 29, 2024Inventors: Jean-Charles BAZIN, Alexandre CHAPIRO
-
Publication number: 20240045502Abstract: In an embodiment, an electronic device includes a display and an eye tracker. The display includes one or more foveated areas. In the embodiment, the eye tracker is configured to collect eye tracking data regarding a gaze of one or more eyes of a user on the display. The electronic device also includes processing circuitry operatively coupled to the display. In the embodiment, the processing circuitry is configured to receive an indication of a motion associated with the gaze from the eye tracker. The processing circuitry is also configured to determine a previous location associated with the gaze during a previous frame and a target position associated with the gaze during a target frame. In the embodiment, the processing circuitry is configured to expand one or more foveated areas of the display adjacent a previous position of the gaze of the user.Type: ApplicationFiled: October 17, 2023Publication date: February 8, 2024Inventors: Yang Li, Alexandre Chapiro, Mehmet N. Agaoglu, Nicolas Pierre Marie Frederic Bonnier, Yi-Pai Huang, Chaohao Wang, Andrew B. Watson, Pretesh A. Mascarenhas
-
Publication number: 20230393405Abstract: Eyewear such as a head-mounted device may include display-optimized lenses. The lenses may be optimized for viewing an external display while also providing sun protection for the user's eyes. The external display may form part of a handheld electronic device that serves as a controller for the head-mounted device. The lenses in the head-mounted device may reduce ambient light brightness while maintaining the brightness of the external display so that the user can use the external display while wearing the head-mounted device. The user may, for example, provide touch input to the external display to adjust display content on the head-mounted display. The lenses may include a polarizer and a color filter having a transmission spectrum curve with peaks corresponding to the primary colors of the external display. The lenses may be removable clip-on lenses and the light filter may be an electrochromic filter.Type: ApplicationFiled: August 22, 2023Publication date: December 7, 2023Inventors: Jonathan C Moisant-Thompson, Alexandre Chapiro, Bennett S Wilburn, Seth E Hunter, Ove Lyngnes, Nicolas P Bonnier, Cameron A Harder, Nawaf Al-Baghly
-
Patent number: 11822715Abstract: In an embodiment, an electronic device includes a display and an eye tracker. The display includes one or more foveated areas. In the embodiment, the eye tracker is configured to collect eye tracking data regarding a gaze of one or more eyes of a user on the display. The electronic device also includes processing circuitry operatively coupled to the display. In the embodiment, the processing circuitry is configured to receive an indication of a motion associated with the gaze from the eye tracker. The processing circuitry is also configured to determine a previous location associated with the gaze during a previous frame and a target position associated with the gaze during a target frame. In the embodiment, the processing circuitry is configured to expand one or more foveated areas of the display adjacent a previous position of the gaze of the user.Type: GrantFiled: May 10, 2021Date of Patent: November 21, 2023Assignee: Apple Inc.Inventors: Yang Li, Alexandre Chapiro, Mehmet N. Agaoglu, Nicolas Pierre Marie Frederic Bonnier, Yi-Pai Huang, Chaohao Wang, Andrew B. Watson, Pretesh A. Mascarenhas
-
Patent number: 11803060Abstract: Eyewear such as sunglasses may include display-optimized lenses. The lenses may be optimized for viewing an external display and/or for viewing a display in the head-mounted device while also providing sun protection for the user's eyes. The lenses may include a polarizer and a color filter that are designed for a given target display. Lenses that are optimized for a display that emits linearly polarized light may include a linear polarizer. Lenses that are optimized for a display that emits circularly polarized light may include a circular polarizer. The circular polarizer may include a quarter wave plate and a linear polarizer. The color filter may have a transmission spectrum curve with peaks corresponding to the primary colors of the target display so that the color and brightness of display light is preserved while the brightness of sunlight is reduced.Type: GrantFiled: April 6, 2021Date of Patent: October 31, 2023Assignee: Apple Inc.Inventors: Bennett S. Wilburn, Jonathan C. Moisant-Thompson, Alexandre Chapiro, Seth E. Hunter, Ove Lyngnes, Nicolas P. Bonnier
-
Publication number: 20230282183Abstract: One or more media contents are received. A viewer's light adaptive states are predicted as a function of time as if the viewer is watching display mapped images derived from the one or more media contents. The viewer's light adaptive states are used to detect an excessive change in luminance in a specific media content portion of the one or more media contents. The excessive change in luminance in the specific media content portion of the one or more media contents is caused to be reduced while the viewer is watching one or more corresponding display mapped images derived from the specific media content portion of the one or more media contents.Type: ApplicationFiled: February 17, 2023Publication date: September 7, 2023Applicant: Dolby Laboratories Licensing CorporationInventors: Alexandre CHAPIRO, Robin ATKINS, Scott DALY
-
Patent number: 11735126Abstract: An electronic device such as a watch may include a display and a light sensor located behind the display. The light sensor may be used to measure the color of external objects. During color sampling operations, the display may emit light towards the external object in front of the display while the light sensor gathers color measurements. The display may emit light of different colors and the light sensor may detect an amount of reflected light for each color, which in turn may be used to determine the color of the external object. The control circuitry may use a watch-band-specific algorithm to determine the color of watch bands and may use a clothing-specific algorithm to determine the color of clothing. The control circuitry may display the color on the display so that the face of the watch matches the user's clothing or matches the user's watch band.Type: GrantFiled: July 19, 2021Date of Patent: August 22, 2023Assignee: Apple Inc.Inventors: Jackson K. Roland, Nicolas P. Bonnier, Alexandre Chapiro, David A. Doyle, Guillaume Lestoquoy, Jonathan C. Moisant-Thompson
-
Patent number: 11587526Abstract: One or more media contents are received. A viewer's light adaptive states are predicted as a function of time as if the viewer is watching display mapped images derived from the one or more media contents. The viewer's light adaptive states are used to detect an excessive change in luminance in a specific media content portion of the one or more media contents. The excessive change in luminance in the specific media content portion of the one or more media contents is caused to be reduced while the viewer is watching one or more corresponding display mapped images derived from the specific media content portion of the one or more media contents.Type: GrantFiled: December 17, 2019Date of Patent: February 21, 2023Assignee: Dolby Laboratories Licensing CorporationInventors: Alexandre Chapiro, Robin Atkins, Scott Daly
-
Publication number: 20220414823Abstract: Peripheral-vision expanded images are streamed to a video streaming client. The peripheral-vision expanded images are generated from source images in reference to view directions of the viewer at respective time points. View direction data is collected and received in real time while the viewer is viewing display images derived from the peripheral-vision expanded images. A second peripheral-vision expanded image is generated from a second source image in reference to a second view direction of the viewer at a second time point. The second peripheral-vision expanded image has a focal-vision image portion covering the second view direction of the viewer and a peripheral-vision image portion outside the focal-vision image portion. The second peripheral-vision expanded image is transmitted to the video streaming client.Type: ApplicationFiled: August 31, 2022Publication date: December 29, 2022Inventors: Alexandre CHAPIRO, Chaitanya ATLURU, Chun Chi WAN, Haricharan LAKSHMAN, William ROZZI, Shane RUGGIERI, Ajit NINAN
-
Patent number: 11461871Abstract: Peripheral-vision expanded images are streamed to a video streaming client. The peripheral-vision expanded images are generated from source images in reference to view directions of the viewer at respective time points. View direction data is collected and received in real time while the viewer is viewing display images derived from the peripheral-vision expanded images. A second peripheral-vision expanded image is generated from a second source image in reference to a second view direction of the viewer at a second time point. The second peripheral-vision expanded image has a focal-vision image portion covering the second view direction of the viewer and a peripheral-vision image portion outside the focal-vision image portion. The second peripheral-vision expanded image is transmitted to the video streaming client.Type: GrantFiled: August 10, 2020Date of Patent: October 4, 2022Assignee: DOLBY LABORATORIES LICENSING CORPORATIONInventors: Alexandre Chapiro, Chaitanya Atluru, Chun Chi Wan, Haricharan Lakshman, William Rozzi, Shane Ruggieri, Ajit Ninan
-
Publication number: 20220011858Abstract: In an embodiment, an electronic device includes a display and an eye tracker. The display includes one or more foveated areas. In the embodiment, the eye tracker is configured to collect eye tracking data regarding a gaze of one or more eyes of a user on the display. The electronic device also includes processing circuitry operatively coupled to the display. In the embodiment, the processing circuitry is configured to receive an indication of a motion associated with the gaze from the eye tracker. The processing circuitry is also configured to determine a previous location associated with the gaze during a previous frame and a target position associated with the gaze during a target frame. In the embodiment, the processing circuitry is configured to expand one or more foveated areas of the display adjacent a previous position of the gaze of the user.Type: ApplicationFiled: May 10, 2021Publication date: January 13, 2022Inventors: Yang Li, Alexandre Chapiro, Mehmet N. Agaoglu, Nicolas Pierre Marie Frederic Bonnier, Yi-Pai Huang, Chaohao Wang, Andrew B. Watson, Pretesh A. Mascarenhas
-
Patent number: 11140440Abstract: Novel systems and methods are described for creating, compressing, and distributing video or image content graded for a plurality of displays with different dynamic ranges. In implementations, the created content is “continuous dynamic range” (CDR) content—a novel representation of pixel-luminance as a function of display dynamic range. The creation of the CDR content includes grading a source content for a minimum dynamic range and a maximum dynamic range, and defining a luminance of each pixel of an image or video frame of the source content as a continuous function between the minimum and the maximum dynamic ranges. In additional implementations, a novel graphical user interface for creating and editing the CDR content is described.Type: GrantFiled: May 2, 2019Date of Patent: October 5, 2021Assignee: Disney Enterprises, Inc.Inventors: Aljoscha Smolic, Alexandre Chapiro, Simone Croci, Tunc Ozan Aydin, Nikolce Stefanoski, Markus Gross
-
Patent number: 11019302Abstract: Methods for encoding and decoding high-dynamic range signals are presented. The signals are encoded in a high frame rate and are accompanied by frame-rate conversion metadata defining a preferred set of frame-rate down-conversion parameters, which are determined according to the maximum luminance of a target display, display playback priority modes, or judder control modes. A decoder uses the frame-rate conversion metadata to apply frame-rate down-conversion to the input high-frame-rate signal according to at least the maximum luminance of the target display and/or the characteristics of the signal itself. Frame-based and pixel-based frame-rate conversions, and judder models for judder control via metadata are also discussed.Type: GrantFiled: September 27, 2018Date of Patent: May 25, 2021Assignee: Dolby Laboratories Licensing CorporationInventors: Jaclyn Anne Pytlarz, Robin Atkins, Elizabeth G. Pieri, Alexandre Chapiro, Scott Daly
-
Publication number: 20200372605Abstract: Peripheral-vision expanded images are streamed to a video streaming client. The peripheral-vision expanded images are generated from source images in reference to view directions of the viewer at respective time points. View direction data is collected and received in real time while the viewer is viewing display images derived from the peripheral-vision expanded images. A second peripheral-vision expanded image is generated from a second source image in reference to a second view direction of the viewer at a second time point. The second peripheral-vision expanded image has a focal-vision image portion covering the second view direction of the viewer and a peripheral-vision image portion outside the focal-vision image portion. The second peripheral-vision expanded image is transmitted to the video streaming client.Type: ApplicationFiled: August 10, 2020Publication date: November 26, 2020Inventors: Alexandre Chapiro, Chaitanya Atluru, Chun Chi Wan, Haricharan Lakshman, William Rozzi, Shane Ruggieri, Ajit Ninan
-
Patent number: 10769754Abstract: Peripheral-vision expanded images are streamed to a video streaming client. The peripheral-vision expanded images are generated from source images in reference to view directions of the viewer at respective time points. View direction data is collected and received in real time while the viewer is viewing display images derived from the peripheral-vision expanded images. A second peripheral-vision expanded image is generated from a second source image in reference to a second view direction of the viewer at a second time point. The second peripheral-vision expanded image has a focal-vision image portion covering the second view direction of the viewer and a peripheral-vision image portion outside the focal-vision image portion. The second peripheral-vision expanded image is transmitted to the video streaming client.Type: GrantFiled: October 22, 2019Date of Patent: September 8, 2020Assignee: Dolby Laboratories Licensing CorporationInventors: Alexandre Chapiro, Chaitanya Atluru, Chun Chi Wan, Haricharan Lakshman, William Rozzi, Shane Ruggieri, Ajit Ninan
-
Publication number: 20200275050Abstract: Methods for encoding and decoding high-dynamic range signals are presented. The signals are encoded in a high frame rate and are accompanied by frame-rate conversion metadata defining a preferred set of frame-rate down-conversion parameters, which are determined according to the maximum luminance of a target display, display playback priority modes, or judder control modes. A decoder uses the frame-rate conversion metadata to apply frame-rate down-conversion to the input high-frame-rate signal according to at least the maximum luminance of the target display and/or the characteristics of the signal itself. Frame-based and pixel-based frame-rate conversions, and judder models for judder control via metadata are also discussed.Type: ApplicationFiled: September 27, 2018Publication date: August 27, 2020Applicant: Dolby Laboratories Licensing CorporationInventors: Jaclyn Anne PYTLARZ, Robin ATKINS, Elizabeth G. PIERI, Alexandre CHAPIRO, Scott DALY
-
Publication number: 20200202814Abstract: One or more media contents are received. A viewer's light adaptive states are predicted as a function of time as if the viewer is watching display mapped images derived from the one or more media contents. The viewer's light adaptive states are used to detect an excessive change in luminance in a specific media content portion of the one or more media contents. The excessive change in luminance in the specific media content portion of the one or more media contents is caused to be reduced while the viewer is watching one or more corresponding display mapped images derived from the specific media content portion of the one or more media contents.Type: ApplicationFiled: December 17, 2019Publication date: June 25, 2020Applicant: Dolby Laboratories Licensing CorporationInventors: Alexandre CHAPIRO, Robin ATKINS, Scott DALY
-
Publication number: 20200134780Abstract: Peripheral-vision expanded images are streamed to a video streaming client. The peripheral-vision expanded images are generated from source images in reference to view directions of the viewer at respective time points. View direction data is collected and received in real time while the viewer is viewing display images derived from the peripheral-vision expanded images. A second peripheral-vision expanded image is generated from a second source image in reference to a second view direction of the viewer at a second time point. The second peripheral-vision expanded image has a focal-vision image portion covering the second view direction of the viewer and a peripheral-vision image portion outside the focal-vision image portion. The second peripheral-vision expanded image is transmitted to the video streaming client.Type: ApplicationFiled: October 22, 2019Publication date: April 30, 2020Applicant: Dolby Laboratories Licensing CorporationInventors: Alexandre Chapiro, Chaitanya Atluru, Chun Chi Wan, Haricharan Lakshman, William Rozzi, Shane Ruggieri, Ajit Ninan
-
Publication number: 20190261049Abstract: Novel systems and methods are described for creating, compressing, and distributing video or image content graded for a plurality of displays with different dynamic ranges. In implementations, the created content is “continuous dynamic range” (CDR) content—a novel representation of pixel-luminance as a function of display dynamic range. The creation of the CDR content includes grading a source content for a minimum dynamic range and a maximum dynamic range, and defining a luminance of each pixel of an image or video frame of the source content as a continuous function between the minimum and the maximum dynamic ranges. In additional implementations, a novel graphical user interface for creating and editing the CDR content is described.Type: ApplicationFiled: May 2, 2019Publication date: August 22, 2019Inventors: Aljoscha Smolic, Alexandre Chapiro, Simone Croci, Tunc Ozan Aydin, Nikolce Stefanoski, Markus Gross
-
Patent number: 10349127Abstract: Novel systems and methods are described for creating, compressing, and distributing video or image content graded for a plurality of displays with different dynamic ranges. In implementations, the created content is “continuous dynamic range” (CDR) content—a novel representation of pixel-luminance as a function of display dynamic range. The creation of the CDR content includes grading a source content for a minimum dynamic range and a maximum dynamic range, and defining a luminance of each pixel of an image or video frame of the source content as a continuous function between the minimum and the maximum dynamic ranges. In additional implementations, a novel graphical user interface for creating and editing the CDR content is described.Type: GrantFiled: September 22, 2015Date of Patent: July 9, 2019Assignees: Disney Enterprises, Inc., Eidgenoessische Technische Hochschule Zurich (ETH Zurich)Inventors: Aljoscha Smolic, Alexandre Chapiro, Simone Croci, Tunc Ozan Aydin, Nikolce Stefanoski, Markus Gross