Patents by Inventor Jaclyn Anne Pytlarz

Jaclyn Anne Pytlarz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180234704
    Abstract: In a method to improve backwards compatibility when decoding high-dynamic range images coded in a wide color gamut (WCG) space which may not be compatible with legacy color spaces, hue and/or saturation values of images in an image database are computed for both a legacy color space (say, YCbCr-gamma) and a preferred WCG color space (say, IPT-PQ). Based on a cost function, a reshaped color space is computed so that the distance between the hue values in the legacy color space and rotated hue values in the preferred color space is minimized HDR images are coded in the reshaped color space. Legacy devices can still decode standard dynamic range images assuming they are coded in the legacy color space, while updated devices can use color reshaping information to decode HDR images in the preferred color space at full dynamic range.
    Type: Application
    Filed: August 3, 2016
    Publication date: August 16, 2018
    Applicant: Dolby Laboratories Licensing Corporation
    Inventors: Robin ATKINS, Peng YIN, Taoran LU, Jaclyn Anne PYTLARZ
  • Publication number: 20180007374
    Abstract: Downsampled video content is generated in a subsampling color space from linearized video content in the subsampling color space. The linearized video content represents a first spatial dimension, whereas the downsampled video content represents a second spatial dimension lower than the first spatial dimension. Opponent channel data is derived in a transmission color space from the downsampled video content. Output video content is generated from luminance data in the linearized video content and the opponent channel data in the transmission color space. The output video content may be decoded by a downstream recipient device to generate video content in an output color space.
    Type: Application
    Filed: March 24, 2016
    Publication date: January 4, 2018
    Applicant: DOLBY LABORATORIES LICENSING CORPORATION
    Inventors: Robin ATKINS, Jaclyn Anne PYTLARZ
  • Patent number: 9729801
    Abstract: Input video signals characterized by a source electro-optical transfer function (EOTF) are to be blended and displayed on a target display with a target EOTF which is different than the source EOTF. Given an input set of blending parameters, an output set of blending parameters is generated as follows. The input blending parameters are scaled by video signal metrics computed in the target EOTF to generate scaled blending parameters. The scaled blended parameters are mapped back to the source EOTF space to generate mapped blending parameters. Finally the mapped blending parameters are normalized to generate the output blending parameters. An output blended image is generating by blending the input video signals using the output blending parameters. Examples of generating the video signal metrics are also provided.
    Type: Grant
    Filed: September 24, 2015
    Date of Patent: August 8, 2017
    Assignee: Dolby Laboratories Licensing Corporation
    Inventors: Jaclyn Anne Pytlarz, Robin Atkins
  • Publication number: 20160100108
    Abstract: Input video signals characterized by a source electro-optical transfer function (EOTF) are to be blended and displayed on a target display with a target EOTF which is different than the source EOTF. Given an input set of blending parameters, an output set of blending parameters is generated as follows. The input blending parameters are scaled by video signal metrics computed in the target EOTF to generate scaled blending parameters. The scaled blended parameters are mapped back to the source EOTF space to generate mapped blending parameters. Finally the mapped blending parameters are normalized to generate the output blending parameters. An output blended image is generating by blending the input video signals using the output blending parameters. Examples of generating the video signal metrics are also provided.
    Type: Application
    Filed: September 24, 2015
    Publication date: April 7, 2016
    Applicant: Dolby Laboratories Licensing Corporation
    Inventors: Jaclyn Anne Pytlarz, Robin Atkins
  • Patent number: 9230338
    Abstract: A method for merging graphics and high dynamic range video data is disclosed. In a video receiver, a display management process uses metadata to map input video data from a first dynamic range into the dynamic range of available graphics data. The remapped video signal is blended with the graphics data to generate a video composite signal. An inverse display management process uses the metadata to map the video composite signal to an output video signal with the first dynamic range. To alleviate perceptual tone-mapping jumps during video scene changes, a metadata transformer transforms the metadata to transformed so that on a television (TV) receiver metadata values transition smoothly between consecutive scenes. The TV receiver receives the output video signal and the transformed metadata to generate video data mapped to the dynamic range of the TV's display.
    Type: Grant
    Filed: February 26, 2015
    Date of Patent: January 5, 2016
    Assignee: Dolby Laboratories Licensing Corporation
    Inventors: Timo Kunkel, Robin Atkins, Tao Chen, Samir N. Hulyalkar, Jaclyn Anne Pytlarz
  • Publication number: 20150256860
    Abstract: A method for merging graphics and high dynamic range video data is disclosed. In a video receiver, a display management process uses metadata to map input video data from a first dynamic range into the dynamic range of available graphics data. The remapped video signal is blended with the graphics data to generate a video composite signal. An inverse display management process uses the metadata to map the video composite signal to an output video signal with the first dynamic range. To alleviate perceptual tone-mapping jumps during video scene changes, a metadata transformer transforms the metadata to transformed so that on a television (TV) receiver metadata values transition smoothly between consecutive scenes. The TV receiver receives the output video signal and the transformed metadata to generate video data mapped to the dynamic range of the TV's display.
    Type: Application
    Filed: February 26, 2015
    Publication date: September 10, 2015
    Applicant: Dolby Laboratories Licensing Corporation
    Inventors: Timo Kunkel, Robin Atkins, Tao Chen, Samir N. Hulyalkar, Jaclyn Anne Pytlarz