Patents by Inventor Ozgur TASDIZEN

Ozgur TASDIZEN has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11501571
    Abstract: A method comprising the steps of obtaining at least one frame of input data from at least one sensor, the frame of input data representative of a real-world environment at a given time. The frame is analysed to determine at least a foveal region within the frame, and at least a method for generating depth information associated with the real-world environment based on the frame of input data is selected. The method is applied to the foveal region to generate depth information associated with the foveal region, and at least the depth information associated with the foveal region is outputted.
    Type: Grant
    Filed: October 2, 2020
    Date of Patent: November 15, 2022
    Assignee: Arm Limited
    Inventor: Ozgur Tasdizen
  • Patent number: 11082628
    Abstract: Examples of the present disclosure relate to a method for reducing artefacts caused by the presence of flicker during capture of a video. A sequence of frames of the video are captured, the frames each comprising a plurality of predefined regions each comprising a plurality of pixels. A time-varying oscillation of the flicker is characterized based on variations, across the sequence of frames, of data relating to pixel intensities in at least one said region. Based on the characterizing of the time-varying oscillation of the flicker, a flicker correction is applied to a frame of the video.
    Type: Grant
    Filed: November 30, 2018
    Date of Patent: August 3, 2021
    Assignees: Apical Ltd, ARM Limited
    Inventors: Ozgur Tasdizen, Alexey Kornienko
  • Publication number: 20210142031
    Abstract: A method comprising the steps of obtaining at least one frame of input data from at least one sensor, the frame of input data representative of a real-world environment at a given time. The frame is analysed to determine at least a foveal region within the frame, and at least a method for generating depth information associated with the real-world environment based on the frame of input data is selected. The method is applied to the foveal region to generate depth information associated with the foveal region, and at least the depth information associated with the foveal region is outputted.
    Type: Application
    Filed: October 2, 2020
    Publication date: May 13, 2021
    Inventor: Ozgur TASDIZEN
  • Patent number: 10893297
    Abstract: Examples of the present disclosure relate to methods for processing image data. In one such example, data elements are received defining a portion of a line of pixels of an image, the image comprising one or more lines of pixels definable by one or more respective sets of data elements. In some cases, a transform operation is performed on the data elements to obtain a plurality of binary transform coefficients, wherein the transform operation is performed independently of data elements defining any other line of pixels. The plurality of transform coefficients is encoded as a sequence of tiered bit-layers, each bit-layer in the sequence of bit-layers comprising a set of bits corresponding to a given bit position in each of the plurality of transform coefficients. The encoded plurality of transform coefficients is output.
    Type: Grant
    Filed: March 22, 2018
    Date of Patent: January 12, 2021
    Assignee: Apical Ltd.
    Inventors: Ozgur Tasdizen, Evren Cesur
  • Patent number: 10861167
    Abstract: A graphics processing system includes a processing circuit operable to render or decode a sequence of frames and generate extrapolated frames by extrapolating object motion from rendered or decoded frames. The system also includes a processing circuit operable to extrapolate object motion from first and second rendered or decoded frames in the sequence to a later extrapolated frame. The processing circuit is also operable to test candidate motion vectors from a region of the extrapolated frame through a region of the first frame to a region of the second frame by comparing the region of the first frame with the region of the second frame. A similarity measure from the comparison is used to select a motion vector and an indication representative of the selected motion vector is stored.
    Type: Grant
    Filed: February 19, 2019
    Date of Patent: December 8, 2020
    Assignee: Arm Limited
    Inventor: Ozgur Tasdizen
  • Publication number: 20200265585
    Abstract: A graphics processing system includes a processing circuit operable to render or decode a sequence of frames and generate extrapolated frames by extrapolating object motion from rendered or decoded frames. The system also includes a processing circuit operable to extrapolate object motion from first and second rendered or decoded frames in the sequence to a later extrapolated frame. The processing circuit is also operable to test candidate motion vectors from a region of the extrapolated frame through a region of the first frame to a region of the second frame by comparing the region of the first frame with the region of the second frame. A similarity measure from the comparison is used to select a motion vector and an indication representative of the selected motion vector is stored.
    Type: Application
    Filed: February 19, 2019
    Publication date: August 20, 2020
    Applicant: Arm Limited
    Inventor: Ozgur Tasdizen
  • Publication number: 20190297353
    Abstract: Examples of the present disclosure relate to methods for processing image data. In one such example, data elements are received defining a portion of a line of pixels of an image, the image comprising one or more lines of pixels definable by one or more respective sets of data elements. In some cases, a transform operation is performed on the data elements to obtain a plurality of binary transform coefficients, wherein the transform operation is performed independently of data elements defining any other line of pixels. The plurality of transform coefficients is encoded as a sequence of tiered bit-layers, each bit-layer in the sequence of bit-layers comprising a set of bits corresponding to a given bit position in each of the plurality of transform coefficients. The encoded plurality of transform coefficients is output.
    Type: Application
    Filed: March 22, 2018
    Publication date: September 26, 2019
    Inventors: Ozgur TASDIZEN, Evren CESUR
  • Publication number: 20190166298
    Abstract: Examples of the present disclosure relate to a method for reducing artefacts caused by the presence of flicker during capture of a video. A sequence of frames of the video are captured, the frames each comprising a plurality of predefined regions each comprising a plurality of pixels. A time-varying oscillation of the flicker is characterized based on variations, across the sequence of frames, of data relating to pixel intensities in at least one said region. Based on the characterizing of the time-varying oscillation of the flicker, a flicker correction is applied to a frame of the video.
    Type: Application
    Filed: November 30, 2018
    Publication date: May 30, 2019
    Inventors: Ozgur TASDIZEN, Alexey KORNIENKO