Patents by Inventor Randy M. Fayan

Randy M. Fayan has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11017751
    Abstract: Playback of a graphical representation of a digital musical score is synchronized with an expressive audio rendering of the score that contains tempo and dynamics beyond those specified in the score. The method involves determining a set of offsets for occurrences of score events in the audio rendering by comparing and temporally aligning audio waveforms of successive subclips of the audio rendering with corresponding audio waveforms of successive subclips of an audio rendering synthesized directly from the score. Tempos and dynamics of human performances may be extracted and used to generate expressive renderings synthesized from the corresponding digital score. This enables parties who wish to distribute or share music scores, such as composers and publishers, to allow prospective licensors to evaluate the score by listening to an expressive musical recording instead of a mechanically synthesized rendering.
    Type: Grant
    Filed: October 15, 2019
    Date of Patent: May 25, 2021
    Assignee: Avid Technology, Inc.
    Inventor: Randy M. Fayan
  • Publication number: 20210110803
    Abstract: Playback of a graphical representation of a digital musical score is synchronized with an expressive audio rendering of the score that contains tempo and dynamics beyond those specified in the score. The method involves determining a set of offsets for occurrences of score events in the audio rendering by comparing and temporally aligning audio waveforms of successive subclips of the audio rendering with corresponding audio waveforms of successive subclips of an audio rendering synthesized directly from the score. Tempos and dynamics of human performances may be extracted and used to generate expressive renderings synthesized from the corresponding digital score. This enables parties who wish to distribute or share music scores, such as composers and publishers, to allow prospective licensors to evaluate the score by listening to an expressive musical recording instead of a mechanically synthesized rendering.
    Type: Application
    Filed: October 15, 2019
    Publication date: April 15, 2021
    Inventor: Randy M. Fayan
  • Patent number: 10460712
    Abstract: Playback of a graphical representation of a digital musical score is synchronized with an expressive audio rendering of the score that contains tempo and dynamics beyond those specified in the score. The method involves determining a set of offsets for occurrences of score events in the audio rendering by comparing and temporally aligning audio waveforms of successive subclips of the audio rendering with corresponding audio waveforms of successive subclips of an audio rendering synthesized directly from the score. Tempos and dynamics of human performances may be extracted and used to generate expressive renderings synthesized from the corresponding digital score. This enables parties who wish to distribute or share music scores, such as composers and publishers, to allow prospective licensors to evaluate the score by listening to an expressive musical recording instead of a mechanically synthesized rendering.
    Type: Grant
    Filed: March 12, 2019
    Date of Patent: October 29, 2019
    Inventor: Randy M. Fayan
  • Patent number: 7545957
    Abstract: In calculating motion between two images, a single channel image may be generated for each image based on measurement of a desired characteristic of those images. Given a desired characteristic (such as edge strength or edge magnitude) in an image, a function measures the strength of the desired characteristic in a region around a pixel in an image. A range of values can represent the likelihood, or measure of confidence, of the occurrence of the desired characteristic in the region around the pixel. Thus, each pixel in the single channel image has a value from the range of values that is determined according to a function. This function operates on a neighborhood in the input image that corresponds to the pixel in the single channel image, and measures the likelihood of occurrence of, or strength of, the desired characteristic in that neighborhood.
    Type: Grant
    Filed: April 20, 2001
    Date of Patent: June 9, 2009
    Assignee: Avid Technology, Inc.
    Inventors: Katherine H. Cornog, Randy M. Fayan
  • Patent number: 7280117
    Abstract: A keyer is provided with a graphical user interface that helps a user visualize the relationship between the key and the image to be processed using that key. A color space swatch is processed by the keyer using the defined key. The output of the keyer as applied to the color space swatch is displayed to the user to illustrate which colors match the defined key. The alpha matte generated by applying the key to the color space swatch also may be displayed. Each pixel in either the color space swatch or the alpha matte generated by applying the key to the color space swatch may be modified to indicate whether its corresponding color is present in the input image or in the preprocessed input image. Luminance processing also may be applied. In particular, the alpha value for a pixel may be adjusted according to the luminance of the pixel according to a user defined function. The alpha matte displayed to the user may include the effects of such luminance processing.
    Type: Grant
    Filed: April 22, 2005
    Date of Patent: October 9, 2007
    Assignee: Avid Technology, Inc.
    Inventor: Randy M. Fayan
  • Patent number: 7194676
    Abstract: A retiming function that defines a rampable retiming effect is used to generate new audio and video samples at appropriate output times. In particular, for each output time, a corresponding input time is determined from the output time by using the retiming function. The retiming function may be a speed curve, a position curve that maps output times to input times directly or a mapping defining correspondence times between points in the video data and points in the audio data. An output sample is computed for the output time based on at least the data in the neighborhood of the corresponding input time, using a resampling function for the type of media data. Synchronization is achieved by ensuring that the input times determined to correspond to output times for video samples correspond to the input times determined to correspond to the same output times for audio samples.
    Type: Grant
    Filed: March 1, 2002
    Date of Patent: March 20, 2007
    Assignee: Avid Technology, Inc.
    Inventors: Randy M. Fayan, Katherine H. Cornog
  • Patent number: 7103231
    Abstract: Two images are analyzed to compute a set of motion vectors that describes motion between the first and second images. A motion vector is computed for each pixel in an image at a time between the first and second images. This set of motion vectors may be defined at any time between the first and second images, such as the midpoint. The motion vectors may be computed using any of several techniques. An example technique is based on the constant brightness constraint, also referred to as optical flow. Each vector is specified at a pixel center in an image defined at the time between the first and second images. The vectors may point to points in the first and second images that are not on pixel centers. The motion vectors are used to warp the first and second images to a point in time of an output image between the first and second images using a factor that represents the time between the first and second image at which the output image occurs.
    Type: Grant
    Filed: November 3, 2003
    Date of Patent: September 5, 2006
    Assignee: Avid Technology, Inc.
    Inventors: Katherine H. Cornog, Garth A. Dickie, Peter J. Fasciano, Randy M. Fayan, Robert A. Gonsalves
  • Patent number: 7043058
    Abstract: Visibie artifacts in images created using image processing based on motion vector maps may be reduced by providing one or more mechanisms for correcting the vector map. In general, the set of motion vectors is changed by selecting one or more portions of the image. The vectors corresponding to the selected one or more portions are modified. Various image processing operations, such as motion compensated interpolation, may be performed using the changed set of motion vectors.
    Type: Grant
    Filed: April 20, 2001
    Date of Patent: May 9, 2006
    Assignee: Avid Technology, Inc.
    Inventors: Katherine H. Cornog, Randy M. Fayan, Garth Dickie
  • Publication number: 20040091170
    Abstract: Two images are analyzed to compute a set of motion vectors that describes motion between the first and second images. A motion vector is computed for each pixel in an image at a time between the first and second images. This set of motion vectors may be defined at any time between the first and second images, such as the midpoint. The motion vectors may be computed using any of several techniques. An example technique is based on the constant brightness constraint, also referred to as optical flow. Each vector is specified at a pixel center in an image defined at the time between the first and second images. The vectors may point to points in the first and second images that are not on pixel centers. The motion vectors are used to warp the first and second images to a point in time of an output image between the first and second images using a factor that represents the time between the first and second image at which the output image occurs.
    Type: Application
    Filed: November 3, 2003
    Publication date: May 13, 2004
    Inventors: Katherine H. Cornog, Garth A. Dickie, Peter J. Fasciano, Randy M. Fayan, Robert A. Gonsalves
  • Patent number: 6665450
    Abstract: Two images are analyzed to compute a set of motion vectors that describes motion between the first and second images. A motion vector is computed for each pixel in an image at a time between the first and second images. This set of motion vectors may be defined at any time between the first and second images, such as the midpoint. The motion vectors may be computed using any of several techniques. An example technique is based on the constant brightness constraint, also referred to as optical flow. Each vector is specified at a pixel center in an image defined at the time between the first and second images. The vectors may point to points in the first and second images that are not on pixel centers. The motion vectors are used to warp the first and second images to a point in time of an output image between the first and second images using a factor that represents the time between the first and second image at which the output image occurs.
    Type: Grant
    Filed: September 8, 2000
    Date of Patent: December 16, 2003
    Assignee: Avid Technology, Inc.
    Inventors: Katherine H. Cornog, Garth A. Dickie, Peter J. Fasciano, Randy M. Fayan
  • Publication number: 20030164845
    Abstract: A retiming function that defines a rampable retiming effect is used to generate new audio and video samples at appropriate output times. In particular, for each output time, a corresponding input time is determined from the output time by using the retiming function. The retiming function may be a speed curve, a position curve that maps output times to input times directly or a mapping defining correspondence times between points in the video data and points in the audio data. An output sample is computed for the output time based on at least the data in the neighborhood of the corresponding input time, using a resampling function for the type of media data. Synchronization is achieved by ensuring that the input times determined to correspond to output times for video samples correspond to the input times determined to correspond to the same output times for audio samples.
    Type: Application
    Filed: March 1, 2002
    Publication date: September 4, 2003
    Inventors: Randy M. Fayan, Katherine H. Cornog
  • Patent number: 6570624
    Abstract: Two images are analyzed to compute a set of motion vectors that describes motion between the first and second images. A motion vector is computed for each pixel in an image at a time between the first and second images. This set of motion vectors may be defined at any time between the first and second images, such as the midpoint. The motion vectors may be computed using any of several techniques. An example technique is based on the constant brightness constraint, also referred to as optical flow. Each vector is specified at a pixel center in an image defined at the time between the first and second images. The vectors may point to points in the first and second images that are not on pixel centers. The motion vectors are used to warp the first and second images to a point in time of an output image between the first and second images using a factor that represents the time between the first and second image at which the output image occurs.
    Type: Grant
    Filed: April 20, 2001
    Date of Patent: May 27, 2003
    Assignee: Avid Technology, Inc.
    Inventors: Katherine H. Cornog, Garth A. Dickie, Peter J. Fasciano, Randy M. Fayan, Robert A. Gonsalves, Michael Laird
  • Publication number: 20030035592
    Abstract: Two images are analyzed to compute a set of motion vectors that describes motion between the first and second images. A motion vector is computed for each pixel in an image at a time between the first and second images. This set of motion vectors may be defined at any time between the first and second images, such as the midpoint. The motion vectors may be computed using any of several techniques. An example technique is based on the constant brightness constraint, also referred to as optical flow. Each vector is specified at a pixel center in an image defined at the time between the first and second images. The vectors may point to points in the first and second images that are not on pixel centers. The motion vectors are used to warp the first and second images to a point in time of an output image between the first and second images using a factor that represents the time between the first and second image at which the output image occurs.
    Type: Application
    Filed: April 20, 2001
    Publication date: February 20, 2003
    Inventors: Katherine H. Cornog, Garth A. Dickie, Peter J. Fasciano, Randy M. Fayan, Robert A. Gonsalves, Michael Laird
  • Publication number: 20020154792
    Abstract: In calculating motion between two images, a single channel image may be generated for each image based on measurement of a desired characteristic of those images. Given a desired characteristic (such as edge strength or edge magnitude) in an image, a function measures the strength of the desired characteristic in a region around a pixel in an image. A range of values can represent the likelihood, or measure of confidence, of the occurrence of the desired characteristic in the region around the pixel. Thus, each pixel in the single channel image has a value from the range of values that is determined according to a function. This function operates on a neighborhood in the input image that corresponds to the pixel in the single channel image, and measures the likelihood of occurrence of, or strength of, the desired characteristic in that neighborhood.
    Type: Application
    Filed: April 20, 2001
    Publication date: October 24, 2002
    Inventors: Katherine H. Cornog, Randy M. Fayan
  • Publication number: 20020154695
    Abstract: Visible artifacts in images created using image processing based on motion vector maps may be reduced by providing one or more mechanisms for correcting the vector map. In general, the set of motion vectors is changed by selecting one or more portions of the image. The vectors corresponding to the selected one or more portions are modified. Various image processing operations, such as motion compensated interpolation, may be performed using the changed set of motion vectors. Various mechanisms for obtaining a changed set of motion vectors may be used separately or combination by a user. A region in an image may be defined. The region may be segmented into foreground and background regions. A tracker then may be used to track either the foreground region or the background region or both. A single motion vector or a parameterized motion model obtained from the tracker may be assigned to the tracked region.
    Type: Application
    Filed: April 20, 2001
    Publication date: October 24, 2002
    Inventors: Katherine H. Cornog, Randy M. Fayan, Garth Dickie