Patents by Inventor Karthik Sathyanarayana

Karthik Sathyanarayana has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11089124
    Abstract: Hybrid pull and push based streaming selectively performs a pull-based distribution of a stream to a first point-of-presence (“PoP”) of a distributed platform having low demand for the stream, and a push-based distribution of the stream to a second PoP of the distributed platform having high demand for the stream. The push-based distribution may be used to prepopulate the second PoP cache with the live stream data as the live stream data is uploaded from an encoder to a source PoP of the distributed platform, and before that live stream data is requested by the second PoP. In doing so, requests for the live stream data received at the second PoP may result in cache hits with the requested live stream data being immediately served from the second PoP cache without having to retrieve the live stream data from outside the second PoP.
    Type: Grant
    Filed: July 19, 2018
    Date of Patent: August 10, 2021
    Assignee: Verizon Digital Media Services Inc.
    Inventors: Karthik Sathyanarayana, Harkeerat Singh Bedi, Sergio Leonardo Ruiz
  • Patent number: 10972515
    Abstract: Server assisted live stream failover involves detecting a manifest of a stream provided by a first source exceeding a staleness quotient before the stream ends, and initiating or triggering failover of the stream from the first source to a second source in response to detecting the stale manifest. A server initiates the failover on behalf of a client, wherein the client requests objects (i.e., manifests and segments) of the stream, the server distributes those objects from at least the first source to the client, and the server detects that a particular requested object has become stale past a staleness quotient. The server indirectly redirects a client from a first source to a second source by passing a message with a 4xx or 5xx code to the client in place of a message with a 3xx code provided by the first source.
    Type: Grant
    Filed: July 31, 2017
    Date of Patent: April 6, 2021
    Assignee: Verizon Digital Media Services Inc.
    Inventors: Karthik Sathyanarayana, Seungyeob Choi
  • Patent number: 10904307
    Abstract: The solution distributes the management of stream segments from a central storage cluster to different edge servers that upload stream segments to and receive stream segments from the central storage cluster. Each edge server tracks the stream segments it has uploaded to the central storage cluster as well as the expiration times for those segments. The tracking is performed without a database using a log file and file system arrangement. First-tier directories are created in the file system for different expiration intervals. Entries under the first-tier directories track individual segments that expire within the expiration interval of the first-tier directory with the file system entries being files or a combination of subdirectories and files. Upon identifying expired stream segments, the edge servers instruct the central storage cluster to delete those stream segments. This removes the management overhead from the central storage cluster and implements the distributed management without a database.
    Type: Grant
    Filed: December 6, 2017
    Date of Patent: January 26, 2021
    Assignee: Verizon Digital Media Services Inc.
    Inventors: Karthik Sathyanarayana, Harkeerat Singh Bedi, Derek Shiell, Robert J. Peters
  • Patent number: 10567493
    Abstract: Some embodiments provide intelligent predictive stream caching for live, linear, or video-on-demand streaming content using prefetching, segmented caching, and request clustering. Prefetching involves retrieving streaming content segments from an origin server prior to the segments being requested by users. Prefetching live or linear streaming content segments involves continually reissuing requests to the origin until the segments are obtained or a preset retry duration is completed. Prefetching is initiated in response to a first request for a segment falling within a particular interval. Request clustering commences thereafter. Subsequent requests are queued until the segments are retrieved. Segmented caching involves caching segments for one particular interval. Segments falling within a next interval are not prefetched until a first request for one such segment in the next interval is received.
    Type: Grant
    Filed: February 20, 2018
    Date of Patent: February 18, 2020
    Assignee: Verizon Digital Media Services Inc.
    Inventors: Jonathan DiVincenzo, Seungyeob Choi, Karthik Sathyanarayana, Robert J. Peters, Eric Dyoniziak
  • Publication number: 20200028927
    Abstract: Hybrid pull and push based streaming selectively performs a pull-based distribution of a stream to a first point-of-presence (“PoP”) of a distributed platform having low demand for the stream, and a push-based distribution of the stream to a second PoP of the distributed platform having high demand for the stream. The push-based distribution may be used to prepopulate the second PoP cache with the live stream data as the live stream data is uploaded from an encoder to a source PoP of the distributed platform, and before that live stream data is requested by the second PoP. In doing so, requests for the live stream data received at the second PoP may result in cache hits with the requested live stream data being immediately served from the second PoP cache without having to retrieve the live stream data from outside the second PoP.
    Type: Application
    Filed: July 19, 2018
    Publication date: January 23, 2020
    Applicant: Verizon Digital Media Services Inc.
    Inventors: Karthik Sathyanarayana, Harkeerat Singh Bedi, Sergio Leonardo Ruiz
  • Publication number: 20190036986
    Abstract: Server assisted live stream failover involves detecting a manifest of a stream provided by a first source exceeding a staleness quotient before the stream ends, and initiating or triggering failover of the stream from the first source to a second source in response to detecting the stale manifest. A server initiates the failover on behalf of a client, wherein the client requests objects (i.e., manifests and segments) of the stream, the server distributes those objects from at least the first source to the client, and the server detects that a particular requested object has become stale past a staleness quotient. The server indirectly redirects a client from a first source to a second source by passing a message with a 4xx or 5xx code to the client in place of a message with a 3xx code provided by the first source.
    Type: Application
    Filed: July 31, 2017
    Publication date: January 31, 2019
    Inventors: Karthik Sathyanarayana, Seungyeob Choi
  • Publication number: 20180176297
    Abstract: Some embodiments provide intelligent predictive stream caching for live, linear, or video-on-demand streaming content using prefetching, segmented caching, and request clustering. Prefetching involves retrieving streaming content segments from an origin server prior to the segments being requested by users. Prefetching live or linear streaming content segments involves continually reissuing requests to the origin until the segments are obtained or a preset retry duration is completed. Prefetching is initiated in response to a first request for a segment falling within a particular interval. Request clustering commences thereafter. Subsequent requests are queued until the segments are retrieved. Segmented caching involves caching segments for one particular interval. Segments falling within a next interval are not prefetched until a first request for one such segment in the next interval is received.
    Type: Application
    Filed: February 20, 2018
    Publication date: June 21, 2018
    Inventors: Jonathan DiVincenzo, Seungyeob Choi, Karthik Sathyanarayana, Robert J. Peters, Eric Dyoniziak
  • Publication number: 20180167434
    Abstract: The solution distributes the management of stream segments from a central storage cluster to different edge servers that upload stream segments to and receive stream segments from the central storage cluster. Each edge server tracks the stream segments it has uploaded to the central storage cluster as well as the expiration times for those segments. The tracking is performed without a database using a log file and file system arrangement. First-tier directories are created in the file system for different expiration intervals. Entries under the first-tier directories track individual segments that expire within the expiration interval of the first-tier directory with the file system entries being files or a combination of subdirectories and files. Upon identifying expired stream segments, the edge servers instruct the central storage cluster to delete those stream segments. This removes the management overhead from the central storage cluster and implements the distributed management without a database.
    Type: Application
    Filed: December 6, 2017
    Publication date: June 14, 2018
    Inventors: Karthik Sathyanarayana, Harkeerat Singh Bedi, Derek Shiell, Robert J. Peters
  • Patent number: 9906590
    Abstract: Some embodiments provide intelligent predictive stream caching for live, linear, or video-on-demand streaming content using prefetching, segmented caching, and request clustering. Prefetching involves retrieving streaming content segments from an origin server prior to the segments being requested by users. Prefetching live or linear streaming content segments involves continually reissuing requests to the origin until the segments are obtained or a preset retry duration is completed. Prefetching is initiated in response to a first request for a segment falling within a particular interval. Request clustering commences thereafter. Subsequent requests are queued until the segments are retrieved. Segmented caching involves caching segments for one particular interval. Segments falling within a next interval are not prefetched until a first request for one such segment in the next interval is received.
    Type: Grant
    Filed: August 20, 2015
    Date of Patent: February 27, 2018
    Assignee: Verizon Digital Media Services Inc.
    Inventors: Jonathan DiVincenzo, Seungyeob Choi, Karthik Sathyanarayana, Robert J. Peters, Eric Dyoniziak
  • Patent number: 9866650
    Abstract: Some embodiments provide a system for simultaneously monitoring a content stream that is streamed using any of a plurality of streaming protocols from different points-of-presence (PoP) from within a distributed platform in real-time without the need for manual visual verification. The system is implemented with different emulation engines, each providing client-side player emulation for a different streaming protocol. The client-side player emulation involves requesting and downloading content stream chunks from a specified PoP according to the streaming protocol that is used by the distributed platform to stream the content stream under test. As part of the emulation, each instance inspects the downloaded chunks without decoding or rendering in order to track real-time performance and any errors in the server-side transmission of the content stream under test.
    Type: Grant
    Filed: December 3, 2014
    Date of Patent: January 9, 2018
    Assignee: Verizon Digital Media Services Inc.
    Inventors: Karthik Sathyanarayana, Jonathan DiVincenzo
  • Patent number: 9755945
    Abstract: Some embodiments dynamically test capacity of a streaming server under test (SUT). The dynamic testing involves a test server generating different test scenarios. Each test scenario specifies a mix of different streaming protocols, content streams, and content stream upload to download ratio. The test server tests the SUT with a gradually increasing traffic load from each test scenario while monitoring SUT performance under each load. The test server records each load from each test scenario under which the SUT becomes saturated. The test server produces a grid mapping the observed SUT saturation points to the test loads that caused them. The grid is used when the SUT is deployed to a production environment to determine if SUT saturation is imminent based on current traffic patterns being serviced by the SUT in the production environment. If so, a remedial action is dynamically performed to prevent the saturation from occurring.
    Type: Grant
    Filed: April 1, 2015
    Date of Patent: September 5, 2017
    Assignee: Verizon Digital Media Services Inc.
    Inventors: Karthik Sathyanarayana, Jonathan DiVincenzo, Jie Feng
  • Publication number: 20170054800
    Abstract: Some embodiments provide intelligent predictive stream caching for live, linear, or video-on-demand streaming content using prefetching, segmented caching, and request clustering. Prefetching involves retrieving streaming content segments from an origin server prior to the segments being requested by users. Prefetching live or linear streaming content segments involves continually reissuing requests to the origin until the segments are obtained or a preset retry duration is completed. Prefetching is initiated in response to a first request for a segment falling within a particular interval. Request clustering commences thereafter. Subsequent requests are queued until the segments are retrieved. Segmented caching involves caching segments for one particular interval. Segments falling within a next interval are not prefetched until a first request for one such segment in the next interval is received.
    Type: Application
    Filed: August 20, 2015
    Publication date: February 23, 2017
    Inventors: Jonathan DiVincenzo, Seungyeob Choi, Karthik Sathyanarayana, Robert J. Peters, Eric Dyoniziak
  • Publication number: 20160292058
    Abstract: Some embodiments dynamically test capacity of a streaming server under test (SUT). The dynamic testing involves a test server generating different test scenarios. Each test scenario specifies a mix of different streaming protocols, content streams, and content stream upload to download ratio. The test server tests the SUT with a gradually increasing traffic load from each test scenario while monitoring SUT performance under each load. The test server records each load from each test scenario under which the SUT becomes saturated. The test server produces a grid mapping the observed SUT saturation points to the test loads that caused them. The grid is used when the SUT is deployed to a production environment to determine if SUT saturation is imminent based on current traffic patterns being serviced by the SUT in the production environment. If so, a remedial action is dynamically performed to prevent the saturation from occurring.
    Type: Application
    Filed: April 1, 2015
    Publication date: October 6, 2016
    Inventors: Karthik Sathyanarayana, Jonathan DiVincenzo, Jie Feng
  • Publication number: 20160164761
    Abstract: Some embodiments provide a system for simultaneously monitoring a content stream that is streamed using any of a plurality of streaming protocols from different points-of-presence (PoP) from within a distributed platform in real-time without the need for manual visual verification. The system is implemented with different emulation engines, each providing client-side player emulation for a different streaming protocol. The client-side player emulation involves requesting and downloading content stream chunks from a specified PoP according to the streaming protocol that is used by the distributed platform to stream the content stream under test. As part of the emulation, each instance inspects the downloaded chunks without decoding or rendering in order to track real-time performance and any errors in the server-side transmission of the content stream under test.
    Type: Application
    Filed: December 3, 2014
    Publication date: June 9, 2016
    Inventors: Karthik Sathyanarayana, Jonathan DiVincenzo