Patents by Inventor Robert A'Court

Robert A'Court has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20170155733
    Abstract: A method for caching a request-response transaction is disclosed. In an embodiment, the method involves identifying a predicted request corresponding to a request a client portion of an application is likely to make of a server portion of the application over a network, assigning a request category to the predicted request based on the extent the predicted request may affect state of the application if the prediction underlying the predicted request is in error, assigning an exception to the predicted request based on the request category of the predicted request, modifying the predicted request based on the exception to the predicted request, and using the modified predicted request for predictive caching.
    Type: Application
    Filed: November 28, 2016
    Publication date: June 1, 2017
    Applicant: DATA ACCELERATOR LTD
    Inventors: Matthew P. Clothier, Robert A'Court
  • Publication number: 20170155735
    Abstract: In an embodiment, a method for creating a cache by predicting database requests by an application and storing responses to the database requests is discloses. The method involves identifying a networked application having a client portion and a server portion coupled to the client portion over a network characterized by a first latency, identifying a database used to store activity related to the networked application, predicting requests the networked application is likely to make using the database, predicting responses to the requests, creating a cache having the requests and/or the responses stored therein, and providing the cache to a predictive cache engine coupled to the client portion of the networked application by a computer-readable medium that has a second latency less than the first latency.
    Type: Application
    Filed: November 28, 2016
    Publication date: June 1, 2017
    Applicant: DATA ACCELERATOR LTD
    Inventors: Matthew P. Clothier, Robert A'Court
  • Publication number: 20170154265
    Abstract: A method for creating a cache by predicting database requests by an application and storing responses to the database requests is disclosed. In an embodiment, the method involves identifying a networked application having a client portion and a server portion coupled to the client portion over a network characterized by a first latency, identifying a database used to store activity related to the networked application is identified, identifying a request-response context of the networked application, using the request-response context to predict requests the networked application is likely to make using the database, using the request-response context to predict responses to the requests, creating a cache having the requests and/or the responses stored therein, and providing the cache to a predictive cache engine coupled to the client portion of the networked application by a computer-readable medium that has a second latency less than the first latency.
    Type: Application
    Filed: November 28, 2016
    Publication date: June 1, 2017
    Applicant: DATA ACCELERATOR LTD
    Inventors: Matthew P. Clothier, Robert A'Court
  • Publication number: 20150271009
    Abstract: A technique involves placing a data acceleration engine between an end user device and a host device. The host device provides data associated with a client application to the data acceleration engine, which provides the data to the end user device. If the data acceleration engine is on the host device, content from a datastore is served to the data acceleration engine as if the data acceleration engine were a client running the client application locally; therefore, latency normally associated with a network between the content datastore and the client device is eliminated. If the data acceleration engine is on the end user device and has received at least some data in advance of a relevant query, responses to the query also do not have latency associated with a network. The data acceleration engine can be implemented as a series of data acceleration engines between end user and host devices.
    Type: Application
    Filed: August 16, 2013
    Publication date: September 24, 2015
    Applicant: DATA ACCELERATOR LTD.
    Inventors: Matthew P. Clothier, Sean P. Corbett, Edward Philip Edwin Elliott, Martin Kirkby, Robert A'Court, Andrew McNeil
  • Publication number: 20140052848
    Abstract: A technique involves placing a data acceleration engine between an end user device and a host device. The host device provides data associated with a client application to the data acceleration engine, which provides the data to the end user device. If the data acceleration engine is on the host device, content from a datastore is served to the data acceleration engine as if the data acceleration engine were a client running the client application locally; therefore, latency normally associated with a network between the content datastore and the client device is eliminated. If the data acceleration engine is on the end user device and has received at least some data in advance of a relevant query, responses to the query also do not have latency associated with a network. The data acceleration engine can be implemented as a series of data acceleration engines between end user and host devices.
    Type: Application
    Filed: May 16, 2013
    Publication date: February 20, 2014
    Applicant: DATA ACCELERATOR LTD.
    Inventors: Matthew Clothier, Sean Corbett, Edward Elliott, Martin Kirkby, Robert A'Court, Andrew McNeil