Patents by Inventor Charles D. Ebersol

Charles D. Ebersol has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200267441
    Abstract: An embodiment of a process for providing a customized composite video feed at a client device includes receiving a background video feed from a remote server, receiving (via the communications interface) content associated with one or more user-specific characteristics, and determining one or more data elements based at least in part on the received content. The process includes generating a composite video feed customized to the one or more user-specific characteristics including by matching at least corresponding portions of the one or more data elements to corresponding portions of the background video feed, and displaying the composite video feed on a display device of the client device.
    Type: Application
    Filed: March 13, 2020
    Publication date: August 20, 2020
    Inventors: Erik Schwartz, Michael Naquin, Grygorii Shcherbiak, Kristopher Hanes, Charles D. Ebersol
  • Publication number: 20200236288
    Abstract: A process to partition a video feed to segment live player activity includes receiving, on a first recurring basis, a transmission of a central video feed from a first camera. The central video feed is calibrated against a spatial region represented in at least two dimensions that is encompassed by the central video feed. The process includes receiving, on a second recurring basis, a respective time-stamped position information from each tracking device in a plurality of tracking devices. Each tracking device is worn by a corresponding subject on the spatial region and transmits positional information that describes a time-stamped position of the corresponding subject in the spatial region. The process uses the received information and the calibration to define a first sub-view of the central video feed associated with a first subject. The first sub-view comprises a corresponding sub-frame associated with the first subject.
    Type: Application
    Filed: January 20, 2020
    Publication date: July 23, 2020
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20200230502
    Abstract: A process to track and provide positional information for display on a remote device includes receiving, on a first recurring basis, time-stamped position information of participant(s) in a competition. The time-stamped position information is captured by a telemetry tracking system during the competition and describes a time-stamped position of each of one or more corresponding participants in a predetermined spatial region. The process includes communicating at least a subset of time-stamped position information to the remote device on a second recurring basis. The remote device uses the time-stamped position information to overlay a representation of one or more of said participants onto a first virtual reproduction of at least a relevant portion of the predetermined spatial region to produce and display an instance of a compiled virtual scene that shows a respective position of each of the participant(s) on the first virtual reproduction of the predetermined spatial region.
    Type: Application
    Filed: January 20, 2020
    Publication date: July 23, 2020
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20200230501
    Abstract: In an embodiment, a process to predict an outcome of a competition includes receiving time-stamped position information of participant(s), the time-stamped position information captured by a telemetry tracking system during the competition. The process includes calculating while the competition is ongoing a covariate parameter for each of one or more participants at a point in time, where each respective covariate parameter is derived from the time-stamped position information of a corresponding participant at the point in time. The process includes predicting the outcome of the competition, as of the point in time, based at least in part on (i) a difference between a calculated competitor strength of the first competitor the second competitor based on historical data associated with the competitors, and (ii) the calculated first covariate parameter(s).
    Type: Application
    Filed: January 20, 2020
    Publication date: July 23, 2020
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20200234543
    Abstract: In an embodiment, a process to predict a probability of a future event occurring in a present competition includes receiving time-stamped position information of one or more participants in the present competition. The time-stamped position information is captured by a telemetry tracking system during the present competition. The process uses the time-stamped position information to determine a first play situation of the present competition. The process determines, based on at least the first play situation and playing data associated with at least a subset of one or both of a first set of one or more participants and a second set of one or more participants, a prediction of the probability of a first future event occurring at the present competition.
    Type: Application
    Filed: January 20, 2020
    Publication date: July 23, 2020
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol