Patents by Inventor Erik Schwartz

Erik Schwartz has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240104793
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Application
    Filed: October 5, 2023
    Publication date: March 28, 2024
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Publication number: 20240085513
    Abstract: A process to partition a video feed to segment live player activity includes receiving, on a first recurring basis, a transmission of a central video feed from a first camera. The central video feed is calibrated against a spatial region represented in at least two dimensions that is encompassed by the central video feed. The process includes receiving, on a second recurring basis, a respective time-stamped position information from each tracking device in a plurality of tracking devices. Each tracking device is worn by a corresponding subject on the spatial region and transmits positional information that describes a time-stamped position of the corresponding subject in the spatial region. The process uses the received information and the calibration to define a first sub-view of the central video feed associated with a first subject. The first sub-view comprises a corresponding sub-frame associated with the first subject.
    Type: Application
    Filed: July 20, 2023
    Publication date: March 14, 2024
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11918912
    Abstract: A process to track and provide positional information for display on a remote device includes receiving, on a first recurring basis, time-stamped position information of participant(s) in a competition. The time-stamped position information is captured by a telemetry tracking system during the competition and describes a time-stamped position of each of one or more corresponding participants in a predetermined spatial region. The process includes communicating at least a subset of time-stamped position information to the remote device on a second recurring basis. The remote device uses the time-stamped position information to overlay a representation of one or more of said participants onto a first virtual reproduction of at least a relevant portion of the predetermined spatial region to produce and display an instance of a compiled virtual scene that shows a respective position of each of the participant(s) on the first virtual reproduction of the predetermined spatial region.
    Type: Grant
    Filed: March 15, 2022
    Date of Patent: March 5, 2024
    Assignee: Infinite Athlete, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20240034878
    Abstract: A polycarbonate composition comprising: 35 to 98 wt % of a poly(carbonate-co-monoarylate ester) comprising aromatic carbonate units, monoaryl carbonate units, or a combination thereof and monoaryl ester units, and optionally aromatic ester units; to less than 50 wt % of a poly(ester) composition comprising greater than 20 to less than 50 wt % of poly(ethylene terephthalate), or 2 to less than 50 wt % of a poly(ester) different from poly(ethylene terephthalate), or a combination of 1-49 wt % of poly(ethylene terephthalate) and 1-49 wt % of a poly(ester) different from poly(ethylene terephthalate); 1 to 50 wt % of a homopolycarbonate, a poly(aliphatic ester-carbonate), or a combination thereof; optionally, 0.001 to 10 wt % of an additive composition; and optionally, 0.5 to 6 wt % of an organophosphorous flame retardant.
    Type: Application
    Filed: December 10, 2021
    Publication date: February 1, 2024
    Inventors: Erik SCHWARTZ, Mark Adrianus Johannes VAN DER MEE
  • Patent number: 11873375
    Abstract: A reinforced polycarbonate composition includes 30-60 wt % of a homopolycarbonate; 5-30 wt % of a poly(carbonate-siloxane); 10-40 wt % of a high heat polycarbonate having a glass transition temperature of 170° C. or higher determined per ASTM D3418 with a 20° C./min heating rate; 1-10 wt % of a phosphorous-containing flame retardant present in amount effective to provide 0.1-1.5 wt % phosphorous; 0.01-0.5 wt % of an anti-drip agent; 5-30 wt % of a reinforcing fiber; and optionally, up to 10 wt % of an additive composition, wherein each amount is based on the total weight of the reinforced polycarbonate composition, which sums to 100 wt %. A molded sample of the polycarbonate composition has a heat deflection temperature greater than 115° C., preferably greater than 125° C., more preferably greater than 130° C., or a flame test rating of V1, preferably V0 as measured according to UL-94 at a thickness of 0.8 millimeter, or at a thickness of 0.6 mm, or at a thickness of 0.4 mm.
    Type: Grant
    Filed: June 25, 2020
    Date of Patent: January 16, 2024
    Assignee: SHPP GLOBAL TECHNOLOGIES B.V.
    Inventors: Erik Schwartz, Sascha Jan ter Horst, Mark Adrianus Johannes van der Mee, Johannes Martinus Dina Goossens, Robert Dirk van de Grampel, Tony Farrell
  • Patent number: 11813529
    Abstract: In an embodiment, a process to predict an outcome of a competition includes receiving time-stamped position information of participant(s), the time-stamped position information captured by a telemetry tracking system during the competition. The process includes calculating while the competition is ongoing a covariate parameter for each of one or more participants at a point in time, where each respective covariate parameter is derived from the time-stamped position information of a corresponding participant at the point in time. The process includes predicting the outcome of the competition, as of the point in time, based at least in part on (i) a difference between a calculated competitor strength of the first competitor the second competitor based on historical data associated with the competitors, and (ii) the calculated first covariate parameter(s).
    Type: Grant
    Filed: March 22, 2022
    Date of Patent: November 14, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11816760
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Grant
    Filed: December 15, 2022
    Date of Patent: November 14, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Publication number: 20230300400
    Abstract: Techniques are disclosed to synchronize content display across multiple devices, e.g., in a same physical location. In various embodiments, a content feed comprising a representation of an event is received. A content stream comprising or derived from the content feed and a synchronization signal that associates each of a plurality of portions of the content stream with a corresponding real-world time at the event depicted in the content stream are generated. The content stream and the synchronization signal are provided to a location via a communication interface. Each of a plurality of devices at the location is configured to use the synchronization signal to render the content stream in a manner that is synchronized with one or more other of the devices.
    Type: Application
    Filed: March 16, 2022
    Publication date: September 21, 2023
    Inventors: Michael Naquin, Erik Schwartz, Charles D. Ebersol, Anne Gerhart
  • Patent number: 11754662
    Abstract: A process to partition a video feed to segment live player activity includes receiving, on a first recurring basis, a transmission of a central video feed from a first camera. The central video feed is calibrated against a spatial region represented in at least two dimensions that is encompassed by the central video feed. The process includes receiving, on a second recurring basis, a respective time-stamped position information from each tracking device in a plurality of tracking devices. Each tracking device is worn by a corresponding subject on the spatial region and transmits positional information that describes a time-stamped position of the corresponding subject in the spatial region. The process uses the received information and the calibration to define a first sub-view of the central video feed associated with a first subject. The first sub-view comprises a corresponding sub-frame associated with the first subject.
    Type: Grant
    Filed: September 3, 2021
    Date of Patent: September 12, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20230122102
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Application
    Filed: December 15, 2022
    Publication date: April 20, 2023
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Publication number: 20230085122
    Abstract: In an embodiment, a process to predict a probability of a future event occurring in a present competition includes receiving time-stamped position information of one or more participants in the present competition. The time-stamped position information is captured by a telemetry tracking system during the present competition. The process uses the time-stamped position information to determine a first play situation of the present competition. The process determines, based on at least the first play situation and playing data associated with at least a subset of one or both of a first set of one or more participants and a second set of one or more participants, a prediction of the probability of a first future event occurring at the present competition.
    Type: Application
    Filed: November 17, 2022
    Publication date: March 16, 2023
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11587266
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Grant
    Filed: July 21, 2021
    Date of Patent: February 21, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Publication number: 20230040708
    Abstract: According to some aspects, methods and systems may include receiving, by a computing device, metadata identifying an event occurring in a video program, and determining an expected motion of objects in the identified event. The methods and systems may further include analyzing motion energy in the video program to identify video frames in which the event occurs, and storing information identifying the video frames in which the event occurs.
    Type: Application
    Filed: August 26, 2022
    Publication date: February 9, 2023
    Inventors: Erik Schwartz, Jan Neumann, Hans Sayyadi, Stefan Deichmann
  • Patent number: 11568713
    Abstract: In an embodiment, a process to predict a probability of a future event occurring in a present competition includes receiving time-stamped position information of one or more participants in the present competition. The time-stamped position information is captured by a telemetry tracking system during the present competition. The process uses the time-stamped position information to determine a first play situation of the present competition. The process determines, based on at least the first play situation and playing data associated with at least a subset of one or both of a first set of one or more participants and a second set of one or more participants, a prediction of the probability of a first future event occurring at the present competition.
    Type: Grant
    Filed: January 20, 2020
    Date of Patent: January 31, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20230026625
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Application
    Filed: July 21, 2021
    Publication date: January 26, 2023
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Publication number: 20220337913
    Abstract: Various implementations described herein are directed to determining variations or changes to a predetermined programming schedule. In accordance with one method, a predicted end time of video content may be determined to be later than a scheduled end time of the video content. Also, it may be determined that video content scheduled to be displayed on a first stream or channel has been moved to a second stream or channel. A scheduled recording or transmission time of video content may be altered based on detected changes to the predetermined programming schedule. A program listing such as an electronic program guide may be revised based on detected changes to the predetermined programming schedule.
    Type: Application
    Filed: November 24, 2021
    Publication date: October 20, 2022
    Inventors: Brian Curtis, Erik Schwartz, Stefan Deichmann, Jan Neumann
  • Patent number: 11461904
    Abstract: According to some aspects, methods and systems may include receiving, by a computing device, metadata identifying an event occurring in a video program, and determining an expected motion of objects in the identified event. The methods and systems may further include analyzing motion energy in the video program to identify video frames in which the event occurs, and storing information identifying the video frames in which the event occurs.
    Type: Grant
    Filed: April 10, 2020
    Date of Patent: October 4, 2022
    Assignee: Comcast Cable Communications, LLC
    Inventors: Erik Schwartz, Jan Neumann, Hans Sayyadi, Stefan Deichmann
  • Publication number: 20220274022
    Abstract: In an embodiment, a process to predict an outcome of a competition includes receiving time-stamped position information of participant(s), the time-stamped position information captured by a telemetry tracking system during the competition. The process includes calculating while the competition is ongoing a covariate parameter for each of one or more participants at a point in time, where each respective covariate parameter is derived from the time-stamped position information of a corresponding participant at the point in time. The process includes predicting the outcome of the competition, as of the point in time, based at least in part on (i) a difference between a calculated competitor strength of the first competitor the second competitor based on historical data associated with the competitors, and (ii) the calculated first covariate parameter(s).
    Type: Application
    Filed: March 22, 2022
    Publication date: September 1, 2022
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20220203241
    Abstract: A process to track and provide positional information for display on a remote device includes receiving, on a first recurring basis, time-stamped position information of participant(s) in a competition. The time-stamped position information is captured by a telemetry tracking system during the competition and describes a time-stamped position of each of one or more corresponding participants in a predetermined spatial region. The process includes communicating at least a subset of time-stamped position information to the remote device on a second recurring basis. The remote device uses the time-stamped position information to overlay a representation of one or more of said participants onto a first virtual reproduction of at least a relevant portion of the predetermined spatial region to produce and display an instance of a compiled virtual scene that shows a respective position of each of the participant(s) on the first virtual reproduction of the predetermined spatial region.
    Type: Application
    Filed: March 15, 2022
    Publication date: June 30, 2022
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11311808
    Abstract: In an embodiment, a process to predict an outcome of a competition includes receiving time-stamped position information of participant(s), the time-stamped position information captured by a telemetry tracking system during the competition. The process includes calculating while the competition is ongoing a covariate parameter for each of one or more participants at a point in time, where each respective covariate parameter is derived from the time-stamped position information of a corresponding participant at the point in time. The process includes predicting the outcome of the competition, as of the point in time, based at least in part on (i) a difference between a calculated competitor strength of the first competitor the second competitor based on historical data associated with the competitors, and (ii) the calculated first covariate parameter(s).
    Type: Grant
    Filed: January 20, 2020
    Date of Patent: April 26, 2022
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol