Patents by Inventor Pawel CZARNECKI

Pawel CZARNECKI has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20240104793
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Application
    Filed: October 5, 2023
    Publication date: March 28, 2024
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Publication number: 20240085513
    Abstract: A process to partition a video feed to segment live player activity includes receiving, on a first recurring basis, a transmission of a central video feed from a first camera. The central video feed is calibrated against a spatial region represented in at least two dimensions that is encompassed by the central video feed. The process includes receiving, on a second recurring basis, a respective time-stamped position information from each tracking device in a plurality of tracking devices. Each tracking device is worn by a corresponding subject on the spatial region and transmits positional information that describes a time-stamped position of the corresponding subject in the spatial region. The process uses the received information and the calibration to define a first sub-view of the central video feed associated with a first subject. The first sub-view comprises a corresponding sub-frame associated with the first subject.
    Type: Application
    Filed: July 20, 2023
    Publication date: March 14, 2024
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11918912
    Abstract: A process to track and provide positional information for display on a remote device includes receiving, on a first recurring basis, time-stamped position information of participant(s) in a competition. The time-stamped position information is captured by a telemetry tracking system during the competition and describes a time-stamped position of each of one or more corresponding participants in a predetermined spatial region. The process includes communicating at least a subset of time-stamped position information to the remote device on a second recurring basis. The remote device uses the time-stamped position information to overlay a representation of one or more of said participants onto a first virtual reproduction of at least a relevant portion of the predetermined spatial region to produce and display an instance of a compiled virtual scene that shows a respective position of each of the participant(s) on the first virtual reproduction of the predetermined spatial region.
    Type: Grant
    Filed: March 15, 2022
    Date of Patent: March 5, 2024
    Assignee: Infinite Athlete, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11813529
    Abstract: In an embodiment, a process to predict an outcome of a competition includes receiving time-stamped position information of participant(s), the time-stamped position information captured by a telemetry tracking system during the competition. The process includes calculating while the competition is ongoing a covariate parameter for each of one or more participants at a point in time, where each respective covariate parameter is derived from the time-stamped position information of a corresponding participant at the point in time. The process includes predicting the outcome of the competition, as of the point in time, based at least in part on (i) a difference between a calculated competitor strength of the first competitor the second competitor based on historical data associated with the competitors, and (ii) the calculated first covariate parameter(s).
    Type: Grant
    Filed: March 22, 2022
    Date of Patent: November 14, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11816760
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Grant
    Filed: December 15, 2022
    Date of Patent: November 14, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Patent number: 11754662
    Abstract: A process to partition a video feed to segment live player activity includes receiving, on a first recurring basis, a transmission of a central video feed from a first camera. The central video feed is calibrated against a spatial region represented in at least two dimensions that is encompassed by the central video feed. The process includes receiving, on a second recurring basis, a respective time-stamped position information from each tracking device in a plurality of tracking devices. Each tracking device is worn by a corresponding subject on the spatial region and transmits positional information that describes a time-stamped position of the corresponding subject in the spatial region. The process uses the received information and the calibration to define a first sub-view of the central video feed associated with a first subject. The first sub-view comprises a corresponding sub-frame associated with the first subject.
    Type: Grant
    Filed: September 3, 2021
    Date of Patent: September 12, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20230127804
    Abstract: A method of fabricating a near net shape component includes forming a sacrificial shell from a pulverant material using an additive manufacturing process, the shell having an aperture. The method further includes filling the shell with a second pulverant material, subjecting the filled shell to a consolidation process, and removing the shell from the consolidated second pulverant material.
    Type: Application
    Filed: October 13, 2022
    Publication date: April 27, 2023
    Inventors: Sergey Mironets, Pawel Czarnecki
  • Publication number: 20230122102
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Application
    Filed: December 15, 2022
    Publication date: April 20, 2023
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Publication number: 20230085122
    Abstract: In an embodiment, a process to predict a probability of a future event occurring in a present competition includes receiving time-stamped position information of one or more participants in the present competition. The time-stamped position information is captured by a telemetry tracking system during the present competition. The process uses the time-stamped position information to determine a first play situation of the present competition. The process determines, based on at least the first play situation and playing data associated with at least a subset of one or both of a first set of one or more participants and a second set of one or more participants, a prediction of the probability of a first future event occurring at the present competition.
    Type: Application
    Filed: November 17, 2022
    Publication date: March 16, 2023
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11587266
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Grant
    Filed: July 21, 2021
    Date of Patent: February 21, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Patent number: 11568713
    Abstract: In an embodiment, a process to predict a probability of a future event occurring in a present competition includes receiving time-stamped position information of one or more participants in the present competition. The time-stamped position information is captured by a telemetry tracking system during the present competition. The process uses the time-stamped position information to determine a first play situation of the present competition. The process determines, based on at least the first play situation and playing data associated with at least a subset of one or both of a first set of one or more participants and a second set of one or more participants, a prediction of the probability of a first future event occurring at the present competition.
    Type: Grant
    Filed: January 20, 2020
    Date of Patent: January 31, 2023
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20230026625
    Abstract: Techniques are disclosed to add augmented reality to a sub-view of a high resolution central video feed. In various embodiments, a central video feed is received from a first camera on a first recurring basis and time-stamped position information is received from a tracking system on a second recurring basis. The central video feed is calibrated against a spatial region encompassed by the central video feed. The received time-stamped position information and a determined plurality of tiles associated with at least one frame of the central video feed are used to define a first sub-view of the central video feed. The first sub-view and a homography defining placement of augmented reality elements on the at least one frame of the central video feed are provided as output to a device configured to use the first sub-view and the homography display the first sub-view.
    Type: Application
    Filed: July 21, 2021
    Publication date: January 26, 2023
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol, Anne Gerhart
  • Patent number: 11498125
    Abstract: A method of fabricating a near net shape component includes forming a sacrificial shell from a pulverant material using an additive manufacturing process, the shell having an aperture. The method further includes filling the shell with a second pulverant material, subjecting the filled shell to a consolidation process, and removing the shell from the consolidated second pulverant material.
    Type: Grant
    Filed: October 31, 2018
    Date of Patent: November 15, 2022
    Assignee: Hamilton Sundstrand Corporation
    Inventors: Sergey Mironets, Pawel Czarnecki
  • Publication number: 20220274022
    Abstract: In an embodiment, a process to predict an outcome of a competition includes receiving time-stamped position information of participant(s), the time-stamped position information captured by a telemetry tracking system during the competition. The process includes calculating while the competition is ongoing a covariate parameter for each of one or more participants at a point in time, where each respective covariate parameter is derived from the time-stamped position information of a corresponding participant at the point in time. The process includes predicting the outcome of the competition, as of the point in time, based at least in part on (i) a difference between a calculated competitor strength of the first competitor the second competitor based on historical data associated with the competitors, and (ii) the calculated first covariate parameter(s).
    Type: Application
    Filed: March 22, 2022
    Publication date: September 1, 2022
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20220203241
    Abstract: A process to track and provide positional information for display on a remote device includes receiving, on a first recurring basis, time-stamped position information of participant(s) in a competition. The time-stamped position information is captured by a telemetry tracking system during the competition and describes a time-stamped position of each of one or more corresponding participants in a predetermined spatial region. The process includes communicating at least a subset of time-stamped position information to the remote device on a second recurring basis. The remote device uses the time-stamped position information to overlay a representation of one or more of said participants onto a first virtual reproduction of at least a relevant portion of the predetermined spatial region to produce and display an instance of a compiled virtual scene that shows a respective position of each of the participant(s) on the first virtual reproduction of the predetermined spatial region.
    Type: Application
    Filed: March 15, 2022
    Publication date: June 30, 2022
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11311808
    Abstract: In an embodiment, a process to predict an outcome of a competition includes receiving time-stamped position information of participant(s), the time-stamped position information captured by a telemetry tracking system during the competition. The process includes calculating while the competition is ongoing a covariate parameter for each of one or more participants at a point in time, where each respective covariate parameter is derived from the time-stamped position information of a corresponding participant at the point in time. The process includes predicting the outcome of the competition, as of the point in time, based at least in part on (i) a difference between a calculated competitor strength of the first competitor the second competitor based on historical data associated with the competitors, and (ii) the calculated first covariate parameter(s).
    Type: Grant
    Filed: January 20, 2020
    Date of Patent: April 26, 2022
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11305194
    Abstract: A process to track and provide positional information for display on a remote device includes receiving, on a first recurring basis, time-stamped position information of participant(s) in a competition. The time-stamped position information is captured by a telemetry tracking system during the competition and describes a time-stamped position of each of one or more corresponding participants in a predetermined spatial region. The process includes communicating at least a subset of time-stamped position information to the remote device on a second recurring basis. The remote device uses the time-stamped position information to overlay a representation of one or more of said participants onto a first virtual reproduction of at least a relevant portion of the predetermined spatial region to produce and display an instance of a compiled virtual scene that shows a respective position of each of the participant(s) on the first virtual reproduction of the predetermined spatial region.
    Type: Grant
    Filed: January 20, 2020
    Date of Patent: April 19, 2022
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Publication number: 20210400201
    Abstract: A process to partition a video feed to segment live player activity includes receiving, on a first recurring basis, a transmission of a central video feed from a first camera. The central video feed is calibrated against a spatial region represented in at least two dimensions that is encompassed by the central video feed. The process includes receiving, on a second recurring basis, a respective time-stamped position information from each tracking device in a plurality of tracking devices. Each tracking device is worn by a corresponding subject on the spatial region and transmits positional information that describes a time-stamped position of the corresponding subject in the spatial region. The process uses the received information and the calibration to define a first sub-view of the central video feed associated with a first subject. The first sub-view comprises a corresponding sub-frame associated with the first subject.
    Type: Application
    Filed: September 3, 2021
    Publication date: December 23, 2021
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11140328
    Abstract: A process to partition a video feed to segment live player activity includes receiving, on a first recurring basis, a transmission of a central video feed from a first camera. The central video feed is calibrated against a spatial region represented in at least two dimensions that is encompassed by the central video feed. The process includes receiving, on a second recurring basis, a respective time-stamped position information from each tracking device in a plurality of tracking devices. Each tracking device is worn by a corresponding subject on the spatial region and transmits positional information that describes a time-stamped position of the corresponding subject in the spatial region. The process uses the received information and the calibration to define a first sub-view of the central video feed associated with a first subject. The first sub-view comprises a corresponding sub-frame associated with the first subject.
    Type: Grant
    Filed: January 20, 2020
    Date of Patent: October 5, 2021
    Assignee: Tempus Ex Machina, Inc.
    Inventors: Erik Schwartz, Michael Naquin, Christopher Brown, Steve Xing, Pawel Czarnecki, Charles D. Ebersol
  • Patent number: 11112016
    Abstract: A valve assembly is provided, the valve assembly comprising a body that defines a fluid flow path between an inlet and an outlet with a frangible member between the inlet and the outlet across the fluid flow path so as to block fluid flow The valve assembly also includes a member for rupturing the frangible member, a spring biasing the rupturing member into contact with the frangible member and a member for retaining the rupturing member away from the frangible member. A magnet is spaced apart from the retaining member, and a non-magnetic sleeve is disposed between the magnet and the retaining member. The non-magnetic sleeve blocks a magnetic field provided by the magnet. Also included is a trigger for removing the non-magnetic sleeve, wherein, when the non-magnetic sleeve is removed, in use, the magnetic field attracts the retaining member to release the rupturing member to cause rupture of the frangible member.
    Type: Grant
    Filed: May 16, 2019
    Date of Patent: September 7, 2021
    Assignee: GOODRICH CORPORATION
    Inventors: Ɓukasz Wiktorko, Pawel Czarnecki