Patents by Inventor Kyle Glowacki

Kyle Glowacki has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10708548
    Abstract: Systems, methods and computer-readable media for creating and using video analysis rules that are based on map data are disclosed. A sensor(s), such as a video camera, can track and monitor a geographic location, such as a road, pipeline, or other location or installation. A video analytics engine can receive video streams from the sensor, and identify a location of the imaged view in a geo-registered map space, such as a latitude-longitude defined map space. A user can operate a graphical user interface to draw, enter, select, and/or otherwise input on a map a set of rules for detection of events in the monitored scene, such as tripwires and areas of interest. When tripwires, areas of interest, and/or other features are approached or crossed, the engine can perform responsive actions, such as generating an alert and sending it to a user.
    Type: Grant
    Filed: September 24, 2018
    Date of Patent: July 7, 2020
    Assignee: AVIGILON FORTRESS CORPORATION
    Inventors: Zeeshan Rasheed, Dana Eubanks, Weihong Yin, Zhong Zhang, Kyle Glowacki, Allison Beach
  • Patent number: 10687022
    Abstract: Embodiments relate to systems, devices, and computer-implemented methods for performing automated visual surveillance by obtaining video camera coordinates determined using video data, video camera metadata, and/or digital elevation models, obtaining a surveillance rule associated with rule coordinates, identifying a video camera that is associated with video camera coordinates that include at least part of the rule coordinates, and transmitting the surveillance rule to a computing device associated with the video camera. The rule coordinates can be automatically determined based on received coordinates of an object. Additionally, the surveillance rule can be generated based on instructions from a user in a natural language syntax.
    Type: Grant
    Filed: December 4, 2015
    Date of Patent: June 16, 2020
    Assignee: AVIGILON FORTRESS CORPORATION
    Inventors: Zeeshan Rasheed, Dana Eubanks, Weihong Yin, Zhong Zhang, Kyle Glowacki, Allison Beach
  • Publication number: 20190037179
    Abstract: Systems, methods and computer-readable media for creating and using video analysis rules that are based on map data are disclosed. A sensor(s), such as a video camera, can track and monitor a geographic location, such as a road, pipeline, or other location or installation. A video analytics engine can receive video streams from the sensor, and identify a location of the imaged view in a geo-registered map space, such as a latitude-longitude defined map space. A user can operate a graphical user interface to draw, enter, select, and/or otherwise input on a map a set of rules for detection of events in the monitored scene, such as tripwires and areas of interest. When tripwires, areas of interest, and/or other features are approached or crossed, the engine can perform responsive actions, such as generating an alert and sending it to a user.
    Type: Application
    Filed: September 24, 2018
    Publication date: January 31, 2019
    Inventors: Zeeshan Rasheed, Dana Eubanks, Weihong Yin, Zhong Zhang, Kyle Glowacki, Allison Beach
  • Patent number: 10110856
    Abstract: Systems, methods and computer-readable media for creating and using video analysis rules that are based on map data are disclosed. A sensor(s), such as a video camera, can track and monitor a geographic location, such as a road, pipeline, or other location or installation. A video analytics engine can receive video streams from the sensor, and identify a location of the imaged view in a geo-registered map space, such as a latitude-longitude defined map space. A user can operate a graphical user interface to draw, enter, select, and/or otherwise input on a map a set of rules for detection of events in the monitored scene, such as tripwires and areas of interest. When tripwires, areas of interest, and/or other features are approached or crossed, the engine can perform responsive actions, such as generating an alert and sending it to a user.
    Type: Grant
    Filed: December 4, 2015
    Date of Patent: October 23, 2018
    Assignee: AVIGILON FORTRESS CORPORATION
    Inventors: Zeeshan Rasheed, Dana Eubanks, Weihong Yin, Zhong Zhang, Kyle Glowacki, Allison Beach
  • Publication number: 20160165193
    Abstract: Systems, methods and computer-readable media for creating and using video analysis rules that are based on map data are disclosed. A sensor(s), such as a video camera, can track and monitor a geographic location, such as a road, pipeline, or other location or installation. A video analytics engine can receive video streams from the sensor, and identify a location of the imaged view in a geo-registered map space, such as a latitude-longitude defined map space. A user can operate a graphical user interface to draw, enter, select, and/or otherwise input on a map a set of rules for detection of events in the monitored scene, such as tripwires and areas of interest. When tripwires, areas of interest, and/or other features are approached or crossed, the engine can perform responsive actions, such as generating an alert and sending it to a user.
    Type: Application
    Filed: December 4, 2015
    Publication date: June 9, 2016
    Inventors: Zeeshan Rasheed, Dana Eubanks, Weihong Yin, Zhong Zhang, Kyle Glowacki, Allison Beach
  • Publication number: 20160165187
    Abstract: Embodiments relate to systems, devices, and computer-implemented methods for performing automated visual surveillance by obtaining video camera coordinates determined using video data, video camera metadata, and/or digital elevation models, obtaining a surveillance rule associated with rule coordinates, identifying a video camera that is associated with video camera coordinates that include at least part of the rule coordinates, and transmitting the surveillance rule to a computing device associated with the video camera. The rule coordinates can be automatically determined based on received coordinates of an object. Additionally, the surveillance rule can be generated based on instructions from a user in a natural language syntax.
    Type: Application
    Filed: December 4, 2015
    Publication date: June 9, 2016
    Inventors: Zeeshan Rasheed, Dana Eubanks, Weihong Yin, Zhong Zhang, Kyle Glowacki, Allison Beach
  • Publication number: 20160165191
    Abstract: A method for predicting when an object will arrive at a boundary includes receiving visual media captured by a camera. An object in the visual media is identified. One or more parameters related to the object are detected based on analysis of the visual media. It is predicted when the object will arrive at a boundary using the one or more parameters. An alert is transmitted to a user indicating when the object is predicted to arrive at the boundary.
    Type: Application
    Filed: December 4, 2015
    Publication date: June 9, 2016
    Inventors: Zeeshan Rasheed, Weihong Yin, Zhong Zhang, Kyle Glowacki, Allison Beach