Patents by Inventor Vijayan K. Asari

Vijayan K. Asari has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11356599
    Abstract: A system includes a control station that enables efficient human collaboration with automated object tracking. The control station is communicatively coupled to an aerial vehicle to receive full motion video of a ground scene taken by an airborne sensor of the aerial vehicle. The control station spatially registers features of a movable object present in the ground scene and determines motion of the movable object relative to the ground scene. The control station predicts a trajectory of the movable objective relative to the ground scene. The control station tracks the movable object based on data fusion of: (i) the spatially registered features; (ii) the determined motion; and (iii) the predicted trajectory of the movable object. The control station presents a tracking annotation and a determined confidence indicator for the tracking annotation on a user interface device to facilitate human collaboration with object tracking.
    Type: Grant
    Filed: February 8, 2021
    Date of Patent: June 7, 2022
    Assignee: United States of America as represented by the Secretary of the Air Force
    Inventors: Terry W. Stanard, Theus H. Aspiras, Vijayan K. Asari, Taleri L. Hammack
  • Publication number: 20210258471
    Abstract: A system includes a control station that enables efficient human collaboration with automated object tracking. The control station is communicatively coupled to an aerial vehicle to receive full motion video of a ground scene taken by an airborne sensor of the aerial vehicle. The control station spatially registers features of a movable object present in the ground scene and determines motion of the movable object relative to the ground scene. The control station predicts a trajectory of the movable objective relative to the ground scene. The control station tracks the movable object based on data fusion of: (i) the spatially registered features; (ii) the determined motion; and (iii) the predicted trajectory of the movable object. The control station presents a tracking annotation and a determined confidence indicator for the tracking annotation on a user interface device to facilitate human collaboration with object tracking.
    Type: Application
    Filed: February 8, 2021
    Publication date: August 19, 2021
    Inventors: Terry W. Stanard, Theus H. Aspiras, Vijayan K. Asari, Taleri L. Hammack
  • Patent number: 10917557
    Abstract: A system includes a control station that enables efficient human collaboration with automated object tracking. The control station is communicatively coupled to an aerial vehicle to receive full motion video of a ground scene taken by an airborne sensor of the aerial vehicle. The control station spatially registers features of a movable object present in the ground scene and determines motion of the movable object relative to the ground scene. The control station predicts a trajectory of the movable objective relative to the ground scene. The control station tracks the movable object based on data fusion of: (i) the spatially registered features; (ii) the determined motion; and (iii) the predicted trajectory of the movable object. The control station presents a tracking annotation and a determined confidence indicator for the tracking annotation on a user interface device to facilitate human collaboration with object tracking.
    Type: Grant
    Filed: April 3, 2019
    Date of Patent: February 9, 2021
    Assignee: United States of America as represented by the Secretary of the Air Force
    Inventors: Terry W. Stanard, Theus H. Aspiras, Vijayan K. Asari, Taleri L. Hammack
  • Publication number: 20200029013
    Abstract: A system includes a control station that enables efficient human collaboration with automated object tracking. The control station is communicatively coupled to an aerial vehicle to receive full motion video of a ground scene taken by an airborne sensor of the aerial vehicle. The control station spatially registers features of a movable object present in the ground scene and determines motion of the movable object relative to the ground scene. The control station predicts a trajectory of the movable objective relative to the ground scene. The control station tracks the movable object based on data fusion of: (i) the spatially registered features; (ii) the determined motion; and (iii) the predicted trajectory of the movable object. The control station presents a tracking annotation and a determined confidence indicator for the tracking annotation on a user interface device to facilitate human collaboration with object tracking.
    Type: Application
    Filed: April 3, 2019
    Publication date: January 23, 2020
    Inventors: Terry W. Stanard, Theus H. Aspiras, Vijayan K. Asari, Taleri L. Hammack
  • Publication number: 20150355309
    Abstract: Systems, methods, and computer product for identifying and tracking an object of interest from an image capturing system based on a plurality of features associated with the object of interest. The object of interest may be tracked based on features associated with the object of interest. A center feature associated with the object of interest is designated. The center feature changes location as the object of interest changes location. A plurality of ringlets is generated. Each ringlet is concentrically positioned so that each ringlet encircles the center feature and encompasses additional features associated with the object of interest. The object of interest is tracked with feature data extracted by each ringlet as the object of interest changes location and/or orientation. The feature data is associated with each feature of the object of interest that each ringlet encompasses.
    Type: Application
    Filed: June 5, 2015
    Publication date: December 10, 2015
    Inventors: Theus Aspiras, Vijayan K. Asari
  • Patent number: 7428333
    Abstract: A new image enhancement process based on an integrated neighborhood dependent nonlinear approach for color images captured in various environments such extremely low lighting, fog or underwater. The new process is a combination of two independent processes: luminance enhancement and contrast enhancement. The luminance enhancement, also regarded as a process of dynamic range compression, is essentially an intensity transformation based on a specifically designed nonlinear transfer function, which can largely increase the luminance for dark regions of the image but only slightly change the luminance for the bright regions of the image. The contrast enhancement transforms each pixel's intensity based on the relationship between the pixel and its surrounding pixels. The output of the contrast enhancement is a power function of the luminance of the input image.
    Type: Grant
    Filed: January 24, 2005
    Date of Patent: September 23, 2008
    Assignee: Old Dominion University
    Inventors: Vijayan K. Asari, Li Tao
  • Patent number: 7362910
    Abstract: A method of characterizing and enhancing the pixels in an image, which is captured at all lighting conditions. The method enable dynamically enhancement of images captured at extremely low lighting conditions. Each pixel of the image, represented by I SUB (x, y) where x is the horizontal axis coordinate value and y is the vertical axis coordinate value, contains three color-components. Each pixel is characterized by defining a relationship between the three color-components of that pixel in its initial state. A nonlinear expansion function, which adaptively adjusts the expansion rate with respect to the statistical property of the original image, is used for enhancing the brightness of the image so that the poorly lighted pixels are enhanced in intensity more as compared to the amount of enhancement for the brighter pixels. The color property of each pixel in the enhanced image is then restored using the characteristic features previously defined by the relationship of the color components.
    Type: Grant
    Filed: January 21, 2005
    Date of Patent: April 22, 2008
    Assignee: Old Dominion University Research Foundation
    Inventors: Vijayan K. Asari, Ming-Jung Seow