Patents by Inventor Ian Dewancker

Ian Dewancker has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10579063
    Abstract: The present disclosure provides systems and methods for predicting the future locations of objects that are perceived by autonomous vehicles. An autonomous vehicle can include a prediction system that, for each object perceived by the autonomous vehicle, generates one or more potential goals, selects one or more of the potential goals, and develops one or more trajectories by which the object can achieve the one or more selected goals. The prediction systems and methods described herein can include or leverage one or more machine-learned models that assist in predicting the future locations of the objects. As an example, in some implementations, the prediction system can include a machine-learned static object classifier, a machine-learned goal scoring model, a machine-learned trajectory development model, a machine-learned ballistic quality classifier, and/or other machine-learned models. The use of machine-learned models can improve the speed, quality, and/or accuracy of the generated predictions.
    Type: Grant
    Filed: August 23, 2017
    Date of Patent: March 3, 2020
    Assignee: UATC, LLC
    Inventors: Galen Clark Haynes, Ian Dewancker, Nemanja Djuric, Tzu-Kuo Huang, Tian Lan, Tsung-Han Lin, Micol Marchetti-Bowick, Vladan Radosavljevic, Jeff Schneider, Alexander David Styler, Neil Traft, Huahua Wang, Anthony Joseph Stentz
  • Publication number: 20190025841
    Abstract: The present disclosure provides systems and methods for predicting the future locations of objects that are perceived by autonomous vehicles. An autonomous vehicle can include a prediction system that, for each object perceived by the autonomous vehicle, generates one or more potential goals, selects one or more of the potential goals, and develops one or more trajectories by which the object can achieve the one or more selected goals. The prediction systems and methods described herein can include or leverage one or more machine-learned models that assist in predicting the future locations of the objects. As an example, in some implementations, the prediction system can include a machine-learned static object classifier, a machine-learned goal scoring model, a machine-learned trajectory development model, a machine-learned ballistic quality classifier, and/or other machine-learned models. The use of machine-learned models can improve the speed, quality, and/or accuracy of the generated predictions.
    Type: Application
    Filed: August 23, 2017
    Publication date: January 24, 2019
    Inventors: Clark Haynes, Ian Dewancker, Nemanja Djuric, Tzu-Kuo Huang, Tian Lan, Hank Lin, Micol Marchetti-Bowick, Vladan Radosavljevic, Jeff Schneider, Alex Styler, Neil Traft, Huahua Wang, Tony Stentz
  • Patent number: 8654398
    Abstract: An automated printout inspection system identifies glyphs in an image by calculating a connectedness score for each foreground pixel, and comparing this score with a specified threshold. The system further generates training images by simulating printouts from an impact printer, including the specifying of specific error types and their magnitudes. The simulated printouts are combined with scan images of real-world printout to train an automated printout inspection system. The inspection results of the automated system are compared with inspection results from human inspectors, and test parameters of the automated system are adjusted so that it renders inspection results within a specified range of the average human inspector.
    Type: Grant
    Filed: March 19, 2012
    Date of Patent: February 18, 2014
    Assignee: Seiko Epson Corporation
    Inventors: Ian Dewancker, Arash Abadpour, Eunice Poon, Kyel Ok, Yury Yakubovich
  • Publication number: 20130242354
    Abstract: An automated printout inspection system identifies glyphs in an image by calculating a connectedness score for each foreground pixel, and comparing this score with a specified threshold. The system further generates training images by simulating printouts from an impact printer, including the specifying of specific error types and their magnitudes. The simulated printouts are combined with scan images of real-world printout to train an automated printout inspection system. The inspection results of the automated system are compared with inspection results from human inspectors, and test parameters of the automated system are adjusted so that it renders inspection results within a specified range of the average human inspector.
    Type: Application
    Filed: March 19, 2012
    Publication date: September 19, 2013
    Inventors: Ian Dewancker, Arash Abadpour, Eunice Poon, Kyel Ok, Yury Yakubovich