Patents by Inventor Jonathan Tremblay

Jonathan Tremblay has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200061811
    Abstract: In at least one embodiment, under the control of a robotic control system, a gripper on a robot is positioned to grasp a 3-dimensional object. In at least one embodiment, the relative position of the object and the gripper is determined, at least in part, by using a camera mounted on the gripper.
    Type: Application
    Filed: August 23, 2019
    Publication date: February 27, 2020
    Inventors: Shariq Iqbal, Jonathan Tremblay, Thang Hong To, Jia Cheng, Erik Leitch, Duncan J. McKay, Stanley Thomas Birchfield
  • Publication number: 20190355150
    Abstract: An object detection neural network receives an input image including an object and generates belief maps for vertices of a bounding volume that encloses the object. The belief maps are used, along with three-dimensional (3D) coordinates defining the bounding volume, to compute the pose of the object in 3D space during post-processing. When multiple objects are present in the image, the object detection neural network may also generate vector fields for the vertices. A vector field comprises vectors pointing from the vertex to a centroid of the object enclosed by the bounding volume defined by the vertex. The object detection neural network may be trained using images of computer-generated objects rendered in 3D scenes (e.g., photorealistic synthetic data). Automatically labelled training datasets may be easily constructed using the photorealistic synthetic data. The object detection neural network may be trained for object detection using only the photorealistic synthetic data.
    Type: Application
    Filed: May 7, 2019
    Publication date: November 21, 2019
    Inventors: Jonathan Tremblay, Thang Hong To, Stanley Thomas Birchfield
  • Publication number: 20190251397
    Abstract: Training deep neural networks requires a large amount of labeled training data. Conventionally, labeled training data is generated by gathering real images that are manually labelled which is very time-consuming. Instead of manually labelling a training dataset, domain randomization technique is used generate training data that is automatically labeled. The generated training data may be used to train neural networks for object detection and segmentation (labelling) tasks. In an embodiment, the generated training data includes synthetic input images generated by rendering three-dimensional (3D) objects of interest in a 3D scene. In an embodiment, the generated training data includes synthetic input images generated by rendering 3D objects of interest on a 2D background image. The 3D objects of interest are objects that a neural network is trained to detect and/or label.
    Type: Application
    Filed: January 24, 2019
    Publication date: August 15, 2019
    Inventors: Jonathan Tremblay, Aayush Prakash, Mark A. Brophy, Varun Jampani, Cem Anil, Stanley Thomas Birchfield, Thang Hong To, David Jesus Acuna Marrero
  • Publication number: 20190228495
    Abstract: Various embodiments enable a robot, or other autonomous or semi-autonomous device or system, to receive data involving the performance of a task in the physical world. The data can be provided as input to a perception network to infer a set of percepts about the task, which can correspond to relationships between objects observed during the performance. The percepts can be provided as input to a plan generation network, which can infer a set of actions as part of a plan. Each action can correspond to one of the observed relationships. The plan can be reviewed and any corrections made, either manually or through another demonstration of the task. Once the plan is verified as correct, the plan (and any related data) can be provided as input to an execution network that can infer instructions to cause the robot, and/or another robot, to perform the task.
    Type: Application
    Filed: January 23, 2019
    Publication date: July 25, 2019
    Inventors: Jonathan Tremblay, Stan Birchfield, Stephen Tyree, Thang To, Jan Kautz, Artem Molchanov
  • Patent number: 9897569
    Abstract: An electronic device includes a first field effect transistor that includes a first gate electrode, a first drain electrode, and a first source electrode; a second field effect transistor that includes a second gate electrode, a second drain electrode, and a second source electrode, the first and second gate electrodes being at least one of electrically connected or integral, and the first and second source electrodes being at least one of electrically connected or integral; an input electrode electrically connected to the first and second gate electrodes; and an output electrode electrically connected to the first and second source electrodes. The first field effect transistor also includes a first semiconductor material. The second field effect transistor further also incudes a second semiconductor material.
    Type: Grant
    Filed: July 8, 2011
    Date of Patent: February 20, 2018
    Assignee: The Johns Hopkins University
    Inventors: Howard Edan Katz, Patrick N. Breysse, Bal Mukund Dhar, Noah Jonathan Tremblay
  • Patent number: 8046942
    Abstract: An electronic sign with multiple direction positionable rear access doors, wherein sliding rear access doors are utilized to provide for access to the inner regions of the electronic sign while maintaining a thin profile. One or more of a plurality of sliding rear access doors can be disengaged from a back panel and urged rearwardly a minimal distance and then positioned laterally and unobtrusively along the length of a top track opposing a bottom track allowing the electronic sign to maintain a thin and unobtrusive profile.
    Type: Grant
    Filed: September 28, 2006
    Date of Patent: November 1, 2011
    Assignee: Daktronics, Inc.
    Inventors: Kory D. Kludt, Jonathan Tremblay