Patents by Inventor Kyle G. Freeman

Kyle G. Freeman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11890747
    Abstract: An interactive autonomous robot is configured for deployment within a social environment. The disclosed robot includes a show subsystem configured to select between different in-character behaviors depending on robot status, thereby allowing the robot to appear in-character despite technical failures. The disclosed robot further includes a safety subsystem configured to intervene with in-character behavior when necessary to enforce safety protocols. The disclosed robot is also configured with a social subsystem that interprets social behaviors of humans and then initiates specific behavior sequences in response.
    Type: Grant
    Filed: June 20, 2019
    Date of Patent: February 6, 2024
    Assignee: DISNEY ENTERPRISES, INC.
    Inventors: Jeremy Andrew Mika, Michael Richard Honeck, Kyle G. Freeman, Michael Fusco
  • Patent number: 11590660
    Abstract: An interactive autonomous robot is configured for deployment within a social environment. The disclosed robot includes a show subsystem configured to select between different in-character behaviors depending on robot status, thereby allowing the robot to appear in-character despite technical failures. The disclosed robot further includes a safety subsystem configured to intervene with in-character behavior when necessary to enforce safety protocols. The disclosed robot is also configured with a social subsystem that interprets social behaviors of humans and then initiates specific behavior sequences in response.
    Type: Grant
    Filed: June 20, 2019
    Date of Patent: February 28, 2023
    Assignee: Disney Enterprises, Inc.
    Inventors: Kyle G. Freeman, Michael Fusco, Jeremy Andrew Mika, Michael Richard Honeck, Jon Hayes Snoddy, Shelley Short, Cory Rouse, Raymond J. Scanlon, Clifford Wong, Lance Duane Updyke, Jeremie Papon, Daniel Pike
  • Patent number: 10814487
    Abstract: A self-guiding automaton includes sensors, a hardware processor, and a memory storing a self-guidance software code. The hardware processor executes the self-guidance software code to detect a location and/or travel path of one or more human being(s) in a pedestrian environment using the sensors, and identify a self-guided path forward in the pedestrian environment based on the location and/or travel path of each human being(s). The self-guidance software code also determines a social cue for communicating the self-guided path forward to the human being(s), detects any change in the location and/or travel path of each human being(s) using the sensors, determines a pedestrian safety score of the self-guided path forward based on the location and/or travel path and any change in the location and/or travel path of each human being(s), and moves the automaton along the self-guided path forward if the pedestrian safety score satisfies a safety threshold.
    Type: Grant
    Filed: January 22, 2018
    Date of Patent: October 27, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Michael R. Honeck, Shelley O. Short, Jerry Rees, Kyle G. Freeman, Cory J. Rouse, Jeremy A. Mika, Michael Fusco
  • Patent number: 10796195
    Abstract: Embodiments provide for perceptual data association from at least a first and a second sensor disposed at different positions in an environment, in respective series of local scene graphs that identify characteristics of objects in the environment that are updated asynchronously and merging the series of local scene graphs to form a coherent image of the environment from multiple perspectives.
    Type: Grant
    Filed: January 16, 2020
    Date of Patent: October 6, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Jeremie A. Papon, Kyle G. Freeman
  • Publication number: 20200311462
    Abstract: Embodiments provide for perceptual data association from at least a first and a second sensor disposed at different positions in an environment, in respective series of local scene graphs that identify characteristics of objects in the environment that are updated asynchronously and merging the series of local scene graphs to form a coherent image of the environment from multiple perspectives.
    Type: Application
    Filed: January 16, 2020
    Publication date: October 1, 2020
    Inventors: Jeremie A. PAPON, Kyle G. FREEMAN
  • Patent number: 10607105
    Abstract: Embodiments provide for perceptual data association received from at least a first and a second sensor disposed at different positions in an environment, in respective time series of local scene graphs that identify several characteristics of at least one object in the environment that are updated different rates; merging, at an output rate, characteristics for each given object from the several time series of local scene graphs that are updated at the output rate; merging, at the output rate, characteristics for each given object from the time series of local scene graphs that are updated at rates other than the output rate; and outputting, at the output rate, a time series of global scene graphs including the merged characteristics.
    Type: Grant
    Filed: March 27, 2019
    Date of Patent: March 31, 2020
    Assignee: Disney Enterprises, Inc.
    Inventors: Jeremie A. Papon, Kyle G. Freeman
  • Publication number: 20200094417
    Abstract: An interactive autonomous robot is configured for deployment within a social environment. The disclosed robot includes a show subsystem configured to select between different in-character behaviors depending on robot status, thereby allowing the robot to appear in-character despite technical failures. The disclosed robot further includes a safety subsystem configured to intervene with in-character behavior when necessary to enforce safety protocols. The disclosed robot is also configured with a social subsystem that interprets social behaviors of humans and then initiates specific behavior sequences in response.
    Type: Application
    Filed: June 20, 2019
    Publication date: March 26, 2020
    Inventors: Kyle G. FREEMAN, Michael FUSCO, Jeremy Andrew MIKA, Michael Richard HONECK, Jon Hayes SNODDY, Shelley SHORT, Cory ROUSE, Raymond J. SCANLON, Clifford WONG, Lance Duane UPDYKE, Jeremie PAPON, Daniel PIKE
  • Publication number: 20200097011
    Abstract: An interactive autonomous robot is configured for deployment within a social environment. The disclosed robot includes a show subsystem configured to select between different in-character behaviors depending on robot status, thereby allowing the robot to appear in-character despite technical failures. The disclosed robot further includes a safety subsystem configured to intervene with in-character behavior when necessary to enforce safety protocols. The disclosed robot is also configured with a social subsystem that interprets social behaviors of humans and then initiates specific behavior sequences in response.
    Type: Application
    Filed: June 20, 2019
    Publication date: March 26, 2020
    Inventors: Jeremy Andrew MIKA, Michael Richard HONECK, Kyle G. FREEMAN, Michael FUSCO
  • Publication number: 20190224850
    Abstract: A self-guiding automaton includes sensors, a hardware processor, and a memory storing a self-guidance software code. The hardware processor executes the self-guidance software code to detect a location and/or travel path of one or more human being(s) in a pedestrian environment using the sensors, and identify a self -guided path forward in the pedestrian environment based on the location and/or travel path of each human being(s). The self-guidance software code also determines a social cue for communicating the self-guided path forward to the human being(s), detects any change in the location and/or travel path of each human being(s) using the sensors, determines a pedestrian safety score of the self-guided path forward based on the location and/or travel path and any change in the location and/or travel path of each human being(s), and moves the automaton along the self-guided path forward if the pedestrian safety score satisfies a safety threshold.
    Type: Application
    Filed: January 22, 2018
    Publication date: July 25, 2019
    Inventors: Michael R. Honeck, Shelley O. Short, Jerry Rees, Kyle G. Freeman, Cory J. Rouse, Jeremy A. Mika, Michael Fusco
  • Patent number: 10012729
    Abstract: Systems and methods for tracking subjects within a three-dimensional physical environment are presented herein. A ranging sensor is mounted at a sensor location in the environment. The ranging sensor generates sensor output. The sensor output includes detected ranges of surfaces present in the environment as a function of orientations of the ranging sensor. Characteristics of the surface are determined using the detected ranges and orientations as polar coordinates of the surfaces.
    Type: Grant
    Filed: September 17, 2014
    Date of Patent: July 3, 2018
    Assignee: Disney Enterprises, Inc.
    Inventor: Kyle G. Freeman
  • Patent number: 9943959
    Abstract: A method for localizing robots and other objects when ranging devices are partially obstructed. The method includes retrieving a digital map of the space and then identifying potential locations for the object in the space. The method involves generating a prediction of expected ranges between the object and surfaces in the space at each of the potential locations. The method includes operating a ranging device to measure ranges to the surfaces in the space and comparing, for each of the predictions, the measured ranges with the expected ranges to identify the most accurate prediction. The comparing includes weighting ranges that are too small neutrally as these are strikes on obstructions, weighting ranges that provide matches positively, and weighting ranges that are too large negatively, and the comparing then involves summing the weighted ranges to identify a most accurate one of the predictions and associated current location for the object.
    Type: Grant
    Filed: April 25, 2016
    Date of Patent: April 17, 2018
    Assignee: Disney Enterprises, Inc.
    Inventor: Kyle G. Freeman
  • Publication number: 20170305013
    Abstract: A method for localizing robots and other objects when ranging devices are partially obstructed. The method includes retrieving a digital map of the space and then identifying potential locations for the object in the space. The method involves generating a prediction of expected ranges between the object and surfaces in the space at each of the potential locations. The method includes operating a ranging device to measure ranges to the surfaces in the space and comparing, for each of the predictions, the measured ranges with the expected ranges to identify the most accurate prediction. The comparing includes weighting ranges that are too small neutrally as these are strikes on obstructions, weighting ranges that provide matches positively, and weighting ranges that are too large negatively, and the comparing then involves summing the weighted ranges to identify a most accurate one of the predictions and associated current location for the object.
    Type: Application
    Filed: April 25, 2016
    Publication date: October 26, 2017
    Inventor: KYLE G. FREEMAN
  • Patent number: 9776323
    Abstract: A trained classifier to be used with a navigation algorithm for use with mobile robots to compute safe and efficient trajectories. An offline learning process is used to train a classifier for the navigation algorithm (or motion planner), and the classifier functions, after training is complete, to accurately detect intentions of humans within a space shared with the robot to block the robot from traveling along its current trajectory. At runtime, the trained classifier can be used with regression based on past trajectories of humans (or other tracked, mobile entities) to predict where the humans will move in the future and whether the humans are likely to be blockers. The planning algorithm or motion planner generates trajectories based on predictions of human behavior that allow the robot to navigate amongst crowds of people more safely and efficiently.
    Type: Grant
    Filed: January 6, 2016
    Date of Patent: October 3, 2017
    Assignee: DISNEY ENTERPRISES, INC.
    Inventors: Carol Ann O'Sullivan, Chonhyon Park, Max L. Gilbert, Jan Ondrej, Kyle G. Freeman
  • Publication number: 20170190051
    Abstract: A trained classifier to be used with a navigation algorithm for use with mobile robots to compute safe and efficient trajectories. An offline learning process is used to train a classifier for the navigation algorithm (or motion planner), and the classifier functions, after training is complete, to accurately detect intentions of humans within a space shared with the robot to block the robot from traveling along its current trajectory. At runtime, the trained classifier can be used with regression based on past trajectories of humans (or other tracked, mobile entities) to predict where the humans will move in the future and whether the humans are likely to be blockers. The planning algorithm or motion planner generates trajectories based on predictions of human behavior that allow the robot to navigate amongst crowds of people more safely and efficiently.
    Type: Application
    Filed: January 6, 2016
    Publication date: July 6, 2017
    Inventors: CAROL ANN O'SULLIVAN, CHONHYON PARK, MAX L. GILBERT, JAN ONDREJ, KYLE G. FREEMAN
  • Publication number: 20160077207
    Abstract: Systems and methods for tracking subjects within a three-dimensional physical environment are presented herein. A ranging sensor is mounted at a sensor location in the environment. The ranging sensor generates sensor output. The sensor output includes detected ranges of surfaces present in the environment as a function of orientations of the ranging sensor. Characteristics of the surface are determined using the detected ranges and orientations as polar coordinates of the surfaces.
    Type: Application
    Filed: September 17, 2014
    Publication date: March 17, 2016
    Inventor: Kyle G. FREEMAN
  • Patent number: 6700573
    Abstract: A technique for the realistic simulation of visual scenes uses voxels generated from positional and elevational data combined with a stored color map to generate vertex and triangle data that is used by an accelerator card to render a visual scene on a display device. In this manner, the invention provides rendering efficiency and speed for highly detailed visual scenes while preventing overdraw.
    Type: Grant
    Filed: November 7, 2001
    Date of Patent: March 2, 2004
    Assignee: NovaLogic, Inc.
    Inventor: Kyle G. Freeman
  • Publication number: 20030085896
    Abstract: A technique for the realistic simulation of visual scenes uses voxels generated from positional and elevational data combined with a stored color map to generate vertex and triangle data that is used by an accelerator card to render a visual scene on a display device. In this manner, the invention provides rendering efficiency and speed for highly detailed visual scenes while preventing overdraw.
    Type: Application
    Filed: November 7, 2001
    Publication date: May 8, 2003
    Inventor: Kyle G. Freeman
  • Patent number: 6373890
    Abstract: Color video data is compressed by storing only the most frequently occurring colors in a block of pixel data. A minimum number of colors for the block is determined determining when colors in a block are the same or comparatively close to colors in the corresponding block of the previous video frame or are comparatively close to colors in the next previous adjacent block such that no additional color data needs to be stored, and by consolidating comparatively close colors withing the block. Two comparatively close colors are consolidated by substituting the more frequently occurring color for the less frequently occurring color in the block. Colors are comparatively close when the difference in their color values is less than a color threshold value. The color threshold value can be set by the user. If the minimum number of colors determined for the block is greater than four, then the color thresholds are adjusted and a new minimum number of colors is then determined.
    Type: Grant
    Filed: May 5, 1998
    Date of Patent: April 16, 2002
    Assignee: NovaLogic, Inc.
    Inventor: Kyle G. Freeman
  • Publication number: 20010055021
    Abstract: A technique and system for the realistic simulation of visual scenes reduces the three-dimensional computation to two additions and further reduces the need for three-dimensional computations by displaying several screen pixels per three-dimensional computation. The approach when implemented in hardware or software significantly speeds up scene generation time while improving the resolution and realism of the rendered scene.
    Type: Application
    Filed: January 31, 2000
    Publication date: December 27, 2001
    Inventor: Kyle G. Freeman
  • Patent number: 6020893
    Abstract: A technique and system for the realistic simulation of visual scenes reduces the three-dimensional computation to two additions and further reduces the need for three-dimensional computations by displaying several screen pixels per three-dimensional computation. The approach when implemented in hardware or software significantly speeds up scene generation time while improving the resolution and realism of the rendered scene.
    Type: Grant
    Filed: April 11, 1997
    Date of Patent: February 1, 2000
    Assignee: Novalogic, Inc.
    Inventor: Kyle G. Freeman