Patents by Inventor Richard C. Roesler

Richard C. Roesler has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20180349012
    Abstract: Technologies described herein provide a mixed environment display of attached control elements. The techniques disclosed herein enable users of a first computing device to interact with a remote computing device configured to control an object, such as a light, appliance, or any other suitable object. Configurations disclosed herein enable the first computing device to cause one or more actions, such as a selection of the object or the display of a user interface, by capturing and analyzing input data defining the performance of one or more gestures, such as a user looking at the object controlled by the second computing device. Rendered graphical elements configured to enable the control of the object can be displayed with a real-world view of the object.
    Type: Application
    Filed: June 1, 2018
    Publication date: December 6, 2018
    Inventors: David M. HILL, Andrew William JEAN, Jeffrey J. EVERTT, Alan M. JONES, Richard C. ROESLER, Charles W. CARLSON, Emiko V. CHARBONNEAU, James DACK
  • Patent number: 10099382
    Abstract: Concepts and technologies are described herein for providing a mixed environment display of robotic actions. In some configurations, techniques disclosed herein can execute a set of instructions or run a simulation of the set of instructions to generate model data defining actions of a robot based on a set of instructions. Using the model data, one or more computing devices may generate graphical data comprising a graphical representation of the robot performing tasks or actions based on an execution of the instructions. The graphical data can be in form of an animation showing a robots actions, which may include movement of the robot and the robot's interaction with other objects. Graphical elements showing the status of the robot or graphical representations of the instructions may be included in the graphical data. The graphical data may be displayed on an interface or communicated to one or more computers for further processing.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: October 16, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: David M. Hill, Alan M. Jones, Jeffrey J. Evertt, Richard C. Roesler, Emiko V. Charbonneau, Andrew William Jean
  • Patent number: 10007413
    Abstract: Technologies described herein provide a mixed environment display of attached control elements. The techniques disclosed herein enable users of a first computing device to interact with a remote computing device configured to control an object, such as a light, appliance, or any other suitable object. Configurations disclosed herein enable the first computing device to cause one or more actions, such as a selection of the object or the display of a user interface, by capturing and analyzing input data defining the performance of one or more gestures, such as a user looking at the object controlled by the second computing device. Rendered graphical elements configured to enable the control of the object can be displayed with a real-world view of the object.
    Type: Grant
    Filed: November 19, 2015
    Date of Patent: June 26, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: David M. Hill, Andrew William Jean, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Charles W. Carlson, Emiko V. Charbonneau, James Dack
  • Publication number: 20170297204
    Abstract: Concepts and technologies are described herein for providing enhanced configuration and control of robots. Configurations disclosed herein augment a mobile computing device, such as a robot, with resources for understanding and navigation of an environment surrounding the computing device. The resources can include sensors of a separate computing device, which may be in the form of a head-mounted display. Data produced by the resources can be used to generate instructions for the mobile computing device. The sensors of the separate computing device can also detect a change in an environment or a conflict in the actions of the mobile computing device, and dynamically modify the generated instructions. By the use of the techniques disclosed herein, a simple, low-cost robot can understand and navigate through a complex environment and appropriately interact with obstacles and other objects.
    Type: Application
    Filed: July 5, 2017
    Publication date: October 19, 2017
    Inventors: David M. HILL, Jeffrey J. EVERTT, Alan M. JONES, Richard C. ROESLER, Andrew William JEAN, Emiko V. CHARBONNEAU
  • Patent number: 9713871
    Abstract: Concepts and technologies are described herein for providing enhanced configuration and control of robots. Configurations disclosed herein augment a mobile computing device, such as a robot, with resources for understanding and navigation of an environment surrounding the computing device. The resources can include sensors of a separate computing device, which may be in the form of a head-mounted display. Data produced by the resources can be used to generate instructions for the mobile computing device. The sensors of the separate computing device can also detect a change in an environment or a conflict in the actions of the mobile computing device, and dynamically modify the generated instructions. By the use of the techniques disclosed herein, a simple, low-cost robot can understand and navigate through a complex environment and appropriately interact with obstacles and other objects.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: July 25, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: David M. Hill, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Andrew William Jean, Emiko V. Charbonneau
  • Publication number: 20160314621
    Abstract: A holographic user interface may display status information in proximity to relevant components of the computing device, such as a robot, allowing a user to readily associate the status information with the relevant components. Arrangements of the graphical displays may utilize graphical elements to show an association between any displayed data and any component of a computing device. Based on data indicating the size, shape, and configuration of a robot's physical parts, techniques disclosed herein can arrange displayed status data, which may involve a holographic UI, in a relevant context. In addition, techniques disclosed herein allow a user to edit data, or provide an input to one or more computing devices in response to the display of any status data.
    Type: Application
    Filed: August 11, 2015
    Publication date: October 27, 2016
    Inventors: David M. Hill, Emiko V. Charbonneau, Andrew William Jean, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler
  • Publication number: 20160311115
    Abstract: Concepts and technologies are described herein for providing enhanced configuration and control of robots. Configurations disclosed herein augment a mobile computing device, such as a robot, with resources for understanding and navigation of an environment surrounding the computing device. The resources can include sensors of a separate computing device, which may be in the form of a head-mounted display. Data produced by the resources can be used to generate instructions for the mobile computing device. The sensors of the separate computing device can also detect a change in an environment or a conflict in the actions of the mobile computing device, and dynamically modify the generated instructions. By the use of the techniques disclosed herein, a simple, low-cost robot can understand and navigate through a complex environment and appropriately interact with obstacles and other objects.
    Type: Application
    Filed: August 11, 2015
    Publication date: October 27, 2016
    Inventors: David M. Hill, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Andrew William Jean, Emiko V. Charbonneau
  • Publication number: 20160311116
    Abstract: Concepts and technologies are described herein for providing a mixed environment display of robotic actions. In some configurations, techniques disclosed herein can execute a set of instructions or run a simulation of the set of instructions to generate model data defining actions of a robot based on a set of instructions. Using the model data, one or more computing devices may generate graphical data comprising a graphical representation of the robot performing tasks or actions based on an execution of the instructions. The graphical data can be in form of an animation showing a robots actions, which may include movement of the robot and the robot's interaction with other objects. Graphical elements showing the status of the robot or graphical representations of the instructions may be included in the graphical data. The graphical data may be displayed on an interface or communicated to one or more computers for further processing.
    Type: Application
    Filed: August 11, 2015
    Publication date: October 27, 2016
    Inventors: David M. Hill, Alan M. Jones, Jeffrey J. Evertt, Richard C. Roesler, Emiko V. Charbonneau, Andrew William Jean
  • Publication number: 20160313902
    Abstract: Technologies described herein provide a mixed environment display of attached control elements. The techniques disclosed herein enable users of a first computing device to interact with a remote computing device configured to control an object, such as a light, appliance, or any other suitable object. Configurations disclosed herein enable the first computing device to cause one or more actions, such as a selection of the object or the display of a user interface, by capturing and analyzing input data defining the performance of one or more gestures, such as a user looking at the object controlled by the second computing device. Rendered graphical elements configured to enable the control of the object can be displayed with a real-world view of the object.
    Type: Application
    Filed: November 19, 2015
    Publication date: October 27, 2016
    Inventors: David M. Hill, Andrew William Jean, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Charles W. Carlson, Emiko V. Charbonneau, James Dack