Patents by Inventor David M. Hill

David M. Hill has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11941600
    Abstract: In a method for advanced identification of a customer, a customer may remotely place an order, intending to later go to a store to pick up the ordered item. The store may have a pick-up area (e.g., at the back of the store) where the customer can go to pick up the ordered item. To save the customer time, the customer may be identified when she enters the store so that an employee can obtain her ordered item and have it ready to pick up by the time the customer walks through the store and arrives at the pick-up area.
    Type: Grant
    Filed: March 29, 2023
    Date of Patent: March 26, 2024
    Assignee: WALGREEN CO.
    Inventors: Nimesh S. Jhaveri, Archana Dhruve, Heather K. Hill, Dejan Kozic, Laura Jean Tebbe, Susan G. Heald, Warit Tulyathorn, Mark A. Jones, Sara B. Frisk, Jennifer M. Levin, Jennifer A. Comiskey, David T. Blanchard
  • Patent number: 10768772
    Abstract: Techniques for context-aware recommendations of relevant presentation content are disclosed. In some configurations, the techniques involve the processing of contextual data from one or more resources to dynamically direct a presentation. A computing device can receive contextual data from one or more resources. The contextual data can be processed to select one or more sections of content data, e.g., a slide of a slide deck or a page of document. The computing device can then cause a display of the one or more sections on one or more devices. In some configurations, a hardware display surface can be configured to provide a real-world view of an object, e.g., a model or other item that may be related to the presentation, through the hardware display surface while also providing a display of the one or more sections of the presenter's material.
    Type: Grant
    Filed: November 19, 2015
    Date of Patent: September 8, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Travis William Steiner, Shannon Richard Monroe, Emiko Valiane Charbonneau, Michal Hlavac, Brian Murphy, David M. Hill, Jia Wang
  • Patent number: 10572133
    Abstract: Technologies described herein provide a mixed environment display of attached control elements. The techniques disclosed herein enable users of a first computing device to interact with a remote computing device configured to control an object, such as a light, appliance, or any other suitable object. Configurations disclosed herein enable the first computing device to cause one or more actions, such as a selection of the object or the display of a user interface, by capturing and analyzing input data defining the performance of one or more gestures, such as a user looking at the object controlled by the second computing device. Rendered graphical elements configured to enable the control of the object can be displayed with a real-world view of the object.
    Type: Grant
    Filed: June 1, 2018
    Date of Patent: February 25, 2020
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: David M. Hill, Andrew William Jean, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Charles W. Carlson, Emiko V. Charbonneau, James Dack
  • Patent number: 10449673
    Abstract: Concepts and technologies are described herein for providing enhanced configuration and control of robots. Configurations disclosed herein augment a mobile computing device, such as a robot, with resources for understanding and navigation of an environment surrounding the computing device. The resources can include sensors of a separate computing device, which may be in the form of a head-mounted display. Data produced by the resources can be used to generate instructions for the mobile computing device. The sensors of the separate computing device can also detect a change in an environment or a conflict in the actions of the mobile computing device, and dynamically modify the generated instructions. By the use of the techniques disclosed herein, a simple, low-cost robot can understand and navigate through a complex environment and appropriately interact with obstacles and other objects.
    Type: Grant
    Filed: July 5, 2017
    Date of Patent: October 22, 2019
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: David M. Hill, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Andrew William Jean, Emiko V. Charbonneau
  • Publication number: 20180349012
    Abstract: Technologies described herein provide a mixed environment display of attached control elements. The techniques disclosed herein enable users of a first computing device to interact with a remote computing device configured to control an object, such as a light, appliance, or any other suitable object. Configurations disclosed herein enable the first computing device to cause one or more actions, such as a selection of the object or the display of a user interface, by capturing and analyzing input data defining the performance of one or more gestures, such as a user looking at the object controlled by the second computing device. Rendered graphical elements configured to enable the control of the object can be displayed with a real-world view of the object.
    Type: Application
    Filed: June 1, 2018
    Publication date: December 6, 2018
    Inventors: David M. HILL, Andrew William JEAN, Jeffrey J. EVERTT, Alan M. JONES, Richard C. ROESLER, Charles W. CARLSON, Emiko V. CHARBONNEAU, James DACK
  • Patent number: 10099382
    Abstract: Concepts and technologies are described herein for providing a mixed environment display of robotic actions. In some configurations, techniques disclosed herein can execute a set of instructions or run a simulation of the set of instructions to generate model data defining actions of a robot based on a set of instructions. Using the model data, one or more computing devices may generate graphical data comprising a graphical representation of the robot performing tasks or actions based on an execution of the instructions. The graphical data can be in form of an animation showing a robots actions, which may include movement of the robot and the robot's interaction with other objects. Graphical elements showing the status of the robot or graphical representations of the instructions may be included in the graphical data. The graphical data may be displayed on an interface or communicated to one or more computers for further processing.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: October 16, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: David M. Hill, Alan M. Jones, Jeffrey J. Evertt, Richard C. Roesler, Emiko V. Charbonneau, Andrew William Jean
  • Patent number: 10007413
    Abstract: Technologies described herein provide a mixed environment display of attached control elements. The techniques disclosed herein enable users of a first computing device to interact with a remote computing device configured to control an object, such as a light, appliance, or any other suitable object. Configurations disclosed herein enable the first computing device to cause one or more actions, such as a selection of the object or the display of a user interface, by capturing and analyzing input data defining the performance of one or more gestures, such as a user looking at the object controlled by the second computing device. Rendered graphical elements configured to enable the control of the object can be displayed with a real-world view of the object.
    Type: Grant
    Filed: November 19, 2015
    Date of Patent: June 26, 2018
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: David M. Hill, Andrew William Jean, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Charles W. Carlson, Emiko V. Charbonneau, James Dack
  • Publication number: 20170297204
    Abstract: Concepts and technologies are described herein for providing enhanced configuration and control of robots. Configurations disclosed herein augment a mobile computing device, such as a robot, with resources for understanding and navigation of an environment surrounding the computing device. The resources can include sensors of a separate computing device, which may be in the form of a head-mounted display. Data produced by the resources can be used to generate instructions for the mobile computing device. The sensors of the separate computing device can also detect a change in an environment or a conflict in the actions of the mobile computing device, and dynamically modify the generated instructions. By the use of the techniques disclosed herein, a simple, low-cost robot can understand and navigate through a complex environment and appropriately interact with obstacles and other objects.
    Type: Application
    Filed: July 5, 2017
    Publication date: October 19, 2017
    Inventors: David M. HILL, Jeffrey J. EVERTT, Alan M. JONES, Richard C. ROESLER, Andrew William JEAN, Emiko V. CHARBONNEAU
  • Patent number: 9713871
    Abstract: Concepts and technologies are described herein for providing enhanced configuration and control of robots. Configurations disclosed herein augment a mobile computing device, such as a robot, with resources for understanding and navigation of an environment surrounding the computing device. The resources can include sensors of a separate computing device, which may be in the form of a head-mounted display. Data produced by the resources can be used to generate instructions for the mobile computing device. The sensors of the separate computing device can also detect a change in an environment or a conflict in the actions of the mobile computing device, and dynamically modify the generated instructions. By the use of the techniques disclosed herein, a simple, low-cost robot can understand and navigate through a complex environment and appropriately interact with obstacles and other objects.
    Type: Grant
    Filed: August 11, 2015
    Date of Patent: July 25, 2017
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: David M. Hill, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Andrew William Jean, Emiko V. Charbonneau
  • Publication number: 20170147154
    Abstract: Techniques for context-aware recommendations of relevant presentation content are disclosed. In some configurations, the techniques involve the processing of contextual data from one or more resources to dynamically direct a presentation. A computing device can receive contextual data from one or more resources. The contextual data can be processed to select one or more sections of content data, e.g., a slide of a slide deck or a page of document. The computing device can then cause a display of the one or more sections on one or more devices. In some configurations, a hardware display surface can be configured to provide a real-world view of an object, e.g., a model or other item that may be related to the presentation, through the hardware display surface while also providing a display of the one or more sections of the presenter's material.
    Type: Application
    Filed: November 19, 2015
    Publication date: May 25, 2017
    Inventors: Travis William Steiner, Shannon Richard Monroe, Emiko Valiane Charbonneau, Michal Hlavac, Brian Murphy, David M. Hill, Jia Wang
  • Publication number: 20160314621
    Abstract: A holographic user interface may display status information in proximity to relevant components of the computing device, such as a robot, allowing a user to readily associate the status information with the relevant components. Arrangements of the graphical displays may utilize graphical elements to show an association between any displayed data and any component of a computing device. Based on data indicating the size, shape, and configuration of a robot's physical parts, techniques disclosed herein can arrange displayed status data, which may involve a holographic UI, in a relevant context. In addition, techniques disclosed herein allow a user to edit data, or provide an input to one or more computing devices in response to the display of any status data.
    Type: Application
    Filed: August 11, 2015
    Publication date: October 27, 2016
    Inventors: David M. Hill, Emiko V. Charbonneau, Andrew William Jean, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler
  • Publication number: 20160311115
    Abstract: Concepts and technologies are described herein for providing enhanced configuration and control of robots. Configurations disclosed herein augment a mobile computing device, such as a robot, with resources for understanding and navigation of an environment surrounding the computing device. The resources can include sensors of a separate computing device, which may be in the form of a head-mounted display. Data produced by the resources can be used to generate instructions for the mobile computing device. The sensors of the separate computing device can also detect a change in an environment or a conflict in the actions of the mobile computing device, and dynamically modify the generated instructions. By the use of the techniques disclosed herein, a simple, low-cost robot can understand and navigate through a complex environment and appropriately interact with obstacles and other objects.
    Type: Application
    Filed: August 11, 2015
    Publication date: October 27, 2016
    Inventors: David M. Hill, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Andrew William Jean, Emiko V. Charbonneau
  • Publication number: 20160311116
    Abstract: Concepts and technologies are described herein for providing a mixed environment display of robotic actions. In some configurations, techniques disclosed herein can execute a set of instructions or run a simulation of the set of instructions to generate model data defining actions of a robot based on a set of instructions. Using the model data, one or more computing devices may generate graphical data comprising a graphical representation of the robot performing tasks or actions based on an execution of the instructions. The graphical data can be in form of an animation showing a robots actions, which may include movement of the robot and the robot's interaction with other objects. Graphical elements showing the status of the robot or graphical representations of the instructions may be included in the graphical data. The graphical data may be displayed on an interface or communicated to one or more computers for further processing.
    Type: Application
    Filed: August 11, 2015
    Publication date: October 27, 2016
    Inventors: David M. Hill, Alan M. Jones, Jeffrey J. Evertt, Richard C. Roesler, Emiko V. Charbonneau, Andrew William Jean
  • Publication number: 20160313902
    Abstract: Technologies described herein provide a mixed environment display of attached control elements. The techniques disclosed herein enable users of a first computing device to interact with a remote computing device configured to control an object, such as a light, appliance, or any other suitable object. Configurations disclosed herein enable the first computing device to cause one or more actions, such as a selection of the object or the display of a user interface, by capturing and analyzing input data defining the performance of one or more gestures, such as a user looking at the object controlled by the second computing device. Rendered graphical elements configured to enable the control of the object can be displayed with a real-world view of the object.
    Type: Application
    Filed: November 19, 2015
    Publication date: October 27, 2016
    Inventors: David M. Hill, Andrew William Jean, Jeffrey J. Evertt, Alan M. Jones, Richard C. Roesler, Charles W. Carlson, Emiko V. Charbonneau, James Dack
  • Publication number: 20150190716
    Abstract: Systems, methods, and computer media for generating an avatar reflecting a player's current appearance. Data describing the player's current appearance is received. The data includes a visible spectrum image of the player, a depth image including both the player and a current background, and skeletal data for the player. The skeletal data indicates an outline of the player's skeleton. Based at least in part on the received data, one or more of the following are captured: a facial appearance of the player; a hair appearance of the player; a clothing appearance of the player; and a skin color of the player. A 3D avatar resembling the player is generated by combining the captured facial appearance, hair appearance, clothing appearance, and/or skin color with predetermined avatar features.
    Type: Application
    Filed: March 18, 2015
    Publication date: July 9, 2015
    Inventors: JEFFREY JESUS EVERTT, JUSTIN AVRAM CLARK, ZACHARY TYLER MIDDLETON, MATTHEW J. PULS, MARK THOMAS MIHELICH, DAN OSBORN, ANDREW R. CAMPBELL, CHARLES EVERETT MARTIN, DAVID M. HILL
  • Patent number: 9013489
    Abstract: Systems, methods, and computer media for generating an avatar reflecting a player's current appearance. Data describing the player's current appearance is received. The data includes a visible spectrum image of the player, a depth image including both the player and a current background, and skeletal data for the player. The skeletal data indicates an outline of the player's skeleton. Based at least in part on the received data, one or more of the following are captured: a facial appearance of the player; a hair appearance of the player; a clothing appearance of the player; and a skin color of the player. A 3D avatar resembling the player is generated by combining the captured facial appearance, hair appearance, clothing appearance, and/or skin color with predetermined avatar features.
    Type: Grant
    Filed: November 16, 2011
    Date of Patent: April 21, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Jeffrey Jesus Evertt, Justin Avram Clark, Zachary Tyler Middleton, Matthew J Puls, Mark Thomas Mihelich, Dan Osborn, Andrew R Campbell, Charles Everett Martin, David M Hill
  • Publication number: 20140088585
    Abstract: A catheter for ablating target tissue from a location within a body vessel includes an ablation region that is configured to transition from a first substantially straight configuration to a second configuration having a two-dimensional or three-dimensional shape. The ablation region may include a plurality of ablation elements that may be distributed along a length of the ablation region such that when the ablation region is in the second configuration, the ablation elements may be placed in closer proximity to the target tissue. Additionally, when the ablation region is in the second configuration, the ablation elements may achieve circumferential coverage of the body lumen or blood vessel, and as such, may be capable of ablating the target tissue at multiple locations along the length and around a circumference of the body lumen or vessel in a single step.
    Type: Application
    Filed: September 25, 2013
    Publication date: March 27, 2014
    Applicant: BOSTON SCIENTIFIC SCIMED, INC.
    Inventors: DAVID M. HILL, JASON P. HILL
  • Publication number: 20120309520
    Abstract: Systems, methods, and computer media for generating an avatar reflecting a player's current appearance. Data describing the player's current appearance is received. The data includes a visible spectrum image of the player, a depth image including both the player and a current background, and skeletal data for the player. The skeletal data indicates an outline of the player's skeleton. Based at least in part on the received data, one or more of the following are captured: a facial appearance of the player; a hair appearance of the player; a clothing appearance of the player; and a skin color of the player. A 3D avatar resembling the player is generated by combining the captured facial appearance, hair appearance, clothing appearance, and/or skin color with predetermined avatar features.
    Type: Application
    Filed: November 16, 2011
    Publication date: December 6, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: JEFFREY JESUS EVERTT, JUSTIN AVRAM CLARK, ZACHARY TYLER MIDDLETON, MATTHEW J. PULS, MARK THOMAS MIHELICH, DAN OSBORN, ANDREW R. CAMPBELL, CHARLES EVERETT MARTIN, DAVID M. HILL
  • Publication number: 20110066181
    Abstract: The present disclosure relates generally to methods and devices for closing and/or sealing an opening in a vessel wall and/or an adjacent tissue tract. In one illustrative embodiment, a device is provided for delivering and deploying an anchor, plug, filament, and locking element adjacent to the opening in the vessel wall and/or tissue tract.
    Type: Application
    Filed: November 17, 2010
    Publication date: March 17, 2011
    Applicant: BOSTON SCIENTIFIC SCIMED, INC.
    Inventors: Mark L. Jenson, Jason P. Hill, Joseph Thielen, Michael J. Pikus, Joel Groff, David M. Hill
  • Patent number: 6634666
    Abstract: An adjustable trailer hitch which provides an operator improved remote control of the lateral position of a trailer hitch, and simplifies the procedure of hooking a trailer to a tow vehicle. The trailer hitch comprises a hitch assembly (such as a ball receptacle) attached to a roller carriage which moves from side-to-side along a bumper-like housing. The roller carriage engages a positioning screw, and an electric motor mounted in a sealed end cap powers the positioning screw. Rotation of the screw translates into rightward or leftward lateral movement of the roller carriage and consequently of the ball receptacle assembly. Motion of the ball receptacle assembly is controlled via a wireless remote control unit. Such direct control over the precise lateral position of the trailer hitch enables the operator to accurately steer the trailer when traveling in reverse.
    Type: Grant
    Filed: December 4, 2001
    Date of Patent: October 21, 2003
    Inventors: David Shilitz, David M. Hill, Louis Toth