Patents by Inventor Mario E. Munich
Mario E. Munich has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20240152705Abstract: Systems and methods for managing conversations between a robot computing device and a user are disclosed. Exemplary implementations may: initiate a first-time user experience sequence with the user; teach the user the robot computing capabilities and/or characteristics; initiate, utilizing a dialog manager, a conversation with the user; receive, one or more command files from the user via one or more microphones; and generate conversation response files and communicating the generated conversation files to the dialog manager in response to the one or more received user global command files to initiate an initial conversation exchange.Type: ApplicationFiled: January 16, 2024Publication date: May 9, 2024Inventors: Stefan A. Scherer, Mario E. Munich, Paolo Pirjanian, Kevin D. Saunders, Wilson Harron, Marissa Kohan
-
Patent number: 11926066Abstract: Apparatus and methods for carpet drift estimation are disclosed. In certain implementations, a robotic device includes an actuator system to move the body across a surface. A first set of sensors can sense an actuation characteristic of the actuator system. For example, the first set of sensors can include odometry sensors for sensing wheel rotations of the actuator system. A second set of sensors can sense a motion characteristic of the body. The first set of sensors may be a different type of sensor than the second set of sensors. A controller can estimate carpet drift based at least on the actuation characteristic sensed by the first set of sensors and the motion characteristic sensed by the second set of sensors.Type: GrantFiled: April 6, 2021Date of Patent: March 12, 2024Assignee: iRobot CorporationInventors: Dhiraj Goel, Ethan Eade, Philip Fong, Mario E. Munich
-
Publication number: 20230384791Abstract: A method of operating a mobile robot includes generating a segmentation map defining respective regions of a surface based on occupancy data that is collected by a mobile robot responsive to navigation of the surface, identifying sub-regions of at least one of the respective regions as non-clutter and clutter areas, and computing a coverage pattern based on identification of the sub-regions. The coverage pattern indicates a sequence for navigation of the non-clutter and clutter areas, and is provided to the mobile robot. Responsive to the coverage pattern, the mobile robot sequentially navigates the non-clutter and clutter areas of the at least one of the respective regions of the surface in the sequence indicated by the coverage pattern. Related methods, computing devices, and computer program products are also discussed.Type: ApplicationFiled: August 9, 2023Publication date: November 30, 2023Inventors: Alexander D. Kleiner, Mario E. Munich
-
Patent number: 11740634Abstract: A method of operating a mobile robot includes generating a segmentation map defining respective regions of a surface based on occupancy data that is collected by a mobile robot responsive to navigation of the surface, identifying sub-regions of at least one of the respective regions as non-clutter and clutter areas, and computing a coverage pattern based on identification of the sub-regions. The coverage pattern indicates a sequence for navigation of the non-clutter and clutter areas, and is provided to the mobile robot. Responsive to the coverage pattern, the mobile robot sequentially navigates the non-clutter and clutter areas of the at least one of the respective regions of the surface in the sequence indicated by the coverage pattern. Related methods, computing devices, and computer program products are also discussed.Type: GrantFiled: April 11, 2022Date of Patent: August 29, 2023Assignee: iRobot CorporationInventors: Alexander D. Kleiner, Mario E. Munich
-
Patent number: 11703857Abstract: A method of operating an autonomous cleaning robot is described. The method includes initiating a training run of the autonomous cleaning robot and receiving, at a mobile device, location data from the autonomous cleaning robot as the autonomous cleaning robot navigates an area. The method also includes presenting, on a display of the mobile device, a training map depicting portions of the area traversed by the autonomous cleaning robot during the training run and presenting, on the display of the mobile device, an interface configured to allow the training map to be stored or deleted. The method also includes initiating additional training runs to produce additional training maps and presenting a master map generated based on a plurality of stored training maps.Type: GrantFiled: November 16, 2020Date of Patent: July 18, 2023Assignee: iRobot CorporationInventors: Stephen O'Dea, Benjamin H. Schriesheim, Qunxi Huang, Kenrick E. Drew, Adam Goss, Mario E. Munich, Alexander D. Kleiner
-
Publication number: 20220317693Abstract: A method of operating a mobile robot includes generating a segmentation map defining respective regions of a surface based on occupancy data that is collected by a mobile robot responsive to navigation of the surface, identifying sub-regions of at least one of the respective regions as non-clutter and clutter areas, and computing a coverage pattern based on identification of the sub-regions. The coverage pattern indicates a sequence for navigation of the non-clutter and clutter areas, and is provided to the mobile robot. Responsive to the coverage pattern, the mobile robot sequentially navigates the non-clutter and clutter areas of the at least one of the respective regions of the surface in the sequence indicated by the coverage pattern. Related methods, computing devices, and computer program products are also discussed.Type: ApplicationFiled: April 11, 2022Publication date: October 6, 2022Inventors: Alexander D. Kleiner, Mario E. Munich
-
Publication number: 20220241985Abstract: Exemplary implementations may: receive one or more inputs including parameters or measurements regarding a physical environment from the one or more input modalities; identify a user based on analyzing the received inputs from the one or more input modalities; determine if the user shows signs of engagement or interest in establishing a communication interaction by analyzing a user's physical actions, visual actions, and/or audio actions, the user's physical actions, visual actions and/or audio actions determined based at least in part on the one or more inputs received from the one or more input modalities; and determine whether the user is interested in an extended communication interaction with the robot computing device by creating visual actions of the robot computing device utilizing the display device or by generating one or more audio files to be reproduced by one or more speakers.Type: ApplicationFiled: February 26, 2021Publication date: August 4, 2022Inventors: Stefan A. Scherer, Mario E Munich, Paolo Pirjanian, Caitlyn Clabaugh, Wilson Harron, Asim Naseer, Albert Ike Macoco, Jr.
-
Publication number: 20220207426Abstract: Systems and methods for creating a view of an environment are disclosed. Exemplary implementations may: receive parameters and measurements from at least two of one or more microphones, one or more imaging devices, a radar sensor, a lidar sensor, and/or one or more infrared imaging devices located in a computing device; analyze the parameters and measurements received from the one or more multimodal input devices, the one or more multimodal input devices including the one or more microphones, one or more imaging devices, a radar sensor, a lidar sensor, and/or one or more infrared imaging devices; generate a world map of an environment around the computing device; and repeat the receiving of parameters and measurements from the multimodal input.Type: ApplicationFiled: April 27, 2021Publication date: June 30, 2022Inventors: Stefan Scherer, Mario E Munich, Paolo Pirjanian, Wilson Harron
-
Publication number: 20220176565Abstract: Systems and methods for authoring and modifying presentation conversation files are disclosed. Exemplary implementations may: receive, at a renderer module, voice files, visual effect files, facial expression files, and/or mobility files; analyze, by the language processor module, the voice files, the visual effect files, the facial expression files, and/or mobility files follow guidelines of a multimodal authoring system; generate, by the renderer module, one or more presentation conversation files based at least in part on the received voice files, visual effect files, facial expression files, and/or mobility files; test, at an automatic testing system, the one or more presentation conversation files to verify correct operation of a computing device that receives the one or more presentation conversation files as an input; and identify, by a multimodal review module, changes to be made to the voice input files, the visual effect files, the facial expression files, and/or the mobility files.Type: ApplicationFiled: February 27, 2021Publication date: June 9, 2022Inventors: Mario E Munich, Stefan Scherer, Paolo Pirjanian, Craig Allen
-
Publication number: 20220180887Abstract: Systems and methods for creating a view of an environment are disclosed. Exemplary implementations may: receive parameters and measurements from at least two of one or more microphones, one or more imaging devices, a radar sensor, a lidar sensor, and/or one or more infrared imaging devices located in a computing device; analyze the parameters and measurements received from the multimodal input; generate a world map of the environment around the computing device; and repeat the receiving of parameters and measurements from the input devices and the analyzing steps on a periodic basis to maintain a persistent world map of the environment.Type: ApplicationFiled: February 28, 2021Publication date: June 9, 2022Inventors: Paolo Pirjanian, Stefan Scherer, Mario E Munich
-
Patent number: 11314260Abstract: A method of operating a mobile robot includes generating a segmentation map defining respective regions of a surface based on occupancy data that is collected by a mobile robot responsive to navigation of the surface, identifying sub-regions of at least one of the respective regions as non-clutter and clutter areas, and computing a coverage pattern based on identification of the sub-regions. The coverage pattern indicates a sequence for navigation of the non-clutter and clutter areas, and is provided to the mobile robot. Responsive to the coverage pattern, the mobile robot sequentially navigates the non-clutter and clutter areas of the at least one of the respective regions of the surface in the sequence indicated by the coverage pattern. Related methods, computing devices, and computer program products are also discussed.Type: GrantFiled: April 18, 2019Date of Patent: April 26, 2022Assignee: iRobot CorporationInventors: Alexander D. Kleiner, Mario E. Munich
-
Publication number: 20220093000Abstract: Systems and methods to process reading articles for a multimodal book application are disclosed. Exemplary implementations may: identify a title of a reading article; store the title of the reading article in a database; scan two or more pages of the reading and generating text representing content of the reading article; analyze the generated text to identify characteristics of the reading article; store the identified characteristics in the database; associate the identified characteristics with the reading article title; generate augmented content files for one or more portions of the reading article based at least in part on the identified characteristics, and; and store the augmented content files in the database and associating the augmented content files with different portions of the reading article.Type: ApplicationFiled: February 27, 2021Publication date: March 24, 2022Inventors: Mario E Munich, Stefan Scherer, Paolo Pirjanian, Craig Allen
-
Publication number: 20220092270Abstract: Systems and methods for managing conversations between a robot computing device and a user are disclosed. Exemplary implementations may: initiate a first-time user experience sequence with the user; teach the user the robot computing capabilities and/or characteristics; initiate, utilizing a dialog manager, a conversation with the user; receive, one or more command files from the user via one or more microphones; and generate conversation response files and communicating the generated conversation files to the dialog manager in response to the one or more received user global command files to initiate an initial conversation exchange.Type: ApplicationFiled: February 26, 2021Publication date: March 24, 2022Inventors: Stefan A. Scherer, Mario E Munich, Paolo Pirjanian, Kevin D Saunders, Wilson Harron, Marissa Kohan
-
Publication number: 20210221003Abstract: Apparatus and methods for carpet drift estimation are disclosed. In certain implementations, a robotic device includes an actuator system to move the body across a surface. A first set of sensors can sense an actuation characteristic of the actuator system. For example, the first set of sensors can include odometry sensors for sensing wheel rotations of the actuator system. A second set of sensors can sense a motion characteristic of the body. The first set of sensors may be a different type of sensor than the second set of sensors. A controller can estimate carpet drift based at least on the actuation characteristic sensed by the first set of sensors and the motion characteristic sensed by the second set of sensors.Type: ApplicationFiled: April 6, 2021Publication date: July 22, 2021Inventors: Dhiraj Goel, Ethan Eade, Philip Fong, Mario E. Munich
-
Patent number: 10974391Abstract: Apparatus and methods for carpet drift estimation are disclosed. In certain implementations, a robotic device includes an actuator system to move the body across a surface. A first set of sensors can sense an actuation characteristic of the actuator system. For example, the first set of sensors can include odometry sensors for sensing wheel rotations of the actuator system. A second set of sensors can sense a motion characteristic of the body. The first set of sensors may be a different type of sensor than the second set of sensors. A controller can estimate carpet drift based at least on the actuation characteristic sensed by the first set of sensors and the motion characteristic sensed by the second set of sensors.Type: GrantFiled: April 10, 2018Date of Patent: April 13, 2021Assignee: iRobot CorporationInventors: Dhiraj Goel, Ethan Eade, Philip Fong, Mario E. Munich
-
Patent number: 10962376Abstract: A system and method for mapping parameter data acquired by a robot mapping system is disclosed. Parameter data characterizing the environment is collected while the robot localizes itself within the environment using landmarks. Parameter data is recorded in a plurality of local grids, i.e., sub-maps associated with the robot position and orientation when the data was collected. The robot is configured to generate new grids or reuse existing grids depending on the robot's current pose, the pose associated with other grids, and the uncertainty of these relative pose estimates. The pose estimates associated with the grids are updated over time as the robot refines its estimates of the locations of landmarks from which determines its pose in the environment. Occupancy maps or other global parameter maps may be generated by rendering local grids into a comprehensive map indicating the parameter data in a global reference frame extending the dimensions of the environment.Type: GrantFiled: March 14, 2018Date of Patent: March 30, 2021Assignee: iRobot CorporationInventors: Philip Fong, Ethan Eade, Mario E. Munich
-
Patent number: 10429851Abstract: A proximity sensor includes first and second sensors disposed on a sensor body adjacent to one another. The first sensor is one of an emitter and a receiver. The second sensor is the other one of an emitter and a receiver. A third sensor is disposed adjacent the second sensor opposite the first sensor. The third sensor is an emitter if the first sensor is an emitter or a receiver if the first sensor is a receiver. Each sensor is positioned at an angle with respect to the other two sensors. Each sensor has a respective field of view. A first field of view intersects a second field of view defining a first volume that detects a floor surface within a first threshold distance. The second field of view intersects a third field of view defining a second volume that detects a floor surface within a second threshold distance.Type: GrantFiled: August 23, 2016Date of Patent: October 1, 2019Assignee: iRobot CorporationInventors: Steven V. Shamlian, Samuel Duffley, Nikolai Romanov, Dhiraj Goel, Frederic D. Hook, Mario E. Munich
-
Publication number: 20190250625Abstract: A method of operating a mobile robot includes generating a segmentation map defining respective regions of a surface based on occupancy data that is collected by a mobile robot responsive to navigation of the surface, identifying sub-regions of at least one of the respective regions as non-clutter and clutter areas, and computing a coverage pattern based on identification of the sub-regions. The coverage pattern indicates a sequence for navigation of the non-clutter and clutter areas, and is provided to the mobile robot. Responsive to the coverage pattern, the mobile robot sequentially navigates the non-clutter and clutter areas of the at least one of the respective regions of the surface in the sequence indicated by the coverage pattern. Related methods, computing devices, and computer program products are also discussed.Type: ApplicationFiled: April 18, 2019Publication date: August 15, 2019Inventors: Alexander D. Kleiner, Mario E. Munich
-
Patent number: 10335004Abstract: A mobile robot system is provided that includes a docking station having at least two pose-defining fiducial markers. The pose-defining fiducial markers have a predetermined spatial relationship with respect to one another and/or to a reference point on the docking station such that a docking path to the base station can be determined from one or more observations of the at least two pose-defining fiducial markers. A mobile robot in the system includes a pose sensor assembly. A controller is located on the chassis and is configured to analyze an output signal from the pose sensor assembly. The controller is configured to determine a docking station pose, to locate the docking station pose on a map of a surface traversed by the mobile robot and to path plan a docking trajectory.Type: GrantFiled: December 23, 2016Date of Patent: July 2, 2019Assignee: iRobot CorporationInventors: Philip Fong, Jason Meltzer, Jens-Steffen Gutmann, Vazgen Karapetyan, Mario E. Munich
-
Patent number: 10310507Abstract: A method of operating a mobile robot includes generating a segmentation map defining respective regions of a surface based on occupancy data that is collected by a mobile robot responsive to navigation of the surface, identifying sub-regions of at least one of the respective regions as non-clutter and clutter areas, and computing a coverage pattern based on identification of the sub-regions. The coverage pattern indicates a sequence for navigation of the non-clutter and clutter areas, and is provided to the mobile robot. Responsive to the coverage pattern, the mobile robot sequentially navigates the non-clutter and clutter areas of the at least one of the respective regions of the surface in the sequence indicated by the coverage pattern. Related methods, computing devices, and computer program products are also discussed.Type: GrantFiled: April 2, 2018Date of Patent: June 4, 2019Assignee: iRobot CorporationInventors: Alexander D. Kleiner, Mario E. Munich