Patents by Inventor Mario Munich
Mario Munich has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20250001615Abstract: Systems and methods for authoring and modifying presentation conversation files are disclosed. Exemplary implementations may: receive, at a renderer module, voice files, visual effect files, facial expression files, and/or mobility files; analyze, by the language processor module, the voice files, the visual effect files, the facial expression files, and/or mobility files follow guidelines of a multimodal authoring system; generate, by the renderer module, one or more presentation conversation files based at least in part on the received voice files, visual effect files, facial expression files, and/or mobility files; test, at an automatic testing system, the one or more presentation conversation files to verify correct operation of a computing device that receives the one or more presentation conversation files as an input; and identify, by a multimodal review module, changes to be made to the voice input files, the visual effect files, the facial expression files, and/or the mobility files.Type: ApplicationFiled: September 10, 2024Publication date: January 2, 2025Inventors: Mario Munich, Stefan Scherer, Paolo Pirjanian, Craig Allen
-
Patent number: 11966227Abstract: A method includes constructing a map of an environment based on mapping data produced by an autonomous cleaning robot in the environment during a first cleaning mission. Constructing the map includes providing a label associated with a portion of the mapping data. The method includes causing a remote computing device to present a visual representation of the environment based on the map, and a visual indicator of the label. The method includes causing the autonomous cleaning robot to initiate a behavior associated with the label during a second cleaning mission.Type: GrantFiled: February 14, 2022Date of Patent: April 23, 2024Assignee: iRobot CorporationInventors: Mario Munich, Andreas Kolling, Manjunath Narayana, Philip Fong
-
Publication number: 20230274743Abstract: Systems and methods for establishing multi-turn communications between a robot device and an individual are disclosed. Implementations may: receive one or more input text files associated with the individual's speech; filter the one or more input text files to verify the one or more input text files are not associated with prohibited subjects; analyze the one or more input text files to determine an intention on the individuals speech; perform actions based on the analyzed intention; generate one or more output text files based on the performed actions; communicate the created one or more output text files to the markup module; analyze the received one or more output text files for sentiment; based on sentiment analysis, associating an emotion indicator, and/or multimodal output actions with the one or more output text files; verify, by the prohibited speech filter, the one or more output text files do not include prohibited subjects.Type: ApplicationFiled: January 28, 2022Publication date: August 31, 2023Inventors: Stefan Scherer, Mario Munich, Paolo Pirjanian, Dave Benson, Justin Beghtol, Rithesh Murthy, Taylor Shin, Catherine Thornton, Erica Gardner, Benjamin Gittelson, Wilson Harron, Caitlyn Clabaugh
-
Patent number: 11614746Abstract: A multi-robot system includes a first a mobile cleaning robot that has a local storage device to store a persistent map of an environment, at least one sensor to sense the environment, and a control module. The control module is configured to: control the mobile cleaning robot to navigate in the environment using the persistent map and sensing data provided by the at least one sensor, share the persistent map with a second mobile cleaning robot, and coordinate with the second mobile cleaning robot to perform cleaning tasks.Type: GrantFiled: January 5, 2018Date of Patent: March 28, 2023Assignee: iRobot CorporationInventors: Philip Fong, Clifton Eric Smith, Mario Munich, Stephen Ernest O'Dea
-
Publication number: 20220269275Abstract: A method includes constructing a map of an environment based on mapping data produced by an autonomous cleaning robot in the environment during a first cleaning mission. Constructing the map includes providing a label associated with a portion of the mapping data. The method includes causing a remote computing device to present a visual representation of the environment based on the map, and a visual indicator of the label. The method includes causing the autonomous cleaning robot to initiate a behavior associated with the label during a second cleaning mission.Type: ApplicationFiled: February 14, 2022Publication date: August 25, 2022Inventors: Mario Munich, Andreas Kolling, Manjunath Narayana, Philip Fong
-
Publication number: 20220183529Abstract: An autonomous mobile robot includes a drive system to support the robot above a surface, a sensor system configured to generate a signal indicative of a location of the robot on the surface, and a controller operably connected to the drive system and the sensor system. The drive system is operable to navigate the robot about the surface. The controller is configured to execute instructions to perform operations including establishing a behavior control zone on the surface, controlling the drive system, in response to establishing the behavior control zone on the surface, to maneuver the robot to a location of the behavior control zone on the surface, and maneuvering, using the drive system, the robot about the surface and initiating a behavior in response to determining, based on the signal indicative of the location of the robot, that the robot is proximate the behavior control zone.Type: ApplicationFiled: March 7, 2022Publication date: June 16, 2022Inventors: Mario Munich, Philip Fong, Vazgen Karapetyan, Andreas Kolling
-
Patent number: 11266287Abstract: An autonomous mobile robot includes a drive system to support the robot above a surface, a sensor system configured to generate a signal indicative of a location of the robot on the surface, and a controller operably connected to the drive system and the sensor system. The drive system is operable to navigate the robot about the surface. The controller is configured to execute instructions to perform operations including establishing a behavior control zone on the surface, controlling the drive system, in response to establishing the behavior control zone on the surface, to maneuver the robot to a location of the behavior control zone on the surface, and maneuvering, using the drive system, the robot about the surface and initiating a behavior in response to determining, based on the signal indicative of the location of the robot, that the robot is proximate the behavior control zone.Type: GrantFiled: May 29, 2019Date of Patent: March 8, 2022Assignee: iRobot CorporationInventors: Mario Munich, Philip Fong, Vazgen Karapetyan, Andreas Kolling
-
Patent number: 11249482Abstract: A method includes constructing a map of an environment based on mapping data produced by an autonomous cleaning robot in the environment during a first cleaning mission. Constructing the map includes providing a label associated with a portion of the mapping data. The method includes causing a remote computing device to present a visual representation of the environment based on the map, and a visual indicator of the label. The method includes causing the autonomous cleaning robot to initiate a behavior associated with the label during a second cleaning mission.Type: GrantFiled: August 9, 2019Date of Patent: February 15, 2022Assignee: iRobot CorporationInventors: Mario Munich, Andreas Kolling, Manjunath Narayana, Philip Fong
-
Publication number: 20210124354Abstract: A method includes constructing a map of an environment based on mapping data produced by an autonomous cleaning robot in the environment during a first cleaning mission. Constructing the map includes providing a label associated with a portion of the mapping data. The method includes causing a remote computing device to present a visual representation of the environment based on the map, and a visual indicator of the label. The method includes causing the autonomous cleaning robot to initiate a behavior associated with the label during a second cleaning mission.Type: ApplicationFiled: August 9, 2019Publication date: April 29, 2021Inventors: Mario Munich, Andreas Kolling, Manjunath Narayana, Philip Fong
-
Publication number: 20200375429Abstract: An autonomous mobile robot includes a drive system to support the robot above a surface, a sensor system configured to generate a signal indicative of a location of the robot on the surface, and a controller operably connected to the drive system and the sensor system. The drive system is operable to navigate the robot about the surface. The controller is configured to execute instructions to perform operations including establishing a behavior control zone on the surface, controlling the drive system, in response to establishing the behavior control zone on the surface, to maneuver the robot to a location of the behavior control zone on the surface, and maneuvering, using the drive system, the robot about the surface and initiating a behavior in response to determining, based on the signal indicative of the location of the robot, that the robot is proximate the behavior control zone.Type: ApplicationFiled: May 29, 2019Publication date: December 3, 2020Inventors: Mario Munich, Philip Fong, Vazgen Karapetyan, Andreas Kolling
-
Patent number: 10705535Abstract: The present invention provides a mobile robot configured to navigate an operating environment, that includes a controller circuit that directs a drive of the mobile robot to navigate the mobile robot through an environment using camera-based navigation system and a camera including optics defining a camera field of view and a camera optical axis, where the camera is positioned within the recessed structure and is tilted so that the camera optical axis is aligned at an acute angle of above a horizontal plane in line with the top surface and is aimed in a forward drive direction of the robot body, and the camera is configured to capture images of the operating environment of the mobile robot.Type: GrantFiled: November 13, 2018Date of Patent: July 7, 2020Assignee: iRobot CorporationInventors: Mario Munich, Nikolai Romanov, Dhiraj Goel, Philip Fong
-
Publication number: 20190212752Abstract: A multi-robot system includes a first a mobile cleaning robot that has a local storage device to store a persistent map of an environment, at least one sensor to sense the environment, and a control module. The control module is configured to: control the mobile cleaning robot to navigate in the environment using the persistent map and sensing data provided by the at least one sensor, share the persistent map with a second mobile cleaning robot, and coordinate with the second mobile cleaning robot to perform cleaning tasks.Type: ApplicationFiled: January 5, 2018Publication date: July 11, 2019Inventors: Philip Fong, Clifton Eric Smith, Mario Munich, Stephen Ernest O'Dea
-
Publication number: 20190086933Abstract: The present invention provides a mobile robot configured to navigate an operating environment, that includes a controller circuit that directs a drive of the mobile robot to navigate the mobile robot through an environment using camera-based navigation system and a camera including optics defining a camera field of view and a camera optical axis, where the camera is positioned within the recessed structure and is tilted so that the camera optical axis is aligned at an acute angle of above a horizontal plane in line with the top surface and is aimed in a forward drive direction of the robot body, and the camera is configured to capture images of the operating environment of the mobile robot.Type: ApplicationFiled: November 13, 2018Publication date: March 21, 2019Inventors: Mario Munich, Nikolai Romanov, Dhiraj Goel, Philip Fong
-
Patent number: 10222805Abstract: The present invention provides a mobile robot configured to navigate an operating environment, that includes a controller circuit that directs a drive of the mobile robot to navigate the mobile robot through an environment using camera-based navigation system and a camera including optics defining a camera field of view and a camera optical axis, where the camera is positioned within the recessed structure and is tilted so that the camera optical axis is aligned at an acute angle of above a horizontal plane in line with the top surface and is aimed in a forward drive direction of the robot body, and the camera is configured to capture images of the operating environment of the mobile robot.Type: GrantFiled: November 16, 2016Date of Patent: March 5, 2019Assignee: iRobot CorporationInventors: Mario Munich, Nikolai Romanov, Dhiraj Goel, Philip Fong
-
Patent number: 9623557Abstract: A robot having a signal sensor configured to measure a signal, a motion sensor configured to measure a relative change in pose, a local correlation component configured to correlate the signal with the position and/or orientation of the robot in a local region including the robot's current position, and a localization component configured to apply a filter to estimate the position and optionally the orientation of the robot based at least on a location reported by the motion sensor, a signal detected by the signal sensor, and the signal predicted by the local correlation component. The local correlation component and/or the localization component may take into account rotational variability of the signal sensor and other parameters related to time and pose dependent variability in how the signal and motion sensor perform. Each estimated pose may be used to formulate new or updated navigational or operational instructions for the robot.Type: GrantFiled: August 26, 2016Date of Patent: April 18, 2017Assignee: iRobot CorporationInventors: Steffen Gutmann, Ethan Eade, Philip Fong, Mario Munich
-
Publication number: 20170097643Abstract: The present invention provides a mobile robot configured to navigate an operating environment, that includes a controller circuit that directs a drive of the mobile robot to navigate the mobile robot through an environment using camera-based navigation system and a camera including optics defining a camera field of view and a camera optical axis, where the camera is positioned within the recessed structure and is tilted so that the camera optical axis is aligned at an acute angle of above a horizontal plane in line with the top surface and is aimed in a forward drive direction of the robot body, and the camera is configured to capture images of the operating environment of the mobile robot.Type: ApplicationFiled: November 16, 2016Publication date: April 6, 2017Inventors: Mario Munich, Nikolai Romanov, Dhiraj Goel, Philip Fong
-
Publication number: 20170050318Abstract: A robot having a signal sensor configured to measure a signal, a motion sensor configured to measure a relative change in pose, a local correlation component configured to correlate the signal with the position and/or orientation of the robot in a local region including the robot's current position, and a localization component configured to apply a filter to estimate the position and optionally the orientation of the robot based at least on a location reported by the motion sensor, a signal detected by the signal sensor, and the signal predicted by the local correlation component. The local correlation component and/or the localization component may take into account rotational variability of the signal sensor and other parameters related to time and pose dependent variability in how the signal and motion sensor perform. Each estimated pose may be used to formulate new or updated navigational or operational instructions for the robot.Type: ApplicationFiled: August 26, 2016Publication date: February 23, 2017Inventors: Steffen Gutmann, Ethan Eade, Philip Fong, Mario Munich
-
Patent number: 9519289Abstract: The present invention provides a mobile robot configured to navigate an operating environment, that includes a controller circuit that directs a drive of the mobile robot to navigate the mobile robot through an environment using camera-based navigation system and a camera including optics defining a camera field of view and a camera optical axis, where the camera is positioned within the recessed structure and is tilted so that the camera optical axis is aligned at an acute angle of above a horizontal plane in line with the top surface and is aimed in a forward drive direction of the robot body, and the camera is configured to capture images of the operating environment of the mobile robot.Type: GrantFiled: September 16, 2015Date of Patent: December 13, 2016Assignee: iRobot CorporationInventors: Mario Munich, Nikolai Romanov, Dhiraj Goel, Philip Fong
-
Patent number: 9468349Abstract: A mobile robot system is provided that includes a docking station having at least two pose-defining fiducial markers. The pose-defining fiducial markers have a predetermined spatial relationship with respect to one another and/or to a reference point on the docking station such that a docking path to the base station can be determined from one or more observations of the at least two pose-defining fiducial markers. A mobile robot in the system includes a pose sensor assembly. A controller is located on the chassis and is configured to analyze an output signal from the pose sensor assembly. The controller is configured to determine a docking station pose, to locate the docking station pose on a map of a surface traversed by the mobile robot and to path plan a docking trajectory.Type: GrantFiled: November 20, 2015Date of Patent: October 18, 2016Assignee: iRobot CorporationInventors: Philip Fong, Jason Meltzer, Jens-Steffen Gutmann, Vazgen Karapetyan, Mario Munich
-
Patent number: 9440354Abstract: A robot having a signal sensor configured to measure a signal, a motion sensor configured to measure a relative change in pose, a local correlation component configured to correlate the signal with the position and/or orientation of the robot in a local region including the robot's current position, and a localization component configured to apply a filter to estimate the position and optionally the orientation of the robot based at least on a location reported by the motion sensor, a signal detected by the signal sensor, and the signal predicted by the local correlation component. The local correlation component and/or the localization component may take into account rotational variability of the signal sensor and other parameters related to time and pose dependent variability in how the signal and motion sensor perform. Each estimated pose may be used to formulate new or updated navigational or operational instructions for the robot.Type: GrantFiled: January 5, 2015Date of Patent: September 13, 2016Assignee: iRobot CorporationInventors: Steffen Gutmann, Ethan Eade, Philip Fong, Mario Munich