CONTROL SYSTEM FOR A MOBILE MANIPULATION DEVICE

A display window control system for a mobile manipulation device includes an interface and a processor. The interface is configured to receive an image from a mobile manipulation device. The processor is configured to determine a display output to display the image in a display window; determine an overlay output to display one or more overlay command icons; determine that an indication has been received to activate an overlay command; and provide the indication to the mobile manipulation device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS REFERENCE TO OTHER APPLICATIONS

This application also claims priority to U.S. Provisional Patent Application No. 62/473,778 entitled LOW COST GENERAL-PURPOSE MOBILE MANIPULATOR FOR INDOOR USE filed Mar. 20, 2017 which is incorporated herein by reference for all purposes. This application also claims priority to U.S. Provisional Patent Application No. 62/474,427 entitled LOW COST GENERAL-PURPOSE MOBILE MANIPULATOR FOR INDOOR USE filed Mar. 21, 2017 which is incorporated herein by reference for all purposes. This application also claims priority to U.S. Provisional Patent Application No. 62/473,778 entitled ADDITIONAL HARDWARE AND SOFTWARE FOR A LOW-COST GENERAL-PURPOSE MOBILE MANIPULATOR FOR INDOOR USE filed Feb. 5, 2018 which is incorporated herein by reference for all purposes.

BACKGROUND OF THE INVENTION

People with limited abilities often need an assistant for performing tasks around the home. For example, moving objects, feeding pets, cleaning up, etc., become very difficult tasks for people with difficulty walking around their home or picking up and carrying objects. Even for the able-bodied it would often be preferable to not have to perform some of these tasks, for example, when traveling, busy with work, relaxing, etc. Hiring an assistant to come into the home is expensive and can bring complications, for example introducing the possibilities of theft, employment law issues, etc. However, the world of humans is very messy and difficult for robots to navigate. Only the simplest home robots have shown themselves to be practical (e.g., robotic vacuum cleaners). This creates a problem where robots for more complicated tasks typically require very sophisticated technology, becoming more expensive than hiring a human assistant.

BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.

FIG. 1 is a block diagram illustrating an embodiment of a network system for mobile manipulation device control.

FIG. 2 is a diagram illustrating an embodiment of a mobile manipulation device.

FIG. 3 is a diagram illustrating an embodiment of a base.

FIG. 4 is a diagram illustrating an embodiment of a belt-driven linear actuator.

FIG. 5 is a diagram illustrating an embodiment of a manipulator end.

FIG. 6 is a diagram illustrating an embodiment of a manipulator end.

FIG. 7 is a diagram illustrating an embodiment of a manipulator end.

FIG. 8 is a diagram illustrating an embodiment of an arm actuator.

FIG. 9 is a diagram illustrating an embodiment of a mobile manipulation device in a room.

FIG. 10 is a diagram illustrating an embodiment of a user interface for a mobile manipulation device.

FIG. 11 is a diagram illustrating an embodiment of a user interface for a mobile manipulation device.

FIG. 12 is a diagram illustrating a set of command icon regions.

FIG. 13 is a diagram illustrating an embodiment of a user interface for a mobile manipulation device.

FIG. 14 is a diagram illustrating an embodiment of a user interface for a mobile manipulation device.

FIG. 15 is a flow diagram illustrating an embodiment of a process for a display window control system for a mobile manipulation device.

FIG. 16 is a flow diagram illustrating an embodiment of a process for determining a display output to display the image in a display window.

FIG. 17 is a flow diagram illustrating an embodiment of a process for determining an overlay output to display a coarse overlay command icon or a fine overlay command icon.

FIG. 18 is a flow diagram illustrating an embodiment of a process for determining an overlay output to display a coarse overlay command icon or a fine overlay command icon.

FIG. 19 is a flow diagram illustrating an embodiment of a process for a remote control interface system for a mobile manipulation device.

FIG. 20 is a flow diagram illustrating an embodiment of a process for determining actuator commands based at least in part on an indication to move.

DETAILED DESCRIPTION

The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.

A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.

A display window control system is disclosed. The display window control system for a mobile manipulation device comprises an interface configured to receive an image from a mobile manipulation device and a processor configured to determine a display output to display the image in a display window, determine an overlay output to display one or more overlay command icons, determine that an indication has been received to activate an overlay command, and provide the indication to the mobile manipulation device. In some embodiments, the display window control system for the mobile manipulation device comprises a memory coupled to the processor and configured to provide the processor with instructions.

A mobile manipulation device is disclosed. The mobile manipulation device comprises a base, a lift, an arm, and a manipulator. The base is able to move across a surface underneath the base. The lift is coupled to the base. The lift moves the arm vertically. The arm moves the manipulator horizontally along one direction. The base is able to move perpendicular to the one direction.

A mobile manipulation device comprises a base, a lift, an arm, and a manipulator. The base moves the lift across a surface using actuators connected to mechanisms such as wheels, tank treads, or articulated links. The base can rotate and translate the lift across the surface and in some embodiments perform other motions. In some embodiments, the base uses two driven wheels. The lift moves the arm vertically—lifting and lowering. The arm moves horizontally—extending and retracting. The base can move perpendicular to the arm's horizontal motion. The manipulator is attached to the arm. Together, the base, lift, and arm result in Cartesian motion of the manipulator. The base is also responsible for moving the lift to different locations and orientations in the environment.

People with limited mobility, cognitive impairment, perceptual difficulties, or reduced dexterity would benefit from a home robot that retrieves requested objects and performs other manipulation tasks. Here a robot is described that can perform manipulation tasks in human environments, such as homes. The robot emulates advantageous characteristics of the human body in a low-cost and highly-simplified form.

A full humanoid robot would be well adapted to manipulating everyday objects in human environments, however the cost would be prohibitive and the robot would be challenging to control. There are several important characteristics of the human body such a robot should have, including sensors and manipulators that are high above the ground, a small footprint, and stability with respect to perturbations, even when reaching out into the environment in order to perform tasks, such as grasping objects. The disclosed device is a robot that emulates these and other humanoid characteristics in a low cost, simplified form.

Two key capabilities of the design are its ability to reach important locations in human environments and its ability to take advantage of the Cartesian structure of human environments, which tend to consist of many horizontal and vertical planes, such as floors, tables, countertops, and cabinets. Some of the novel concepts included follow:

    • Use of fewer actuators by using the base during manipulation to achieve movement along an additional orthogonal direction; use of a thin and long telescoping mechanism to reach through clutter instead of the typical approach of reaching around clutter with multiple rotary joints; and use of a Cartesian structure matched to the Cartesian structure of indoor human environments to simplify manipulation.
    • Use of smaller actuators by having the arm actuator move a telescoping member orthogonal to gravity to avoid issues of gravitational loading and long moment arms typically associated with a long reach; using a telescoping member rigidly attached to a carriage on the lift and the lift attached to a high-mass mobile base to enable the robot's structure to support the force and moment applied by the telescoping arm to the carriage instead of an actuator; and using a lift actuator that moves a low mass arm assembly up and down, where in some embodiments the arm assembly is low mass due in part to the small arm actuator and the lightweight telescoping structure.
    • Reduced need for coordination among actuators due to Cartesian structure in Cartesian environments, which enables independent, serialized control of the mobile base actuators, lift actuator, and telescoping arm actuator to perform many useful tasks in human environments. For example, when reaching to a location, a useful strategy is to first position the base, then raise or lower the arm so that the arm points at the target location, and then extend the arm until the manipulator reaches the location. Each of these motions can be facilitated by cameras and refined with further motions.
    • Safe and stable reaching to locations due to a lightweight telescoping arm that is long and has a small cross section to reach through clutter; a statically stable base with a small footprint to navigate to locations in human environments; and a low center of mass resulting from a relatively large mass in the base from components such as the batteries and wheel actuators and a lightweight structure above the base due to the lightweight lift, arm, and manipulator.
    • Versatility through specialized manipulators and tools that can be manually or automatically attached to the end of arm. These manipulators and tools may include simple or complex manipulators and tools—for example, simple unactuated tools like a hook, complex actuated mechanisms like a pitch-roll wrist, and/or actuated tools like a vacuum. Other examples of manipulators and tools include: a two actuator wrist with a compliant gripper; a gripper at a fixed orientation; hooks for opening/closing drawers and doors, operating light switches, and operating elevator buttons; a spoon for feeding; an adjustable mobile phone/tablet holder; a dustpan like gripper to pick things up from the ground; a vacuum cleaner brush; a small brush tool and dustpan attached to the arm to sweep debris off surfaces; and tools to leave marks, such as a marker holder with a spring to write on surfaces.
    • View of the world from a human perspective without a complex robot head by using cameras with wide fields of view mounted at human eye level.

The disclosed robot design uses a single long, narrow, lightweight telescoping mechanism to reach locations around a person's body. In conjunction with the robot's wheeled base and vertical lift, the robot can directly move objects in orthogonal directions matched to the planar structure of human environments, simplifying control of the robot. This design reduces the robot's overall weight, number of actuators, and actuator requirements. In addition the robot's motions are easier for people to understand and direct.

In some embodiments, this robot design uses four motors: two for the base wheels, one for a long, narrow, lightweight telescoping mechanism that extends horizontally and serves as the robot's arm, and one motor for a vertical lift that raises and lowers the robot's telescoping arm. The robot's arm extends and retracts forward and backward, the lift moves up and down, and the wheels move the robot sideways or rotate the robot. These four motors provide four degree of freedom control of the end of arm (X, Y, Z, and theta). Less than four motors can be used using a clutched transmission and more than four motors can be used using a redundant mechanism. An actuated manipulator or a simple tool, such as a hook or duster, can be attached to the end of the arm. For example, the arm moves a manipulator toward and away from a base mounting pole along an x-axis; the arm is moved up and down along a base mounting pole relative to the floor along a z-axis; and the base mounting pole that protrudes vertically from a base is moved perpendicular to the arm along a y-axis. The base is also enabled to rotate around the axis of the base mounting pole. With these motions the end of the robot's arm can move in four directions using the telescoping arm, the lift, and the wheels.

The use of a long, narrow, lightweight, telescoping mechanism allows the robot to stably reach long distances relative to the footprint size of its wheeled mobile base, and to do so with a relatively small motor. By remaining horizontal, the motion of the arm does not work against gravity when used on the flat ground of indoor human environments. The long reach is especially useful in cluttered human environments, such as when assisting a person by moving an object close to his or her body. A key insight is that a long, thin telescoping arm with Cartesian positioning and a single axis of rotation can reach most locations in human environments and thereby do useful things. In some embodiments, the telescoping arm has a cross section smaller than or similar in size to a cross section of a human arm and a length that is longer than or similar in length to an outstretched human arm. In some embodiments, the telescoping arm is constructed from a series of nested structural elements and sliding elements such as bushings or bearings and is an exoskeleton comprising a hollow structure that provides structural support and also serves as the exterior of the arm.

The drive for the telescoping mechanism uses a large, somewhat stiff cable with electrically conducting wires to push and pull the end of the telescoping mechanism while extending and retracting. The telescoping mechanism can be moved by placing the cable between a high-friction pulley wheel attached to a motor and a passive pulley wheel with both wheels compressed together with springs to maintain contact with the cable. The conducting wires carry power and signals to and from end effectors at the end of the telescoping mechanism. The end effectors can be changed, and the base of the robot can have a tool holder for enabling the robot to automatically change the end effectors. In various embodiments, the end effectors include one or more of the following: an actuated grabber device, a dexterous hand, a robotic wrist, a vacuum, a suction gripper, a dustpan, a duster, a wiping element, a scraper, a hook, a rotary tool, a mop, a mobile phone holder, a tablet holder, a brush, a writing instrument, an eating utensil, a cloth, or any other appropriate tool.

The disclosed mobile manipulator is an improved system because it provides low cost, effective assistance with an intuitive control mechanism as a human helper robot for human spaces.

FIG. 1 is a block diagram illustrating an embodiment of a network system for mobile manipulation device control. In the example shown, mobile manipulation device control system 102 communicates with mobile manipulation device 104 via network 100. Mobile manipulation device control system 102 comprises a computer, a smartphone, custom mobile manipulation device control hardware, an analog mobile manipulation device controller, a digital mobile manipulation device controller, a mobile manipulation device controller including a display, a manual mobile manipulation device controller, an automatic mobile manipulation device controller, a mobile manipulation device controller utilizing artificial intelligence, etc. Mobile manipulation device 104 comprises a mobile manipulation device for performing tasks—for example, picking up objects, carrying objects, cleaning floors or other surfaces, operating machinery, assisting humans, playing with pets, etc. Network 100 comprises one or more of the following: a local area network, a wide area network, a wired network, a wireless network, the Internet, an intranet, a storage area network, or any other appropriate communication network. Mobile manipulation device 104 provides data from sensors (e.g., a top-mounted camera, an arm-mounted camera, a manipulator camera, a grip sensor, a position sensor, microphones, etc.) to mobile manipulation device control system 102. The sensor data is displayed to a user of mobile manipulation device control system 102. A command is received from mobile manipulation device control system 102 and sent to mobile manipulation device 104. The command is translated to cause action in mobile manipulation device 104 (e.g., motion of the arm, manipulator, base, etc.)

FIG. 2 is a diagram illustrating an embodiment of a mobile manipulation device. In some embodiments, mobile manipulation device 200 comprises mobile manipulation device 104 of FIG. 1. In the example shown, mobile manipulation device 200 comprises base 204, vertical lift 202, arm 216, and manipulator 222. Base 204 comprises a base for carrying and moving the elements of mobile manipulation device 200 (e.g., along y axis or rotating around to change theta). Base 204 includes two or more wheels (e.g., wheel 206). The two or more wheels are driven. For example, the two or more wheels are driven together, independently, in common mode (e.g., in the same direction at the same speed), in differential mode (e.g., in opposite directions at the same speed), etc. In some embodiments, base 204 additionally includes an additional wheel, wherein the additional wheel is not driven. For example, the additional wheel comprises an additional wheel for providing static stability (e.g., allowing the mobile manipulation device to balance without control). The additional wheel comprises a pivoting caster, a ball caster, a mecanum wheel, an omniwheel, etc. In some embodiments, base 204 additionally comprises one or more additional wheels, wherein the additional wheels are or are not driven. In various embodiments, base 204 additionally comprises batteries, a computer, speakers, a vertical lift driver for actuating vertical lift 202, or any other appropriate elements.

Vertical lift 202 comprises a vertical lift for carrying, raising, and lowering arm 216 (e.g., along the z-axis). In some embodiments, vertical lift 202 comprises a fixed mast (e.g., a vertical element of fixed dimensions) coupled to a linear actuator (e.g., an actuator for moving an element along a linear path), and arm 216 is coupled to the moving element of the linear actuator. In various embodiments, the linear actuator comprises a belt drive, a lead screw, a ball screw, a linear actuator including a prismatic joint, etc. In some embodiments, vertical lift 202 comprises a telescoping mast (e.g., a vertical element capable of extending or contracting in length), and arm 216 is coupled to a fixed point on the telescoping mast. In the example shown, coupling 212 comprises an actuated coupling for coupling arm 216 to vertical lift 202. A linear actuator included in vertical lift 202 drives coupling 212 and arm 216 up and down.

Camera mounting structure 208 is attached to the top of vertical lift 202 and holds camera 210. Camera 210 comprises a camera for viewing the environment around mobile manipulation device 200. For example, camera 210 comprises a camera facing the ground. When mounted high above the ground, camera 210 can see the tops of surfaces in human environments, like countertops, tables, desks, and manipulable objects in places people commonly place them. In some embodiments, camera 210 comprises a fisheye lens. When near human eye height with a fisheye lens, camera 210 provides a view of the world comparable to a standing human. Camera mounting structure 208 for camera 210 can increase visibility of the surrounding environment by reducing occlusions. Camera mounting structure 208 places the camera away from the mast to reduce occlusion from the mast. In some embodiments, camera mounting structure 208 arcs behind the camera to reduce occlusion, and its structure can use components with thin cross sections oriented parallel to rays emanating from the optical axis of the camera to reduce occlusion. Mobile manipulation device 200 additionally comprises camera 214 mounted on arm 216 so that it can view the arm and the manipulator. Camera 214 is mounted on arm 216 near vertical lift 202 and is facing gripper 222. Camera 214 comprises a camera for viewing the area where gripper 222 is operating. In some embodiments, camera 214 comprises a fisheye lens.

Arm 216 comprises an arm extending horizontally from vertical lift 202 (e.g., along x axis). In the example shown, arm 216 has a square cross-section. In some embodiments, arm 216 has a round cross-section. Other cross sections can be used—both nest and non-nesting. In addition, keyed cross sections can be used that prevent rotation of the telescoping tubes relative to one another. Arm 216 is telescoping—for example, capable of a telescoping action for moving gripper 222 towards or away from vertical lift 202. When the wheels of base 204 are driven in common mode such that mobile manipulation device 200 moves in a straight line, mobile manipulation device 200 moves perpendicular to the direction of arm 216. Arm 216 comprises hinge 218—for example, an actuated hinge for bending. In the example shown, when hinge 218 bends, gripper 222 moves toward the ground. Arm 216 additionally comprises camera 220. Camera 220 is mounted on arm 216 near gripper 222. As gripper 222 moves (e.g., due to the motion of hinge 218 or an actuated rotational coupling), camera 220 moves with it holding gripper 222 static in its frame. Camera 220 allows a close view of the actions of gripper 222. In some embodiments, camera 220 comprises a fisheye lens. Gripper 222 is attached to the end of arm 216. In some embodiments, a different manipulator is attached to the end of arm 216 (e.g., a manipulator for interacting with other objects). For example, manipulator comprises a grabber, an interchangeable tool holder, a vacuum, or a mop. Mobile manipulation device 200 may include a set of interchangeable tools that can be attached to an interchangeable tool holder manipulator. For example, a set of interchangeable tools can be mounted on base 204 and reached by retracting arm 216, folding hinge 218, and lowering arm 216 using vertical lift 212. The interchangeable tool holder attaches to a variety of tools, including, but not limited to, a grabber, a dexterous hand, a vacuum, a suction gripper, a dustpan, a duster, a wiping element, a scraper, a hook, a rotary tool, a mop, a mobile phone holder, a tablet holder, a brush, a writing instrument, an eating utensil, and a cloth.

Arm 216 additionally comprises actuated rotational coupling 224 for rotating. Rotational coupling 224 makes grippers and other tools more versatile. For example, it can be used to turn a door knob that a gripper is holding. When hinge 218 makes the gripper point towards the ground, rotational coupling 224 can orient a gripper to pick up an elongated object sitting on a flat surface from above. Rotational coupling 224 can also enable a hook to be oriented to hook onto both vertical and horizontal drawer handles. Angle sensors on rotational coupling 224 can be used to rotate video from camera 220 to make it look as though the camera was not being rotated by rotational coupling 224 and simplify remote operation by a human.

Camera 210, camera 214, and camera 220 provide video data to an interface and a processor disposed in base 204 of mobile manipulator 200. The processor provides the video data to a mobile manipulation device control system. The mobile manipulation device control system may be remotely located from the robot, allowing for remote teleoperation or autonomous control. The mobile manipulation device control system provides a user one or more views from the video data and receives motion commands from a user. The motion commands are provided to mobile manipulator 200. An interface of mobile manipulator 200 (e.g., a communication interface) receives the commands and provides the commands to a processor of mobile manipulator 200 and are then translated to specific motion commands for actuators of mobile manipulator 200 (e.g., fine or coarse motions for each of base 204, vertical lift 202, arm 216, folding hinge 218, and manipulator 222). In some embodiments, light rings or infrared sources are placed around some or all of the cameras to enable operating in dark rooms and other dark areas like the interior of a container.

In some embodiments, a control system for a mobile manipulation device comprises an interface that enables a remote operator to drive the mobile manipulation device and manipulate objects. The interface is a video-centric interface. The operator directly clicks on or touches one or more video streams to make the mobile manipulation device move. For example, clicking on or touching various parts of the video stream from the top camera makes the mobile manipulation device's wheeled base move. Clicking on or touching various parts of the video stream from the arm camera makes the arm move up and down and extend and retract. Clicking on or touching various parts of the video stream from the wrist camera makes the gripper open and close, bend down, straighten up, roll right, and roll left.

The regions are positioned with respect to the mobile manipulation device to make the mapping between a click or touch and mobile manipulation device motion intuitive. For example, clicking on or touching the video above the arm makes it move up, and clicking or touching in front of the arm makes it extend. The interface also provides feedback on the grip force and motor torques by making relevant regions of the video stream turn red with darker, less transparent, red meaning higher force or torque. The operator can use this to better infer what is happening. The red coloring is displayed in the region that when clicked or touched is likely to increase the torque or grip force. Each region of a video stream corresponds with a command can have a distinctive cursor icon associated with it as well as a descriptive text tooltip that appears when the cursor is held in the region for an extended duration. All commands are executed with a single click or touch, which results in one of the robot's joints moving a predefined distance/angle or executing a predefined autonomous motion of a short and bounded time. Some clickable or touchable regions in the video move over greater or smaller distances/angles. This is similar to a ‘jog mode’ on a Computer Numerical control (CNC) machine where coarse and fine motion of individual axes can be attained using a push button interface. The short bounded time of the resulting motion in ‘jog mode’ allows the robot operator to safely test small motions, observe in the video how they change the state of the world, and then adjust the subsequent set of jog commands. It also allows the robot operator to remove their attention from the robot operation at any time without having to consider returning to a robot that is an unsafe state.

In some embodiments, the interface rotates the gripper video to make it appear as though the camera is always in the same orientation with respect to gravity (i.e., the ceiling is always at the top of the image and the floor is always at the bottom) in spite of the camera rolling with the gripper. In some embodiments, there is also a microphone mounted to the gripper that provides audio feedback to the operator about objects being manipulated and helps the operator hear what the mobile manipulation device is doing, providing better situational awareness.

In some embodiments of the interface, the interface comprises multiple user selectable operator interface modes that map to primary usages of the robot. In some embodiments, a navigation mode, a manipulation mode, a grasp object from the top mode, and a grasp object from the side mode are provided. Each user interface mode presents one or more camera video streams that are zoomed, cropped, translated, and otherwise modified to present a very intuitive mapping between what the operator sees, what the operator clicks on or touches, and what the robot does to accomplish its task. By changing which videos are displayed, how the videos are processed, the mapping from locations on the videos to robot actuator commands, the computer graphics feedback provided to the user, and other aspects of the interface, a user selected interface mode can make a task more intuitive to perform with the robot.

The navigation mode makes driving the robot through an environment more intuitive. The navigation mode shows video from the top camera rotated such that the top of the video display corresponds with the direction of forward motion of the mobile manipulation device (i.e., y axis is vertically aligned with the video display). Clicking on or touching locations in the top middle part of the display results in forward motion of the robot along they axis and clicking on or touching locations near the bottom middle of the display results in backward motion of the robot along the y axis. Clicking on or touching locations on the sides of the display results in rotation of the robot around the theta axis. In some embodiments, the navigation mode simultaneously shows video from the arm camera to help the user avoid hitting the arm into something while driving the mobile manipulation device. FIG. 13 shows an arm camera interface comparable to what is used by some embodiments of the navigation mode. In some embodiments, video is simultaneously shown from the wrist camera to provide additional context while navigating the robot through an environment. FIG. 14 shows a wrist camera interface comparable to what is used by some embodiments of the navigation mode. In some embodiments, the size of the top camera video display is larger than other video displays to emphasize its importance and provide greater detail from the top camera while driving the robot.

The manipulation mode makes manipulating objects with the robot more intuitive. The manipulation mode shows video from the top camera rotated such that the top of the video display is the direction in which the arm extends (i.e., x axis is vertically aligned with the video display). FIGS. 10, 11, and 12 show a top camera interface comparable to what is used in some embodiments of the manipulation mode. In some embodiments, the manipulation mode can simultaneously show video from the wrist camera with an interface similar to the interface shown in FIG. 14. In some embodiments, the manipulation mode also simultaneously shows video from the arm camera. FIG. 13 shows an arm camera interface comparable to what is used by some embodiments of the manipulation mode. In some embodiments, the size of the wrist camera video display is larger than other video displays to emphasize its importance and provide greater detail from the wrist camera while manipulating objects with the robot.

The grasp object from the top mode makes grasping objects from above more intuitive. The grasp object from the top mode is used when the gripper is pointing down toward the ground (i.e., perpendicular to the arm). The grasp object from the top mode displays video from the wrist camera. Clicking on or touching the video display results in the gripper moving across a flat horizontal plane in order to position the gripper above an object to be grasped. This is achieved by sending commands to the base (y axis) and the arm (x axis). Clicking on or touching the top or bottom of the video display results in motion of the arm (x axis). Clicking on or touching the sides of the video display results in motion of the base (y axis). Once positioned above an object, the gripper can be rotated and the gripper can be lowered (z axis) and then closed to grasp the object.

The grasp object from the side mode makes grasping objects from the side more intuitive. The grasp object from the side mode is used when the gripper is pointing straight out (i.e., parallel to the arm). The grasp object from the side mode displays video from the wrist camera. Clicking on or touching the video display results in the gripper moving across a flat vertical plane in order to position the gripper in front of an object to be grasped. This is achieved by sending commands to the base (y axis) and the lift (z axis). Clicking on or touching the top or bottom of the video display results in motion of the lift (z axis). Clicking on or touching the sides of the video display results in motion of the base (y axis). Once positioned in front of an object, the arm can be extended (x axis) and the gripper closed to grasp the object.

By making the control system for the mobile manipulation system intuitive and easy to use, control of the mobile manipulation system becomes available to a wide range of non-specialized users. The control system allows control of all degrees of freedom of the mobile manipulation system with only a mouse and using a display with minimal distractions. In some embodiments, a user can command the system by touching the display rather than using a mouse. The system includes a plurality of cameras allowing the user to see different context for mobile manipulator system motions and automatically determines a user interface behavior based on the camera currently being displayed. This control system for a mobile manipulation system significantly broadens the ability of manipulator robots to perform useful work in the home.

FIG. 3 is a diagram illustrating an embodiment of a base. In some embodiments, base 300 comprises base 204 of FIG. 2 (e.g., a base of a mobile manipulation device). In the example shown, base 300 includes wheel 304 and wheel 308. Wheel 304 and wheel 308 comprise wheels for driving the mobile manipulation device. Wheel 304 and wheel 308 are capable of driving the mobile manipulation device in a straight line when driven in common mode (e.g., in the same direction at the same rate) or of turning the mobile manipulation device when driven in differential mode (e.g., in opposite directions at the same rate). In the example shown, wheel 304 is driven by driver 306 and wheel 308 is driven by driver 310. In some embodiments, wheel 304 and wheel 306 are driven by a single driver (e.g., including a switchable drive train for driving wheel 304 and wheel 306 in common mode or in differential mode). Base 300 additionally comprises wheel 312. Wheel 312 comprises a third wheel that is not driven. In the example shown, wheel 312 comprises a ball caster. In various embodiments, wheel 312 comprises a pivoting caster, a mecanum wheel, an omniwheel, or any other appropriate wheel type. Wheel 312 comprises a wheel for providing static stability. In some embodiments, base 300 additionally comprises one or more additional wheels, wherein the wheels are driven or not driven.

Base 300 additionally comprises computer 302. Computer 302 comprises a computer including an interface system, a memory, a processor, data storage, etc. Computer 302 communicates with a mobile manipulation device control system (e.g., mobile manipulation device control system 102 of FIG. 1) via a network for providing data (e.g., camera data, microphone data, actuator data, sensor data, etc.) and for receiving control information. Computer 302 comprises connections to other systems of the mobile manipulation device, for example, a power connection to battery 316, driver connections to driver 306 and driver 310, a controller connection to vertical lift controller 314, an audio data connection to send audio data to speaker 318, an audio data connection to send audio data from a microphone on the robot to the mobile manipulation device control system, a video data connection to receive data from one or more cameras, an arm controller to cause a manipulator to move toward and away from a vertical lift, a hinge controller to control a hinge motion bending a manipulator up or down, a manipulator controller for opening or closing a manipulator, a manipulator controller for rotating the manipulator, etc. Vertical lift controller 314 comprises a vertical lift controller for holding and actuating a vertical lift to move an arm up and down. Battery 316 comprises a battery pack for powering the mobile manipulation device. Speaker 318 comprises a speaker for providing audio (e.g., providing audio feedback to a user of the mobile manipulation device) and in some implementations enabling a remote human operator or artificial intelligence to communicate with people near the robot.

FIG. 4 is a diagram illustrating an embodiment of a belt-driven linear actuator. In some embodiments, the belt driven linear actuator of FIG. 4 comprises vertical lift 202 and coupling 212 of FIG. 2. In the example shown, mast 400 comprises a vertical mast for holding the linear actuator. Rail 402 is fastened to mast 400. Rail 402 comprises a trapezoidal shape. Slider 404 comprises a slider for sliding linearly on rail 402. Slider 404 comprises a trapezoidal gap that fits the trapezoidal shape of rail 402. The trapezoidal coupling prevents slider 404 from moving in any direction other than up and down. Together, slider 404 and rail 402 form a prismatic joint. Belt 406 drives slider 404 up and down. Belt 406 is driven by a vertical lift actuator (e.g., a vertical lift actuator mounted in a mobile manipulation device base). Arm 410 is mounted to slider 404 and extends horizontally. In the example shown, arm 410 has a round cross-section. Camera 408 is mounted to arm 410 via slider 404 and offset form the arm in order to see arm 410 and a manipulator from the side.

In some embodiments, a compliant grabber device comprises two compliant fingers each with a suction cup fingertip. In some embodiments, each compliant finger has a compliant linkage that comprises two strips of material that behave as springs with one strip of material rigidly affixed to a housing and the other strip of material moved via an actuator to bend the finger. In some embodiments, the actuator pulls on a cable attached to the compliant linkages in order to cause the fingers to close and the return force of the springs causes the fingers to open when the cable tension is released. In some embodiments, a linear actuator rigidly attached to the compliant linkages pulls on the compliant linkages to cause the fingers to close and pushes on the compliant linkages to cause the fingers to open. In some embodiments, bend sensors mounted on the fingers provide signals with information about the kinematics and dynamics of the compliant grabber device. In some embodiments, an actuator used to move the fingers provides voltage, current, and kinematic signals with information about the kinematics and dynamics of the compliant grabber device.

FIG. 5 is a diagram illustrating an embodiment of a manipulator end. Arm 500 comprises an arm of a mobile manipulation device. In some embodiments, arm 500 comprises arm 216 of FIG. 2. In the example shown, arm 500 has a square cross-section. Arm 500 comprises rotational coupler 502. Rotational coupler 502 comprises an actuated rotational coupling for rotating. Arm 500 additionally comprises hinge 504 including hinge actuator 506. Hinge 504 comprises an actuated hinge for bending. Arm 500 additionally comprises arm end 508. Arm end 508 rotates when actuated by rotational coupler 502 and swings when actuated by hinge 504. Camera 510 is mounted on arm end 508. Camera 510 includes ring light 534 to illuminate area in front of camera 510.

Arm end 508 additionally comprises spring 512 and spring 514, spring 516 and spring 518, and grabber 520 and grabber 522. Spring 516 and spring 518 comprise spring elements capable of being actuated by being drawn into the end of arm end 508. In some embodiments, spring 516 and spring 518 are connected to a cable pulled and released by an actuator. In some embodiments, an actuator is rigidly coupled to spring 516 and spring 518 to push and pull them. In the example shown, linear actuator 532 is rigidly coupled to spring 516 and spring 518. When spring 516 and spring 518 are drawn into the end of arm end 508, spring 512 and spring 514 are pulled together, and grabber 520 and grabber 522 are pulled together. Grabber 520 and grabber 522 can pick up an object in this way. When the actuation is reversed, spring 512 and spring 514 return to their default positions or, in the case of a rigid coupling, can be pushed out to open beyond their default positions. When spring 516 and spring 518 are pushed out of the end of arm end 508, spring 512 and spring 514 are pushed apart, and grabber 520 and grabber 522 are pushed apart. Grabber 520 and grabber 522 can reach around a larger object than would fit between them in their default position in this way. Hook 530 attached to the outside of grabber 522 can be used to hook onto drawer handles, light switches, door handles, and other objects and be used in other manipulation tasks.

In various embodiments, hook 530 is rectangular, the protruding end of an L-shaped attachment to the gripper, or any other appropriate shape.

Arm end 508 additionally comprises laser light source 524 and laser light source 526. For example, laser light source 524 and laser light source 526 comprise laser pointers. Laser light source 524 and laser light source 526 are positioned at a slight angle to arm end 508 such that the light from laser light source 524 converges with the light from laser light source 526 between grabber 520 and grabber 522 (e.g., converging on a point indicating where the manipulator will interact when actuated). When the manipulator is being maneuvered into a position to grab an object using grabber 520 and grabber 522, light from light source 524 and laser light source 526 reflecting from the object can be used to judge where the object is relative to grabber 520 and grabber 522 (e.g., when viewing images captured by camera 510).

Bend sensors (e.g., bend sensor 528 on spring 514) mounted on spring 512 and spring 514 provide information about the state of the gripper. The actuator coupled to spring 516 and spring 518 provides current, voltage, and kinematic information about the state of the gripper. For example, together, this information can be used to detect contact, estimate the applied grip force, estimate the width of the opening of the gripper when grasping an object, and perform other sensing.

Coupler 534 is used to quickly attach or decouple different tools from the end of arm 500.

FIG. 6 is a diagram illustrating an embodiment of a manipulator end. Arm 600 comprises an arm of a mobile manipulation device. In some embodiments, arm 600 comprises arm 216 of FIG. 2. In the example shown, arm 600 has a square cross-section. Arm 600 additionally comprises arm end 604. Camera 606 is mounted on arm end 604. Camera 606 includes ring light 608 to illuminate area in front of camera 606.

Arm end 604 additionally comprises springs (e.g., spring 610) and grabbers (e.g., springs and grabbers similar to FIG. 5). The spring elements are capable of being actuated by being drawn into the end of arm end 604. In some embodiments, springs are connected to a cable pulled and released by an actuator. In some embodiments, an actuator is rigidly coupled to springs to push and pull them. When springs are drawn into the end of arm end 604, springs are pulled together, and grabbers are pulled together. Grabbers can pick up an object in this way. When the actuation is reversed, springs return to their default positions or, in the case of a rigid coupling, can be pushed out to open beyond their default positions. When springs are pushed out of the end of arm 604, springs are pushed apart, and grabbers are pushed apart. Grabbers can reach around a larger object than would fit between them in their default position in this way. Hook 618 attached to the outside of grabber can be used to hook onto drawer handles, light switches, door handles, and other objects and be used in other manipulation tasks.

Arm end 604 additionally comprises laser light source 614 and laser light source 616. For example, laser light source 614 and laser light source 616 comprise laser pointers. Laser light source 614 and laser light source 616 are positioned at a slight angle to arm end 604 such that the light from laser light source 614 converges with the light from laser light source 616 between grabbers (e.g., converging on a point indicating where the manipulator will interact when actuated). When the manipulator is being maneuvered into a position to grab an object using grabbers, light from light source 614 and laser light source 616 reflecting from the object can be used to judge where the object is relative to grabbers (e.g., when viewing images captured by camera 606).

Bend sensors (e.g., bend sensor 612 on spring 610) mounted on springs provide information about the state of the gripper. The actuator coupled to springs provides current, voltage, and kinematic information about the state of the gripper. For example, together, this information can be used to detect contact, estimate the applied grip force, estimate the width of the opening of the gripper when grasping an object, and perform other sensing.

Coupler 602 is used to quickly attach or decouple different tools from the end of arm 600.

FIG. 7 is a diagram illustrating an embodiment of a manipulator end. In some embodiments, arm 700 comprises arm 216 of FIG. 2. In the example shown, arm 700 includes an actuator internal to arm 700 able to extend and retract coupler 702. When coupler 702 is retracted outer spring 704 and outer spring 708 move grippers inward (e.g., gripper 706). When coupler 702 is extended outer spring 704 and outer sprint 708 move grippers outward (e.g., gripper 706).

FIG. 8 is a diagram illustrating an embodiment of an arm actuator. In some embodiments, arm 800 is used to implement x axis motion of arm 216. In the example shown, arm 800 is actuated using stiff but flexible cable 808 and causes arm to extend and retract. Actuator 804 is for moving drive pulley wheel 802. In some embodiments, cable includes power and signal conductors used by a tool at the end of the arm. Unactuated pulley wheel 806 and actuated pulley wheel 802 are held together with compliant springs to move stiff cable 808 (e.g., a friction drive pulley). Hollow nested elements of arm 800 enable the end of arm 800 to extend and retract.

In some embodiments, the telescoping arm has a cross section smaller than or similar in size to a cross section of a human arm. The small cross section, long reach, and low mass of the telescoping arm are important features of the actual invention, since they enable the arm to stably reach places in human environments. In some embodiments, the telescoping arm has a length that is longer than or similar in length to an outstretched human arm. In some embodiments, the telescoping arm is constructed from a series of nested structural elements and sliding elements such as bushings or bearings. In some embodiments, the telescoping arm is an exoskeleton comprising a hollow structure that provides structural support and also serves as the exterior of the arm. In some embodiments, the telescoping mechanism is driven by an element that contains power and signal conductors. In some embodiments, the power and signal conductors in the cable used to extend or retract the arm are used by the manipulator or tool attached to the end of the telescoping arm.

FIG. 9 is a diagram illustrating an embodiment of a mobile manipulation device in a room. In some embodiments, mobile manipulation device 900 comprises mobile manipulation device 200 of FIG. 2 located in a room—for example, a living room of an apartment. In the example shown, mobile manipulation device 900 is located in the room to pick up objects, to clean surfaces, to check on pets, to give potential thieves the impression that someone is home, etc. Mobile manipulation device 900 locomotes using wheels located in base 902. Base 902 additionally comprises a computer for communicating with a remote control system, determining actuator commands from control information, providing actuator commands to mobile manipulation device actuators, providing camera images to the control system, etc. Different views of the room are available to mobile manipulation device 900 using top camera 904, arm camera 906, and wrist camera 908.

FIG. 10 is a diagram illustrating an embodiment of a user interface for a mobile manipulation device. In some embodiments, the user interface of FIG. 10 is provided to a user controlling a mobile manipulation device (e.g., via mobile manipulation device control system 102 of FIG. 1). In the example shown, the user interface of FIG. 10 comprises a camera image with an overlay comprising a plurality of command icons. For example, the camera image of FIG. 10 comprises an image captured by top camera 904 of FIG. 9. The camera image of FIG. 10 shows mobile manipulation device 1000. Mobile manipulation device 1000 is capable of moving laterally (e.g., left and right in the image of FIG. 10) and rotating clockwise or counterclockwise. The overlay comprising a plurality of command icons comprises coarse overlay command icon 1002 and fine overlay command icon 1004. Coarse overlay command icon 1002 and fine overlay command icon 1004 are associated with a motion in a theta axis of mobile manipulation device 1000, wherein the theta axis is associated with a rotational movement of a base of mobile manipulation device 1000. In the example shown, coarse overlay command icon 1002 and fine overlay command icon 1004 are associated with a counterclockwise motion in the theta axis, and the overlay additionally comprises a coarse overlay command icon 1014 and a fine overlay command icon 1016 associated with a clockwise motion in the theta axis. Coarse overlay command icon 1002 and fine overlay command icon 1004 are associated with different motion sizes, for example, coarse overlay command icon 1002 is associated with a coarse motion (e.g., 10 degrees of rotation) and fine overlay command icon 1004 is associated with a jog motion (e.g., 1 degree of rotation). Coarse overlay command icon 1014 and fine overlay command icon 1016 are associated with different motion sizes, for example, coarse overlay command icon 1014 is associated with a coarse motion (e.g., 10 degrees of rotation) and fine overlay command icon 1016 is associated with a jog motion (e.g., 1 degree of rotation). Other overlay icon functionality beyond coarse and fine jog motions can also be achieved.

The overlay comprising a plurality of command icons additionally comprises coarse overlay command icon 1006 and fine overlay command icon 1008. Coarse overlay command icon 1006 and fine overlay command icon 1008 are associated with a motion in an y axis of mobile manipulation device 1000, wherein the y axis is associated with a horizontal movement of a base of mobile manipulation device 1000. In the example shown, coarse overlay command icon 1006 and fine overlay command icon 1008 are associated with a rightward motion in the y axis, and the overlay additionally comprises a coarse overlay command icon 1020 and a fine overlay command icon 1018 associated with a leftward motion in the y axis. Coarse overlay command icon 1006 and fine overlay command icon 1008 are associated with different motion sizes, for example, coarse overlay command icon 1006 is associated with a coarse motion (e.g., a movement of the base 10 cm to the right) and fine overlay command icon 1008 is associated with a jog motion (e.g., a movement of the base 1 cm to the right). Coarse overlay command icon 1020 and fine overlay command icon 1018 are associated with different motion sizes, for example, coarse overlay command icon 1020 is associated with a coarse motion (e.g., a movement of the base 10 cm to the left) and fine overlay command icon 1018 is associated with a jog motion (e.g., a movement of the base 1 cm to the left).

The overlay comprising a plurality of command icons additionally comprises arm camera switch indication button 1010 and wrist camera switch indication button 1012. In the event an indication to arm camera switch indication button 1010 is received (e.g., by the mobile manipulation device control system) a user interface including a camera image from an arm camera is shown. In the event an indication to wrist camera switch indication button 1012 is received (e.g., by the mobile manipulation device control system) a user interface including a camera image from a wrist camera is shown. In some embodiments, the interface displays more than one video stream at the same time (e.g., an interface that shows video from the top camera, arm camera, and wrist camera at the same time), each with its own overlay.

In some embodiments, the overlay comprises a plurality of command icons additionally comprises a torque overlay for displaying an actuator torque. For example, a command icon is colored (e.g., colored red) when an associated actuator is delivering a torque above a threshold value, a display region is colored when an associated actuator is delivering a torque above a threshold value, the overlay comprises one or more torque meter displays, etc. The visibility of the color can be proportional to the torque, such as becoming less transparent with higher torques.

FIG. 11 is a diagram illustrating an embodiment of a user interface for a mobile manipulation device. In some embodiments, the user interface of FIG. 1 is provided to a user controlling a mobile manipulation device (e.g., via mobile manipulation device control system 102 of FIG. 1). In the example shown, the user interface of FIG. 11 comprises a camera image with an overlay comprising a command icon. For example, the camera image of FIG. 11 comprises an image captured by top camera 904 of FIG. 9. In the example shown, the camera image of FIG. 11 shows mobile manipulation device 1100. Mobile manipulation device 1100 includes arm 1108 camera 1112 that points along arm 1108. Mobile manipulation device 1100 has base 1110 that enables rotation of mobile manipulation device 1100. The overlay comprising a command icon comprises fine overlay command icon 1102. Fine overlay command icon 1102 is associated with a jog motion in a counterclockwise direction in a theta axis. Fine overlay command icon 1102 comprises the only overlay command icon shown in the user interface of FIG. 8. Fine overlay command icon 1102 comprises a cursor icon (e.g., a mouse cursor icon or icon that appears when the video display is touched) comprising a shape indicating a jog motion in a counterclockwise direction in a theta axis. The motion axis, direction, and size associated with fine overlay command icon 1102 are determined using a cursor position or touch position. For example, the display of FIG. 11 is divided into a set of regions, wherein each region is associated with a different motion. The displayed cursor icon (e.g., the cursor icon for the mouse cursor or icon that appears when the video display is touched) is determined to be a cursor icon associated with the region the cursor lies within. In some embodiments, multiple regions representing different sizes of motions (e.g., small, medium, and large) are used, or continuous variability of the motion size is commanded based on the position of the mouse cursor or location of touch and indicated to the user via rendered computer graphics (e.g., variable icon size). The overlay comprising a command icon additionally comprises arm camera switch indication button 1104 and wrist camera switch indication button 1106.

FIG. 12 is a diagram illustrating a set of command icon regions. In some embodiments, the diagram of FIG. 12 shows display regions associated with different command icons for an overlay command icon comprising a cursor icon. In the example shown, each region of FIG. 12 is associated with a command icon that is displayed as a cursor icon when the cursor is in that region or a touch occurs in that region. When the cursor is not in that region or the region is not being touched the command icon may be entirely or partially hidden from the user. FIG. 12 comprises a plurality of regions. Region 1200 is associated with coarse counterclockwise motion overlay command icon 1216 associated with a coarse counterclockwise motion in a theta axis. Region 1202 is associated with fine counterclockwise motion overlay command icon 1218 associated with a fine counterclockwise motion in a theta axis. Region 1204 is associated with coarse clockwise motion overlay command icon 1220 associated with a coarse clockwise motion in a theta axis. Region 1206 is associated with fine clockwise motion overlay command icon 1222 associated with a fine clockwise motion in a theta axis. Region 1208 is associated with coarse rightward motion overlay command icon 1224 associated with a coarse rightward motion in an y axis. Region 1210 is associated with fine rightward motion overlay command icon 1226 associated with a fine rightward motion in an y axis. Region 1212 is associated with coarse leftward motion overlay command icon 1228 associated with a coarse leftward motion in an y axis. Region 1230 is associated with fine leftward motion overlay command icon 1200 associated with a fine leftward motion in an y axis.

FIG. 13 is a diagram illustrating an embodiment of a user interface for a mobile manipulation device. In some embodiments, the user interface of FIG. 13 is provided to a user controlling a mobile manipulation device (e.g., via mobile manipulation device control system 102 of FIG. 1). In the example shown, the user interface of FIG. 13 comprises a camera image with an overlay comprising a plurality of command icons. For example, the camera image of FIG. 13 comprises an image captured by arm camera 906 of FIG. 9. In the example shown, the camera image of FIG. 13 shows mobile manipulation device arm 1300. Mobile manipulation device arm 1300 is capable of moving vertically and extending or retracting. The overlay includes a plurality of command icons—for example, coarse vertical up overlay command icon 1302 and fine vertical up overlay command icon 1304. Coarse vertical up overlay command icon 1302 and fine vertical up overlay command icon 1304 are associated with a motion in a z axis of a mobile manipulation device, wherein the z axis is associated with a vertical movement of a vertical lift of the mobile manipulation device. In the example shown, coarse vertical up overlay command icon 1302 and fine vertical up overlay command icon 1304 are associated with a upwards motion in the z axis. Coarse vertical up overlay command icon 1302 and fine vertical up overlay command icon 1304 are associated with different motion sizes—for example, coarse vertical up overlay command icon 1302 is associated with a coarse motion (e.g., 10 cm up) and fine vertical up overlay command icon 1304 is associated with a jog motion (e.g., 1 cm up).

The overlay includes a plurality of command icons—for example, coarse vertical down overlay command icon 1314 and fine vertical down overlay command icon 1316. Coarse vertical down overlay command icon 1314 and fine vertical down overlay command icon 1316 are associated with a motion in a z axis of a mobile manipulation device, wherein the z axis is associated with a vertical movement of a vertical lift of the mobile manipulation device. In the example shown, coarse vertical down overlay command icon 1314 and fine vertical down overlay command icon 1316 are associated with a downwards motion in the z axis. Coarse vertical down overlay command icon 1314 and fine vertical down overlay command icon 1316 are associated with different motion sizes—for example, coarse vertical down overlay command icon 1314 is associated with a coarse motion (e.g., 10 cm down) and fine vertical down overlay command icon 1316 is associated with a jog motion (e.g., 1 cm down).

The overlay comprising a plurality of command icons additionally comprises coarse retracting overlay command icon 1306 and fine retracting overlay command icon 1308. Coarse retracting overlay command icon 1306 and fine retracting overlay command icon 1308 are associated with a motion in a x axis of the mobile manipulation device, wherein the x axis is associated with a retraction movement of an arm of the mobile manipulation device. In the example shown, coarse retracting overlay command icon 1306 and fine retracting overlay command icon 1308 are associated with a retracting motion in the x axis. Coarse retracting overlay command icon 1306 and fine retracting overlay command icon 1308 are associated with different motion sizes—for example, coarse retracting overlay command icon 1306 is associated with a coarse retracting motion (e.g., 10 cm) and fine retracting overlay command icon 1308 is associated with a jog motion (e.g., 1 cm).

The overlay comprising a plurality of command icons additionally comprises coarse extending overlay command icon 1318 and fine extending overlay command icon 1320. Coarse extending overlay command icon 1318 and fine extending overlay command icon 1320 are associated with a motion in a x axis of the mobile manipulation device, wherein the x axis is associated with an extension movement of an arm of the mobile manipulation device. In the example shown, coarse extension overlay command icon 1318 and fine extension overlay command icon 1320 are associated with a retracting motion in the x axis. Coarse extension overlay command icon 1318 and fine extension overlay command icon 1320 are associated with different motion sizes—for example, coarse extension overlay command icon 1318 is associated with a coarse extension motion (e.g., 10 cm) and fine extension overlay command icon 1320 is associated with a jog motion (e.g., 1 cm).

The overlay comprising a plurality of command icons additionally comprises top camera switch indication button 1310 and wrist camera switch indication button 1312. In response to an indication to top camera switch indication button 1310 being received (e.g., by the mobile manipulation device control system), a user interface including a camera image from a top camera is shown. In response to an indication to wrist camera switch indication button 1312 being received (e.g., by the mobile manipulation device control system), a user interface including a camera image from a wrist camera is shown.

FIG. 14 is a diagram illustrating an embodiment of a user interface for a mobile manipulation device. In some embodiments, the user interface of FIG. 14 is provided to a user controlling a mobile manipulation device (e.g., via mobile manipulation device control system 102 of FIG. 1). In the example shown, the user interface of FIG. 14 comprises a camera image with an overlay comprising a plurality of command icons. For example, the camera image of FIG. 14 comprises an image captured by wrist camera 908 of FIG. 9. In the example shown, the camera image of FIG. 14 shows mobile manipulation device manipulator 1400. Mobile manipulation device manipulator 1400 is capable of moving in a pitch axis, rotating in a roll axis, and opening or closing. Other types of manipulators can also be controlled using this type of control interface. For example, the manipulator may comprise a hook. The command icons overlaying the camera image would be positioned and presented so as to enable the operator to efficiently operate the hook to pull open a drawer, for example. The overlay comprising a plurality of command icons includes coarse manipulator counterclockwise rotation overlay command icon 1402 and fine manipulator counterclockwise rotation overlay command icon 1404. Coarse manipulator counterclockwise rotation overlay command icon 1402 and fine manipulator counterclockwise rotation overlay command icon 1404 are associated with a motion in a roll axis of the mobile manipulation device, wherein the roll axis is associated with a rotation of a wrist rotational coupling of the mobile manipulation device. Coarse manipulator counterclockwise rotation overlay command icon 1402 and fine manipulator counterclockwise rotation overlay command icon 1404 are associated with different motion sizes—for example, coarse manipulator counterclockwise rotation overlay command icon 1402 is associated with a coarse motion (e.g., 10 degrees) and fine manipulator counterclockwise rotation overlay command icon 1404 is associated with a jog motion (e.g., 1 degree).

The overlay comprising a plurality of command icons includes coarse manipulator clockwise rotation overlay command icon 1418 and fine manipulator clockwise rotation overlay command icon 1420. Coarse manipulator clockwise rotation overlay command icon 1418 and fine manipulator clockwise rotation overlay command icon 1420 are associated with a motion in a roll axis of the mobile manipulation device, wherein the roll axis is associated with a rotation of a wrist rotational coupling of the mobile manipulation device. Coarse manipulator clockwise rotation overlay command icon 1418 and fine manipulator clockwise rotation overlay command icon 1420 are associated with different motion sizes—for example, coarse manipulator clockwise rotation overlay command icon 1418 is associated with a coarse motion (e.g., 10 degrees) and fine manipulator clockwise rotation overlay command icon 1420 is associated with a jog motion (e.g., 1 degree).

The overlay comprising a plurality of command icons additionally comprises coarse open overlay command icon 1406 and fine open overlay command icon 1408. Coarse open overlay command icon 1406 and fine open overlay command icon 1408 are associated with an open/close motion of the mobile manipulation device, wherein the open/close motion is associated with an opening and closing of manipulator 1400 of the mobile manipulation device. In the example shown, coarse open overlay command icon 1406 and fine open overlay command icon 1408 are associated with an opening motion. Coarse open overlay command icon 1406 and fine open overlay command icon 1408 are associated with different motion sizes—for example, coarse open overlay command icon 1406 is associated with a coarse motion and fine open overlay command icon 1408 is associated with a jog motion.

The overlay comprising a plurality of command icons additionally comprises coarse closed overlay command icon 1422 and fine closed overlay command icon 1424. Coarse closed overlay command icon 1422 and fine closed overlay command icon 1424 are associated with an open/close motion of the mobile manipulation device, wherein the open/close motion is associated with an opening and closing of manipulator 1400 of the mobile manipulation device. In the example shown, coarse closed overlay command icon 1422 and fine closed overlay command icon 1424 are associated with an opening motion. Coarse closed overlay command icon 1422 and fine closed overlay command icon 1424 are associated with different motion sizes—for example, coarse closed overlay command icon 1422 is associated with a coarse motion and fine open overlay command icon 1424 is associated with a jog motion.

The overlay comprising a plurality of command icons additionally comprises coarse vertical up overlay command icon 1410 and fine vertical up overlay command icon 1412. Coarse vertical up overlay command icon 1410 and fine vertical up overlay command icon 1412 are associated with a motion in a pitch axis of the mobile manipulation device, wherein the pitch axis is associated with a bending of a wrist hinge of the mobile manipulation device. In the example shown, coarse vertical up overlay command icon 1410 and fine vertical up overlay command icon 1412 are associated with an upwards motion in the pitch axis. Coarse vertical up overlay command icon 1410 and fine vertical up overlay command icon 1412 are associated with different motion sizes—for example, coarse vertical up overlay command icon 1410 is associated with a coarse motion (e.g., 10 cm) and fine vertical up overlay command icon 1412 is associated with a jog motion (e.g., 1 cm).

The overlay comprising a plurality of command icons additionally comprises coarse vertical down overlay command icon 1426 and fine vertical down overlay command icon 1428. Coarse vertical down overlay command icon 1426 and fine vertical down overlay command icon 1428 are associated with a motion in a pitch axis of the mobile manipulation device, wherein the pitch axis is associated with a bending of a wrist hinge of the mobile manipulation device. In the example shown, coarse vertical down overlay command icon 1426 and fine vertical down overlay command icon 1428 are associated with an downwards motion in the pitch axis. Coarse vertical down overlay command icon 1426 and fine vertical down overlay command icon 1428 are associated with different motion sizes—for example, coarse vertical down overlay command icon 1426 is associated with a coarse motion (e.g., 10 cm down) and fine vertical down overlay command icon 1428 is associated with a jog motion (e.g., 1 cm up).

The overlay comprising a plurality of command icons additionally comprises arm camera switch indication button 1414 and top camera switch indication button 1416. In the event an indication to arm camera switch indication button 1414 is received (e.g., by the mobile manipulation device control system) a user interface including a camera image from an arm camera is shown. In the event an indication to top camera switch indication button 1416 is received (e.g., by the mobile manipulation device control system) a user interface including a camera image from a top camera is shown.

FIG. 15 is a flow diagram illustrating an embodiment of a process for a display window control system for a mobile manipulation device. In some embodiments, the process of FIG. 15 is executed by mobile manipulation device control system 102 of FIG. 1. In the example shown, in 1500, an image is received from a mobile manipulation device. For example, an image or a plurality of images are received from one or more of a top camera, an arm camera, a wrist camera, etc. In 1502, a display output to display the image in a display window is determined. For example, determining a display output comprises determining a selected camera or set of cameras, determining images associated with selected cameras, cropping images, zooming images enhancing images, rotating images, combining images, etc. In 1504, an overlay output to display a coarse overlay command icon or a fine overlay command icon is determined. Other types of overlay command icons can be supported besides coarse and fine overlay command icons. The overlay command icons can be placed in the display at prespecified locations that are intuitively mapped to their function (e.g., an upward motion icon appears at the top of the screen). The overlay command icons can also be placed in the display at locations that are automatically generated based on the current state of the robot (positions of its links in the image for example). The coarse overlay command icon or the fine overlay command icon comprise command icons for indicating a mobile manipulation device command. For example, one command icon is determined, one coarse overlay command icon and one fine overlay command icon is determined, or a plurality of coarse overlay command icons and fine overlay command icons are determined, each command icon associated with a motion axis and a direction. In 1506, the display output and the overlay output are displayed. For example, the display output is displayed with the overlay output as an overlay on a display of a mobile manipulation device control system. In 1508, it is determined that an indication has been received to activate the coarse overlay command icon or the fine overlay command icon. In 1510, the indication is provided to the mobile manipulation device.

FIG. 16 is a flow diagram illustrating an embodiment of a process for determining a display output to display the image in a display window. In some embodiments, the process of FIG. 16 implements 1502 of FIG. 15. In the example shown, in 1600, a camera selected for display is determined. In 1602, an image associated with the selected camera is determined. In 1604, the image is processed. For example, processing the image may comprise scaling the image, translating the image, rotating the image, dewarping the image, cropping the image, or any other appropriate image processing. In 1606, it is determined whether a switch indication is received to switch the display output. In the event it is determined that a switch indication is received, control passes to 1600. In the event it is determined that a switch indication is not received, the process ends.

FIG. 17 is a flow diagram illustrating an embodiment of a process for determining an overlay output to display a coarse overlay command icon or a fine overlay command icon. In some embodiments, the process of FIG. 17 implements 1504 of FIG. 15. In the example shown, in 1700, a camera selected for display is determined. In 1702, one or more axes associated with the camera selected for display is/are determined. For example, an y axis and a theta axis are associated with a top camera, a x axis and a z axis are associated with an arm camera, and a pitch axis, a roll axis, and an open/close motion are associated with a wrist camera. In 1704, a next axis associated with the camera selected for display is selected. In some embodiments, the first axis associated with the camera selected for display is selected. In 1706, a coarse overlay command icon or a fine overlay command icon associated with the axis are determined for a first direction. In 1708, a coarse overlay command icon or a fine overlay command icon associated with the axis are determined for a second direction. For example, determining a command icon comprises determining a command icon shape, determining a command icon size, determining a command icon orientation, determining a command icon location, etc. In 1710, it is determined whether there are more axes. For example, it is determined whether there are more axes associated with the camera selected for display. In the event it is determined that there are more axes, control passes to 1704. In the event it is determined that there are not more axes, control passes to 1712. In 1712, one or more switch indication icons are determined, each switch indication icon associated with another camera.

FIG. 18 is a flow diagram illustrating an embodiment of a process for determining an overlay output to display a coarse overlay command icon or a fine overlay command icon. In some embodiments, the process of FIG. 18 implements 1508 of FIG. 15. In the example shown, in 1800, a camera selected for display is determined. In 1802, a set of control regions associated with the camera selected for display is determined. In 1804, a current cursor position or touch position is determined. In 1806, a control region of a set of control regions associated with the current cursor position or touch position is determined. In 1808, an overlay command icon associated with the control region in the current cursor position or touch position is determined. In 1810, one or more switch indication icons are determined, each switch indication icon associated with another camera.

FIG. 19 is a flow diagram illustrating an embodiment of a process for a remote control interface system for a mobile manipulation device. In some embodiments, the process of FIG. 19 is executed by mobile manipulation device 104 of FIG. 1. In the example shown, in 1900, an image from a mobile manipulation device camera is provided. In 1902, an indication to move is received. For example, the indication to move comprises an indication received to activate a coarse overlay command icon or a fine overlay command icon. In 1904, actuator commands are determined based at least in part on the indication to move. In 1906, the actuator commands are provided to one or more actuators.

FIG. 20 is a flow diagram illustrating an embodiment of a process for determining actuator commands based at least in part on an indication to move. In some embodiments, the process of FIG. 20 implements 1904 of FIG. 19. In the example shown, in 2000, an axis associated with the indication to move is determined. In 2002, a motion size associated with the indication to move is determined. In 2004, a motion direction associated with the indication to move is determined. In 2006, one or more actuators associated with the axis are determined. In 2008, actuator commands for the one or more actuators are determined based at least in part on the motion size and the motion direction.

Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.

Claims

1. A display window control system for a mobile manipulation device, comprising:

an interface configured to: receive an image from a mobile manipulation device; and
a processor configured to: determine a display output to display the image in a display window; determine an overlay output to display one or more overlay command icons; determine that an indication has been received to activate an overlay command; and provide the indication to the mobile manipulation device.

2. The system of claim 1, wherein the image is received from an overhead camera of the mobile manipulation device, an arm camera of the mobile manipulation device, or a wrist camera of the mobile manipulation device.

3. The system of claim 1, wherein the one or more overlay command icons comprise a coarse overlay command icon and a fine overlay command icon.

4. The system of claim 1, wherein the overlay output is determined based at least in part on a camera associated with the image.

5. The system of claim 3, wherein the coarse overlay command icon is associated with a coarse jog motion, and wherein the fine overlay command icon is associated with a fine jog motion.

6. The system of claim 1, wherein the processor is further configured to determine that a switch indication has been received to switch the display output to display a second image associated with an indicated camera of the mobile manipulation device.

7. The system of claim 1, wherein an axis associated with the overlay command icon is based at least in part on a camera associated with the display output.

8. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in an y axis of the mobile manipulation device, wherein the y axis is associated with a horizontal movement of a base of the mobile manipulation device.

9. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a theta axis of the mobile manipulation device, wherein the theta axis is associated with a rotational movement of a base of the mobile manipulation device.

10. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a z axis of the mobile manipulation device, wherein the z axis is associated with a vertical movement of a vertical lift of the mobile manipulation device.

11. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a y axis of the mobile manipulation device, wherein they axis is associated with an extension movement of an arm of the mobile manipulation device.

12. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a pitch axis of the mobile manipulation device, wherein the pitch axis is associated with a bending of a wrist hinge of the mobile manipulation device.

13. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a roll axis of the mobile manipulation device, wherein the roll axis is associated with a rotation of a wrist rotational coupling of the mobile manipulation device.

14. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with an open/close motion of device gripper.

15. The system of claim 1, wherein the overlay output additionally comprises a torque overlay for displaying an actuator torque.

16. A remote control interface system for a mobile manipulation device, comprising:

an interface configured to: provide an image from a mobile manipulation device camera; and receive an indication to move; and
a processor configured to: determine actuator commands based at least in part on the indication to move; and provide the actuator commands to one or more actuators.

17. The system of claim 16, wherein the indication to move is associated with a command motion type.

18. The system of claim 17, wherein the command motion type comprises a coarse motion, a fine jog motion, or an autonomous motion than concludes within a bounded amount of time.

19. The system of claim 16, wherein the indication to move is associated with a command axis, wherein the command axis comprises an x axis, a theta axis, a z axis, a y axis, a pitch axis, a roll axis, or an open/close motion.

Patent History
Publication number: 20180267690
Type: Application
Filed: Mar 16, 2018
Publication Date: Sep 20, 2018
Inventors: Charles Clark Kemp (Atlanta, GA), Henry Mandus Clever (Atlanta, GA)
Application Number: 15/924,088
Classifications
International Classification: G06F 3/0481 (20060101); G06F 3/0484 (20060101); B25J 9/16 (20060101); B25J 15/08 (20060101);