CONTROL SYSTEM FOR A MOBILE MANIPULATION DEVICE
A display window control system for a mobile manipulation device includes an interface and a processor. The interface is configured to receive an image from a mobile manipulation device. The processor is configured to determine a display output to display the image in a display window; determine an overlay output to display one or more overlay command icons; determine that an indication has been received to activate an overlay command; and provide the indication to the mobile manipulation device.
This application also claims priority to U.S. Provisional Patent Application No. 62/473,778 entitled LOW COST GENERAL-PURPOSE MOBILE MANIPULATOR FOR INDOOR USE filed Mar. 20, 2017 which is incorporated herein by reference for all purposes. This application also claims priority to U.S. Provisional Patent Application No. 62/474,427 entitled LOW COST GENERAL-PURPOSE MOBILE MANIPULATOR FOR INDOOR USE filed Mar. 21, 2017 which is incorporated herein by reference for all purposes. This application also claims priority to U.S. Provisional Patent Application No. 62/473,778 entitled ADDITIONAL HARDWARE AND SOFTWARE FOR A LOW-COST GENERAL-PURPOSE MOBILE MANIPULATOR FOR INDOOR USE filed Feb. 5, 2018 which is incorporated herein by reference for all purposes.
BACKGROUND OF THE INVENTIONPeople with limited abilities often need an assistant for performing tasks around the home. For example, moving objects, feeding pets, cleaning up, etc., become very difficult tasks for people with difficulty walking around their home or picking up and carrying objects. Even for the able-bodied it would often be preferable to not have to perform some of these tasks, for example, when traveling, busy with work, relaxing, etc. Hiring an assistant to come into the home is expensive and can bring complications, for example introducing the possibilities of theft, employment law issues, etc. However, the world of humans is very messy and difficult for robots to navigate. Only the simplest home robots have shown themselves to be practical (e.g., robotic vacuum cleaners). This creates a problem where robots for more complicated tasks typically require very sophisticated technology, becoming more expensive than hiring a human assistant.
Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.
The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
A display window control system is disclosed. The display window control system for a mobile manipulation device comprises an interface configured to receive an image from a mobile manipulation device and a processor configured to determine a display output to display the image in a display window, determine an overlay output to display one or more overlay command icons, determine that an indication has been received to activate an overlay command, and provide the indication to the mobile manipulation device. In some embodiments, the display window control system for the mobile manipulation device comprises a memory coupled to the processor and configured to provide the processor with instructions.
A mobile manipulation device is disclosed. The mobile manipulation device comprises a base, a lift, an arm, and a manipulator. The base is able to move across a surface underneath the base. The lift is coupled to the base. The lift moves the arm vertically. The arm moves the manipulator horizontally along one direction. The base is able to move perpendicular to the one direction.
A mobile manipulation device comprises a base, a lift, an arm, and a manipulator. The base moves the lift across a surface using actuators connected to mechanisms such as wheels, tank treads, or articulated links. The base can rotate and translate the lift across the surface and in some embodiments perform other motions. In some embodiments, the base uses two driven wheels. The lift moves the arm vertically—lifting and lowering. The arm moves horizontally—extending and retracting. The base can move perpendicular to the arm's horizontal motion. The manipulator is attached to the arm. Together, the base, lift, and arm result in Cartesian motion of the manipulator. The base is also responsible for moving the lift to different locations and orientations in the environment.
People with limited mobility, cognitive impairment, perceptual difficulties, or reduced dexterity would benefit from a home robot that retrieves requested objects and performs other manipulation tasks. Here a robot is described that can perform manipulation tasks in human environments, such as homes. The robot emulates advantageous characteristics of the human body in a low-cost and highly-simplified form.
A full humanoid robot would be well adapted to manipulating everyday objects in human environments, however the cost would be prohibitive and the robot would be challenging to control. There are several important characteristics of the human body such a robot should have, including sensors and manipulators that are high above the ground, a small footprint, and stability with respect to perturbations, even when reaching out into the environment in order to perform tasks, such as grasping objects. The disclosed device is a robot that emulates these and other humanoid characteristics in a low cost, simplified form.
Two key capabilities of the design are its ability to reach important locations in human environments and its ability to take advantage of the Cartesian structure of human environments, which tend to consist of many horizontal and vertical planes, such as floors, tables, countertops, and cabinets. Some of the novel concepts included follow:
-
- Use of fewer actuators by using the base during manipulation to achieve movement along an additional orthogonal direction; use of a thin and long telescoping mechanism to reach through clutter instead of the typical approach of reaching around clutter with multiple rotary joints; and use of a Cartesian structure matched to the Cartesian structure of indoor human environments to simplify manipulation.
- Use of smaller actuators by having the arm actuator move a telescoping member orthogonal to gravity to avoid issues of gravitational loading and long moment arms typically associated with a long reach; using a telescoping member rigidly attached to a carriage on the lift and the lift attached to a high-mass mobile base to enable the robot's structure to support the force and moment applied by the telescoping arm to the carriage instead of an actuator; and using a lift actuator that moves a low mass arm assembly up and down, where in some embodiments the arm assembly is low mass due in part to the small arm actuator and the lightweight telescoping structure.
- Reduced need for coordination among actuators due to Cartesian structure in Cartesian environments, which enables independent, serialized control of the mobile base actuators, lift actuator, and telescoping arm actuator to perform many useful tasks in human environments. For example, when reaching to a location, a useful strategy is to first position the base, then raise or lower the arm so that the arm points at the target location, and then extend the arm until the manipulator reaches the location. Each of these motions can be facilitated by cameras and refined with further motions.
- Safe and stable reaching to locations due to a lightweight telescoping arm that is long and has a small cross section to reach through clutter; a statically stable base with a small footprint to navigate to locations in human environments; and a low center of mass resulting from a relatively large mass in the base from components such as the batteries and wheel actuators and a lightweight structure above the base due to the lightweight lift, arm, and manipulator.
- Versatility through specialized manipulators and tools that can be manually or automatically attached to the end of arm. These manipulators and tools may include simple or complex manipulators and tools—for example, simple unactuated tools like a hook, complex actuated mechanisms like a pitch-roll wrist, and/or actuated tools like a vacuum. Other examples of manipulators and tools include: a two actuator wrist with a compliant gripper; a gripper at a fixed orientation; hooks for opening/closing drawers and doors, operating light switches, and operating elevator buttons; a spoon for feeding; an adjustable mobile phone/tablet holder; a dustpan like gripper to pick things up from the ground; a vacuum cleaner brush; a small brush tool and dustpan attached to the arm to sweep debris off surfaces; and tools to leave marks, such as a marker holder with a spring to write on surfaces.
- View of the world from a human perspective without a complex robot head by using cameras with wide fields of view mounted at human eye level.
The disclosed robot design uses a single long, narrow, lightweight telescoping mechanism to reach locations around a person's body. In conjunction with the robot's wheeled base and vertical lift, the robot can directly move objects in orthogonal directions matched to the planar structure of human environments, simplifying control of the robot. This design reduces the robot's overall weight, number of actuators, and actuator requirements. In addition the robot's motions are easier for people to understand and direct.
In some embodiments, this robot design uses four motors: two for the base wheels, one for a long, narrow, lightweight telescoping mechanism that extends horizontally and serves as the robot's arm, and one motor for a vertical lift that raises and lowers the robot's telescoping arm. The robot's arm extends and retracts forward and backward, the lift moves up and down, and the wheels move the robot sideways or rotate the robot. These four motors provide four degree of freedom control of the end of arm (X, Y, Z, and theta). Less than four motors can be used using a clutched transmission and more than four motors can be used using a redundant mechanism. An actuated manipulator or a simple tool, such as a hook or duster, can be attached to the end of the arm. For example, the arm moves a manipulator toward and away from a base mounting pole along an x-axis; the arm is moved up and down along a base mounting pole relative to the floor along a z-axis; and the base mounting pole that protrudes vertically from a base is moved perpendicular to the arm along a y-axis. The base is also enabled to rotate around the axis of the base mounting pole. With these motions the end of the robot's arm can move in four directions using the telescoping arm, the lift, and the wheels.
The use of a long, narrow, lightweight, telescoping mechanism allows the robot to stably reach long distances relative to the footprint size of its wheeled mobile base, and to do so with a relatively small motor. By remaining horizontal, the motion of the arm does not work against gravity when used on the flat ground of indoor human environments. The long reach is especially useful in cluttered human environments, such as when assisting a person by moving an object close to his or her body. A key insight is that a long, thin telescoping arm with Cartesian positioning and a single axis of rotation can reach most locations in human environments and thereby do useful things. In some embodiments, the telescoping arm has a cross section smaller than or similar in size to a cross section of a human arm and a length that is longer than or similar in length to an outstretched human arm. In some embodiments, the telescoping arm is constructed from a series of nested structural elements and sliding elements such as bushings or bearings and is an exoskeleton comprising a hollow structure that provides structural support and also serves as the exterior of the arm.
The drive for the telescoping mechanism uses a large, somewhat stiff cable with electrically conducting wires to push and pull the end of the telescoping mechanism while extending and retracting. The telescoping mechanism can be moved by placing the cable between a high-friction pulley wheel attached to a motor and a passive pulley wheel with both wheels compressed together with springs to maintain contact with the cable. The conducting wires carry power and signals to and from end effectors at the end of the telescoping mechanism. The end effectors can be changed, and the base of the robot can have a tool holder for enabling the robot to automatically change the end effectors. In various embodiments, the end effectors include one or more of the following: an actuated grabber device, a dexterous hand, a robotic wrist, a vacuum, a suction gripper, a dustpan, a duster, a wiping element, a scraper, a hook, a rotary tool, a mop, a mobile phone holder, a tablet holder, a brush, a writing instrument, an eating utensil, a cloth, or any other appropriate tool.
The disclosed mobile manipulator is an improved system because it provides low cost, effective assistance with an intuitive control mechanism as a human helper robot for human spaces.
Vertical lift 202 comprises a vertical lift for carrying, raising, and lowering arm 216 (e.g., along the z-axis). In some embodiments, vertical lift 202 comprises a fixed mast (e.g., a vertical element of fixed dimensions) coupled to a linear actuator (e.g., an actuator for moving an element along a linear path), and arm 216 is coupled to the moving element of the linear actuator. In various embodiments, the linear actuator comprises a belt drive, a lead screw, a ball screw, a linear actuator including a prismatic joint, etc. In some embodiments, vertical lift 202 comprises a telescoping mast (e.g., a vertical element capable of extending or contracting in length), and arm 216 is coupled to a fixed point on the telescoping mast. In the example shown, coupling 212 comprises an actuated coupling for coupling arm 216 to vertical lift 202. A linear actuator included in vertical lift 202 drives coupling 212 and arm 216 up and down.
Camera mounting structure 208 is attached to the top of vertical lift 202 and holds camera 210. Camera 210 comprises a camera for viewing the environment around mobile manipulation device 200. For example, camera 210 comprises a camera facing the ground. When mounted high above the ground, camera 210 can see the tops of surfaces in human environments, like countertops, tables, desks, and manipulable objects in places people commonly place them. In some embodiments, camera 210 comprises a fisheye lens. When near human eye height with a fisheye lens, camera 210 provides a view of the world comparable to a standing human. Camera mounting structure 208 for camera 210 can increase visibility of the surrounding environment by reducing occlusions. Camera mounting structure 208 places the camera away from the mast to reduce occlusion from the mast. In some embodiments, camera mounting structure 208 arcs behind the camera to reduce occlusion, and its structure can use components with thin cross sections oriented parallel to rays emanating from the optical axis of the camera to reduce occlusion. Mobile manipulation device 200 additionally comprises camera 214 mounted on arm 216 so that it can view the arm and the manipulator. Camera 214 is mounted on arm 216 near vertical lift 202 and is facing gripper 222. Camera 214 comprises a camera for viewing the area where gripper 222 is operating. In some embodiments, camera 214 comprises a fisheye lens.
Arm 216 comprises an arm extending horizontally from vertical lift 202 (e.g., along x axis). In the example shown, arm 216 has a square cross-section. In some embodiments, arm 216 has a round cross-section. Other cross sections can be used—both nest and non-nesting. In addition, keyed cross sections can be used that prevent rotation of the telescoping tubes relative to one another. Arm 216 is telescoping—for example, capable of a telescoping action for moving gripper 222 towards or away from vertical lift 202. When the wheels of base 204 are driven in common mode such that mobile manipulation device 200 moves in a straight line, mobile manipulation device 200 moves perpendicular to the direction of arm 216. Arm 216 comprises hinge 218—for example, an actuated hinge for bending. In the example shown, when hinge 218 bends, gripper 222 moves toward the ground. Arm 216 additionally comprises camera 220. Camera 220 is mounted on arm 216 near gripper 222. As gripper 222 moves (e.g., due to the motion of hinge 218 or an actuated rotational coupling), camera 220 moves with it holding gripper 222 static in its frame. Camera 220 allows a close view of the actions of gripper 222. In some embodiments, camera 220 comprises a fisheye lens. Gripper 222 is attached to the end of arm 216. In some embodiments, a different manipulator is attached to the end of arm 216 (e.g., a manipulator for interacting with other objects). For example, manipulator comprises a grabber, an interchangeable tool holder, a vacuum, or a mop. Mobile manipulation device 200 may include a set of interchangeable tools that can be attached to an interchangeable tool holder manipulator. For example, a set of interchangeable tools can be mounted on base 204 and reached by retracting arm 216, folding hinge 218, and lowering arm 216 using vertical lift 212. The interchangeable tool holder attaches to a variety of tools, including, but not limited to, a grabber, a dexterous hand, a vacuum, a suction gripper, a dustpan, a duster, a wiping element, a scraper, a hook, a rotary tool, a mop, a mobile phone holder, a tablet holder, a brush, a writing instrument, an eating utensil, and a cloth.
Arm 216 additionally comprises actuated rotational coupling 224 for rotating. Rotational coupling 224 makes grippers and other tools more versatile. For example, it can be used to turn a door knob that a gripper is holding. When hinge 218 makes the gripper point towards the ground, rotational coupling 224 can orient a gripper to pick up an elongated object sitting on a flat surface from above. Rotational coupling 224 can also enable a hook to be oriented to hook onto both vertical and horizontal drawer handles. Angle sensors on rotational coupling 224 can be used to rotate video from camera 220 to make it look as though the camera was not being rotated by rotational coupling 224 and simplify remote operation by a human.
Camera 210, camera 214, and camera 220 provide video data to an interface and a processor disposed in base 204 of mobile manipulator 200. The processor provides the video data to a mobile manipulation device control system. The mobile manipulation device control system may be remotely located from the robot, allowing for remote teleoperation or autonomous control. The mobile manipulation device control system provides a user one or more views from the video data and receives motion commands from a user. The motion commands are provided to mobile manipulator 200. An interface of mobile manipulator 200 (e.g., a communication interface) receives the commands and provides the commands to a processor of mobile manipulator 200 and are then translated to specific motion commands for actuators of mobile manipulator 200 (e.g., fine or coarse motions for each of base 204, vertical lift 202, arm 216, folding hinge 218, and manipulator 222). In some embodiments, light rings or infrared sources are placed around some or all of the cameras to enable operating in dark rooms and other dark areas like the interior of a container.
In some embodiments, a control system for a mobile manipulation device comprises an interface that enables a remote operator to drive the mobile manipulation device and manipulate objects. The interface is a video-centric interface. The operator directly clicks on or touches one or more video streams to make the mobile manipulation device move. For example, clicking on or touching various parts of the video stream from the top camera makes the mobile manipulation device's wheeled base move. Clicking on or touching various parts of the video stream from the arm camera makes the arm move up and down and extend and retract. Clicking on or touching various parts of the video stream from the wrist camera makes the gripper open and close, bend down, straighten up, roll right, and roll left.
The regions are positioned with respect to the mobile manipulation device to make the mapping between a click or touch and mobile manipulation device motion intuitive. For example, clicking on or touching the video above the arm makes it move up, and clicking or touching in front of the arm makes it extend. The interface also provides feedback on the grip force and motor torques by making relevant regions of the video stream turn red with darker, less transparent, red meaning higher force or torque. The operator can use this to better infer what is happening. The red coloring is displayed in the region that when clicked or touched is likely to increase the torque or grip force. Each region of a video stream corresponds with a command can have a distinctive cursor icon associated with it as well as a descriptive text tooltip that appears when the cursor is held in the region for an extended duration. All commands are executed with a single click or touch, which results in one of the robot's joints moving a predefined distance/angle or executing a predefined autonomous motion of a short and bounded time. Some clickable or touchable regions in the video move over greater or smaller distances/angles. This is similar to a ‘jog mode’ on a Computer Numerical control (CNC) machine where coarse and fine motion of individual axes can be attained using a push button interface. The short bounded time of the resulting motion in ‘jog mode’ allows the robot operator to safely test small motions, observe in the video how they change the state of the world, and then adjust the subsequent set of jog commands. It also allows the robot operator to remove their attention from the robot operation at any time without having to consider returning to a robot that is an unsafe state.
In some embodiments, the interface rotates the gripper video to make it appear as though the camera is always in the same orientation with respect to gravity (i.e., the ceiling is always at the top of the image and the floor is always at the bottom) in spite of the camera rolling with the gripper. In some embodiments, there is also a microphone mounted to the gripper that provides audio feedback to the operator about objects being manipulated and helps the operator hear what the mobile manipulation device is doing, providing better situational awareness.
In some embodiments of the interface, the interface comprises multiple user selectable operator interface modes that map to primary usages of the robot. In some embodiments, a navigation mode, a manipulation mode, a grasp object from the top mode, and a grasp object from the side mode are provided. Each user interface mode presents one or more camera video streams that are zoomed, cropped, translated, and otherwise modified to present a very intuitive mapping between what the operator sees, what the operator clicks on or touches, and what the robot does to accomplish its task. By changing which videos are displayed, how the videos are processed, the mapping from locations on the videos to robot actuator commands, the computer graphics feedback provided to the user, and other aspects of the interface, a user selected interface mode can make a task more intuitive to perform with the robot.
The navigation mode makes driving the robot through an environment more intuitive. The navigation mode shows video from the top camera rotated such that the top of the video display corresponds with the direction of forward motion of the mobile manipulation device (i.e., y axis is vertically aligned with the video display). Clicking on or touching locations in the top middle part of the display results in forward motion of the robot along they axis and clicking on or touching locations near the bottom middle of the display results in backward motion of the robot along the y axis. Clicking on or touching locations on the sides of the display results in rotation of the robot around the theta axis. In some embodiments, the navigation mode simultaneously shows video from the arm camera to help the user avoid hitting the arm into something while driving the mobile manipulation device.
The manipulation mode makes manipulating objects with the robot more intuitive. The manipulation mode shows video from the top camera rotated such that the top of the video display is the direction in which the arm extends (i.e., x axis is vertically aligned with the video display).
The grasp object from the top mode makes grasping objects from above more intuitive. The grasp object from the top mode is used when the gripper is pointing down toward the ground (i.e., perpendicular to the arm). The grasp object from the top mode displays video from the wrist camera. Clicking on or touching the video display results in the gripper moving across a flat horizontal plane in order to position the gripper above an object to be grasped. This is achieved by sending commands to the base (y axis) and the arm (x axis). Clicking on or touching the top or bottom of the video display results in motion of the arm (x axis). Clicking on or touching the sides of the video display results in motion of the base (y axis). Once positioned above an object, the gripper can be rotated and the gripper can be lowered (z axis) and then closed to grasp the object.
The grasp object from the side mode makes grasping objects from the side more intuitive. The grasp object from the side mode is used when the gripper is pointing straight out (i.e., parallel to the arm). The grasp object from the side mode displays video from the wrist camera. Clicking on or touching the video display results in the gripper moving across a flat vertical plane in order to position the gripper in front of an object to be grasped. This is achieved by sending commands to the base (y axis) and the lift (z axis). Clicking on or touching the top or bottom of the video display results in motion of the lift (z axis). Clicking on or touching the sides of the video display results in motion of the base (y axis). Once positioned in front of an object, the arm can be extended (x axis) and the gripper closed to grasp the object.
By making the control system for the mobile manipulation system intuitive and easy to use, control of the mobile manipulation system becomes available to a wide range of non-specialized users. The control system allows control of all degrees of freedom of the mobile manipulation system with only a mouse and using a display with minimal distractions. In some embodiments, a user can command the system by touching the display rather than using a mouse. The system includes a plurality of cameras allowing the user to see different context for mobile manipulator system motions and automatically determines a user interface behavior based on the camera currently being displayed. This control system for a mobile manipulation system significantly broadens the ability of manipulator robots to perform useful work in the home.
Base 300 additionally comprises computer 302. Computer 302 comprises a computer including an interface system, a memory, a processor, data storage, etc. Computer 302 communicates with a mobile manipulation device control system (e.g., mobile manipulation device control system 102 of
In some embodiments, a compliant grabber device comprises two compliant fingers each with a suction cup fingertip. In some embodiments, each compliant finger has a compliant linkage that comprises two strips of material that behave as springs with one strip of material rigidly affixed to a housing and the other strip of material moved via an actuator to bend the finger. In some embodiments, the actuator pulls on a cable attached to the compliant linkages in order to cause the fingers to close and the return force of the springs causes the fingers to open when the cable tension is released. In some embodiments, a linear actuator rigidly attached to the compliant linkages pulls on the compliant linkages to cause the fingers to close and pushes on the compliant linkages to cause the fingers to open. In some embodiments, bend sensors mounted on the fingers provide signals with information about the kinematics and dynamics of the compliant grabber device. In some embodiments, an actuator used to move the fingers provides voltage, current, and kinematic signals with information about the kinematics and dynamics of the compliant grabber device.
Arm end 508 additionally comprises spring 512 and spring 514, spring 516 and spring 518, and grabber 520 and grabber 522. Spring 516 and spring 518 comprise spring elements capable of being actuated by being drawn into the end of arm end 508. In some embodiments, spring 516 and spring 518 are connected to a cable pulled and released by an actuator. In some embodiments, an actuator is rigidly coupled to spring 516 and spring 518 to push and pull them. In the example shown, linear actuator 532 is rigidly coupled to spring 516 and spring 518. When spring 516 and spring 518 are drawn into the end of arm end 508, spring 512 and spring 514 are pulled together, and grabber 520 and grabber 522 are pulled together. Grabber 520 and grabber 522 can pick up an object in this way. When the actuation is reversed, spring 512 and spring 514 return to their default positions or, in the case of a rigid coupling, can be pushed out to open beyond their default positions. When spring 516 and spring 518 are pushed out of the end of arm end 508, spring 512 and spring 514 are pushed apart, and grabber 520 and grabber 522 are pushed apart. Grabber 520 and grabber 522 can reach around a larger object than would fit between them in their default position in this way. Hook 530 attached to the outside of grabber 522 can be used to hook onto drawer handles, light switches, door handles, and other objects and be used in other manipulation tasks.
In various embodiments, hook 530 is rectangular, the protruding end of an L-shaped attachment to the gripper, or any other appropriate shape.
Arm end 508 additionally comprises laser light source 524 and laser light source 526. For example, laser light source 524 and laser light source 526 comprise laser pointers. Laser light source 524 and laser light source 526 are positioned at a slight angle to arm end 508 such that the light from laser light source 524 converges with the light from laser light source 526 between grabber 520 and grabber 522 (e.g., converging on a point indicating where the manipulator will interact when actuated). When the manipulator is being maneuvered into a position to grab an object using grabber 520 and grabber 522, light from light source 524 and laser light source 526 reflecting from the object can be used to judge where the object is relative to grabber 520 and grabber 522 (e.g., when viewing images captured by camera 510).
Bend sensors (e.g., bend sensor 528 on spring 514) mounted on spring 512 and spring 514 provide information about the state of the gripper. The actuator coupled to spring 516 and spring 518 provides current, voltage, and kinematic information about the state of the gripper. For example, together, this information can be used to detect contact, estimate the applied grip force, estimate the width of the opening of the gripper when grasping an object, and perform other sensing.
Coupler 534 is used to quickly attach or decouple different tools from the end of arm 500.
Arm end 604 additionally comprises springs (e.g., spring 610) and grabbers (e.g., springs and grabbers similar to
Arm end 604 additionally comprises laser light source 614 and laser light source 616. For example, laser light source 614 and laser light source 616 comprise laser pointers. Laser light source 614 and laser light source 616 are positioned at a slight angle to arm end 604 such that the light from laser light source 614 converges with the light from laser light source 616 between grabbers (e.g., converging on a point indicating where the manipulator will interact when actuated). When the manipulator is being maneuvered into a position to grab an object using grabbers, light from light source 614 and laser light source 616 reflecting from the object can be used to judge where the object is relative to grabbers (e.g., when viewing images captured by camera 606).
Bend sensors (e.g., bend sensor 612 on spring 610) mounted on springs provide information about the state of the gripper. The actuator coupled to springs provides current, voltage, and kinematic information about the state of the gripper. For example, together, this information can be used to detect contact, estimate the applied grip force, estimate the width of the opening of the gripper when grasping an object, and perform other sensing.
Coupler 602 is used to quickly attach or decouple different tools from the end of arm 600.
In some embodiments, the telescoping arm has a cross section smaller than or similar in size to a cross section of a human arm. The small cross section, long reach, and low mass of the telescoping arm are important features of the actual invention, since they enable the arm to stably reach places in human environments. In some embodiments, the telescoping arm has a length that is longer than or similar in length to an outstretched human arm. In some embodiments, the telescoping arm is constructed from a series of nested structural elements and sliding elements such as bushings or bearings. In some embodiments, the telescoping arm is an exoskeleton comprising a hollow structure that provides structural support and also serves as the exterior of the arm. In some embodiments, the telescoping mechanism is driven by an element that contains power and signal conductors. In some embodiments, the power and signal conductors in the cable used to extend or retract the arm are used by the manipulator or tool attached to the end of the telescoping arm.
The overlay comprising a plurality of command icons additionally comprises coarse overlay command icon 1006 and fine overlay command icon 1008. Coarse overlay command icon 1006 and fine overlay command icon 1008 are associated with a motion in an y axis of mobile manipulation device 1000, wherein the y axis is associated with a horizontal movement of a base of mobile manipulation device 1000. In the example shown, coarse overlay command icon 1006 and fine overlay command icon 1008 are associated with a rightward motion in the y axis, and the overlay additionally comprises a coarse overlay command icon 1020 and a fine overlay command icon 1018 associated with a leftward motion in the y axis. Coarse overlay command icon 1006 and fine overlay command icon 1008 are associated with different motion sizes, for example, coarse overlay command icon 1006 is associated with a coarse motion (e.g., a movement of the base 10 cm to the right) and fine overlay command icon 1008 is associated with a jog motion (e.g., a movement of the base 1 cm to the right). Coarse overlay command icon 1020 and fine overlay command icon 1018 are associated with different motion sizes, for example, coarse overlay command icon 1020 is associated with a coarse motion (e.g., a movement of the base 10 cm to the left) and fine overlay command icon 1018 is associated with a jog motion (e.g., a movement of the base 1 cm to the left).
The overlay comprising a plurality of command icons additionally comprises arm camera switch indication button 1010 and wrist camera switch indication button 1012. In the event an indication to arm camera switch indication button 1010 is received (e.g., by the mobile manipulation device control system) a user interface including a camera image from an arm camera is shown. In the event an indication to wrist camera switch indication button 1012 is received (e.g., by the mobile manipulation device control system) a user interface including a camera image from a wrist camera is shown. In some embodiments, the interface displays more than one video stream at the same time (e.g., an interface that shows video from the top camera, arm camera, and wrist camera at the same time), each with its own overlay.
In some embodiments, the overlay comprises a plurality of command icons additionally comprises a torque overlay for displaying an actuator torque. For example, a command icon is colored (e.g., colored red) when an associated actuator is delivering a torque above a threshold value, a display region is colored when an associated actuator is delivering a torque above a threshold value, the overlay comprises one or more torque meter displays, etc. The visibility of the color can be proportional to the torque, such as becoming less transparent with higher torques.
The overlay includes a plurality of command icons—for example, coarse vertical down overlay command icon 1314 and fine vertical down overlay command icon 1316. Coarse vertical down overlay command icon 1314 and fine vertical down overlay command icon 1316 are associated with a motion in a z axis of a mobile manipulation device, wherein the z axis is associated with a vertical movement of a vertical lift of the mobile manipulation device. In the example shown, coarse vertical down overlay command icon 1314 and fine vertical down overlay command icon 1316 are associated with a downwards motion in the z axis. Coarse vertical down overlay command icon 1314 and fine vertical down overlay command icon 1316 are associated with different motion sizes—for example, coarse vertical down overlay command icon 1314 is associated with a coarse motion (e.g., 10 cm down) and fine vertical down overlay command icon 1316 is associated with a jog motion (e.g., 1 cm down).
The overlay comprising a plurality of command icons additionally comprises coarse retracting overlay command icon 1306 and fine retracting overlay command icon 1308. Coarse retracting overlay command icon 1306 and fine retracting overlay command icon 1308 are associated with a motion in a x axis of the mobile manipulation device, wherein the x axis is associated with a retraction movement of an arm of the mobile manipulation device. In the example shown, coarse retracting overlay command icon 1306 and fine retracting overlay command icon 1308 are associated with a retracting motion in the x axis. Coarse retracting overlay command icon 1306 and fine retracting overlay command icon 1308 are associated with different motion sizes—for example, coarse retracting overlay command icon 1306 is associated with a coarse retracting motion (e.g., 10 cm) and fine retracting overlay command icon 1308 is associated with a jog motion (e.g., 1 cm).
The overlay comprising a plurality of command icons additionally comprises coarse extending overlay command icon 1318 and fine extending overlay command icon 1320. Coarse extending overlay command icon 1318 and fine extending overlay command icon 1320 are associated with a motion in a x axis of the mobile manipulation device, wherein the x axis is associated with an extension movement of an arm of the mobile manipulation device. In the example shown, coarse extension overlay command icon 1318 and fine extension overlay command icon 1320 are associated with a retracting motion in the x axis. Coarse extension overlay command icon 1318 and fine extension overlay command icon 1320 are associated with different motion sizes—for example, coarse extension overlay command icon 1318 is associated with a coarse extension motion (e.g., 10 cm) and fine extension overlay command icon 1320 is associated with a jog motion (e.g., 1 cm).
The overlay comprising a plurality of command icons additionally comprises top camera switch indication button 1310 and wrist camera switch indication button 1312. In response to an indication to top camera switch indication button 1310 being received (e.g., by the mobile manipulation device control system), a user interface including a camera image from a top camera is shown. In response to an indication to wrist camera switch indication button 1312 being received (e.g., by the mobile manipulation device control system), a user interface including a camera image from a wrist camera is shown.
The overlay comprising a plurality of command icons includes coarse manipulator clockwise rotation overlay command icon 1418 and fine manipulator clockwise rotation overlay command icon 1420. Coarse manipulator clockwise rotation overlay command icon 1418 and fine manipulator clockwise rotation overlay command icon 1420 are associated with a motion in a roll axis of the mobile manipulation device, wherein the roll axis is associated with a rotation of a wrist rotational coupling of the mobile manipulation device. Coarse manipulator clockwise rotation overlay command icon 1418 and fine manipulator clockwise rotation overlay command icon 1420 are associated with different motion sizes—for example, coarse manipulator clockwise rotation overlay command icon 1418 is associated with a coarse motion (e.g., 10 degrees) and fine manipulator clockwise rotation overlay command icon 1420 is associated with a jog motion (e.g., 1 degree).
The overlay comprising a plurality of command icons additionally comprises coarse open overlay command icon 1406 and fine open overlay command icon 1408. Coarse open overlay command icon 1406 and fine open overlay command icon 1408 are associated with an open/close motion of the mobile manipulation device, wherein the open/close motion is associated with an opening and closing of manipulator 1400 of the mobile manipulation device. In the example shown, coarse open overlay command icon 1406 and fine open overlay command icon 1408 are associated with an opening motion. Coarse open overlay command icon 1406 and fine open overlay command icon 1408 are associated with different motion sizes—for example, coarse open overlay command icon 1406 is associated with a coarse motion and fine open overlay command icon 1408 is associated with a jog motion.
The overlay comprising a plurality of command icons additionally comprises coarse closed overlay command icon 1422 and fine closed overlay command icon 1424. Coarse closed overlay command icon 1422 and fine closed overlay command icon 1424 are associated with an open/close motion of the mobile manipulation device, wherein the open/close motion is associated with an opening and closing of manipulator 1400 of the mobile manipulation device. In the example shown, coarse closed overlay command icon 1422 and fine closed overlay command icon 1424 are associated with an opening motion. Coarse closed overlay command icon 1422 and fine closed overlay command icon 1424 are associated with different motion sizes—for example, coarse closed overlay command icon 1422 is associated with a coarse motion and fine open overlay command icon 1424 is associated with a jog motion.
The overlay comprising a plurality of command icons additionally comprises coarse vertical up overlay command icon 1410 and fine vertical up overlay command icon 1412. Coarse vertical up overlay command icon 1410 and fine vertical up overlay command icon 1412 are associated with a motion in a pitch axis of the mobile manipulation device, wherein the pitch axis is associated with a bending of a wrist hinge of the mobile manipulation device. In the example shown, coarse vertical up overlay command icon 1410 and fine vertical up overlay command icon 1412 are associated with an upwards motion in the pitch axis. Coarse vertical up overlay command icon 1410 and fine vertical up overlay command icon 1412 are associated with different motion sizes—for example, coarse vertical up overlay command icon 1410 is associated with a coarse motion (e.g., 10 cm) and fine vertical up overlay command icon 1412 is associated with a jog motion (e.g., 1 cm).
The overlay comprising a plurality of command icons additionally comprises coarse vertical down overlay command icon 1426 and fine vertical down overlay command icon 1428. Coarse vertical down overlay command icon 1426 and fine vertical down overlay command icon 1428 are associated with a motion in a pitch axis of the mobile manipulation device, wherein the pitch axis is associated with a bending of a wrist hinge of the mobile manipulation device. In the example shown, coarse vertical down overlay command icon 1426 and fine vertical down overlay command icon 1428 are associated with an downwards motion in the pitch axis. Coarse vertical down overlay command icon 1426 and fine vertical down overlay command icon 1428 are associated with different motion sizes—for example, coarse vertical down overlay command icon 1426 is associated with a coarse motion (e.g., 10 cm down) and fine vertical down overlay command icon 1428 is associated with a jog motion (e.g., 1 cm up).
The overlay comprising a plurality of command icons additionally comprises arm camera switch indication button 1414 and top camera switch indication button 1416. In the event an indication to arm camera switch indication button 1414 is received (e.g., by the mobile manipulation device control system) a user interface including a camera image from an arm camera is shown. In the event an indication to top camera switch indication button 1416 is received (e.g., by the mobile manipulation device control system) a user interface including a camera image from a top camera is shown.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
Claims
1. A display window control system for a mobile manipulation device, comprising:
- an interface configured to: receive an image from a mobile manipulation device; and
- a processor configured to: determine a display output to display the image in a display window; determine an overlay output to display one or more overlay command icons; determine that an indication has been received to activate an overlay command; and provide the indication to the mobile manipulation device.
2. The system of claim 1, wherein the image is received from an overhead camera of the mobile manipulation device, an arm camera of the mobile manipulation device, or a wrist camera of the mobile manipulation device.
3. The system of claim 1, wherein the one or more overlay command icons comprise a coarse overlay command icon and a fine overlay command icon.
4. The system of claim 1, wherein the overlay output is determined based at least in part on a camera associated with the image.
5. The system of claim 3, wherein the coarse overlay command icon is associated with a coarse jog motion, and wherein the fine overlay command icon is associated with a fine jog motion.
6. The system of claim 1, wherein the processor is further configured to determine that a switch indication has been received to switch the display output to display a second image associated with an indicated camera of the mobile manipulation device.
7. The system of claim 1, wherein an axis associated with the overlay command icon is based at least in part on a camera associated with the display output.
8. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in an y axis of the mobile manipulation device, wherein the y axis is associated with a horizontal movement of a base of the mobile manipulation device.
9. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a theta axis of the mobile manipulation device, wherein the theta axis is associated with a rotational movement of a base of the mobile manipulation device.
10. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a z axis of the mobile manipulation device, wherein the z axis is associated with a vertical movement of a vertical lift of the mobile manipulation device.
11. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a y axis of the mobile manipulation device, wherein they axis is associated with an extension movement of an arm of the mobile manipulation device.
12. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a pitch axis of the mobile manipulation device, wherein the pitch axis is associated with a bending of a wrist hinge of the mobile manipulation device.
13. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with a motion in a roll axis of the mobile manipulation device, wherein the roll axis is associated with a rotation of a wrist rotational coupling of the mobile manipulation device.
14. The system of claim 3, wherein the coarse overlay command icon or the fine overlay command icon are associated with an open/close motion of device gripper.
15. The system of claim 1, wherein the overlay output additionally comprises a torque overlay for displaying an actuator torque.
16. A remote control interface system for a mobile manipulation device, comprising:
- an interface configured to: provide an image from a mobile manipulation device camera; and receive an indication to move; and
- a processor configured to: determine actuator commands based at least in part on the indication to move; and provide the actuator commands to one or more actuators.
17. The system of claim 16, wherein the indication to move is associated with a command motion type.
18. The system of claim 17, wherein the command motion type comprises a coarse motion, a fine jog motion, or an autonomous motion than concludes within a bounded amount of time.
19. The system of claim 16, wherein the indication to move is associated with a command axis, wherein the command axis comprises an x axis, a theta axis, a z axis, a y axis, a pitch axis, a roll axis, or an open/close motion.
Type: Application
Filed: Mar 16, 2018
Publication Date: Sep 20, 2018
Inventors: Charles Clark Kemp (Atlanta, GA), Henry Mandus Clever (Atlanta, GA)
Application Number: 15/924,088