INTELLIGENT BED ROBOT EQUIPPED WITH PRESSURE SENSOR-PROVIDED MATTRESS AND GRIPPER-PROVIDED SUPPORTING ROBOT ARM

Disclosed herein is an intelligent bed robot. The intelligent bed robot includes a pressure sensor-provided mattress, an intelligent robot arm, transfer rails, and at least one gripper. The pressure sensor-provided mattress monitors the position, posture and motion of a user on a bed in real time, and assists the user. The intelligent robot arm includes vertical bars disposed on two opposite sides of the bed and configured to have a predetermined length, a horizontal bar configured to connect the vertical bars to each other, and torque sensors disposed at lower ends of the vertical bar, and measures horizontal and vertical forces applied by the user. The transfer rails guide the intelligent robot arm along a path of movement. The gripper is coupled to the horizontal bar of the intelligent robot arm, and is provided with a finger unit capable of picking up an object.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates generally to a bed robot that is equipped with a pressure sensor-provided mattress and a supporting robot arm, and, more particularly, to an intelligent bed robot that is equipped with one or more grippers to perform the work of picking up objects.

2. Description of the Related Art

With the extension of the average life expectancy of people today and the development of medical technology, the elderly population is steadily increasing throughout the world. Accordingly, there is an urgent need for a system for assisting the elderly population in their activities.

Electric beds, targeting the elderly, the weak and the disabled, have a function that enables the heights and angles of the upper part and legs thereof to be freely adjusted. However, such electric beds provide only the most basic functions.

The present applicant proposed ‘Intelligent Bed Robot equipped with Pressure Sensor-provided Mattress and Supporting Robot Arm,’ for which a patent application was submitted on Oct. 15, 2004 and Korean Patent No. 10-0605750 was granted. With this, the applicant attempted to provide a bed robot that can provide more familiar and convenient services to users.

FIG. 1 shows a patented intelligent bed robot equipped with a pressure sensor-provided mattress and a supporting robot arm.

Referring to FIG. 1, the robot 100 includes a pressure sensor-provided mattress 104, a supporting robot arm 102, a shelf, and transfer rails 108. The pressure sensor-provided mattress 104 assists a person on a bed while monitoring the location, posture and motion of the person on the bed in real time. The supporting robot arm 102 includes two right and left motors and two torque sensors, and functions to support the person on the bed based on the information obtained by monitoring the mattress. The shelf is attached to the supporting robot arm for use. When not in use, the shelf is moved to and hidden in the space under the intelligent bed robot. The transfer rails 108 function to guide the robot arm through a path of movement so as to provide service to the person on the bed.

The intelligent bed robot according to the patented invention can provide services, such as the support of the upper body and the carriage of the meal shelf. However, there is a need for the provision of an intelligent bed robot that is capable of performing gripping work, such as the folding of bedclothes, in addition to existing services.

SUMMARY OF THE INVENTION

Accordingly, the present invention has been made keeping in mind the above problems occurring in the prior art, and an object of the present invention is to provide an intelligent bed robot in which one or more grippers are attached to a supporting robot arm to enable the work of picking up an object, thereby providing more familiar and convenient services to a user.

In order to accomplish the above object, the present invention provides an intelligent bed robot, including a pressure sensor-provided mattress configured to monitor the position, posture and motion of a user on a bed in real time, and assisting the user; an intelligent robot arm configured to comprise vertical bars disposed on two opposite sides of the bed and configured to have a predetermined length, a horizontal bar configured to connect the vertical bars to each other, and torque sensors disposed at lower ends of the vertical bar, and to measure horizontal and vertical forces applied by the user; transfer rails configured to guide the intelligent robot arm along a path of movement; and at least one gripper coupled to the horizontal bar of the intelligent robot arm so that it can move horizontally along the horizontal bar and rotate around the horizontal bar, and provided with a finger unit capable of picking up an object.

The intelligent bed robot may further include a control box for reading pressure values of the pressure sensors by sequentially scanning the pressure sensors using a multiplexer, and delivering data about the pressure values to a main computer in a pressure measurement distribution image form.

The robot arm may have two operation modes, including a follow mode, in which the user can control the robot arm using his or her command so as to obtain support means using the robot, and a support mode, in which the user's body is secured and supported using the acquired support means.

The follow mode may conduct work using a fuzzy network.

The horizontal bar may have a central separation element so that a length of the horizontal bar can vary with relative locations of the vertical bars.

The pressure sensor-provided mattress may monitor a person on a bed using a Principal Component Analysis (PCA) algorithm, the PCA algorithm performing a first step of determining a human's recognition target pattern and relating pressure sensor dataset for pattern classification; a second step of extracting feature points by calculating covariance of the pressure sensor data; a third step of determining a number of feature points capable of maximizing information classification between respective pieces of pattern data; and a fourth step of performing pattern recognition by comparing new incoming pressure sensor data with predetermined feature points.

The robot arm may work in a command mode of conducting operations in response to the user's command, which includes user's voice, and in an automatic mode of conducting work according to a predetermined work command.

The horizontal bar may have a ball screw disposed therein along a length of the horizontal bar, and the gripper includes a first gear structure provided with threads so that the first gear structure can guide the gripper through linear movement in conjunction with the ball screw; a second gear structure formed of a worm gear that works in conjunction with the first gear structure so that the gripper can rotate around the horizontal bar; and a third gear structure configured such that the finger unit can conduct work of picking up an object.

The ball screw and the first gear structure may work in conjunction with each other along a longitudinal hole that is formed along the length of the horizontal bar.

The finger unit may have the shape of a human hand, include a stationary first finger and a movable second finger, and perform the work of gripping an object through overlapping of the first and second fingers.

The first finger may be provided with an accommodation part so that the first finger can accommodate the second finger when the first finger and the second finger overlap each other.

The first finger and the second finger may have respective rubber portions that come into contact with an object.

The intelligent bed robot may be controlled using a Controller Area Network (CAN) device.

BRIEF DESCRIPTION OF THE DRAWINGS

The above and other objects, features and advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:

FIG. 1 is a perspective view of a prior art intelligent bed robot;

FIG. 2 is a view showing a preferred embodiment of an intelligent bed robot according to the present invention;

FIGS. 3A and 3B are a perspective view showing a gripper mounted on the intelligent bed robot according to the present invention, and a view showing the mounting of the gripper on the horizontal bar, respectively;

FIG. 4 is a perspective view showing an example in which grippers are mounted on the robot arm of the intelligent bed robot according to the present invention;

FIGS. 5A and 5B are enlarged photos respectively showing the portion of FIG. 4 in which the gripper is attached to the horizontal bar, and a longitudinal hole which is formed along the horizontal bar;

FIGS. 6A and 6B show torque sensors that are attached to the vertical bar;

FIGS. 7A and 7B show the different locations of the horizontal bar and the vertical bars; and

FIG. 8 is a diagram showing the control system of the intelligent bed robot according to the present invention.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

The construction and operation of the present invention are described in detail with reference to the accompanying drawings below. The same reference numerals, shown in the respective drawings, designate the same elements having the same functions.

FIG. 2 shows a preferred embodiment of an intelligent bed robot 200 according to the present invention. That is, FIG. 2 shows the intelligent bed robot 200 equipped with grippers.

Referring to FIG. 2, the intelligent bed robot 200 according to the present invention includes a mattress 204, an intelligent robot arm 202, 206 and 210, and transfer rails 208. In this case, pressure sensors are attached to the mattress 204 so as to monitor the position, posture and motion of a user 201 on a bed in real time and to assist him or her. The mattress 204 needs to detect the intention of the user 201 when the user 201 lies down on the bed and then sits up, and to assist the user 201 in sitting up using the triple folding structure of the intelligent bed robot.

In the case where the user 201 uses an active guide system, it is necessary to detect the position of the user 201. For this purpose, 14×24 pressure sensors are attached to the top surface of the mattress of the bed. The pressure sensors have advantages over temperature sensors or the like in that they have fast reaction speed and high resolution. In this case, it is preferred that the intelligent bed robot according to the present invention further include a control box, so that the control box can sequentially scan the pressure sensors, attached to the mattress, using a multiplexer, read the values of selected pressure sensors, and deliver pressure value data to a main computer in the form of a pressure measurement distribution image.

Furthermore, it is preferred that processors be attached to respective sections so that hardware is not damaged when the shape of the bed changes, since the bed has a triple folding structure. In this case, construction can be made such that three sensing processors are attached to the sections and transmit data to a main computer.

An algorithm capable of recognizing user information must be applied to the mattress 204. In order to support stochastic real-time signal processing, the Principal Component Analysis (PCA) algorithm is used as the information processing algorithm. The PCA algorithm recognizes sensor information from the bed in the form of images, and simultaneously employs a filtering technique and a stochastic approach, which are commonly used in image processing.

An algorithm with a small computational load is necessary because a lot of noise is included in the input data provided by the sensors and because the trajectory generation as well as the control of the robot arm should be processed in real time. The work that must precede the application of the present algorithm is data base construction for classifying the incoming bed image input. Accordingly, a database containing information about the positions and postures of the user on the bed must be constructed. Meanwhile, a method of applying the PCA algorithm to the present invention is described below.

First, the several primary pressure patterns of the mattress are determined to perform pattern classification. That is, a recognition target pattern, such as the lying posture and sitting posture of a human, is determined, and then pressure sensor data are extracted.

Thereafter, feature points are extracted by calculating the covariance of pressure sensor data. The feature points are used as important information that is used to distinguish the patterns of respective postures.

Thereafter, the number of feature points is determined such that success rate of classification among respective pieces of pattern data can be reached up to the predefined level.

Thereafter, pattern recognition is performed by comparing new incoming pressure sensor data with the predetermined feature points. Through this process, the current posture of the user can be determined, and, using variation in the posture of the user based on recognition results, motion information can be predicted.

The transfer rails 208 function to guide the robot arm 202, 206 and 210 through a path so as to provide service to a person on the bed.

The intelligent robot arm 202, 206 and 210 includes vertical bars 202(a) and 202(b) configured to have a predetermined length and placed on two opposite sides of the bed, and a horizontal bar 206 configured to connect the vertical bars 202(a) and 202(b) with each other. Torque sensors are placed at respective lower ends of the vertical bars 202(a) and 202(b) to measure the horizontal and vertical force applied by the user (see FIGS. 5A and 5B).

One or more grippers 210(a) and 210(b) move horizontally along the horizontal bar 206, and are mounted on the horizontal bar 206 so that they can rotate around the horizontal bar 206. Furthermore, fingers 212 and 214 are included in the gripper 210 to grip an object. The gripper 210 may move along the horizontal bar 206 to the center of the bed, and may move outside the mattress 204 when the gripper 210 is not necessary. Furthermore, the gripper 210 can rotate 360°, so that it can conduct the work of fetching a shelf from below the bed, or gripping, spreading and removing bedclothes on the bed.

The working procedure of the gripper 210, for example, the procedure of the work of folding bedclothes after a user has woken up, is described below. In this case, the operation of the gripper 210 may be performed both in an automatic mode and in a command mode.

In the case where a command mode is set, the user 201 issues a command using voices or a bed manipulator. In this case, the current position and posture of a patient are detected by the pressure sensors attached to the mattress 204. Thereafter, the robot arm moves to a location suitable for the posture and position of the user, and conducts the work of folding bedclothes.

Meanwhile, in the case where an automatic mode is set, for example, in the case where the user 201 wakes up and sits up on the bed, the variation in the posture of the user 201 is detected by the sensors attached to the mattress 204. In this case, the work of folding bedclothes can be performed together with the control of the action of moving the folding structure of the bed so as to conduct the work without a separate command from the user.

In the work of the intelligent bed robot 200 of the present invention in a command mode, a voice command or a command via the manipulator is received, and then a service is provided. In this case, the user 201 can issue a command, for example, to conduct the action of preparing a meal or going out using a voice or a manipulator. At this time, the user 201 does not issue commands for respective actions that constitute a time series of actions, but transmits only higher control commands (preparation of a meal, and preparation for going out) that are classified according to category.

FIGS. 3A and 3B are a perspective view showing a gripper mounted on the intelligent bed robot according to the present invention, and a view showing the mounting of the gripper on the horizontal bar, respectively. For the convenience of the user (201 shown in FIG. 2), the gripper 210 is mounted on the horizontal bar to conduct the work of spreading or removing bedclothes in addition to the work of conveying the shelf.

It is preferred that the gripper 210 include two fingers (a first finger 214, and a second finger 212), in which case the first finger 214 is fastened, thereby preventing a joint from being bent when raising and pulling a heavy object. The overall shape of the gripper 210 is curved to be similar to that of a human hand, and thus can reduce the risk of injury to the user (201 shown in FIG. 2) that may be caused upon collision with the user.

It is preferred that the first finger 214 be provided with an accommodation part so that the second finger 212 can pass through the first finger 214 when the first finger 214 and the second finger 212 overlap each other. In this case, a path formed in the second finger 212 may be a through hole that passes through the second finger 212, or a path that is formed by opening a side of the second finger 212, as illustrated in FIG. 3A.

When the second finger 212 is maximally closed, the first finger 214 and the second finger 212 overlap each other. Through this overlapping action, the size of a contact area can be increased for the case where cloth, such as the cloth of bedclothes, is gripped. In this case, it is preferred that the insides of the fingers be formed of rubber or the like, thereby having great frictional force.

FIG. 3B shows the mounting of the gripper 210 on the horizontal bar 206. That is, FIG. 3B shows the mounting of the gripper 210 on the horizontal bar 206 via gear structures.

The gripper 210 of the intelligent bed robot according to the present invention includes first, second and third gear structures 217, 219, and 220.

The first gear structure 217 is provided with threads that are engaged with a ball screw 209 mounted inside the horizontal bar 206 along the length thereof, and that guide the gripper 210 through linear movement. The second gear structure 219 is formed of a worm gear, which works in conjunction with the first gear structure 217 to rotate around the horizontal bar 206. The third gear structure 220 is configured such that the fingers 212 and 214 can conduct the work of gripping an object. Since the first gear structure 217 functions as a support when the second gear structure 219 rotates, the first gear structure 217 does not rotate around the horizontal bar 206, but guides the worm gear of the second gear structure 219 through rotation.

Referring to FIGS. 3A and 3B, a first motor 215 is mounted in the horizontal bar and provides driving force to the ball screw 209. The ball screw 209, which receives driving force from the first motor 215, rotates inside the horizontal bar 206, and the first gear structure 217 engages with spiral threads and moves linearly along the horizontal bar 206. In this case, the first gear structure 217 is provided with threads and thus works in conjunction with the ball screw 209. Meanwhile, in the region where the ball screw 209 is placed, the horizontal bar 206 has longitudinal holes, so that the inside of the horizontal bar 206 communicates with the outside of the horizontal bar 206. Using the above-described connection structure, the ball screw works in conjunction with the first gear structure, so that the rotation of the first gear structure 217 can be prevented. That is, the first gear structure 217 only moves linearly along the horizontal bar 206, does not rotate around the horizontal bar 206, and functions as a support when the second gear structure 219 rotates.

As shown in FIGS. 3 and 4, the second gear structure 219 has a worm gear form, and, as the worm gear works in conjunction with the first gear structure 217, the gripper 212 can rotate around the horizontal bar 206. The reason why the second gear structure 219 has a worm gear form is that an unnecessary motion can be prevented in a non-control state (an unpowered state, etc.) because the gripper 212 undergoes high loads, for example, while moving various objects.

The third gear structure 220 receives driving force from the third motor 218, and functions to control the motions of the fingers 212 and 214 when the motion of picking up an object is performed.

The sensors attached to the mattress recognize the positional state of the user on the bed, and the vertical bars (202 shown in FIG. 2) move upon the performance of a necessary motion, in which case the moving distances of the right and left vertical bars (202 shown in FIG. 2) may not be the same (see FIG. 7B). According to the present invention, the length of the horizontal bar (206 shown in FIG. 2) is automatically changed in response to a change in the distance between the vertical bars (202 shown in FIG. 2) based on the relative positions of the vertical bars (202 shown in FIG. 2), which have moved. As an example, the horizontal bar (206 shown in FIG. 2) may be configured such that the length thereof can be automatically changed around a central separation element (211 shown in FIG. 4) (see FIG. 4).

FIG. 4 shows an example in which grippers are mounted on the robot arm of the intelligent bed robot according to the present invention. That is, FIG. 4 is a perspective view showing parts of the vertical bars and the horizontal bar equipped with the grippers.

From FIG. 4, it can be found that the present invention provides a structure in which the vertical bar 202 is connected to both ends of the horizontal bar 206 and the grippers 210 can move linearly along the horizontal bar 206 and rotate around the horizontal bar 206.

FIGS. 5A and 5B are enlarged photos showing the portion of FIG. 4, in which a gripper 210 is attached to the horizontal bar 206, and a longitudinal hole 206-1 or 206-2, which is formed in the horizontal bar 206 along the length thereof, respectively. Referring to FIGS. 5A and 5B, the robot arm of the intelligent bed robot according to the present invention has two grippers 210(a) and 210(b). Each of the grippers has 3 Degrees of Freedom (3 DOF). One degree of freedom is used to move along the horizontal bar 206, another degree of freedom is used to rotate around the horizontal bar 206, and the remaining degree of freedom is used to control the work of picking up. In this case, respective degrees of freedom are controlled by the first, second and third motors (215, 216 and 218 shown in FIGS. 3A and 3B), which have been described in conjunction with FIGS. 3A and 3B.

Although the existing intelligent bed robots function to support the body weight of users or to provide shelves for meals, the function of picking up objects using one or more grippers is added to the intelligent bed robot of the present invention, so that the intelligent bed robot of the present invention actively assists a user in his or her physical activities.

FIGS. 6A and 6B show torque sensors that are attached to the vertical bar. Referring to FIG. 6A, a transfer platform, which enables the robot arm to move across the entire area of the bed along the transfer rails, is connected to the low end of a support bar, and two Direct Current (DC) motors, which are installed on both sides of the bed, drive the transfer platform.

Two torque sensors are attached to each transfer platform, and measure horizontal and vertical forces that are applied to the vertical bar by the user.

If the force applied to the vertical bar has only a horizontal component fx, torques τA and τB, which are measured by respective sensors, have the same direction. In contrast, the vertical component fy of a force applied to the vertical bar causes torques τA and τB, which are measured by respective sensors, to have opposite directions.

The robot arm attached to the intelligent robot of the present invention can operate both in a follow mode and in a support mode. A follow mode is an operation mode in which a user can control the robot arm using a command so as to secure a support means using the robot, while a support mode is an operation mode in which a user fastens and supports his or her body on the secured support means.

In the follow mode, a user can move the robot arm to a desired position with reference to data detected by the above-described torque sensors. In contrast, in the support mode, the body weight of the user can be supported without the influence of data detected by the torque sensors, so that the user can rest his or her body on a support means that is provided by the robot arm.

As an example, when a user desires to change his or her posture on a bed, how the follow mode and the support mode are performed is described below.

First, a user locates the horizontal bar near his or her chest using a voice command. In this case, the position of the user is detected by the pressure sensors attached to the mattress, and the horizontal bar moves to a location suitable for the user based on the results of the detection (see FIGS. 7A and 7B).

However, the location of the horizontal bar based on a desired position and the results of data measurement varies with the characteristics of an individual user.

Accordingly, in the follow mode, the user can adjust the location and height of the horizontal bar by pushing or pulling the horizontal bar while holding it. In this case, the force applied to the horizontal bar is detected by the torque sensors, and respective vertical bars are moved. Finally, when the user is placed at a desired position, the process may proceed to the support mode. The user may switch the operation mode from the follow mode to the support mode using a voice command.

In the support mode, the robot arm operates based on kinematics control. The bed system estimates the position of the user, and then controls the robot arm based on a scheduled path.

In contrast, in the follow mode, the user's intention is detected through the torque sensors. Referring to FIGS. 6A and 6B, when the user pushes or pulls the horizontal bar, the robot arm measures the vibrations of the torque sensors, and then controls the motor A and the motor B. After the user's intention has been detected through the torque sensors, the position desired by the user is calculated with reference to the combination of the measurement results of the torque sensor.

The torque sensors and the user's intention can be expected to be intuitive, while the structure of the robot arm system is not easily explained mathematically. Accordingly, it is preferable to use a fuzzy network to find the user's intention. The following Table 1 and Table 2 are rule tables that are used for a fuzzy network, where P indicates ‘positive’, N indicates ‘negative’, L indicates ‘large’, S indicates ‘small’, and ZO indicates ‘zero’. Thus, the motor A and the motor B are controlled according to the rule tables.

TABLE 1 τB Motor A PL PS ZO NS NL τA PL ZO ZO PS PL PL PS ZO ZO PS PS PL ZO NS NS ZO PS PS NS NL NS NS ZO ZO NL NL NL NS ZO ZO

TABLE 2 τB Motor B PL PS ZO NS NL τA PL PL PL PS ZO ZO PS PL PS PS ZO ZO ZO PS PS ZO NS NS NS ZO ZO NS NS NL NL ZO ZO NS NL NL

Using the Table 1 and Table 2, the robot arm can be moved to a desired location through a simple computation using the signs and intensities of signals.

FIGS. 7A and 7B show the different locations of the horizontal bar and the vertical bars. FIGS. 7A and 7B show the comparison of the case where the length of the horizontal bar is identical to that of the minor axis of the bed with the case where the length of the horizontal bar is longer than that of the minor axis of the bed.

The intelligent bed robot according to the present invention may be implemented such that the length of the horizontal bar varies with the relative locations of the vertical bars. When data is received from the sensors attached to the mattress and the vertical bar moves to a location suitable for the user, the length of the horizontal bar may vary with the relative locations of the vertical bars. As results of the movement of the robot arm based on the data measurement of the sensors, the length of the horizontal bar is formed to be the shortest in the case of FIG. 7A, while the length of the horizontal bar is formed to be longer than the minor axis of the bed in the case of FIG. 7B.

FIG. 8 is a diagram showing the control system of the intelligent bed robot according to the present invention.

The intelligent bed robot according to the present invention has 10 DC motors for operating the robot arm and 2 Alternating Current (AC) motors for raising an object. Accordingly, it is preferred that the control system be based on a Controller Area Network (CAN) to simultaneously control the devices. Referring to FIG. 8, the 10 DC motors installed in the vertical bar and the grippers are connected to the CAN, and the 2 AC motors are controlled via RS232 using a microcontroller. The microcontroller stores data that is detected by the pressure sensors attached to the mattress.

The present invention provides an intelligent bed robot equipped with one or more grippers, which analyzes the posture and action of a bed user and is provided with a robot arm having multiple degrees of freedom.

Furthermore, the present invention provides an intelligent robot, in which one or more grippers capable of picking up objects are added to a conventional robot equipped with a pressure sensor-provided mattress and a supporting robot arm, thereby providing more familiar and convenient services to a user.

Moreover, the present invention provides an intelligent bed robot that can be used to perform health monitoring and evaluate a rehabilitation procedure.

Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.

Claims

1. An intelligent bed robot, comprising:

a pressure sensor-provided mattress configured to monitor a position, posture and motion of a user on a bed in real time, and assisting the user;
an intelligent robot arm configured to comprise vertical bars disposed on two opposite sides of the bed and configured to have a predetermined length, a horizontal bar configured to connect the vertical bars to each other, and torque sensors disposed at lower ends of the vertical bar, and to measure horizontal and vertical forces applied by the user;
transfer rails configured to guide the intelligent robot arm along a path of movement; and
at least one gripper coupled to the horizontal bar of the intelligent robot arm so that it can move horizontally along the horizontal bar and rotate around the horizontal bar, and provided with a finger unit capable of picking up an object.

2. The intelligent bed robot as set forth in claim 1, further comprising a control box for reading pressure values of the pressure sensors by sequentially scanning the pressure sensors using a multiplexer, and delivering data about the pressure values to a main computer in a pressure measurement distribution image form.

3. The intelligent bed robot as set forth in claim 1, wherein the robot arm has two operation modes, including a follow mode, in which the user can control the robot arm using his or her command so as to obtain support means using the robot, and a support mode, in which the user's body is secured and supported using the acquired support means.

4. The intelligent bed robot as set forth in claim 3, wherein the follow mode conducts work using a fuzzy network.

5. The intelligent bed robot as set forth in claim 1, wherein the horizontal bar has a central separation element so that a length of the horizontal bar can vary with relative locations of the vertical bars.

6. The intelligent bed robot as set forth in claim 1, wherein the pressure sensor-provided mattress monitors a person on a bed using a Principal Component Analysis (PCA) algorithm, the PCA algorithm performing:

a first step of determining a human's recognition target pattern and relating pressure sensor dataset for pattern classification;
a second step of extracting feature points by calculating covariance of the pressure sensor data;
a third step of determining a number of feature points capable of maximizing information classification between respective pieces of pattern data; and
a fourth step of performing pattern recognition by comparing new incoming pressure sensor data with predetermined feature points.

7. The intelligent bed robot as set forth in any one of claims 1 to 6, wherein the robot arm works in a command mode of conducting operations in response to the user's command, which includes user's voice, and in an automatic mode of conducting work according to a predetermined work command.

8. The intelligent bed robot as set forth in claim 1, wherein:

the horizontal bar has a ball screw disposed therein along a length of the horizontal bar; and
the gripper includes:
a first gear structure provided with threads so that the first gear structure can guide the gripper through linear movement in conjunction with the ball screw;
a second gear structure formed of a worm gear that works in conjunction with the first gear structure so that the gripper can rotate around the horizontal bar; and
a third gear structure configured such that the finger unit can conduct work of picking up an object.

9. The intelligent bed robot as set forth in claim 8, wherein the ball screw and the first gear structure work in conjunction with each other along a longitudinal hole that is formed along the length of the horizontal bar.

10. The intelligent bed robot as set forth in claim 1, wherein the finger unit has a shape of a human hand, includes a stationary first finger and a movable second finger, and performs work of gripping an object through overlapping of the first and second fingers.

11. The intelligent bed robot as set forth in claim 10, wherein the first finger is provided with an accommodation part so that the first finger can accommodate the second finger when the first finger and the second finger overlap each other.

12. The intelligent bed robot as set forth in claim 10, wherein the first finger and the second finger have respective rubber portions that come into contact with an object.

13. The intelligent bed robot as set forth in claim 1, wherein the intelligent bed robot is controlled using a Controller Area Network (CAN) device.

Patent History
Publication number: 20080078030
Type: Application
Filed: Aug 3, 2007
Publication Date: Apr 3, 2008
Applicant: KOREA ADVANCED INSTITUTE OF SCIENCE AND TECHNOLOGY (Daejeon)
Inventors: Ju Jang LEE (Daejon), Kap Ho SEO (Daejon), Chang Mok OH (Daegu)
Application Number: 11/833,285
Classifications