GAME CONTROLLER SIMULATING PARTS OF THE HUMAN ANATOMY

Methods, systems and computer program products of the present invention provide a simulated part of a human anatomy. The simulated human part may be an apparatus associated with a computer program for simulating a part of the human anatomy comprising one or more sensors. The apparatus may be connected to a computing device either through a wired medium or wirelessly. The computing device may receive a data input from at least one of the plurality of sensors that may be located in the vicinity of the apparatus. The sensors may be a pressure sensor, humidity sensor, motion sensor or some other types of sensor. Data received from the apparatus at the computing device may be stored in a database and may be associated with a region of the apparatus having one or more sensors.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of the following commonly owned U.S. Provisional Patent Application, which is incorporated herein by reference in its entirety: App. No. 61/212,567 filed on Apr. 13, 2009 and entitled “Game Controller Simulating Parts of Human Anatomy.”

BACKGROUND

1. Field

This invention relates to video game and video game style controllers or other human interface devices for interaction with a computer and/or game system, and more specifically to controllers that simulate parts of human anatomy to provide user feedback and learning.

2. Description of the Related Art

Currently, there are standard game-style controllers and other human interface devices for interacting with computers and gaming systems, however, there is a need for controllers that simulate parts of the human anatomy, including those providing a learning method and system of human interaction and touch. A controller in the shape of any specific piece of human anatomy, be it a hand, foot, or some other human anatomical part embedded with sensors, could be useful for such applications as massage therapy training, therapy, gaming, or any other purpose that involves use and interaction with a computer or game system.

SUMMARY

Methods, systems and computer program products of the present invention provide a simulated part of a human anatomy. The simulated human part may be an apparatus associated with a computer program for simulating a part of the human anatomy comprising one or more sensors. The apparatus may be connected to a computing device either through a wired medium or wirelessly. The computing device may receive a data input from at least one of the plurality of sensors that may be located in the vicinity of the apparatus. The sensors may be a pressure sensor, humidity sensor, motion sensor or some other type of sensor. Data received from the apparatus at the computing device may be stored in a database and may be associated with a region of the apparatus having one or more sensors.

In an embodiment, the data received from the apparatus may represent the actuation of the sensor in the areas of touch. In another embodiment, the data may represent change in temperature, pressure, or some other characteristic of the sensor on actuation. The apparatus may be visually presented on the display of the computing device, for example using a graphic user interface. Further, the display may depict the sensor, the response of the sensor on actuation, and the region of the human anatomy associated with the apparatus. The display may be a graphical user interface that may be touch sensitive. A depiction of a part of the human anatomy from which the data input may be received may then be presented within the graphic user interface.

The apparatus representing a part of the human anatomy may be a foot, hand, leg, arm, back, face, torso, head, vagina, vulva, penis, breast, buttock, and the like, without limitations.

In an embodiment, the apparatus representing a part of the human anatomy may be a game controller. The game controller may implement a neural network pattern recognition model, neural network learning model, or some other kind of model. The apparatus representing a part of the human anatomy may be a component of a humanoid robot with human anatomic features. The apparatus representing a part of the human anatomy may include a surface inserted with an array of three-dimensional sensors. In embodiments, the surface may be made of silicone, urethane, rubber, elastomeric, or a combination thereof.

In an embodiment, the computing device may be a computing device such as a computer, a PDA, a cellphone, a laptop, a computer, a desktop, a gaming console, television gaming console, handheld gaming console, wireless gaming console, LCD, iPad, a smart phone, a television or some other type of computing device.

Methods, systems, and computer program products of the present invention for receiving a sequence of sensor data inputs from the apparatus is provided. The data input may infer the sensor data sequence using one or more statistical techniques such as interpolation, and store the sensor data sequence in a database, subsequently displaying the data in a graphical user interface. In an embodiment, the computer program product may be capable of being embodied in a computer readable medium and configured to be executed on one or more computers. The computer program product may provide an apparatus representing a part of the human anatomy. The apparatus may be embedded with a plurality of sensors that may actuate on touch. Further, the apparatus may be connected to a computing device. A first data input may be received from a first sensor. Likewise, a second sensor located within the apparatus may provide a second data input. The first and second data inputs may be stored in a database as a sensor data sequence. The sensor data sequence may be associated with the region of the apparatus having at least one of the plurality of sensors. The sensor data sequence may include interpolated sensor data relating to a physical region between the first and second sensor based on first and second data inputs.

The part of the apparatus that may represent the human anatomy may be depicted on the display of the computing device using the graphic user interface. A depiction of the part of human anatomy from which the sensor data sequence is received may be subsequently presented within the graphic user interface.

In embodiments, the first and second data inputs may be analyzed based at least in part using neural network pattern recognition technique. Further, the numeric result obtained by analysis using neural pattern recognition may be stored in association with the sensor data sequence.

Methods, systems, and computer program products of the present invention for recording a sensor data from an expert user may be provided. In an embodiment, the computer program product may be capable of being embodied in a computer readable medium and configured to be executed on one or more computers. The computer program product may perform the step of receiving an expert sensor data sequence from an expert user using a first apparatus representing a part of the human anatomy. One or more sensors may be provided in the first apparatus. An actuation and/or manipulation of one or more of the first physical device may provide an expert sensor sequence data. The manipulation may be considered as an expert maneuver. An apparatus may be connected to a first computing device. Further, the expert sensor data sequence may be recorded and stored in a database.

A novice sensor data sequence may be received from a novice user using a second apparatus representing a similar part of the human anatomy as the first apparatus. A plurality of sensors may be provided in the second apparatus, too. Further, the second apparatus may be connected to a second computing device. The novice sensor data sequence may be recorded and stored in the database.

The novice sensor data sequence may then be compared to the expert sensor data sequence based at least in part on a statistical analysis of the data sequences. The statistical analysis may result in a score associated with a degree of similarity between the novice sensor data sequence and the expert sensor data sequence.

The computer program product may further perform the step of representing the part of human anatomy that is represented by the second apparatus on a display of the second computing device, using a graphic user interface. A depiction of a part of the human anatomy from which the novice sensor data sequence is received may be presented within the graphical user interface. The graphical user interface may include a score and a feedback indicator. In an embodiment, the feedback indicator may be a visual comparison of the expert sensor data sequence and the novice sensor data sequence. In another embodiment, the feedback indicator may be a text tutorial relating to the expert maneuver. In yet another embodiment, the feedback indicator may be a multimedia tutorial relating to the expert maneuver. In yet another embodiment, the feedback indicator may be a skill rating for at least one of a plurality of skills relating to the expert maneuver.

In accordance with an embodiment of the present invention, the feedback indicator may be based at least in part on providing feedback using the second apparatus. Further, in another embodiment, the feedback using the second apparatus may be based at least in part on lighting an area of the apparatus. In yet another embodiment, the feedback using the second apparatus may be based at least in part on vibrating an area of the apparatus. In another embodiment, the feedback using the second apparatus may be based at least in part on deforming an area of the apparatus. In yet another embodiment, the feedback using the second apparatus may be based at least in part on heating an area of the apparatus. In still another embodiment, the feedback using the second apparatus may be based at least in part on cooling an area of the apparatus.

In an embodiment, the expert sensor data sequence and the novice sensor data sequence may be stored in one or more databases. In yet another embodiment, the database may be a distributed database.

In accordance with various embodiments of the present invention, the feedback indicator may be provided to the novice user in substantial real-time relative to the receipt of the novice sensor data sequence.

In embodiments, the score after statistical analysis may be stored in an account associated with the novice user. Further, the account may include a plurality of scores based at least in part on a plurality of novice sensor data sequences derived from prior usage sessions with a plurality of apparatuses. The plurality of scores may be statistically combined to create a master score for the novice user summarizing at least in part an overall performance with at least one type of apparatus. In embodiments, the expert maneuver may be a reflexology maneuver, massage maneuver, physical therapy maneuver, occupational therapy maneuver, chiropractic maneuver, medical maneuver, rehabilitation maneuver, sexual maneuver, and the like, without limitations. Further, the massage maneuver may be a back massage maneuver, neck massage maneuver, facial massage maneuver, arm massage maneuver, leg massage maneuver, foot massage maneuver, and the like, without limitations. The sexual maneuver may be an oral sexual maneuver, a digital sexual maneuver, related to copulation, and the like, without limitations.

In embodiments, the expert user may be a masseuse, physician, physical therapist, occupational therapist, sex therapist, rehabilitation expert, paid performer, and the like, without limitations. The paid performer may be a sexual performer or the like.

The present invention provides a computer program product that may be embodied in a computer readable medium and may be executed on one or more computers. The computer program product may perform the step of receiving an expert sensor data sequence from an expert user using a first apparatus representing a part of the human anatomy. In an embodiment, a plurality of sensors may be provided in the first apparatus. Further, the expert sensor data sequence may derive at least in part from the expert, physically manipulating the device as part of performing an expert maneuver. The apparatus may be connected to a first computing device. In embodiments, the expert sensor data sequence may be recorded and stored in a database. In embodiments, the database may be of different types.

A first novice sensor data sequence may be received from a first novice user using a second apparatus representing a similar part of the human anatomy as the first apparatus. A plurality of sensors may be provided in the second apparatus. Further, the second apparatus may also be connected to a second computing device. The first novice sensor data sequence may also be recorded and stored in the database.

A second novice sensor data sequence may be received from a second novice user using a third apparatus representing a similar part of the human anatomy as the first apparatus. A plurality of sensors may be provided in the third apparatus too. Further, the third apparatus may also be connected to a third computing device. The second novice sensor data sequence may also be recorded and stored in the database.

In embodiments, the first and the second novice sensor data sequences may be compared to the expert sensor data sequence based at least in part on a statistical analysis of the data sequences. The statistical analysis may result in a first score associated with a degree of similarity between the first novice sensor data sequence and the expert sensor data sequence. Similarly, the statistical analysis may also result in a second score associated with a degree of similarity between the second novice sensor data sequence and the expert sensor data sequence.

The computer program product may further perform the step of representing the part of human anatomy that is represented by the second apparatus on a display of at least one of the first and second computing devices, using a graphic user interface. A depiction of a part of the human anatomy from which the first and second novice sensor data sequences are received may be presented within the graphic user interface. The graphic user interface may include a score and at least one feedback indicator.

The present invention provides a computer program product capable of being embodied in a computer readable medium and configured to be executed on one or more computers. The computer program product may perform the step of programming a data sequence representing a human's physical interaction with a part of the human anatomy in a software code. The data sequence may represent at least in part a performance of an anatomical maneuver. The computer program product may further perform the step of storing the programmed data sequence in a database.

In embodiments, a novice sensor data sequence may be received from a novice user using a second apparatus representing a part of the human anatomy. A plurality of sensors may be provided in the apparatus. Further, the apparatus may be connected to a second computing device. The novice sensor data sequence may also be recorded and stored in the database.

In embodiments, the novice sensor data sequence may be compared to the programmed data sequence based at least in part on a statistical analysis of the data sequences. The statistical analysis may result in a score associated with a degree of similarity between the novice sensor data sequence and the programmed data sequence. Further, the computer program product may perform the step of representing the part of human anatomy on display on the computing device using a graphic user interface. A depiction of a part of the human anatomy from which the novice sensor data sequence is received may then be presented within the graphic user interface. The graphic user interface may include a score and a feedback indicator.

In accordance with various embodiments of the present invention, the feedback indicator may be provided to the novice user in substantial real-time relative to the receipt of the novice sensor data sequence.

In embodiments, the score after statistical analysis may be stored in an account associated with the novice user. Further, the account may include a plurality of scores based at least in part on a plurality of novice sensor data sequences derived from prior usage sessions with a plurality of apparatus. The plurality of scores may be statistically combined to create a master score for the novice user summarizing at least in part an overall performance with at least one type of apparatus.

While the invention has been described in connection with certain preferred embodiments, other embodiments would be understood by one of ordinary skill in the art and are encompassed herein.

BRIEF DESCRIPTION OF THE FIGURES

The invention and the following detailed descriptions of certain embodiments thereof may be understood by reference to the following figures:

FIG. 1 is a block diagram depicting an apparatus connected with a computing device in accordance with a first embodiment of the present invention;

FIG. 2 is a flowchart depicting method steps in accordance with the first embodiment of the present invention;

FIG. 3 is a block diagram depicting an apparatus connected with a computing device in accordance with another embodiment of the present invention;

FIG. 4 is a flowchart depicting method steps in accordance with another embodiment of the present invention;

FIG. 5 is a block diagram depicting an apparatus connected with a computing device in accordance with yet another embodiment of the present invention;

FIG. 6 is a flowchart depicting method steps in accordance with yet another embodiment of the present invention;

FIG. 7 is a block diagram depicting an apparatus connected with a computing device in accordance with yet another embodiment of the present invention;

FIG. 8 is a flowchart depicting method steps in accordance with yet another embodiment of the present invention;

FIG. 9 is a block diagram depicting an apparatus connected with a computing device in accordance with yet another embodiment of the present invention;

FIG. 10 is a flowchart depicting method steps in accordance with yet another embodiment of the present invention;

FIG. 11 is an exemplary screenshot for a graphic user interface in accordance with various embodiments of the present invention;

FIG. 12 is a screenshot of one embodiment of a screen display of a rhythm game;

FIG. 13 is a diagram of one embodiment of a simulated foot controller in use;

FIG. 14 is a side view of one embodiment of a simulated foot controller;

FIG. 15 is a configuration diagram of one embodiment of a simulated foot controller showing a possible sensor layout; and

FIG. 16 is a configuration diagram of one embodiment of a simulated hand controller showing a possible sensor layout.

FIG. 17 depicts a simplified graphic user interface for comparing a novice performance to an expert performance.

FIG. 18 depicts a simplified graphic user interface for presenting a performance score and performance graphics to a user.

FIG. 19 depicts a simplified graphic user interface for presenting data received from a physical apparatus to a user.

FIG. 20 depicts a simplified graphic user interface for presenting a user with performance prompts.

DETAILED DESCRIPTION

Detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting but rather to provide an understandable description of the invention.

The terms “a” or “an,” as used herein, are defined as one or more than one. The term “another,” as used herein, is defined as at least a second or more. The terms “including” and/or “having” as used herein, are defined as comprising (i.e., open transition). The term “coupled” or “operatively coupled,” as used herein, is defined as connected, although not necessarily directly and not necessarily mechanically.

Referring to FIG. 1, an apparatus 102 communicatively coupled with an external device such as a computing device 128 is provided. In embodiments, the method steps performed are depicted in FIG. 4. The apparatus 102 may include sensors 104, memory 112, processing unit 114, database 118, analog-digital converter 120, electronic circuitry 122, communication channel 124 and the like. In an embodiment, the apparatus 102 may resemble a part of the human anatomy such as a body part. In accordance with various embodiments, the apparatus 102 representing the human anatomy part may include, without limitations, a foot, hand, leg, arm, back, face, torso, head, vagina, vulva, penis, breast, buttock and the like. In accordance with various embodiments of the present invention, the apparatus 102 representing a part of the human anatomy may be a component of a humanoid robot with human anatomic features. In embodiments, the apparatus 102 representing a part of the human anatomy may include a silicone surface, urethane surface, rubber surface, elastomeric surface or combinations thereof including a three-dimensional sensor array.

The apparatus 102 may be configured to send/receive electronic signals aggregated from a plurality of sensors 104. The sensors 104 may include a pressure sensor 108, a humidity sensor 110 and the like that may be mounted within the apparatus 102. The apparatus 102 may be communicatively coupled to the computing device 128. In embodiments, the external computing device 128 may be a computing device; a computer such as a laptop, notebook and the like; a cell phone, a PDA, a server; a gaming console such as a television gaming console, handheld gaming console, wireless gaming console; an ipod, a television, a smart phone and the like. Signals received from the apparatus 102 may be stored in a memory 112. A database 118 may be maintained for storing and recording signals. The databases (118; 128) may reside within the apparatus and computing device, respectively, or be located at a location remote to either the apparatus and/or computing device. For example, one or more databases (118; 128) may reside on an external server, or plurality or servers. The servers may be linked to remotely using a network, such as the Internet or some other type of computing or gaming network. Alternatively, the signals received form a plurality of sensors 104 may be transmitted to the computing device 128 and may be subsequently stored in the memory 130 and/or the database 138. These signals may be analyzed, processed and filtered to convert into data by the processing unit 132 maintained in the database 138. For example, signals aggregated from various devices may be in analog form that may be converted into digital format before being stored in the database 138. In an embodiment, the database 138 may be a buffer, flip-flop, memory, hard disk, optical drive, CD, DVD and the like. Likewise, sensor signals may be stored for processing in the data structure in the associated computing device 128. In this aspect, higher precision may be associated with the received signal, and therefore, data structures may be utilized for storing values of the sensors 104. Examples of software may include, without limitations, linked list, graphs, trees, flat files, spreadsheet, XML and the like.

It may be noted that the term ‘signal’ and ‘data,’ as used herein, may be used synonymously.

In an embodiment of the present invention, the signal may be processed in electronic circuitry 140 before being stored in the database 138. For example, the noise filter may filter the signal received from the plurality of sensors 104. In this aspect, the noise filter may reduce the noise component, including but not limited to, Gaussian noise, white noise, thermal noise or some other type of noise. Likewise, the signal may be passed through a band pass filter to eliminate frequency outside a specified range. In yet another embodiment of the present invention, the sensor signals may be converted into digital format before being stored in the database 138. In this aspect, analog to digital convertor 134 may be utilized to convert analog signals into digital signals before storing them in the database 138.

Moving further, data received at the computing device 128 may be displayed on a graphic user interface (GUI) 148. The GUI 148 may depict information on the display 144 corresponding to the effect of one or more sensors 104. For example, a touch at the center of the foot may initiate a response; the magnitude and accuracy of touch may be analyzed and displayed on the display 144. Instructions corresponding to these parameters may be provided as a feedback to the user. Likewise, a touch on the tip of the toe may initiate a response that may be recorded and the analysis of accuracy and effectiveness of the touch may be displayed on the display 144. The computing device 128 may further provide instructional content for displaying the best way of initiating a touch.

The GUI may facilitate interaction of a user with the computing device 128. In this aspect, the user may click on the image of apparatus 102 that may be displayed on the display 144. The user may be provided educational content related to that region.

In embodiments, the apparatus 102 may be a game controller, a remote controller, a device carved into the shape of a human part, an artificially crafted human organ, a medical clone of a human organ and the like. The apparatus 102 may be embedded with one or more sensors 104 and/or transducers such as a pressure sensor 108, humidity sensor and the like, without limitations. In other embodiments, various other types of sensors such as, but not limited to, a light sensor, motion sensor, temperature sensor, magnetic sensor, accelerometer, gravity sensor, vibration sensor, electrical sensor, sound sensor and the like, may be employed. In accordance with various embodiments, the sensor 104 may record stretch, motion of an organism, and position of appendage and the like. In yet another embodiment, the sensor 104 may sense fluid activity such as toxins, nutrients, pheromones and the like. Likewise, the sensors 104 may be configured to sense biomolecules interaction or some other kinetic parameters. In an embodiment, the sensors 104 may sense metabolic milieu, such as glucose level, oxygen level, or osmolality and the like. Likewise, a model for determining internal signal molecules, such as hormones, neurotransmitters, and cytokines may be developed that may sense these activities with the aid of the simulation model.

The data from the apparatus 102 may be received at the computing device 128. In an embodiment, the information may be received either wirelessly or through a wired medium. For example, the apparatus 102 and the computing device 128 may include a communication channel 124 and 142 respectively such as a wireless facility that may enable exchange of data wirelessly. Various technologies such as EDGE, INFRARED, BLUETOOTH and the like may be employed for wireless transfer of data. In embodiments, data transfer between the apparatus 102 and the computing device 128 may be achieved through a wired medium using different kinds of connectors such as RS 232 port, serial port, parallel port or some other kind of port.

The apparatus 102 may be displayed on a computing device 128, which may be a computing device. Further, the apparatus 102 on being displayed on the screen of the computing device 128 may provide information relevant to the user. The computing device 128 may display various locations on the body part and corresponding operations performed on the body part through the graphic user interface 148. For example, the computing device 128 may depict areas on a foot to be pressed, twisted, bent, squeezed, rubbed and the like. Similarly, various other body parts may also be displayed in addition to the foot. Further, the computing device 128 may also depict a region of the part of the human anatomy from which the data input was received on the graphic user interface 148.

In embodiments, the apparatus 102 may be in the form of a game controller to be used by one or more players. In accordance with these embodiments, multiple persons such as a couple may operate the apparatus 102 to practice and exchange training/learning regarding various therapies and treatments such as sex therapies and the like. Further, the game controller for couples may be shaped into respective gender specific organs. In addition, the computing device 128 may receive sensor inputs governed by the operations of the couple on their apparatuses to recognize acts and degree of accuracy of the acts performed by the couple. In embodiments, the game controller may utilize neural network pattern recognition methods. In other embodiments, the game controller may use neural network learning methods. Similarly various other algorithms and recognition techniques may be utilized by the game controller.

In accordance with an embodiment of the present invention, referring to FIG. 2, a method 200 may be provided for receiving sensor data input from an apparatus 102. The apparatus 102 may represent an anatomical feature. Further, the method 200 may depict storage and display of the sensor data input. The sensor data input may be displayed in a graphic user interface (GUI). The method 200 starts at step 202. At step 204, an apparatus 102 may be provided that may be connected to a computing device 128. The apparatus 102 may represent a part of human anatomy such as a foot, hand, leg, arm, back, face, and the like. Further, the apparatus 102 may include a plurality of sensors 104. At step 208, a data input may be received from at least one of the plurality of sensors 104 located within the apparatus 102. At step 210, the data input may be stored in a database 118. The data input may be stored in association with a region of the apparatus 102 in which the at least one of the plurality of sensors 104 may be located. At step 212, the part of the human anatomy may be represented by the apparatus 102. The part of the human anatomy may be represented on a display of the computing device 128. Furthermore, at step 214, a region of the part of the human anatomy from which the data input was received may be presented within the graphic user interface. The method terminates at step 218.

In accordance with another embodiment of the present invention, referring to FIG. 3, a sequence of sensor data inputs may be received from the apparatus 302 representing the anatomical feature. An apparatus 302 may include a plurality of first set of sensors 304 and a plurality of second set of sensors 312. Additionally, the apparatus may include a database 324, an analog-to-digital convertor 328, a memory 320, an electronic circuitry 330, a processing unit 322, and a communication channel 332. The apparatus 302 may be communicatively coupled to the computing device 334.

Likewise, the computing device 334 may include a memory 338, a processing unit 340, a database 344, an electronic circuitry 348, a communication channel 350, a display 352, and a graphical user interface 354.

In an embodiment of the present invention, the computing device 334 may receive a first data input from at least one of the sensors 304 and the second input data input from at least one of the second set of sensors 312. In an exemplary scenario, the first data input and the second data input may be received from the same body part. In another scenario, the first data input and the second data input may be associated with distinct body parts. The first and the second data inputs may be stored in the database 344 as a sensor data sequence. Further, the sensor data sequence may be stored in the database 344 along with other details such as proximity of the sensor, region of mounting of the sensor on the anatomical part and the like.

In embodiments, the sensor data sequence may include an interpolated sensor data relating to a physical region between the first and the second sensor based at least in part on the first and the second sensor data inputs. In embodiments, the first sensor data input and the second sensor data input may be analyzed based at least in part on using neural network pattern recognition methods. Further, a numeric result retrieved after analysis may be stored in association with the sensor data sequence including the first sensor data input and the second sensor data input. The numeric result may be obtained based on an analysis performed by the processing unit 340.

In accordance with an embodiment of the present invention, referring to FIG. 4, a method 400 may be provided for receiving a sequence of sensor data inputs from an apparatus 302. The apparatus 302 may represent an anatomical feature. Further, the method 400 may depict inferring, storing and displaying a sensor data sequence. The method starts at step 402. At step 404, the apparatus 302 may be provided that may be connected to a computing device 334. The apparatus 302 may represent a part of the human anatomy such as a foot, hand, leg, arm, back, face, and the like. Further, the apparatus 302 may include a plurality of first set of sensors 304 and a plurality of second set of sensors 312. At step 408, a first data input may be received from a first sensor from at least one of the sensors 304 located within the apparatus 302. Further, at step 410, a second data input may be received from a second sensor from at least one of the sensors 312 located within the apparatus 302. At step 412, the first data input and the second data input may be stored in a database 324 as a sensor data sequence. The sensor data sequence may be stored in association with a region of the apparatus 302 in which at least one of the plurality of first set of sensors 304 and the plurality of second set of sensors 312 may be located. Furthermore, the sensor data sequence may include interpolated sensor data that may relate to a physical region between the first sensor and the second sensor based at least in part on the first and second data inputs. At step 414, the part of the human anatomy may be represented by the apparatus 302. The part of the human anatomy may be represented on a display of the computing device 334. Additionally, at step 418, a part of the human anatomy from which the data input was received may be presented within the graphic user interface. The method terminates at step 420.

Referring to FIG. 5, a plurality of apparatuses such as a first apparatus 502A, and a second apparatus 502B may be communicatively coupled to the plurality of computing devices such as a first computing device 522A, and a second computing device 522B. The first apparatus may include one or more expert sensors 504A, memory 508A, processing unit 510A, electronic circuitry 512A, analog-to-digital convertor 518A and a communication channel 520A, and a database 514A. The databases may reside within the apparatuses and/or computing devices, or be located at a location remote to either the apparatuses and/or computing devices. For example, one or more databases may reside on an external server, or plurality or servers. The servers may be linked to remotely using a network, such as the Internet or some other type of computing or gaming network. The second apparatus may include one or more expert sensors 504B, memory 508B, processing unit 510B, electronic circuitry 512B, analog-to-digital convertor 518B and a communication channel 520B, and a database 514B. The first computing 522A may include analog-to-digital convertor 530A, memory 524A, processing unit 528A, electronic circuitry 534A, communication channel 538A, and display 540A with a GUI 542A. Similarly, the second computing device 522B may include analog-to-digital convertor 530B, memory 524B, processing unit 528B, electronic circuitry 534B, communication channel 538B, and display 540B with a GUI 542B.

In an exemplary embodiment, a first user may be an expert user using a first apparatus 502A that may represent a part of the human anatomy. A plurality of sensors may be provided within the first apparatus 502A referred to as expert sensors 504A. The expert sensors 504A may generate an expert sensor data sequence that may indicate at least in part an operation of the expert user. The expert may manipulate the device physically as a part of performing an expert maneuver. The first apparatus 502A may be connected to a first computing device 522A. Various connecting methods and devices have been described in conjunction with FIG. 1 in detail. Similarly, a second user may be associated with a second apparatus 502B. The second user may be a novice user using a second apparatus 502B that may represent a part of the human anatomy similar to the first anatomical part. However, in accordance with various other embodiments of the present invention, the first anatomical part and the second anatomical part may be different from one another. A plurality of sensors may be provided within the second apparatus 502B referred to as novice sensors 504B. The novice sensors 504B may generate a novice sensor data sequence that may indicate at least in part an operation of the novice user. The second apparatus 502B may be connected to a second computing device 522B in a manner similar to the connection of the first apparatus 502A with the first computing device 522A.

The expert sensor data sequence and the novice sensor data sequence may be recorded and stored in a database such as the database 532A or 532B. In an embodiment, multiple databases may be utilized for storing the expert sensor data sequence and the novice sensor data sequence separately. The processing unit such as the processing unit 528A or 528B may further perform a comparison between the expert sensor data sequence and the novice sensor data sequence that may indicate effects of operations performed by the expert user and the novice user. In accordance with an embodiment, the comparison may be based at least in part on statistical analysis of the data sequences. The statistical analysis may generate a score that may associate a degree of similarity between the novice sensor data sequence and the expert sensor data sequence. In accordance with various other embodiments, several other types of comparison models and algorithms may be utilized to generate a score indicating a relationship between the novice sensor data sequence and the expert sensor data sequence.

In embodiments, the score may be stored in an account associated with the novice user. The account may include a plurality of scores based at least in part on a plurality of novice sensor data sequences derived from prior usage sessions with a plurality of apparatuses. This may assist the novice user to compare his previous performances as well. In an embodiment, the plurality of scores may be statistically combined to create a master score for the novice user summarizing at least in part an overall performance with at least one type of apparatus such as 502A or 502B. Similarly, in other embodiments, cumulative performance for various associated apparatuses representing one specific kind of action such as sexual therapy may be generated. This may assist in recognizing one's status, potential areas of expertise and weaknesses and the like. The score may be utilized to train the novice user to recognize faults and discrepancies and accordingly modify his operations to conform to the expert user tactics. Such trainings may be helpful among couples where one of the couples may act as an expert user and teach his partner best methods and actions according to his preferences. The various parts of the human anatomy represented by the second apparatus 502B may be presented on the display 540B of the second computing device 522B through a graphic user interface 542B. Similarly, in other embodiments, various parts of the human anatomy represented by the first apparatus 502A may be presented on the display 540A of the first computing device 522A through a graphic user interface 542A. In yet other embodiments, a single graphic user interface such as 542A or 542B may be utilized to represent relevant parts of the human anatomy.

In embodiments, the expert maneuver may be a reflexology maneuver, massage maneuver, physical therapy maneuver, occupational therapy maneuver, chiropractic maneuver, medical maneuver, rehabilitation maneuver, sexual maneuver, and the like. The message maneuver may be an arm massage maneuver, a leg massage maneuver, a foot massage maneuver, and the like. The sexual maneuver may be an oral sexual maneuver, related to copulation, digital sexual maneuver and the like. In embodiments, the expert user may be a masseuse, physician, physical therapist, occupational therapist, sex therapist, rehabilitation expert, paid performer such as a sexual performer, and the like, without limitations.

In embodiments, a depiction of the region of the part of human anatomy from which the novice sensor data sequence is received may be presented within the graphic user interface 542B. Further, the score for the performance of the novice user may be included within the depiction on the graphic user interface 542B.

In an embodiment, a feedback may be provided to the novice user based on his performance in light of the standards set by the operation and performance of the expert user. The feedback may be provided by a feedback indicator that may be included within the graphic user interface such as 542A or 542B. In other embodiments, the feedback may be generated through emails, SMS, or various other means in case the novice user prefers privacy. In an embodiment, the feedback indicator may be a visual comparison of the expert sensor data sequence and the novice sensor data sequence. In another embodiment, the feedback indicator may be a text tutorial relating to the expert maneuver. In yet another embodiment, the feedback indicator may be a multimedia tutorial relating to the expert maneuver. In still another embodiment, the feedback indicator may be a skill rating for at least one of a plurality of skills relating to the expert maneuver. In embodiments, the feedback indicator may be based at least in part on providing feedback using the second apparatus 502B. Further, in an embodiment, the feedback using the second apparatus 502B may be based at least in part on lighting an area of the apparatus 502B. In another embodiment, the feedback using the second apparatus 502B may be based at least in part on vibrating an area of the apparatus 502B. In still another embodiment, the feedback using the second apparatus 502B may be based at least in part on heating an area of the apparatus 502B. In still another embodiment, the feedback using the second apparatus 502B may be based at least in part on deforming an area of the apparatus 502B. In yet another embodiment, the feedback using the second apparatus 502B may be based at least in part on cooling an area of the apparatus 502B.

In embodiments, the feedback indicator may be provided to the novice user in substantial real-time relative to the receipt of the novice sensor data sequence.

In accordance with an embodiment of the present invention, referring to FIG. 6, a method 600 may be provided. The method 600 may start at step 602. At step 604, an expert sensor data sequence may be received from an expert user. The expert sensor data sequence may be received using a first apparatus 502A that may represent a part of the human anatomy in which a plurality of sensors may be contained. The expert sensor data sequence may derive at least in part from the expert physically manipulating the device as part of performing an expert maneuver. The apparatus 502A may be connected to a first computing device 522A. At step 608, the expert sensor data sequence may be recorded and stored in a database 514A. Further, at step 610, a novice sensor data sequence may be received from a novice user that may use a second apparatus 502B. The second apparatus may represent a similar part of the human anatomy as the first apparatus 502A in which a plurality of sensors may be contained. The second apparatus 502B may be connected to a second computing device 522B. At step 612, the novice sensor data sequence may be recorded and stored in the database 514A. Furthermore, at step 614, the novice sensor data sequence may be compared to the expert sensor data sequence based at least in part on a statistical analysis of the data sequences. The statistical analysis results in a score may be associated with a degree of similarity between the novice sensor data sequence and the expert sensor data sequence. At step 618, the part of the human anatomy may be represented by the second apparatus 502B. The part of the human anatomy may be represented on a display of the second computing device 522B. Further, at step 620, a region of the part of the human anatomy from which the novice sensor data sequence was received may be presented within the graphic user interface. The graphic user interface may include the score and a feedback pointer. Finally, the method 600 terminates at 622.

Referring to FIG. 7, a plurality of apparatus such as a first apparatus 702A, a second apparatus 702B, and a third apparatus 702C may be communicatively coupled to the plurality of computing devices such as a first computing device 722A, a second computing device 722B, and a third computing device 722C.

The first apparatus may include one or more expert sensors 704A, memory 708A, processing unit 710A, electronic circuitry 712A, analog-to-digital convertor 718A and a communication channel 720A, and a database 714A. As described elsewhere herein, databases may be local or remote to the apparatuses and/or computing devices. Similarly, the second apparatus may include one or more expert sensors 704B, memory 708B, processing unit 710B, electronic circuitry 712B, analog-to-digital convertor 718B and a communication channel 720B, and a database 714B. Likewise, the third apparatus may include one or more expert sensors 704B, memory 708B, processing unit 710B, electronic circuitry 712B, analog-to-digital convertor 718B and a communication channel 720B, and a database 714B.

The first computing device 722A may include analog-to-digital convertor 730A, memory 724A, processing unit 728A, electronic circuitry 734A, communication channel 738A, display 740A with a GUI 742A. Similarly, the second computing device 722B may include analog-to-digital convertor 730B, memory 724B, processing unit 728B, electronic circuitry 734B, communication channel 738B, and display 740B with a GUI 542B. Likewise, the third computing device 722C may include analog-to-digital convertor 730C, memory 724C, processing unit 728C, electronic circuitry 734C, communication channel 738C, and display 540C with a GUI 542C.

In this exemplary scenario, two users may be referred to as a first novice user and a second novice user as represented in FIG. 7. In addition, an expert user may be provided. The expert user may be associated with a first apparatus 702A. The first novice user may be associated with a second apparatus 702B and a second novice user may be associated with a third apparatus 702C as represented by FIG. 7. The first apparatus 702A may be connected with the first computing device 722A, the second apparatus 702B may be connected to the second computing device 722B and the third apparatus 702C may be connected to the third computing device 722C. Various modes of connecting the apparatuses with the clients have been described in conjunction with FIG. 1. A plurality of sensors such as 704A, 704B and 704C may be mounted within the apparatuses.

In embodiments, an expert sensor data sequence may be received from an expert user using the first apparatus 702A representing a part of the human anatomy. The expert sensor data sequence may be received with the help of sensors 704A mounted within the first apparatus 702A. Further, the expert sensor data sequence may derive information at least in part from the expert physically manipulating the device as part of performing an expert maneuver. Similarly, a first novice sensor data sequence may be received from the first novice user using the second apparatus 702B representing a similar part of the human anatomy as represented by the first apparatus 702A. The first novice sensor data sequence may be received with the help of sensors 704B mounted within the second apparatus. Likewise, a second novice sensor data sequence may be received from the second novice user using the third apparatus 702C representing a similar part of the human anatomy as represented by the first and second apparatuses. The second novice sensor data sequence may be received with the help of sensors 704C mounted within the third apparatus 702C. The expert sensor data sequence, first novice sensor data sequence and the second novice sensor data sequence may be recorded and stored in a database such as 732A, 732B or 732C. In an embodiment, multiple databases may be utilized for storing the expert sensor data sequence and the novice sensor data sequences separately.

The processing unit such as 728A, 728B or 728C may further compare the expert sensor data sequence and the first novice sensor data sequence, the expert sensor data sequence and the second novice sensor data sequence that may indicate effects of operations performed by the expert user, the first novice user and the second novice user. In accordance with an embodiment, the comparison may be performed based at least in part on statistical analysis of the data sequences. The statistical analysis may generate a first score that may associate a degree of similarity between the first novice sensor data sequence and the expert sensor data sequence. Similarly, the statistical analysis may generate a second score that may associate a degree of similarity between the second novice sensor data sequence and the expert sensor data sequence. In accordance with various other embodiments, several other types of comparison models and algorithms may be utilized to generate a score indicating a relationship between the novice sensor data sequences and the expert sensor data sequence.

Various parts of the human anatomy represented by the first, second or third apparatuses 702A, 702B and 702C respectively may be presented on the display of at least one of the first, second and third computing devices 722A, 722B, and 722C through a graphic user interface such as 742A, 742B or 742C. For example, in a scenario, a part of the human anatomy that is represented by the second apparatus 702B may be presented on a display of at least one of the first and second computing devices such as 740A or 740B, using a graphic user interface such as 742A or 742B. Similarly, in other embodiments, various parts of the human anatomy represented by the apparatuses such as 702A, 702B or 702C may be presented on the display such as 740A, 740B or 740C of the computing devices such as 722A, 722B or 722C through a graphic user interface such as 742A, 742B or 742C. In an embodiment, a single graphic user interface such as 742A, 742B or 742C may be utilized to represent relevant parts of the human anatomy. In embodiments, a depiction of the part of human anatomy from which the first novice sensor data sequence and the second novice sensor data sequence is received may be presented within the graphic user interface. Further, the score for the performance of the novice user may be included within the depiction of the graphic user interface. The performance based score has been described previously in detail.

In an embodiment, a feedback may be provided to the first novice user and the second novice user based on their performance in light of the standards set by the operation and performance of the expert user. A feedback may be provided by a feedback indicator that may be included within the graphic user interface. The feedback indicator has been described previously in detail.

In accordance with an embodiment of the present invention, referring to FIG. 8, a method 800 may be provided for associating the apparatuses with an expert user and multiple novice users. In embodiments, a first apparatus such as represented in FIG. 7 as the apparatus 702A may be associated with the expert user. Similarly, the first novice user may be associated with the second apparatus 701B and the second novice user may be associated with the third apparatus 702C. The apparatuses may represent an anatomical feature of human body such as a foot, hand, leg, arm, back, face, and the like. Expert sensors 704A may be mounted within the first apparatus 702A and first novice sensor 704B and second novice sensor 704C may be mounted within the second apparatus 702B and 702C respectively.

The method starts at step 802. At step 804, an expert sensor data sequence determined by the expert sensors 704A may be received from the expert user using the first apparatus 702A. The expert sensor data sequence may be recorded and stored at step 808 in a database such as 732A. At step 810, a first novice sensor data sequence may be received from a first novice user using a second apparatus 702B. The first novice sensor data sequence may be recorded and stored at step 812 in a database such as 732B. At step 814, a second novice sensor data sequence may be received from a second novice user using a third apparatus 702C. The second novice sensor data sequence may be recorded and stored at step 818 in a database such as 732C. The first novice sensor data sequence and the second novice sensor data sequence are compared to the expert sensor data sequence at step 820. In accordance with an embodiment, the comparison may be performed based at least in part on statistical analysis of the data sequences.

At step 822, a part of the human anatomy that is represented by the second apparatus 702B may be presented on a display of at least one of the first and second computing devices such as 740A or 740B, using a graphic user interface such as 742A OR 742B. Similarly, in other embodiments, various parts of the human anatomy represented by the apparatuses such as 702A, 702B or 702C may be presented on the display such as 740A, 740B or 740C of the computing devices such as 722A, 722B or 722C through a graphic user interface such as 742A, 742B or 742C.

At step 824, a depiction of the region of the part of human anatomy from which the first novice sensor data sequence and the second novice sensor data sequence is received may be presented within the graphic user interface. Further, the score for the performance of the novice users may be included within the depiction on the graphic user interface.

Referring to FIG. 9, an apparatus 902 may include a plurality of novice sensors 904, memory 912, processing unit 914, database 918, analog-to-digital convertor 920, electronic circuitry 922, and a communication channel 924. The apparatus 902 may be communicatively coupled to the computing device 928 and may include memory 930, processing unit 932, analog-to-digital convertor 934, electronic circuitry 940, communication channel 942, display 944 with a GUI 948.

In accordance with another embodiment, a data sequence representing a human's physical interaction with a part of the human anatomy may be programmed in software code. The data sequence may represent at least in part a performance of an anatomical maneuver. Further, the programmed data sequence may be stored in a database 938. A novice user may be associated with an apparatus 902 that may represent a part of the human anatomy. In embodiments, the part of human anatomy represented by the apparatus 902 associated with the novice user may be similar to the part of human anatomy represented by the apparatus that is programmed in software code.

In embodiments, a novice sensor data sequence may be received from a novice user using the apparatus 902. The novice sensor data sequence may be received with the help of sensors 904 mounted within the apparatus 902. The apparatus 902 may be connected to a computing device 928 through various modes of communications and networking as described in conjunction with FIG. 1 in detail. The novice sensor data sequence may be stored in the database 938. In other embodiments, the novice sensor data sequence and the programmed data sequence may be stored in separate databases.

The processing unit 932 may make a comparison between the programmed data sequence and the novice sensor data sequence. In accordance with an embodiment, the comparison may be based at least in part on statistical analysis of the data sequences. The statistical analysis may generate a score that may associate a degree of similarity between the novice sensor data sequence and the programmed data sequence. In accordance with various other embodiments, several other types of comparison models and algorithms may be utilized to generate a score indicating a relationship between the novice sensor data sequences and the programmed data sequence.

Various parts of the human anatomy represented by the apparatus 902 may be presented on the display 944 of the computing device 928 through a graphic user interface 948. In an embodiment, a single graphic user interface such as 948 may be utilized to represent relevant parts of the human anatomy. In other embodiments, multiple graphic user interfaces may be utilized to represent the programmed data sequence and the novice sensor data sequence separately.

In embodiments, a depiction of the part of human anatomy from which the novice sensor data sequence is received may be presented within the graphic user interface such as 948. Further, the score for the performance of the novice user and a feedback indicator may be included within the depiction on the graphic user interface 948. The performance based score and the feedback indicator has been described previously in detail. In embodiments, the feedback indicator may be provided to the novice user in substantial real-time relative to the receipt of the novice sensor data sequence.

In embodiments, the score may be stored in an account associated with the novice user. The account may include a plurality of scores based at least in part on a plurality of novice sensor data sequences derived from prior usage sessions with a plurality of apparatuses. This may assist the novice user to compare his previous performances as well. In an embodiment, the plurality of scores may be statistically combined to create a master score for the novice user summarizing at least in part an overall performance with at least one type of apparatus. Similarly, in other embodiments, cumulative performance for various associated apparatuses representing one specific kind of action such as sexual therapy may be generated. This may assist in recognizing one's status, potential and areas of expertise and weaknesses and the like. The score may be utilized to train the novice to recognize faults and discrepancies and accordingly modify his operations in accordance with the standards set by the programmed data sequence. Such trainings may be helpful in cases where individuals are hesitant to attend trainings by experts, and therefore may find the programmed version advantageous.

In accordance with an embodiment of the present invention, referring to FIG. 10, a method 1000 may be provided for programming and recording a sensor data sequence. The method 1000 starts at 1002. At step 1004, a data sequence may be programmed in software code. The data sequence may represent a human's physical interaction with a part of human anatomy. The data sequence may represent at least in part a performance of an anatomical maneuver. At step 1008, the programmed data sequence may be stored in a database 918. Further, at step 1010, a novice sensor data sequence may be received from a novice user using an apparatus 902. The apparatus 902 may represent the part of the human anatomy, in which a plurality of sensors 904 may be included.

At step 1012, the novice sensor data sequence may be recorded and stored in the database 938. At step 1014, the novice data sequence may be compared with the programmed data sequence. The comparison may be performed using statistical methods as described in conjunction with FIG. 9. At step 1018, a part of the human anatomy represented by the apparatus 902 may be presented on the display 944 of a computing device 928 through a graphic user interface 948.

At step 1020, a depiction of the part of human anatomy from which the novice sensor data sequence is received may be presented within the graphic user interface such as 948. The method terminates at step 1022.

FIG. 11 is an exemplary screenshot for a graphic user interface 1100 in accordance with various embodiments of the present invention. In embodiments, the graphic user interface 1100 may depict information on a display corresponding to the effect of sensors that are mounted on an apparatus. For example, a touch at the centre of the foot may initiate a response; the magnitude and accuracy of touch may be analyzed and displayed on the display of the graphic user interface 1100. Instructions corresponding to these parameters may be provided as a feedback to the user. Likewise, a touch on the tip of the toe may initiate a response that may be recorded and the analysis of accuracy and effectiveness of the touch may be displayed on the display.

In embodiments, the graphic user interface 1100 may for example monitor and depict effects from tactile sensor 1104, pressure sensor 1108, and humidity sensor 1110 as depicted in FIG. 11. The graphic user interface 1100 may further include three portions of display corresponding to the details monitored and analyzed for the tactile sensor 1104, pressure sensor 1108, and humidity sensor 1110. These details may include details of tactile sensor 1112, details of humidity sensor 1114, and details of pressure sensor 1118. These details may include current status or level or performance, ideal status, level or performance and suggestions to move from the current level to reach the ideal level. For example, the humidity sensor 1110 may recognize that the level of humidity is low and may show the current level of humidity. In addition, the details corresponding to the humidity sensor 1114 may inform the user various suggestions such as application of a lubricant on the required parts for a full effect and enhance performance. Similarly, various other details for several types of sensors may be depicted on the graphic user interface 1100.

An apparatus 102 simulating a part of human anatomy may be used on a variety of gaming platforms, such as: PLAYSTATION 2, PLAYSTATION 3, PLAYSTATION PORTABLE, manufactured by Sony Corporation; GAMECUBE, GAMEBOY, GAMEBOY ADVANCE, or WII, manufactured by Nintendo Corporation; or XBOX or XBOX 360, manufactured by Microsoft Corporation. The apparatus 102 may also be used on gaming platforms comprising a personal computer or a cellular telephone, smart phone or some other computing device.

Although described below in connection with a simulated foot, the apparatus 102 may simulate any of a variety of human anatomical parts such as: hand, head, face, genitalia or other body parts. These apparatus 102 would be similarly outfitted with sensors 104.

Referring now to FIG. 12, a screenshot of one possible embodiment of a game-style environment for use with an apparatus in the shape of a part of human anatomy is provided. In the illustration, the human anatomical part in question is a foot, which is depicted on screen 1202. An on-screen indicator 1204 may show the user where to press on the apparatus and the user may respond accordingly. A score 1208 may be shown to reward the user's adeptness at using the apparatus. Secondary indicators 1210 may be used to show other areas to be touched, sensitive areas to be avoided, or other conditions that the user may be made aware of.

Now referring to FIG. 13, one embodiment of an apparatus 1300 in the shape of human anatomy is shown being held by the user. Again, the anatomical part in question may be a foot in this particular embodiment. The body 1302 of the apparatus may be manipulated by the user by pressing, stroking, pinching, or some other kind of physical interaction. These manipulations may then be read by various sensors and interpreted by a signal processor in the base of the apparatus 1308. The information may then be transmitted to a computer or game system via a cable, wireless connection, or some other means. Although a USB port 1304 is shown in this particular embodiment, the apparatus may transmit data via some other kind of connection as well or wirelessly.

FIG. 14 shows a side view of a similar embodiment of the apparatus 1300 as shown in FIG. 13. The body of the simulated foot 1302 may be made of some kind of safe material. The signal processing electronics that send sensor data to the computer or game system may be housed in an enclosure 1310 at the base of the apparatus. An ergonomic hand-grip 1308 may be used to hold the apparatus. In this particular configuration, signals are passed to the computer or game system via a USB port 1304.

Referring now to FIG. 15, an embodiment of a simulated foot apparatus 1500 for use with a video game or educational tool is depicted. The body of the simulated foot 1302 is made of some kind of safe material and is outfitted with a sensor suite comprising position sensors 1502 and pressure sensors 1504. Although ten sensors are shown in the figure, the apparatus may have any number of sensors. Also, although the apparatus shown has two different types of sensors, the device may contain rotary position sensors, contact sensors, temperature sensors, torsion sensors, capacitance sensors, or some other kind of sensors. The apparatus may also have only one type of sensor. Moreover, the positioning of the sensors may be different from that shown in the figure. Feedback may also be given to the user via a vibrating motor or some other indicator to alert the user that he is performing well or poorly or refer to some other game condition. The apparatus shown here has no base, but uses a USB port 1304 on the rear of the apparatus to transmit sensor data to the computer or game system.

Now referring to FIG. 16, an embodiment of a simulated hand apparatus 1600 for use with a video game or educational tool is depicted. The body of the simulated hand 1600 is made of some kind of safe material and is outfitted with a sensor suite comprising position sensors 1502 and pressure sensors 1504. Although twelve sensors are shown in the figure, the apparatus may have any number of sensors. Also, although the apparatus shown has two different types of sensors, the device may contain rotary position sensors, contact sensors, temperature sensors, torsion sensors, capacitance sensors, or some other kind of sensor. The apparatus may also have only one type of sensor. Moreover, the positioning of the sensors may be different from that shown in the figure. The apparatus shown here uses a USB port 1304 to transmit sensor data to the computer or game system.

In any configuration and shape of the apparatus 1600, each sensor feeds a stream of data to the game platform indicating the state of the sensor at that time. The data may be sampled at any frequency within the hardware's limitations but will most likely be sampled at about standard video rates of 30 Hz. Each sensor's state may be used in conjunction with one or more of the other sensors to provide information on how the user is manipulating the apparatus 1600. This method may also be used to interpolate information.

A mechanical or electrical “tilt sensor” may be included to monitor position and orientation of the apparatus. This sensor may be a mercury reed switch, an accelerometer, or some other kind of device for detecting tilt, position, or orientation. This information may be used in game play to determine how the user is holding the apparatus 1600 or if he is rolling it around as one might do to relieve stress from an ankle or wrist. Other secondary techniques of interaction with the apparatus 1600 may include shaking or slapping the apparatus 1600.

The body of the apparatus 1600 may be made of a non-toxic or medical grade compound that will somewhat simulate the elasticity and hardness of human flesh. The compound can be urethane, silicone, latex, or some other compound that meets regulatory safety requirements. Bonelike structures may be embedded in the apparatus 1600 to provide rigidity and a more realistic feel as well as support and backing for the sensors or wire feeds. These simulated bones may be made of metal, plastic, or some other material that provides the necessary characteristics.

In an example embodiment, a foot may be depicted visually on a display within a GUI, such as a screen, that represents, at least in part, a map of the physical apparatus representing the part of human anatomy. The on-screen foot may glow in an area where a user is supposed to press and the glowing region may travel in motion to indicate to the user that he must follow in order to score well. The glowing region may change in color or size to cue the user to different techniques or circumstances such as increasing or decreasing the pressure exerted. The user may have to perform actions involving particular motions, varying pressures, moving or flexing the controller, or some other actions capable of being detected or interpolated by the sensors.

A tilt-sensor of the apparatus may be used during game play. In one specific embodiment, the on-screen body part may move out of center, rotate, or display a change in some other manner and the user must move the apparatus to the position indicated. This could be used to simulate the on-screen massage subject moving his or her foot causing the user to have to move his hands to compensate. Failure to do so could result in loss of points or perhaps a simulated pressure felt by the on-screen subject. Such a result could be harmful or beneficial depending on the particular circumstance.

The tilt sensor may also be used to sense rolling motions, flexure, or other manipulations.

During gameplay or during training, the user may use simple motions or combinations of motions that may then be combined into more complex manipulations. The on-screen instructions may start out simply and progress in difficulty using combinations of these simple or more elaborate manipulations.

The game, therapy, or training tool may include variations of the on-screen subject or the subject's preferences, which would simulate the variety of preferences of human subjects during physical interactions. Variations in the speed of motion, pressure applied, or some other parameter may be introduced into the game or training tool, adding more realism to the simulation. In the case of a training tool or therapy aid, this may be useful in making the user aware of these variations and thus tailor the manipulations, maneuvers, or procedures to the preferences of actual human subjects.

The motions depicted on-screen as cues to the user may be preprogrammed as part of the software or may be “taught” to another user. In this scenario, a user may perform certain actions manipulating the apparatus and these actions may be captured by the sensors and recorded. This series of motions or a variant thereof may then be used as a stage in the game during which a user must mimic these motions by following cues from the game or to train the user in these maneuvers or for some other implementation.

Another embodiment of the game may be a “free-style” game wherein the user is not given cues but may be shown only a display of the body part simulated by the apparatus indicating the sensors currently stimulated by the user. The user's performance may be judged according to how closely the user executes pre-programmed, user-inputted, or algorithmically based series of motions. The user's moves may be guided by visual, audio, tactile, or some other form of feedback.

An apparatus simulating a part of the human anatomy may be used as a human interface device for interaction with a computer in a fashion similar to a mouse, track pad, or some other human interface device.

FIG. 17 illustrates a graphic user interface (GUI) 1700 in accordance with an embodiment of the present invention. The GUI 1700 may include a plurality of panels such as a training panel 1702 and a performance panel 1704. The training panel 1702 and the performance panel 1704 may refer to the panels associated with an expert 1708 and a novice user 1710, respectively. Further, the training panel and the performance panel may depict a first apparatus 502A and a second apparatus 502B corresponding to the expert 1708 and the novice user 1710 respectively. It must be appreciated by a person ordinarily skilled in the art that though FIG. 17 and the corresponding description are indicative of two apparatuses and users (expert and novice user), more than two users and apparatuses may be utilized in accordance with various embodiments of the present invention.

The GUI 1700 may further include an expert pressure display facility 1712 and a novice user pressure display facility 1714 corresponding to the training panel 1702 and the performance panel 1704, respectively.

Referring still to FIG. 17, the apparatus such as 502A and 502B are represented by the human foot. The expert's hand representing the expert 1708 is shown on the training panel 1702 manipulating the first apparatus 502A in the form of a foot using a technique or maneuver being taught. The expert 1708 may be trained in the skill of demonstrating a right maneuver of initiating a touch on a human organ/body. A graph for actual pressure shows timelines for pressure applied by the expert using his/her thumb on the first apparatus 502A. The graphical depiction of the pressure may be provided in actual pressure graph depicted on the expert pressure display unit 1712. The novice user 1710 may observe the applied pressure on the first apparatus 502A that shows the art of maneuvering; that is, in this case, massaging the foot. Further, the pressure applied at various instances of time may be depicted in a time pressure curve as illustrated in the expert pressure display unit 1712. For example, the pressure applied at time T1 may correspond to pressure circle P1. Likewise, the pressure applied at time T2 may correspond to pressure circle P2.

In embodiments, the first pressure circle representing P1 may correspond to the current pressure level applied by the expert 1708 and the second pressure circle representing P2 may correspond to the pressure applied after a specified period of time. In an embodiment, the specified period of time may be input by the user through the GUI 1700.

The novice user's hand representing the novice user 1710 is shown on the performance panel 1704 manipulating the second apparatus 502B in the form of a foot. A representation of the novice user's hands may be ghosted over the foot image, showing where a computer may believe the novice user's hands to be. In accordance with this embodiment, the computer may give the novice user 1710 the benefit of the doubt, assuming that the novice user 1710 is holding the foot properly and using a correct digit. The novice user 1710 may be trained by observing the demonstration of the expert 1708 in the skill of a right maneuver of initiating a touch on a human organ/body.

The novice user 1710 may apply pressure at the same location in line with action performed by the expert 1708. The novice user 1710 may apply the pressure at a point where the expert 1708 had previously applied the pressure. A novice user display facility 1714 may show the pressure time graph of the novice user 1710, when the pressure is applied at a particular point on the second apparatus 502B by the novice user 1710. A point on the pressure time curve may depict the pressure P3 corresponding to a time such as T3. In an embodiment, the pressure P3 may represent current pressure applied by the novice user 1710. In another embodiment, the pressure P3 may represent pressure at a particular period of time.

In an embodiment, different pressure distributions may be portrayed on the pressure display units such as 1712 and 1714 using different colors that may link to the pressure in the vicinity of a particular point on the human foot. In other embodiments, users may utilize human parts other than foot such as palm, hand, back, breast, neck, head, vagina, penis, vulva, mouth and the like.

FIG. 18 illustrates a GUI 1800 in accordance with another embodiment of the present invention. The GUI 1800 depicts an apparatus 102 which may be in the form of a foot. Similarly, the apparatus may be crafted into other shapes depicting body parts and/or human organs such as palm, hand, back, breast, neck, head, vagina, penis, vulva, mouth and the like, without limitations. The GUI 1800 may further include a score display facility 1802, a first pressure display unit 1804, a second pressure display unit 1808, one or more status bars 1810, a feedback indicator facility 1812 and the like. The score display facility 1802 may indicate a score to the user; the score may relate to a pressure on the apparatus 102 applied by the user. In another embodiment, the user score may be depicted in the form of percentage of accuracy and other similar manner. The user score may depict the accuracy of the maneuver that is performed by a user at a particular time. In an embodiment, the score may be either qualitative or quantitative. For example, a score of 100 may correspond to 100 percent accuracy. The same score may be awarded as ‘excellent’ in another example.

The first pressure display facility 1804 may show a moving pressure graph showing the pressure that the user must apply on the apparatus 102 plotted with respect to time over the next few seconds of interaction, based on the expert's or trainer's programming of the technique being taught. Further, the first pressure display facility 1804 may also indicate an instructed area of interaction on the apparatus 102. A trail or glowing icon or other similar indicator may be visibly provided and color-coded denoting an actual position of the user's primary interaction on the apparatus 102. In a similar manner, the first pressure display facility 1804, the second pressure display facility 1808 may be provided on the GUI 1800. The second pressure display facility 1808 may denote an interaction of the user with the apparatus 102 on a second portion. The user's or expert's hands may be shown ghosted on the apparatus 102 such as a foot, in this case, showing a representation of either instructed or actual interaction.

In embodiments, the one or more status bars 1810 may indicate an overall pleasure felt by the user, a gauge of accuracy for each of the user's means of interaction, and the like. In embodiments, the user may also determine the overall pleasure at a particular time previously recorded by the one or more status bars 1810. Therefore, the user may compare his/her performance in terms of overall pleasure at various instances. The one or more status bars 1810 may indicate various types of arousals and responses such as certain interactions that may elicit a response of sexual arousal, as is purportedly the case when a reflexologist massages certain areas of the foot that are thought to be connected to erogenous zones, and the like. The one or more status bars 1810 may indicate such types of arousal responses as well. In an embodiment, instructions may be given to the user beforehand in a visual sequence and the user may have to memorize series of maneuvers and try to emulate them.

In embodiments, the feedback indicator 1812 may display a feedback to the user after an interaction ends. As referred in FIG. 18, a feedback indicating GREAT JOB may be shown on the feedback indicator 1812 on establishing a good performance during the interaction. This may motivate the user for future interactions.

FIG. 19 illustrates a graphic user interface (GUI) 1900 in accordance with yet another embodiment of the present invention. The GUI 1900 may include a first apparatus 502A representing a foot corresponding to an expert, a second apparatus 502B representing a foot corresponding to a novice user. The GUI 1900 may further include a first expert pressure display unit 1902, a second expert pressure display facility 1904, a first novice user pressure display facility 1908, a second novice user pressure display facility 1910 and the like.

The first expert pressure display facility 1902 and the second expert pressure display facility 1904 may show a moving pressure graph showing the pressure that that user must apply on the apparatus 502A plotted with respect to time over the next few seconds of interaction, based on the expert's or trainer's programming of the technique being taught for two distinct regions of the body respectively. Similarly, the first novice user pressure display facility 1908 and the second novice user pressure display facility 1910 may show an interaction of the novice user with the apparatus 502B on two distinct portions of the apparatus 502B, respectively.

In embodiments, one or more status bars (not depicted in FIG. 19) may be provided in a manner similar to those illustrated in conjunction with FIG. 18 without limiting the spirit and scope of the present invention.

FIG. 20 illustrates a GUI 2000 in accordance with still another embodiment of the present invention. Referring to FIG. 20, an apparatus 102 is illustrated in a perspective view of a foot for better illustrative purposes. The GUI 2000 further depicts one or more icons such as 2002, 2004 and the like indicating a portion on the apparatus 102 where the user may initiate primary means of interaction such as left thumb, right thumb, palm, and the like. In embodiments, the expert's position of primary means of interaction as well as the expert's or user's hand and the like may be ghosted on the image of the apparatus 102. The one or more icons may move in an appropriate path to indicate user movements while the ghosted icon may move in a path indicating instructor movements. The one or more icons 2002 and 2004 may vary in size, color, brightness, or other parameters to indicate pressure applied, current temperature, and the like.

In embodiments, the GUI 2000 may also include a score display facility 2008. The score display facility 2008 may indicate the user's current score or accuracy meter or other necessary means of judging the user performance.

Referring again to FIG. 1, in an embodiment, hardware may be an apparatus 102 that may resemble and stimulate sensitivity of a human anatomical organ such as a human vulva, penis, hand, foot and the like. In another embodiment, the apparatus 102 may resemble the anatomical part of an animal. The apparatus 102 or simply apparatus 102 may be equipped with analog and/or digital sensors 104. It may also be equipped with analog and/or digital feedback components that may simulate physiological responses to stimuli and other physiological phenomena. In accordance with various embodiments of the present invention, sensors 104 may be of different types for detecting user inputs. The sensors 104 may be mechanical, electrical, or any other type or any combination thereof. The user inputs may be positional, pressure-driven, humidity-driven, temperature-driven, capacitive, or some other type of input or any combination thereof. Further, the feedback from the apparatus 102 may be audible, visual, tactile, or any combination thereof, or of some other type of feedback and/or instruction.

In accordance with various embodiments, the apparatus 102 may be available in a variety of shapes, each resembling a part or parts of the human anatomy. Further, the apparatus 102 may be available in different sizes to represent varying human sizes and forms in accordance with the diversity of the human form. The shape of the apparatus 102 may be designed by an artist, scanned, or live cast from an actual person or existing simulated part, or have its form designed in some other way, without limitations. An individual body part may come in many different forms, shapes, colors, sizes and the like to represent multitude of variations that exist in actual human form. These variations may also be fantastical or of the type that do not typically occur in humans, such as green skin, metallic robotic form, cartoonish features, greatly over-sized, etc.

In an embodiment, the apparatus 102 may be in the shape of a human hand. This may be used for various types of training including training in massage therapy, physical therapy, reflexology, or any other application for which such an apparatus 102 may be useful, without limitations. In an embodiment, the apparatus 102 may be incorporated into a robotic system to aid in human-robot interaction.

In another embodiment, the apparatus 102 may be in the shape of a human foot. This may be used for various types of training including training in massage therapy, physical therapy, reflexology, or any other application for which the apparatus 102 may be useful, without limitations. In an embodiment, the apparatus 102 may also be incorporated into a robotic system to aid in human-robot interaction.

In yet another embodiment of the present invention, the apparatus 102 may be in the form of a human penis. The penis may also have the scrotal/testicular region attached. This apparatus 102 may be used as a personal education tool, for entertainment purposes, as a sex therapy aid, as an instructional tool, as a couples therapy tool to facilitate communication about sexual matters, or any other application for which such an apparatus 102 may be useful. Likewise, the apparatus 102 may be incorporated into a robotic system to aid in human-robot interaction.

In still another embodiment of the present invention, the apparatus 102 may be in the form of a human vulva. In this embodiment, the apparatus 102 may be used as a personal education tool, for entertainment purposes, as a sex therapy aid, as an instructional tool, as a couples therapy tool to help facilitate communication about sexual matters, or any other application for which such an apparatus 102 may be useful. Further, the apparatus 102 may also be incorporated into a robotic system to aid in human-robot interaction.

In accordance with various other embodiments, certain other possible forms for the apparatus 102 may include, without limitations, a human finger, toe, thumb, knee, ankle, leg, buttocks, belly, thigh, abdomen, chest, arm, elbow, wrist, back, face, neck, shoulder, or any other region of human anatomy or combinations thereof. The apparatus 102 may even be designed in the form of a liver, kidney, heart, brain, spleen, intestine, lung, eye, sinus, or other human organ or component or combination thereof. Every form of the apparatus 102 may involve other regions of the human anatomy, skeletal structure, or other features; for example, an apparatus 102 in the shape of the human back may include a simulated spine, a region of the buttocks that may be embedded with simulating kidneys, or may contain any other feature necessary for the desired application. Such applications may include, without limitations, training in chiropractic art, massage therapy, surgical science, entertainment, or any other application for which such an apparatus 102 may be useful. Various shapes of the apparatus 102 could also be used for implantation into new or existing types of medical training dummies or game controllers for medical examination and/or surgical training. In an embodiment, any form of the apparatus 102 may also be used as elements of robots or larger assemblies. The apparatus 102 may also be designed in a way as to make them modular or otherwise easily connectable with other similar apparatus such as 102.

In accordance with various embodiments, different types of materials may be utilized for the apparatus 102. The apparatus 102 may be made of non-toxic materials that may be deemed safe for their intended purposes. In an embodiment, and/or surgical grade materials may be used if necessary. For example, a part that may come in contact with the mouth may be constructed from materials that are deemed safe for oral use, and parts that may come in contact with the skin may be constructed from materials that are deemed safe for skin contact.

In accordance with an embodiment of the present invention, the outer skin of the apparatus 102 may be made of a non-toxic material of roughly an appropriate texture, hardness, and elasticity similar to human flesh for the part to be simulated during application. The material may vary in properties related to the surface of the part to better conform to variations in properties of actual human body parts or to better facilitate functionality. The material may be resilient enough to withstand the usage demands for which it is designated. The skin of the apparatus 102 may be provided with embedded sensors 104 to allow proper detection of intended user input. Some materials that may be effective for use as the skin of the apparatus 102 are Silicone elastomer, Urethane elastomer, Buna-N, Viton, or some other non-toxic elastomer, without limitations.

In another embodiment, the apparatus 102 may also include skeletal structure to more accurately represent the body part. In an exemplary scenario, the skeletal material may be non-toxic. In another scenario, the skeletal material may be toxic so long as the components made of that material are embedded within the non-toxic skin of the apparatus 102. The skeletal components may most likely possess rigidity and elasticity similar to actual human skeletal parts. In an embodiment, the skeletal components may be connected by rigid connectors. In another embodiment, the skeletal components may be connected with some type of flexible cartilage simulation material. The skeletal parts may also be equipped with sensors in order to detect stimuli that may cause positive or negative reaction, such as proper or improper adjustment of a subluxation, or bending of bones to the point of pain or breakage.

In an embodiment, the housing of the apparatus 102 may contain electronic components and connections. The housing may also provide a firm grip for the entire game controller to prevent false readings due to conditions that may be caused by contact with the skin of the game controller and the like. This may also be safe and non-toxic; however, the constraints may be relaxed to a certain extent. The housing may contain digital sensors 104 in the form of buttons for such as game menu navigation. The housing may also contain other sensors 104 to detect changes in temperature or humidity.

In accordance with various embodiments, the apparatus 102 may use sensors 104 to detect particular behaviors of the user. These sensors 104 may be analog, digital, and the like, without limitations. Sensors 104 may be embedded within the silicone skin attached to skeletal or structural elements on the exterior surface of the housing of the apparatus 102. In another scenario, sensors 104 may be attached in some other ways such as electronic, physical, and the like. In an embodiment, a keyboard of the computer or some other external human interface may be used as a sensor 104 in an auxiliary fashion. Sensors 104 may be spaced on the surface of the apparatus 102 in such a way as to detect user interaction with a resolution that is fine enough to properly interact with the software in use.

Different shapes may require different numbers of sensors 104 as well as alternate placements. Further, there may be different grades of the apparatus 102 that may also require different numbers, types, and placements of the sensors 104. For example, a hand-shaped apparatus 102 designed for professional reflexology may include over 100 sensors 104 covering all the various pressure points relevant to the study. On the contrary, a model of the same shape that is intended for home use or entertainment purposes may include fewer sensors 104, covering only the major areas of interests. It must be appreciated by a person ordinarily skilled in the art that the apparatus 102 may contain any combination of sensor types, grades, etc. These sensors 104 may be simple buttons or switches that may sense resistance, force, pressure, humidity, temperature, stress, strain, torsion, capacitance, sound, light or other electromagnetic radiation of any wavelength, or any other condition that may provide data relevant to the intended application, without limitations. The sensors 104 may be analog, digital, mechanical, or any combination thereof.

In an embodiment, the sensors 104 may be potentiometric or resistive. A potentiometer is essentially a variable resistor that acts to alter electrical conductance when acted upon in certain ways. Potentiometers may be used for detection of position, torsion, stress, strain, and the like conditions, without limitations. Several types of potentiometers may be utilized as sensors 104. Thin film potentiometers in linear, rotary, or some other form may be used to determine position, rotation, or other effects imparted by an object such as a finger, tongue and the like. Knob-style potentiometers may be used to detect twisting, stress, strain, and the like, or to monitor in-game controls, menu navigation, or similar variables, without limitations. Resistive sensing may be employed in potentiometers in the form of resistive arrays, resistive switches, and the like.

In another embodiment, pressure sensors may be employed in the apparatus 102 for detecting pressure that a user exerts on the apparatus 102 and also as a means of location. These sensors 104 may be electrical, mechanical, or combinations thereof. Sensors 104 for pressure detection applications may involve a spring of any conventional design, skin-like material as an elastic conformal layer, or other types of mechanical pressure monitoring devices. These sensors 104 may be electrical in nature, such as a piezoelectric sensor which generates electricity in proportion to an exerted force, and the like. In embodiments, the pressure sensors may be a combination of two or more types of sensors, such as a strain gauge attached to a membrane or a diaphragm and the like. A pressure sensor may determine force per unit area exerted on the sensor region by the user. The pressure may be positive or negative. A negative pressure may represent a vacuum being created. For example, an application of suction by the user may represent negative pressure. Suction may be caused by the user during sucking an area or a portion using his or her mouth, pulling with fingers or teeth, and the like.

In yet another embodiment of the present invention, force sensors may be utilized in the apparatus 102. Force sensors are very similar to pressure sensors and may be employed in the apparatus 102 for detecting force intensity that the user may exert on the apparatus 102. A force sensor may be employed in ways similar to a pressure sensor and generally the two sensors may operate on similar principles. A force sensor may operate on the same principle as a pressure sensor except that it may detect an overall force exerted on the sensor as opposed to the force per unit area detected by the pressure sensor. Since the sensor itself may be very small, the force per unit area may be identical to the overall force exerted on the apparatus 102. In accordance with various embodiments, these sensors 104 may be electrical, mechanical, or a combination thereof. In embodiments, a mechanical pressure sensor may involve a spring, skin-like material as an elastic conformal layer, and similar type of mechanical pressure monitoring devices, without limitations. Force and pressure sensors may be piezoelectric, capacitive, strain-gauge, photosensitive, and the like that may measure or detect or infer force or pressure exerted on the apparatus 102.

In accordance with another embodiment of the present invention, humidity sensors may be used that are configured to detect moisture or humidity caused by breath or licking. Information from such sensors 104 may be utilized to recognize the level of moisture on an area of the apparatus 102, over the entire apparatus 102, or to discern licking from merely touching with the finger/palm/etc. In an exemplary scenario, certain areas may need proper lubrication before certain techniques are employed to their full effect. Humidity sensors may be used to gauge the humidity effect for lubrication as well.

In embodiments, temperature sensors may be utilized in the apparatus 102 that may be configured to determine user contacts on the surface, blowing on the surface, or similar actions that may alter the temperature of a surface of the apparatus 102. A temperature sensor may be used to determine temperature levels and changes of the apparatus 102. For example, in a scenario, the temperature sensor may determine if the temperature is held at a constant level or as a reference for ambient temperature. These temperature levels may be used as variables in game play such as keeping the room temperature within a desired range, massaging or rubbing an area of the apparatus 102 for warmth, blowing on a surface of the apparatus 102 for cooling, or any other conditions that may require a temperature sensor.

In accordance with another embodiment of the present invention, stress and/or strain gauges may be utilized as sensors 104. Stress and/or strain gauges may function like force sensors and may be used to detect stretching or compression. These sensors 104 may be employed in the apparatus 102 as force or pressure sensors or as a way to discern actions performed by the user, such as pulling on the middle toe while squeezing sides. Similarly, gentle rolling of the toe may be detected by a strain gauge and a flex sensor placed between the plastic “bone” parts of the toe, and force sensors placed around the bone in the skin of the apparatus 102, and the like, without limitations.

In an embodiment, arrays of various types of sensors 104 may be utilized in the apparatus 102. Various types of sensors 104 have been described above in detail. The use of sensor arrays may reduce electronic and mechanical complexity in the apparatus 102. In an exemplary scenario, sensor arrays may be printed on the apparatus 102. In an alternative scenario, sensor arrays may be transferred directly to a layer within the skin material of the apparatus 102, or to the bone material, or to a conformal layer that may then be layered into the apparatus 102, or on some other surface without limitations. These arrays may be of simple design such as arrays of small metal sheets connected by wire traces and the like that may be used as a single layer of a capacitive sensor array. In such a scenario and configuration of sensor arrays, a middle layer may be made of a skin-like material similar to the skin of the apparatus 102 while the other layer may be formed of a thin metal sheet with wire traces. In embodiments, the thin metal sheet with wire traces may be sprayed or printed on the apparatus 102, fabricated and layered into the apparatus 102, or otherwise mounted on the apparatus 102. Sensors or sensor arrays may be fabricated directly into the skin of the apparatus 102 through other methods and modes as well, without limitations. For example, in accordance with various embodiments, stereo lithography, selective laser sintering, extrusion, 3d printing, and other similar types of fabrication methods may be utilized.

In embodiments, various types of light sensors may be employed in the apparatus 102. The light sensors may include, without limitations, infrared sensors, ultraviolet sensors, photodiodes, CCDs, and similar photosensitive sensors. In embodiments, various photosensitive sensors that may be used as temperature sensors, proximity sensors, as a part of a machine vision/image processing system, or in any other ways capable of detecting relevant actions performed by the user on the apparatus 102 may also be utilized as light sensors.

In accordance with various embodiments, touch screen technology may be employed for detecting the position of an event, intensity of an event, and combinations thereof. The touch screen technology may include, without limitations, strain gauge technology, surface acoustic wave technology, capacitive technology, bi-directional technology, and the like. The touch screen technology may be typically used as a visual display that reacts with touch contacts. However, the touch screen technology may alternatively be utilized without the visual display, thereby acting as a sensor 104. In accordance with this configuration, the touch screen may resemble a capacitive or tablet-style sensor. In certain scenarios, touch-screens may be configured to detect multiple points of contact simultaneously. This may be termed as multi-touch. In accordance with an embodiment, touch-screen technologies may be configured to detect contact events only. In alternative embodiments, touch-screen technologies may be configured to detect proximity, force, temperature, or other similar parameters, without limitations. In still another embodiment of the present invention, touch-screen technologies may be utilized through use of other specific traits that these may exhibit. For example, LCD screens may discolor on an application of force. Discoloration may be temporary and proportional to the force exerted on the screen. The phenomenon of discoloration may be exploited with the use of machine vision, image processing, and the like. For example, when a user touches the apparatus 102 at one or more places, a force may be exerted on the flexible LCD screen embedded within the skin of the apparatus 102, thereby causing discoloration at the points of contact. The discoloration may be proportional to the force exerted on the LCD screen. A sensor 104 or an array of sensors may be disposed on the apparatus 102 in conjunction with software algorithms that may be configured to process information for determining points of contact and forces exerted on the apparatus 102.

In embodiments, image processing and machine vision technologies may be employed in the apparatus 102. This may further include sensors 104 such as cameras, CMOS image sensors, infrared detectors, photodiodes, or other light-based sensors, without limitations. Further, sensing may be simple, such as detection of light occluded by a user's interaction with the skin of the apparatus 102 or complex such as the use of stereo cameras to detect actual 3D orientations in space related to the user's hand, tongue, etc. In addition, machine vision technologies may be utilized in complex arrangements to detect the identity of the user, facial expressions of the user, modes of interaction of the user, detailed depth-mapping, and the like.

In embodiments, capacitive sensors may be fitted within the apparatus 102. The capacitive sensors may utilize the property of capacitance to detect contact with an object, displacement of elements of the sensor, or other similar events. In accordance with various embodiments, this may be achieved by an application of electric charge across two layers of conductive material with a free space or other dielectric material filled in between the two layers. The spaced gap may be filled by an elastomeric material similar to the material of the skin of the apparatus 102 or by some other material, without limitations. Based on a value of the capacitance of a particular configuration of the conductive layers and dielectric; current, voltage, and distance between the two plates and other similar parameters may be measured with extremely high accuracy. Further, based on the hardness and thickness of the constituent layers and a change in the distance between the conductive layers, force exerted on the apparatus 102 may be determined.

In accordance with various embodiments of the present invention, multiple axis sensors may be utilized in the apparatus 102. Most force or pressure sensors in general may detect forces applied in two directions on a single axis, usually positive and negative, substantially orthogonal to the sensor surface. This may correspond to an “in” or “out” force vector. The sensor may show this effect as an increased or decreased reading of force or pressure. Further, sensors 104 may be configured to detect pressure or force in more than one axis. These sensors 104 configured to detect forces in multiple axes may detect “in” and “out” applications of force that may be substantially orthogonal to the surface plane. In addition, these sensors 104 may be configured to detect “left,” “right,” “up,” and “down” forces that are substantially parallel or tangential to the sensor surface or orthogonal to the “in/out” vector. Such sensors 104 may be employed to detect directional effects such as directions in which the user directs force and the magnitude of the respective force. Such sensors 104 may also be utilized to infer rotational motion and related parameters. A multiple axis sensor may include an array of sensors each oriented to one axis or another. The sensors 104 including such an array may be piezo-resistive, capacitive, optical, or any other types, without limitations, that may detect conditions where directional orientations may be useful as a data input.

The apparatus 102 as described above may require power to perform its function. Further, sensors, microcontrollers, communication game controllers, and other similar components may also require power to operate properly. Power may be received from one or more power sources. For example, power may be supplied by batteries that may be internal and/or external to the game controller, by AC power adapter, through USB or other computer connection, and the like, without limitations. In addition, power may be harvested completely or partially from motion, heat, light, or other similar sources that may be available or intrinsic to the apparatus 102.

In accordance with an embodiment of the present invention, power may be supplied from batteries. Batteries may include, without limitations, lithium-ion, lead-acid, nickel-cadmium, or any other type. Further, batteries may be disposable or rechargeable. Rechargeable batteries may be charged via an AC connection, USB connection, energy harvested from motion, or any other source that may be available or may be made available. In several embodiments, battery power may be utilized in addition to or in place of external or harvested power or combinations thereof.

In embodiments, the apparatus 102 may be connected with a computing device 128. The computing device 128 may include a computer, cellular telephone, game system, PDA, smart phone, iPad, or other similar devices, without limitations. The connection may be achieved in a wired fashion, such as through a USB, Firewire, Serial port, or other ways of connection through a cable to a computing or gaming platform. In another embodiment, the connection may be wireless, such as enabled through BlueTooth, Wi-Fi, ZigBee, or other forms of wireless communication between the apparatus 102 and the computing device 128. In certain embodiments, data connection may be used as a power connection as well. In addition, the apparatus 102 may also be capable of using any combination of data connection methods. In an embodiment, a combination of wired and wireless modes may be utilized in the apparatus 102. For example, power and data may be transferred via a USB port or other wired connections configured to charge a battery inside the apparatus 102 that may be used when the apparatus 102 is connected wirelessly.

In an embodiment, software associated with the apparatus 102 may be configured to communicate and process information received from external interfaces. Further, the software may enable processing and analysis of signals and data received from within the apparatus 102 or from an external source. In embodiments, sensors 104 may provide a signal to the apparatus 102 for interaction. For example, analog sensors may be configured to allow interaction input to the apparatus 102. In embodiments, potentiometers, pressure sensors, temperature sensors, and humidity sensors, or other types of sensors providing an analog output may be processed by the software for decoding an interaction. In this aspect, various algorithms may be implemented to process the signals received from one or more sensors 104.

In embodiments, the computing device 128 such as a computer may receive an input from the analog sensors 104; the input may be first decoded or processed by electronic hardware components and later interpreted by software. Alternatively, the software and hardware may be coupled to process the input signal simultaneously in order to provide a faster response. In another embodiment, the hardware components interpreting the sensor data may contain embedded software or firmware, which may be process data and/or communication with a computer, gaming platform, or some other type of apparatus 102. The apparatus 102 may be connected to a computing device 128 such as a computer having associated software that may perform various forms of data processing. In embodiments, various algorithms for averaging, interpolation, extrapolation, and the like, may be provided for processing the signal.

In embodiments, the software may be configured to differentiate and measure the type and amount of simulation the user may apply to apparatus 102. For example, the software may include a computation facility that may associate the score of the user in response to a simulation related to a particular technique, maneuver, or instruction. In embodiments, the user may reach a particular level of competence; the software may provide feedback to the user that a particular level has been reached. Further, the software may detect the actions of the user and may determine whether the motions of the user can be approximated to the prescribed motions. The software may prescribe a margin of error. An action by the user may be detected, processed by the associated circuitry and analyzed, to see if it falls within the acceptable margin of error. The disparity in resolution between the apparatus 102 and actual human nerve networks may also result in an error. In embodiments, the software may consider the margin of error introduced due to the difference in resolution of the apparatus 102 and the actual human nerve networks and compensate the error accordingly. For example, the user may touch 5 mm to the left of where the game has instructed the user to touch; the computer may determine this to be within the margin of error and score the user accordingly. In another example, if the nearest sensor 104 is located 5 mm to the left of the where the game has instructed the user to touch, the computer may regard input from that nearest sensor 104 or from a combination of sensors in the region as being within the margin of error and may score the user accordingly.

In embodiments, the error compensation may apply to timing. The time resolution may be governed by the sample rate of the processing unit 132, a sample rate inherent to sensors 104 used, the software, or some other component, and the software logic may account for these errors/delays. Software logic may also need to account for expected or reasonably predictable natural timing variations caused by the user, hardware latency, and the like. The margins of error may be lesser or greater depending on conditions set forth by the software.

The margin of error may be based on level of difficulty associated with the game. For example, if the user performs a certain instruction against time then the margin of error may be reduced to account for the level of difficulty. In another example, if the user is categorized as novice or a beginner then the level of difficulty may be low; in this scenario, the margin of error may be relaxed. Likewise, certain on-screen subjects may be programmed to increase or decrease the margin of error in order to account for accuracy of certain maneuvers.

In embodiments, image processing techniques may be employed for analyzing sensor data. For analyzing images, various algorithms including line detection, shape recognition, object tracking, and the like, may be employed to infer parameters of a user's actions based on optical data, such as data gathered from a camera embedded in the apparatus 102. For example, a camera may capture the image of the user's finger over one region of the apparatus 102. Image processing software may be programmed and this information may be used to recognize the finger as a finger and then lock on to it and track the motion of the finger accordingly.

Statistical techniques such as interpolation/extrapolation may be employed to detect the signal in areas on the surface of the apparatus 102 where there is no sensor 104 to directly detect user interaction. Information on the current state of user interaction may be inferred by statistical techniques such as interpolation/extrapolation. Previous data may be utilized to infer the speed and direction of a user's motion, variations in force, or other trends, patterns, actions and the like. In an embodiment, different methods of interpolation and extrapolation between data points may exist, which may be utilized to fill in information where sensor data is insufficient or missing.

In another embodiment, the software may determine a “best-guess” to be the action of the user. This may be necessary where limited or no sensor data is being registered. In this aspect, the software may use previous data to detect a general trend and based on the previous data, an assumption of the sensor data at the current time may be determined. In an embodiment, the software may make an assumption based on sensor data being registered that may be considered to be incomplete. For example, the user may be using a hand-shaped apparatus 1600 as illustrated in FIG. 16 and attempting to perform a sweeping motion from the tip of the thumb to the base of the thumb on the palm. The motion in this example may bring the path of the user's fingertip through three force sensors, one at the thumb tip, one midway down the thumb, and one at the thumb's base. The position of the user's finger, force exerted, or other information may be inferred from data received by the sensors by analysis the sensor data. When the user's finger is in between two sensors or moving from a first sensor to a second sensor, the position may be inferred as midway between these two sensors and weighed toward the sensor registering greater force. This inference may further be weighted by other information, such as known data or assumption that the user is performing actions that the user is expected to perform.

In embodiments, a search for finding the closest points to a specific data point in a metric space may be performed. There are many different algorithms for performing a Nearest Neighbor Search (NNS) including linear space searching, vector approximation, space partitioning, and the like. Some types of NNS may yield an absolute result, such as a specific reference point that the data point is determined to be closest to, while others may yield a more abstract result, such as determining a “bin” or group that the known data point will be said to belong to. NNS technique may facilitate extrapolating of unknown data from known points, interpolation between known points, or other operations that may serve to infer information from known data.

In another embodiment, neural network algorithms may be employed to define the relationship between user action and sensor stimulation. Specifically, the problem of determining particular behaviors made by the user from discrete sensor input may be logically internalized in software, either in the user's computer or implemented in the apparatus 102 circuitry, by using neural network analyses. In embodiments, various neural network techniques and algorithms may be utilized including back propagation with gradient descent, evolutionary computation methods, swarm intelligence techniques, simulated annealing, non-parametric methods, or some other type of neural network training algorithm to train sensor logic. In embodiments, these techniques may be utilized for testing the apparatus 102, recognition of user input patterns, training the apparatus 102 or software with certain techniques or methods for future recognition, or any other application that may facilitate the use of apparatus 102 in the intended application.

Apart from the nearest neighbor search and neural networks, other techniques may be employed to analyze sensor data and used to infer information about the user's interaction with the apparatus 102. In an embodiment, reverse finite element analysis may be utilized for analyzing and inferring the sensor data. Information from the sensors 104 may be utilized to construct a virtual model of the surface of the apparatus 102 with some margin of error, and this surface shape may be compared with the surface shape of the apparatus 102, when the apparatus 102 is not being touched. The gathered information, the data corresponding to the physical properties such as elasticity, hardness, temperature and the like, and the distribution of materials comprising the apparatus 102 may be utilized to calculate the probability distribution of different modes of interaction. The probability distribution may be employed in association with an algorithm, pattern recognition methods, data analysis technique and the like for enhancing the performance of the sensor data. The probability distribution could be compared with the earlier probability distribution to reduce the number of user actions that may result in data readings that were read earlier. This may facilitate accurate recognition of specific user input patterns by the computer and interpretation of the specific user actions being performed on the apparatus 102.

Interaction with the apparatus 102 may be usually through sensors 104; the type of interaction with the apparatus 102 may be similar. For example, the apparatus 102 may be held in the user's hand. In other embodiments, the apparatus 102 may be placed on a table or surface, attached to a larger assembly, or otherwise positioned at a particular angle. The user may initiate interaction with the apparatus 102 by pressing a button, licking or touching the surface, or some other means of interaction with the apparatus 102. Sensors 104 will detect changes or conditions affected by the user. Data from the sensors 104 may be received by a microcontroller, processor, or some other apparatus 102 that may be a computing apparatus 102 such as a computer enabled to transmit data to a computer, game console, or other apparatus. The computing apparatus 102 may display on-screen representations of the data or scenes that may in some way represent how the user is interacting with the apparatus 102. The apparatus 102 may also have tactile or other feedback apparatuses on-board. The computing apparatus 102 may send signals to the apparatus 102 instructing it to activate a feedback including instructions on how to activate the feedback.

In an embodiment, the computing apparatus 102 such as a computer may graphically display to the user various techniques, moves, or other actions; the user may copy these actions or use them as learning modules. In another embodiment, the user may improvise actions, which may be analyzed by the software in some way. The software may display an on-screen representation of the subject whose hand, foot, or other body part may be represented by the apparatus 102, the user interacting with the subject, or some other graphical representation of relevant actions.

In an embodiment, the software may be programmed to display a training session for a professional massage. In this aspect, the apparatus 102 may be utilized as a tool for professional massage therapy training. The software may display animations of the user's relevant body part being manipulated in certain ways representing various techniques that may be employed in the field of massage therapy. The user may then be required to attempt to repeat these actions on the apparatus 102 and may be graded based on how accurately the user was able to replicate the actions. Users may obtain software for the apparatus 102 programmed by fellow users, professional massage therapists, or others. Separate software may be employed in programming the apparatus 102 based on statistical data from a user community, real-life subjects, professional massage therapists, anatomical/physiological information, or some other sources of data. In an embodiment, the apparatus 102 may be utilized to practice various techniques at home. The software may be programmed by the instructor of the class, other professional massage therapists, fellow students, or some other persons to teach the user certain methods of massage therapy.

In an embodiment, the apparatus 102 may provide personal massage training to the user. The apparatus 102 may be utilized to train one or more users to perform various massage therapy techniques and procedures by depicting performance by professional massage therapists.

In an embodiment, the apparatus 102 may be utilized for couple therapy and/or sex therapy. The apparatus 102 may be used to ameliorate certain sexual problems between couples. Further, the couple may be able to satisfy each other by demonstrating and learning methods of sexual contact that their partner prefers. The apparatus 102 may be utilized to instruct individuals facing problems with a partner, couples experiencing sexual compatibility problems or issues, couples involved in couple therapy or some other people who wish to improve sexual relationship with their partners. Likewise, a sex therapist may recommend the apparatus 102 to patients as an aid for communicating sexual preferences. The apparatus 102 may be utilized as a display for educating their partners about the actions or techniques that they would like to employ. Likewise, the apparatus 102 may be utilized to display to their partners actions or techniques that they do not enjoy, as a means of opening up communication on a subject that many consider awkward, or in any other way that may facilitate improvement in the couples' sexual relationship.

In another embodiment, the apparatus 102 may be utilized to teach personal sexual training. In this aspect, the apparatus 102 may be utilized to instruct individuals in the art of pleasing a partner, in methods of safe sex, in basic anatomy or some other techniques that suggest foul play and must be avoided.

In an embodiment, the apparatus 102 may provide professional physical and/or occupational therapy training. The apparatus 102 and its associated software or both may simulate one or more of a multitude of afflictions or conditions such as amputation, carpal tunnel syndrome, trauma, or any other affliction or condition that would cause a person to require physical or occupational therapy. The apparatus 102 may aid in training students or professionals in methods of treatment, potential hazards of treatments, and subjects that may be of interest in the study of treatment of these conditions or afflictions. This apparatus 102 and its associated software may be programmed with techniques obtained from expert physical or occupational therapy professionals.

In an embodiment, the apparatus 102 may be utilized for physical/occupational therapy training at home. The apparatus 102 may be utilized by a partner, friend, or caregiver to learn home treatments, beneficial exercises, and other techniques, methods, information and the like, to aid in the care of a person who suffers from an affliction or condition that would benefit from physical or occupational therapy.

In an embodiment, the apparatus 102 may be utilized for chiropractic training. The apparatus 102 may be utilized to train users to perform manipulations of body parts in order to use the techniques used by chiropractors. In another embodiment, the apparatus 102 may be utilized as a training tool for providing surgical training. Likewise, in another embodiment, the apparatus 102 may be utilized in medical diagnosis; for example, to train medical professionals in diagnosis of sexual problems.

In another embodiment, the apparatus 102 may provide anatomical training to medical professionals, students, sex professionals and the like. For example, the apparatus 102 may be used to show where a big toe or inner labia is; or in more complex cases, showing the location of a kidney. The sensors 104 could be an aid to help the user properly find the region.

The apparatus 102 may be utilized for entertainment. In an embodiment, the apparatus 102 may be a type of video game controller. The user may interact with the video game controller in different ways. The computing apparatus 102 may display on-screen instructions, displaying techniques, methods, or actions that the user may replicate for practice. The user may be provided these inputs before executing the procedure, or may be displayed in real-time. The user may observe visual indicators showing where the user is performing on the apparatus 102. In addition, the user may also observe relevant representations of conditions such as force or pressure applied, temperature, strain in a particular direction, or some other conditions that may be derived from sensor data of the apparatus 102.

The apparatus 102 may portray different types of human preferences. In this aspect, a number of on-screen subjects, different modes of on-screen subjects and the like may be provided. The user may select from one or more different sets of preferences to more accurately represent preferences of real-life subjects.

In an embodiment, a wide range of forms, shapes, colors, sizes, or other physical parameters may be associated with the apparatus 102 and/or on-screen subjects. The variation in form, color, sizes and the like may reflect multitude of variations in real-life.

The apparatus 102 may be configured to display changes in user preferences in a particular duration. The apparatus 102 may display on-screen subject from interaction to interaction to reflect how a real human subject's preferences may change in real life for one or more reasons. In an example, a subject may normally enjoy a very firm foot massage on particular days, while on other days he or she might prefer to have their foot massaged extremely gently. The change of preference in real-life may be caused by stress, temperature, working conditions, or some other type of conditions associated with change in preference.

Sometimes the action of the user may be misinterpreted. In this aspect, the subject may misrepresent the reaction to an action or actions being performed by the user, or by other conditions, environmental and the like. This may be used as a means of representing a number of situations or factors. For example, a person may be faking an orgasm; the occurrence of a fake orgasm may be caused by a number of factors, such as embarrassment, a desire not to hurt a partner's feelings, a desire to finish the current interaction with a partner, tiredness, or some other type of factor.

Likewise, in another scenario, the subject may attempt to conceal feelings of pleasure from a user due to shyness, cultural inhibitions or some other reason.

The apparatus 102 may be used as an instructional tool for displaying wounds/irritations/sensitive areas and the like. The on-screen subject may be shown to have wounds, irritated or sensitive areas, hyper-sensitive areas and the like. These may be chosen by the software randomly. In an embodiment, wounds, irritated or sensitive areas, hyper-sensitive areas and the like may be based on a pattern. In other embodiments, wounds, irritated or sensitive areas, hyper-sensitive areas and the like may be programmed by the user such as a professional massage therapist, a medical professional and the like. The instruction tool may display how to handle these sensitive areas. For example, the apparatus 102 and software may be programmed to provide instructional steps to avoid or to target an area depending on the nature of the condition.

In embodiments, the apparatus 102 may be interfaced with the web and the allowed licensed user to interact in a network mode, including but not limited to interaction with social networks, websites, and the like.

In accordance with various embodiments of the present invention, data may be collected or aggregated in the software for further use. Data aggregation may be performed in several different ways. For example, users may upload different techniques to a server, transmit information to a server directly, or otherwise have their data collected. In embodiments, professionals in massage therapy, sex therapy, or other similar relevant fields may use the apparatus 102 to demonstrate certain techniques and record data of their interaction with the apparatus 102 during demonstration. Further, human subjects may be recorded in the form of a videotape, digital video, motion tracking, and the like while engaging in the specific act intended to be recorded such as a foot massage, cunnilingus, or some other relevant act, without limitations.

Data thus collected or aggregated may be analyzed in various ways. For example, the analysis may be performed manually by a human, computer, or combinations thereof. The analysis may involve various data traits such as tempo of repeated or successive patterns, intensity, variation of intensity, combination of techniques, choice of background music or other environmental factors, or any other recognizable feature or pattern or combinations thereof, without limitations.

The aggregated and analyzed data may be used to predict various preferences such as likes and dislikes of a person during an ongoing act related to a foot massage, fellatio, and the like. Further, based on the collected and analyzed information, the software may attempt to predict the likes of a person based on examples of several other related techniques that the person may enjoy. For instance, a user may perform simple techniques on an apparatus 102 to demonstrate his/her favorite moves. These may be related to various inputs such as for example on a foot like rubbing the heel of the foot with force and then using fingers to gently stroke the arch from the heel towards the toe. The software may analyze this input similar to the analysis of other aggregated data. In addition, the software may use the analysis of other aggregated data to predict user's likes and dislikes next in succession, or at a later time, or some other determination that may be derived from the information gathered and analyzed.

In accordance with various embodiments, aggregated data may be utilized for extrapolation/interpolation. Further, aggregated data and the analysis performed thereof may be used to construct detailed models of a subject's preferences based on an initial data programmed for a particular subject. For example, a user may program an on-screen subject to emulate himself by inputting various techniques, methods, actions, or combinations thereof, and rating them based on the degree of his enjoyment or dislikes. Based on this information and data received from other users or programmed subjects, a statistical analysis may be performed. The statistical analysis may be utilized to inform the user regarding other methods, techniques, or combinations thereof that may be of use, avoided, and the like, for the user. Such models may be used to predict various reactions such as how an on-screen subject may respond to various actions performed by the user even if the subject has not been programmed with the information regarding these specific actions.

In embodiments, buttons or sensors 104 may be employed in the apparatus 102 to virtually perform certain actions during an interaction. In an embodiment, buttons or sensors 104 may be a part of the skin-like surface of the apparatus 102. In another embodiment, buttons or sensors 104 may be mounted on an enclosure of the apparatus 102, or somewhere else within the reach of a user. Further, buttons may involve additional human interface apparatus 102 such as a keyboard or a mouse and the like. Buttons or sensors 104 may be used as controls to perform various on-screen actions. These actions may involve modification in environmental factors such as music, temperature, and the like; interaction with a part of the on-screen subject's body not represented by the apparatus 102; or any other type of action that may be desirable.

In embodiments, programming action buttons may be mounted on the apparatus 102. Action buttons may be either preset, programmed, or combinations thereof. Further, programmable action buttons may be already associated with a designated action. In embodiments, programmable action buttons may be programmed by a user, a remote user over a network, a networked computer, or others, without limitations. Programming of the action buttons may be performed in several ways. For example, the user may select actions from a group of preset actions that may or may not be customizable, choose various combinations of the preset actions, select environmental objects such as a lamp or a radio or regions of the subject's body required to perform preset or customizable actions, and the like. As an example, the user may select a region of the subject's body for interaction, such as a breast and the like. A display may then depict a representation of the breast with major features mapped onto the representation of the apparatus 102 such as an apparatus shaped like a vulva in this case. In accordance with this example, outer labia may represent major portions of the breast, inner labia may represent areola, and the clitoris may represent a nipple. The user may then program a specific action using the apparatus 102 to simulate a breast, such as squeezing the breast lightly with the palm while firmly pinching the nipple between the thumb and forefinger and the like. The user may use this action during game play by pressing the respective button. If the button is pressure or displacement sensitive, the intensity of the represented action may be affected accordingly.

In accordance with various embodiments, the apparatus 102 may provide a feedback to the user in various forms. These may include tactile, audible, visual, or some other forms intended to alert the user regarding certain conditions. Tactile feedback may be generated in the form of vibrations, temperature, change in size or shape, humidity, or some other form that the user may detect by touching. A vibration feedback may be generated by a motor with an offset-weighted shaft, low-frequency audio, or some other apparatus 102 capable of generating a noticeable vibration. Change in size or shape may resemble a swelling or contraction of a region, regions, or orifices that may be generated by small actuators, inflation or deflation of pneumatic or hydraulic cells, expansion or contraction of memory alloys, or any other similar ways associated with creation of a variation in size, shape, or both of a portion of the apparatus 102. Audible feedback may be generated by a speaker, piezoelectric buzzer, or any other instrument capable of generating audible signals.

In embodiments, the software may provide an audible feedback to the user. This may be represented by a subject having an accelerated or decelerated audible breathing rate or heart rate, physiological events such as cracking of a knuckle, subluxation adjustments, voicing approval or disapproval by the subject, audible moans, groans produced by the subject, or other sounds that a human may use to express feelings and emotions such as pleasure, pain, and the like. In an embodiment, audible feedback may be in the form of audible beeps or music or variations in quality, such as timbre, volume, tempo, pitch, and the like.

The apparatus 102 may display handicaps and/or interferences that may restrict the user in various ways. The user may select or the software may impose certain handicaps on the user during interaction; these handicaps may include awkward positioning, unpleasant environmental conditions, or other conditions that may render the subject less responsive or more difficult to please. This may be represented by certain results having diminished or reversed efficacy.

The apparatus 102 may display a virtual environment with music or other conditions. In this aspect, the user may be provided with an option of setting environmental and/or atmospheric conditions in the virtual environment of the interface. In embodiment, these conditions may include background music, lighting, temperature, location, cleanliness of location, positioning of furniture, state of windows (open, closed, and the like), use of incense, or some other condition. In an embodiment, the user may have the authority to change the conditions in real-life; these conditions may include but may not be limited to weather, time of day, traffic conditions, ambient noises and the like.

An apparatus 102 simulating a part of human anatomy may be used on a variety of gaming platforms, such as: PLAYSTATION 2, PLAYSTATION 3, PLAYSTATION PORTABLE, manufactured by Sony Corporation; GAMECUBE, GAMEBOY, GAMEBOY ADVANCE, or WII, manufactured by Nintendo Corporation; or XBOX or XBOX 360, manufactured by Microsoft Corporation. The apparatus 102 may also be used on gaming platforms comprising a personal computer or a cellular telephone.

Although described below in connection with a simulated foot, the apparatus 102 may simulate any of a variety of human anatomical parts such as: hand, head, face, genitalia or other body parts. These apparatus 102 would be similarly outfitted with sensors 104.

Referring now to FIG. 12, a screenshot of one possible embodiment of a game-style environment for use with an apparatus in the shape of a part of human anatomy is provided. In the illustration, the human anatomical part in question is a foot, which is depicted on screen 1202. An on-screen indicator 1204 may show the user where to press on the apparatus and the user may respond accordingly. A score 1208 may be shown to reward the user's adeptness at using the apparatus. Secondary indicators 1210 may be used to show other areas to be touched, sensitive areas to be avoided, or other conditions that the user may be made aware of.

Now referring to FIG. 13, one embodiment of an apparatus 1300 in the shape of human anatomy is shown being held by the user. Again, the anatomical part in question may be a foot in this particular embodiment. The body 1302 of the apparatus may be manipulated by the user by pressing, stroking, pinching, or some other kind of physical interaction. These manipulations may then be read by various sensors and interpreted by a signal processor in the base of the apparatus 1308. The information may then be transmitted to a computer or game system via a cable, wireless connection, or some other means. Although a USB port 1304 is shown in this particular embodiment, the apparatus may transmit data via some other kind of connection as well or wirelessly.

FIG. 14 shows a side view of a similar embodiment of the apparatus 1300 as shown in FIG. 13. The body of the simulated foot 1302 is made of some kind of safe material. The signal processing electronics that send sensor data to the computer or game system may be housed in an enclosure 1310 at the base of the apparatus. An ergonomic hand-grip 1308 may be used to hold the apparatus. In this particular configuration, signals are passed to the computer or game system via a USB port 1304.

Referring now to FIG. 15, an embodiment of a simulated foot apparatus 1500 for use with a video game or educational tool is depicted. The body of the simulated foot 1302 is made of some kind of safe material and is outfitted with a sensor suite comprising position sensors 1502 and pressure sensors 1504. Although ten sensors are shown in the figure, the apparatus may have any number of sensors. Also, although the apparatus shown has two different types of sensors, the device may contain rotary position sensors, contact sensors, temperature sensors, torsion sensors, capacitance sensors, or some other kind of sensors. The apparatus may also have only one type of sensor. Moreover, the positioning of the sensors may be different from that shown in the figure. Feedback may also be given to the user via a vibrating motor or some other indicator to alert the user that he is performing well or poorly or refer to some other game condition. The apparatus shown here has no base, but uses a USB port 1304 on the rear of the apparatus to transmit sensor data to the computer or game system.

Now referring to FIG. 16, an embodiment of a simulated hand apparatus 1600 for use with a video game or educational tool is depicted. The body of the simulated hand 1600 is made of some kind of safe material and is outfitted with a sensor suite comprising position sensors 1502 and pressure sensors 1504. Although twelve sensors are shown in the figure, the apparatus may have any number of sensors. Also, although the apparatus shown has two different types of sensors, the device may contain rotary position sensors, contact sensors, temperature sensors, torsion sensors, capacitance sensors, or some other kind of sensor. The apparatus may also have only one type of sensor. Moreover, the positioning of the sensors may be different from that shown in the figure. The apparatus shown here uses a USB port 1304 to transmit sensor data to the computer or game system.

In any configuration and shape of the apparatus 1600, each sensor feeds a stream of data to the game platform indicating the state of the sensor at that time. The data may be sampled at any frequency within the hardware's limitations but will most likely be sampled at about standard video rates of 30 Hz. Each sensor's state may be used in conjunction with one or more of the other sensors to provide information on how the user is manipulating the apparatus 1600. This method may also be used to interpolate information.

A mechanical or electrical “tilt sensor” may be included to monitor position and orientation of the apparatus. This sensor may be a mercury reed switch, an accelerometer, or some other kind of device for detecting tilt, position, or orientation. This information may be used in game play to determine how the user is holding the apparatus 1600 or if he is rolling it around as one might do to relieve stress from an ankle or wrist. Other secondary techniques of interaction with the apparatus 1600 may include shaking or slapping the apparatus 1600.

The body of the apparatus 1600 is made of a non-toxic, or medical grade compound that will somewhat simulate the elasticity and hardness of human flesh. The compound can be urethane, silicone, latex, or some other compound that meets regulatory safety requirements. Bonelike structures may be embedded in the apparatus 1600 to provide rigidity and a more realistic feel as well as support and backing for the sensors or wire feeds. These simulated bones may be made of metal, plastic, or some other material that provides the necessary characteristics.

The methods and systems described herein may be deployed in part or in whole through a machine that executes computer software, program codes, and/or instructions on a processor. The processor may be part of a server, computing device, network infrastructure, mobile computing platform, stationary computing platform, or other computing platform. A processor may be any kind of computational or processing device capable of executing program instructions, codes, binary instructions and the like. The processor may be or include a signal processor, digital processor, embedded processor, microprocessor or any variant such as a co-processor (math co-processor, graphic co-processor, communication co-processor and the like) and the like that may directly or indirectly facilitate execution of program code or program instructions stored thereon. In addition, the processor may enable execution of multiple programs, threads, and codes. The threads may be executed simultaneously to enhance the performance of the processor and to facilitate simultaneous operations of the application. By way of implementation, methods, program codes, program instructions and the like described herein may be implemented in one or more thread. The thread may spawn other threads that may have assigned priorities associated with them; the processor may execute these threads based on priority or any other order based on instructions provided in the program code. The processor may include memory that stores methods, codes, instructions and programs as described herein and elsewhere. The processor may access a storage medium through an interface that may store methods, codes, and instructions as described herein and elsewhere. The storage medium associated with the processor for storing methods, programs, codes, program instructions or other type of instructions capable of being executed by the computing or processing device may include but may not be limited to one or more of a CD-ROM, DVD, memory, hard disk, flash drive, RAM, ROM, cache and the like.

A processor may include one or more cores that may enhance speed and performance of a multiprocessor. In embodiments, the process may be a dual core processor, quad core processors, other chip-level multiprocessor and the like that combine two or more independent cores (called a die).

The methods and systems described herein may be deployed in part or in whole through a machine that executes computer software on a server, client, firewall, gateway, hub, router, or other such computer and/or networking hardware. The software program may be associated with a server that may include a file server, print server, domain server, internet server, intranet server and other variants such as secondary server, host server, distributed server and the like. The server may include one or more of memories, processors, computer readable media, storage media, ports (physical and virtual), communication devices, and interfaces capable of accessing other servers, clients, machines, and devices through a wired or a wireless medium, and the like. The methods, programs or codes as described herein and elsewhere may be executed by the server. In addition, other devices required for execution of methods as described in this application may be considered as a part of the infrastructure associated with the server.

The server may provide an interface to other devices including, without limitation, clients, other servers, printers, database servers, print servers, file servers, communication servers, distributed servers and the like. Additionally, this coupling and/or connection may facilitate remote execution of program across the network. The networking of some or all of these devices may facilitate parallel processing of a program or method at one or more location without deviating from the scope of the invention. In addition, any of the devices attached to the server through an interface may include at least one storage medium capable of storing methods, programs, code and/or instructions. A central repository may provide program instructions to be executed on different devices. In this implementation, the remote repository may act as a storage medium for program code, instructions, and programs.

The software program may be associated with a client that may include a file client, print client, domain client, internet client, intranet client and other variants such as secondary client, host client, distributed client and the like. The client may include one or more of memories, processors, computer readable media, storage media, ports (physical and virtual), communication devices, and interfaces capable of accessing other clients, servers, machines, and devices through a wired or a wireless medium, and the like. The methods, programs or codes as described herein and elsewhere may be executed by the client. In addition, other devices required for execution of methods as described in this application may be considered as a part of the infrastructure associated with the client.

The client may provide an interface to other devices including, without limitation, servers, other clients, printers, database servers, print servers, file servers, communication servers, distributed servers and the like. Additionally, this coupling and/or connection may facilitate remote execution of program across the network. The networking of some or all of these devices may facilitate parallel processing of a program or method at one or more location without deviating from the scope of the invention. In addition, any of the devices attached to the client through an interface may include at least one storage medium capable of storing methods, programs, applications, code and/or instructions. A central repository may provide program instructions to be executed on different devices. In this implementation, the remote repository may act as a storage medium for program code, instructions, and programs.

The methods and systems described herein may be deployed in part or in whole through network infrastructures. The network infrastructure may include elements such as computing devices, servers, routers, hubs, firewalls, clients, personal computers, communication devices, routing devices and other active and passive devices, modules and/or components as known in the art. The computing and/or non-computing device(s) associated with the network infrastructure may include, apart from other components, a storage medium such as flash memory, buffer, stack, RAM, ROM and the like. The processes, methods, program codes, instructions described herein and elsewhere may be executed by one or more of the network infrastructural elements.

The methods, program codes, and instructions described herein and elsewhere may be implemented on a cellular network having multiple cells. The cellular network may either be frequency division multiple access (FDMA) network or code division multiple access (CDMA) network. The cellular network may include mobile devices, cell sites, base stations, repeaters, antennas, towers, and the like. The cell network may be a GSM, GPRS, 3G, EVDO, mesh, or other networks types.

The methods, programs codes, and instructions described herein and elsewhere may be implemented on or through mobile devices. The mobile devices may include navigation devices, cell phones, mobile phones, mobile personal digital assistants, laptops, palmtops, netbooks, pagers, electronic books readers, music users and the like. These devices may include, apart from other components, a storage medium such as a flash memory, buffer, RAM, ROM and one or more computing devices. The computing devices associated with mobile devices may be enabled to execute program codes, methods, and instructions stored thereon. Alternatively, the mobile devices may be configured to execute instructions in collaboration with other devices. The mobile devices may communicate with base stations interfaced with servers and configured to execute program codes. The mobile devices may communicate on a peer to peer network, mesh network, or other communications network. The program code may be stored on the storage medium associated with the server and executed by a computing device embedded within the server. The base station may include a computing device and a storage medium. The storage device may store program codes and instructions executed by the computing devices associated with the base station.

The computer software, program codes, and/or instructions may be stored and/or accessed on machine readable media that may include: computer components, devices, and recording media that retain digital data used for computing for some interval of time; semiconductor storage known as random access memory (RAM); mass storage typically for more permanent storage, such as optical discs, forms of magnetic storage like hard disks, tapes, drums, cards and other types; processor registers, cache memory, volatile memory, non-volatile memory; optical storage such as CD, DVD; removable media such as flash memory (e.g. USB sticks or keys), floppy disks, magnetic tape, paper tape, punch cards, standalone RAM disks, Zip drives, removable mass storage, off-line, and the like; other computer memory such as dynamic memory, static memory, read/write storage, mutable storage, read only, random access, sequential access, location addressable, file addressable, content addressable, network attached storage, storage area network, bar codes, magnetic ink, and the like.

The methods and systems described herein may transform physical and/or or intangible items from one state to another. The methods and systems described herein may also transform data representing physical and/or intangible items from one state to another.

The elements described and depicted herein, including in flow charts and block diagrams throughout the figures, imply logical boundaries between the elements. However, according to software or hardware engineering practices, the depicted elements and the functions thereof may be implemented on machines through computer executable media having a processor capable of executing program instructions stored thereon as a monolithic software structure, as standalone software modules, or as modules that employ external routines, code, services, and so forth, or any combination of these, and all such implementations may be within the scope of the present disclosure. Examples of such machines may include, but may not be limited to, personal digital assistants, laptops, personal computers, mobile phones, other handheld computing devices, medical equipment, wired or wireless communication devices, transducers, chips, calculators, satellites, tablet PCs, electronic books, gadgets, electronic devices, devices having artificial intelligence, computing devices, networking equipments, servers, routers and the like. Furthermore, the elements depicted in the flow chart and block diagrams or any other logical component may be implemented on a machine capable of executing program instructions. Thus, while the foregoing drawings and descriptions set forth functional aspects of the disclosed systems, no particular arrangement of software for implementing these functional aspects should be inferred from these descriptions unless explicitly stated or otherwise clear from the context. Similarly, it will be appreciated that the various steps identified and described above may be varied, and that the order of steps may be adapted to particular applications of the techniques disclosed herein. All such variations and modifications are intended to fall within the scope of this disclosure. As such, the depiction and/or description of an order for various steps should not be understood to require a particular order of execution for those steps, unless required by a particular application, or explicitly stated or otherwise clear from the context.

The methods and/or processes described above, and steps thereof, may be realized in hardware, software or any combination of hardware and software suitable for a particular application. The hardware may include a general purpose computer and/or dedicated computing device or specific computing device or particular aspect or component of a specific computing device. The processes may be realized in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors or other programmable device, along with internal and/or external memory. The processes may also, or instead, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device or combination of devices that may be configured to process electronic signals. It will further be appreciated that one or more of the processes may be realized as a computer executable code capable of being executed on a machine readable medium.

The computer executable code may be created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software, or any other machine capable of executing program instructions.

Thus, in one aspect, each method described above and combinations thereof may be embodied in computer executable code that, when executing on one or more computing devices, performs the steps thereof. In another aspect, the methods may be embodied in systems that perform the steps thereof, and may be distributed across devices in a number of ways, or all of the functionality may be integrated into a dedicated, standalone device or other hardware. In another aspect, the means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such permutations and combinations are intended to fall within the scope of the present disclosure.

While the invention has been disclosed in connection with the preferred embodiments shown and described in detail, various modifications and improvements thereon will become readily apparent to those skilled in the art. Accordingly, the spirit and scope of the present invention is not to be limited by the foregoing examples, but is to be understood in the broadest sense allowable by law.

All documents referenced herein are hereby incorporated by reference.

Claims

1. A computer program product embodied in a computer readable medium that, when executing on one or more computers, performs the steps of:

a. receiving a data input from at least one of a plurality of sensors located within an apparatus representing a part of the human anatomy, wherein the apparatus is connected to a computing device;
b. storing the data input, wherein the data input is stored in association with a region of the apparatus in which the at least one of the plurality of sensors is located;
c. representing on a display of the computing device, using a graphic user interface, the part of human anatomy that is represented by the apparatus; and
d. presenting within the graphic user interface a depiction of the region of the part of human anatomy from which the data input was received.

2. The computer program product of claim 1, wherein the data input is stored in a temporary memory component.

3. The computer program product of claim 1, wherein the data input is stored in a database.

4. The computer program product of claim 1, wherein the apparatus representing a part of the human anatomy is a game controller.

5. The computer program product of claim 4, wherein the game controller uses neural network pattern recognition methods.

6. The computer program product of claim 4, wherein the game controller uses neural network learning methods.

7. The computer program product of claim 1, wherein the apparatus representing a part of the human anatomy is a component of a humanoid robot that includes a representation of other human anatomic features.

8. The computer program product of claim 1, wherein the sensor is a pressure sensor.

9. The computer program product of claim 1, wherein the sensor is a humidity sensor.

10. The computer program product of claim 1, wherein the sensor is a plurality of sensors.

11. The computer program product of claim 1, wherein sensor is placed among a three-dimensional sensor array.

12. The computer program product of claim 1, wherein the computing device is a computer.

13. The computer program product of claim 12, wherein the computer is a desktop computer.

14. The computer program product of claim 12, wherein the computer is a laptop computer.

15. The computer program product of claim 12, wherein the computer is a notebook.

16. The computer program product of claim 1, wherein the computing device is a gaming console.

17. The computer program product of claim 1, wherein the computing device is a television.

18. The computer program product of claim 1, wherein the computing device is a smart phone.

19. A computer program product embodied in a computer readable medium that, when executing on one or more computers, performs the steps of:

a. receiving an expert sensor data sequence from an expert user using a first apparatus representing a part of the human anatomy in which a plurality of sensors are contained, wherein the expert sensor data sequence derives at least in part from the expert physically manipulating the device as part of performing an expert maneuver, and wherein the apparatus is connected to a first computing device;
b. recording and storing the expert sensor data sequence in a database;
c. receiving a first novice sensor data sequence from a first novice user using a second apparatus, representing a similar part of the human anatomy as the first apparatus, in which a plurality of sensors are contained, wherein the second apparatus is connected to a second computing device;
d. recording and storing the first novice sensor data sequence;
e. receiving a second novice sensor data sequence from a second novice user using a third apparatus, representing a similar part of the human anatomy as the first apparatus, in which a plurality of sensors are contained, wherein the third apparatus is connected to a third computing device;
f. recording and storing the second novice sensor data sequence;
g. comparing the first and second novice sensor data sequences to the expert sensor data sequence based at least in part on a statistical analysis of the data sequences, wherein the statistical analysis results in a first score associated with a degree of similarity between the first novice sensor data sequence and the expert sensor data sequence, and a second score associated with a degree of similarity between the second novice sensor data sequence and the expert sensor data sequence;
h. representing on a display of at least one of the first and second computing devices, using a graphic user interface, the part of human anatomy that is represented by the apparatus; and
i. presenting within the graphic user interface a depiction of the region of the part of human anatomy from which the first and second novice sensor data sequences were received, and including within the graphic user interface the first and second score and at least one feedback indicator.

20. A computer program product embodied in a computer readable medium that, when executing on one or more computers, performs the steps of:

a. receiving sensor feedback from a plurality of sensors that are embedded in an artificial replica of at least a portion of human anatomy;
b. receiving information relating to a location of each of the plurality of sensors as each is positioned within the artificial replica;
c. performing an analysis on the sensor feedback that defines both (i) a relationship of interactions amongst at least two of the plurality of sensors and (ii) a value from at least each one of the plurality of sensors that was interacted with during a user interaction; and
d. presenting, through a graphical user interface, a graphic depiction of at least a portion of the artificial replica and an indication corresponding to the sensor feedback analysis such that the user can understand how the user's interaction affected the artificial replica and caused a positive or negative action, as assessed through a comparison of the user interaction with actions stored in memory.
Patent History
Publication number: 20100261530
Type: Application
Filed: Apr 12, 2010
Publication Date: Oct 14, 2010
Inventors: David R. Thomas (Somerville, MA), Geoffrey G. Payson (Somerville, MA), Philip M. Bronstad (Somerville, MA)
Application Number: 12/758,606