Patents by Inventor Charles J. Jacobus
Charles J. Jacobus has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20090139359Abstract: A tactor system transforms cable motion to tactor motion. A housing defining a plane is adapted for placement proximate to the skin of a user. A cable is operative to deform a tactor element in the housing, causing a portion of the element to move outwardly from the plane of the housing, thereby imparting a tactile sensation to the user's skin. For example, tension on the cable may cause a strip of flexible plastic or other suitable material to bend at a living hinge that moves outwardly from the plane of the housing. The cable may be driven by an actuator including. Two or more tactor elements may be disposed next to each other or in the same housing, with different elements being activated at different times to enhance the apparent frequency of the stimulus. For example, a reciprocating mechanism may be used to operate a pair of tactor elements out of phase with respect to one another.Type: ApplicationFiled: November 18, 2008Publication date: June 4, 2009Applicant: Cybernet Systems CorporationInventors: Christopher R. Wagner, Amanda Christiana, Charles J. Jacobus
-
Publication number: 20090116692Abstract: A real-time computer vision system tracks one or more objects moving in a scene using a target location technique which does not involve searching. The imaging hardware includes a color camera, frame grabber and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: ApplicationFiled: January 14, 2008Publication date: May 7, 2009Inventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Publication number: 20090076419Abstract: A wireless monitoring/assessment system identifies and records selected basic activities of daily living, loss of balance, and falls of at-risk elderly using a minimum set of sensors. A wearable system collects data over an extended period from sensors mounted under each shoe/slipper inner sole, and three kinetic measurement sensors, one mounted on the subject's chest, and the other two on each thigh. The resulting data can be processed off line to determine if a loss of balance or fall has occurred and identify the basic activity that the subject was performing when it happened. The two inner sole foot pressure sensors provide data on the loading of the plantar aspect of each fore- and rear foot. Each kinematics measurement sensor module provides 3-axis acceleration and angular rate data from the torso and each thigh. All data is sampled at a high rate using an analog to digital converter, time stamped, and then stored in memory.Type: ApplicationFiled: May 23, 2008Publication date: March 19, 2009Applicant: Cybernet Systems CorporationInventors: Pavan K. Namineni, Charles J. Jacobus
-
Publication number: 20090074248Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.Type: ApplicationFiled: December 2, 2008Publication date: March 19, 2009Applicant: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
-
Publication number: 20090046005Abstract: This invention is a wireless mobile indoor/outdoor tracking system. It is designed to track the absolute position of all nodes in a network indoors and outdoors. The system uses GPS positioning when a signal is available and RF ranging when it is unavailable. When indoors, a minimum of three network nodes must receive a GPS signal to determine absolute position. A mesh network is used to make the system mobile and to create an avenue for data to be transmitted to a remote base station.Type: ApplicationFiled: August 15, 2008Publication date: February 19, 2009Applicant: Cybernet Systems CorporationInventors: Pavan K. Namineni, Trevor Davey, Gary Siebert, Charles J. Jacobus
-
Publication number: 20090027334Abstract: A method for controlling a graphical user interface (GUI) for a touchscreen-enabled computer systems provides a variety of software methods (tools) provide for high-fidelity control of the user interface. The TrackScreen tool provides finger-friendly mouse functions such as scrolling, dragging and clicking. The Magnifier application continuously captures the current screen image, and displays a magnified subset of it. Selecting within this magnified area with a pointing device (mouse, touchscreen, digitizer, etc) causes the application to simulate the action on the portion of the screen corresponding to the point in the magnified image that was selected. A KeyBoard application, a keyboard is rendered on screen, with sufficient size that the individual keys are easily selectable with an unaided finger. The Common Tasks Tool or CTT) allows common keyboard shortcuts, mouse events, and other user interface events to be specified in a configuration file and represented on screen as a large, easy-to-click button.Type: ApplicationFiled: June 2, 2008Publication date: January 29, 2009Applicant: Cybernet Systems CorporationInventors: Eugene Foulk, Ronald Hay, Katherine Scott, Merrill D. Squiers, Joseph Tesar, Charles J. Cohen, Charles J. Jacobus
-
Patent number: 7460690Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.Type: GrantFiled: September 14, 2005Date of Patent: December 2, 2008Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
-
Publication number: 20080216176Abstract: A hardware-assisted security system for networked computers can detect, prevent, and mitigate rootkits. The solution relies upon an add-on card that monitors the system, alerting administrators when malicious changes are made to a system. The technical detail lies in the techniques needed to detect rootkits, preventing rootkits when possible, and granting administration of protected systems. A beneficial side-effect of the solution is that it allows many other security features, like system auditing, forensic capabilities to determine what happened after an attack, and hardware lock-down of important system resources.Type: ApplicationFiled: February 6, 2008Publication date: September 4, 2008Applicant: Cybernet Systems CorporationInventors: Chris C. Lomont, Charles J. Jacobus
-
Patent number: 7345672Abstract: A system and method for providing a tactile virtual reality to a user is present. The position and orientation of the user is utilized to generate a virtual reality force field. Forces are in turn generated on the user as a function of this force field. A six-axis manipulator is presented for providing a user interface to such a system. This manipulator provides a unique kinematic structure with two constant force springs, which provide gravity compensation so that the manipulator effectively floats.Type: GrantFiled: February 27, 2004Date of Patent: March 18, 2008Assignee: Immersion CorporationInventors: Charles J. Jacobus, Alan J. Riggs, Mark J. Taylor
-
Patent number: 7121946Abstract: A real-time computer vision system tracks the head of a computer user to implement real-time control of games or other applications. The imaging hardware includes a color camera, frame grabber, and processor. The software consists of the low-level image grabbing software and a tracking algorithm. The system tracks objects based on the color, motion and/or shape of the object in the image. A color matching function is used to compute three measures of the target's probable location based on the target color, shape and motion. The method then computes the most probable location of the target using a weighting technique. Once the system is running, a graphical user interface displays the live image from the color camera on the computer screen. The operator can then use the mouse to select a target for tracking. The system will then keep track of the moving target in the scene in real-time.Type: GrantFiled: June 29, 2001Date of Patent: October 17, 2006Assignee: Cybernet Systems CorporationInventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Patent number: 7073752Abstract: An electronic automatic reserve or primary parachute activation device incorporates partial or complete capture of freefaller or tethered parachute jumper kinematics to rapidly and reliable determine when to automatically activate deployment of the primary or reserve chute. This device uses means for directly measuring acceleration, velocity and/or position in addition to air pressure change to enable reliable detection of chute deployment conditions earlier than is possible with conventional pressure change activated automated activations devices. This is important when the activation decision must be made within 5–10 seconds of the initiation of the jump as is the case for military low altitude parachuting.Type: GrantFiled: October 25, 2004Date of Patent: July 11, 2006Assignee: Cybernet Systems CorporationInventors: Nestor Voronka, Charles J. Jacobus, Derek M. Johnson, Pavan Namineni
-
Patent number: 7050606Abstract: A system and method tracks the movements of a driver or passenger in a vehicle (ground, water, air, or other) and controls devices in accordance with position, motion, and/or body or hand gestures or movements. According to one embodiment, an operator or passenger uses the invention to control comfort or entertainment features such the heater, air conditioner, lights, mirror positions or the radio/CD player using hand gestures. An alternative embodiment facilitates the automatic adjustment of car seating restraints based on head position. Yet another embodiment is used to determine when to fire an airbag (and at what velocity or orientation) based on the position of a person in a vehicle seat. The invention may also be used to control systems outside of the vehicle.Type: GrantFiled: November 1, 2001Date of Patent: May 23, 2006Assignee: Cybernet Systems CorporationInventors: George V. Paul, Glenn J. Beach, Charles J. Cohen, Charles J. Jacobus
-
Patent number: 7036094Abstract: A system for recognizing various human and creature motion gaits and behaviors is presented. These behaviors are defined as combinations of “gestures” identified on various parts of a body in motion. For example, the leg gestures generated when a person runs are different than when a person walks. The system described here can identify such differences and categorize these behaviors. Gestures, as previously defined, are motions generated by humans, animals, or machines. Where in the previous patent only one gesture was recognized at a time, in this system, multiple gestures on a body (or bodies) are recognized simultaneously and used in determining behaviors. If multiple bodies are tracked by the system, then overall formations and behaviors (such as military goals) can be determined.Type: GrantFiled: March 31, 2000Date of Patent: April 25, 2006Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
-
Patent number: 6950534Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.Type: GrantFiled: January 16, 2004Date of Patent: September 27, 2005Assignee: Cybernet Systems CorporationInventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
-
Patent number: 6875174Abstract: A general-purpose, low-cost system provides comprehensive physiological data collection, with extensive data object oriented programmability and configurability for a variety of medical as well as other analog data collection applications. In a preferred embodiment, programmable input signal acquisition and processing circuits are used so that virtually any analog and/or medical signal can be digitized from a common point of contact to a plurality of sensors. A general-purpose data routing and encapsulation architecture supports input tagging and standardized routing through modern packet switch networks, including the Internet; from one of multiple points of origin or patients, to one or multiple points of data analysis for physician review. The preferred architecture further supports multiple-site data buffering for redundancy and reliability, and real-time data collection, routing, and viewing (or slower than real-time processes when communications infrastructure is slower than the data collection rate).Type: GrantFiled: April 18, 2002Date of Patent: April 5, 2005Assignee: Cybernet Systems CorporationInventors: Jeffrey C. Braun, Charles J. Jacobus, Scott Booth, Michael Suarez, Derek Smith, Jeff Hartnagle, Glenn Leprell
-
Patent number: 6801637Abstract: An optical system tracks the motion of objects, including the human body or portions thereof using a plurality of three-dimensional active markers based upon triangulation from data read via multiple linear CCDs through cylindrical lenses. Each marker is lit in sequence so that it is in sync with a frame capture using the imaging system positioned and oriented so as to provide a basis for computing three-dimensional location. In the preferred embodiment, the imaging system detects an infrared signal which is sent out by the tag controller as part of the tag/marker illumination sequence at the beginning of the first tag position capture time. The controller then traverses through the tags in time sync with each imaging system frame capture cycle. Thus, only one unique tag will be lit during each image capture of the cameras, thereby simplifying identification. Using linear CCD sensors, the frame time (i.e.Type: GrantFiled: February 22, 2001Date of Patent: October 5, 2004Assignee: Cybernet Systems CorporationInventors: Nestor Voronka, Charles J. Jacobus
-
Patent number: 6801008Abstract: A system and method for providing a tactile virtual reality to a user is present. The position and orientation of the user is utilized to generate a virtual reality force field. Forces are in turn generated on the user as a function of this force field. A six-axis manipulator is presented for providing a user interface to such a system. This manipulator provides a unique kinematic structure with two constant force springs which provide gravity compensation so that the manipulator effectively floats.Type: GrantFiled: August 14, 2000Date of Patent: October 5, 2004Assignee: Immersion CorporationInventors: Charles J. Jacobus, Alan J. Riggs, Mark J Taylor
-
Publication number: 20040164960Abstract: A system and method for providing a tactile virtual reality to a user is present. The position and orientation of the user is utilized to generate a virtual reality force field. Forces are in turn generated on the user as a function of this force field. A six-axis manipulator is presented for providing a user interface to such a system. This manipulator provides a unique kinematic structure with two constant force springs, which provide gravity compensation so that the manipulator effectively floats.Type: ApplicationFiled: February 27, 2004Publication date: August 26, 2004Inventors: Charles J. Jacobus, Alan J. Riggs, Mark J. Taylor
-
Publication number: 20040161132Abstract: A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description.Type: ApplicationFiled: January 16, 2004Publication date: August 19, 2004Inventors: Charles J. Cohen, Glenn Beach, Brook Cavell, Gene Foulk, Charles J. Jacobus, Jay Obermark, George Paul
-
Patent number: RE39906Abstract: Force feedback in large, immersive environments is provided by device which a gyro- stabilization to generate a fixed point of leverage for the requisite forces and/or torques. In one embodiment, one or more orthogonally oriented rotating gyroscopes are used to provide a stable platform to which a force-reflecting device can be mounted, thereby coupling reaction forces to a user without the need for connection to a fixed frame. In one physical realization, a rigid handle or joystick is directly connected to the three-axis stabilized platform and using an inventive control scheme to modulate motor torques so that only the desired forces are felt. In an alternative embodiment, a reaction sphere is used to produce the requisite inertial stabilization. Since the sphere is capable of providing controlled torques about three arbitrary, linearly independent axes, it can be used in place of three reaction wheels to provide three-axis stabilization for a variety of space-based and terrestrial applications.Type: GrantFiled: June 21, 2001Date of Patent: November 6, 2007Assignee: Immersion CorporationInventors: Gerald P. Roston, Charles J. Jacobus