Patents by Inventor Colin Swindells

Colin Swindells has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200393903
    Abstract: Multi-directional kinesthetic actuation systems are provided. The multi-directional kinesthetic actuation systems are configured to provide kinesthetic effects in multiple directions through both pulling and pushing forces. Multi-directional kinesthetic actuation systems include at least an active linkage, one or more hinges, and a motor. The motor is employed to advance or retract the active linkage. The active linkage is activated to provide increased buckling strength to transfer force to the hinges and deactivated to increase flexibility to facilitate retraction by the motor. The hinges are configured to translate the pushing or pulling force provided by the active linkage into a torque to be provided to a user's finger.
    Type: Application
    Filed: June 11, 2019
    Publication date: December 17, 2020
    Inventors: Vahid KHOSHKAVA, Robert LACROIX, Colin SWINDELLS, Sanya ATTARI
  • Patent number: 10665067
    Abstract: Systems and methods for integrating haptics overlay in augmented reality are disclosed. One illustrative system described herein includes a haptic output device. The system also includes a display configured to output a visual effect. The system also includes a sensor for tracking a position of a proxy object. The system also includes a processor configured to: determine a modified visual effect based in part on data received from the sensor, determine a haptic effect based in part on data received from the sensor, transmit a display signal associated with the modified visual effect to the display, transmit a haptic signal associated with the haptic effect to the haptic output device; and output the haptic effect using the haptic output device.
    Type: Grant
    Filed: June 15, 2018
    Date of Patent: May 26, 2020
    Assignee: Immersion Corporation
    Inventors: Satoshi Araki, Christopher J. Ullrich, Liwen Wu, Juan Manuel Cruz-Hernandez, Danny A. Grant, Sanya Attari, Colin Swindells
  • Publication number: 20200159326
    Abstract: A hand-held device for providing haptic feedback includes an elongated housing, a mass, a mass restriction device, a first sensor and a second sensor. The elongated housing includes at least two chambers. The mass is slidably disposed within the chambers and is slidable by gravity. The mass restriction device restricts the mass within at least one of the chambers. The first sensor is configured to sense an orientation of the elongated housing and the second sensor is configured to sense a location of the mass within the elongated housing relative to the chambers. In response to a command signal indicative of a virtual interaction related to manipulating a virtual object, the mass restriction device restricts the mass within at least one of the chambers to effect a perceived change in weight as the virtual object is manipulated by the user.
    Type: Application
    Filed: November 21, 2018
    Publication date: May 21, 2020
    Inventors: William S. RIHN, Colin SWINDELLS
  • Patent number: 10583359
    Abstract: A wearable device for providing haptic effects includes a wearable housing configured to be worn on to a portion of a hand of a wearer and an actuator secured to the wearable housing. The wearable housing includes a first digit segment configured to conform to a finger of the hand and a second digit segment configured to conform to another finger or a thumb of the hand. The actuator is configured to receive a command signal indicative of a virtual interaction related to touching or grasping a virtual object. In response to the command signal, the actuator provides a force onto the portion of the hand or provides a force to render a resistance to movement of the first and second digit segments toward each other. The actuator uses the wearable housing to mechanically stabilize the force towards the portion of the hand.
    Type: Grant
    Filed: December 28, 2017
    Date of Patent: March 10, 2020
    Assignee: IMMERSION CORPORATION
    Inventors: Robert Heubel, Juan Manuel Cruz-Hernandez, Vahid Khoshkava, Sanya Attari, Colin Swindells, Satoshi Araki
  • Publication number: 20200026354
    Abstract: Systems and methods are provided for generating haptic effects adapted to a dynamic system associated with a user's dynamic interaction with a haptic enabled apparatus. The systems and methods operates to monitor a dynamic change in the dynamic system and automatically modify haptic rendering in real time so that the user can feel consistent haptic effects adapted to the change in the dynamic system.
    Type: Application
    Filed: July 17, 2018
    Publication date: January 23, 2020
    Inventor: Colin SWINDELLS
  • Publication number: 20190385419
    Abstract: Systems and methods for integrating haptics overlay in augmented reality are disclosed. One illustrative system described herein includes a haptic output device. The system also includes a display configured to output a visual effect. The system also includes a sensor for tracking a position of a proxy object. The system also includes a processor configured to: determine a modified visual effect based in part on data received from the sensor, determine a haptic effect based in part on data received from the sensor, transmit a display signal associated with the modified visual effect to the display, transmit a haptic signal associated with the haptic effect to the haptic output device; and output the haptic effect using the haptic output device.
    Type: Application
    Filed: June 15, 2018
    Publication date: December 19, 2019
    Inventors: Satoshi Araki, Christopher J. Ullrich, Liwen Wu, Juan Manuel Cruz-Hernandez, Danny A. Grant, Sanya Attari, Colin Swindells
  • Publication number: 20190354183
    Abstract: Systems and methods for providing kinesthetic feedback for virtual and augmented reality controllers are disclosed. One illustrative system described herein includes a interface device including a virtual or augmented reality controller configured to receive input from a user and output a controller signal and a haptic output device coupled to the virtual or augmented reality controller and to a mechanical ground, the haptic output device configured to output haptic effects. The system also includes a processor coupled to the virtual or augmented reality controller and the haptic output device, the processor configured to: receive the controller signal; determine a haptic effect based in part on the controller signal; and transmit a haptic signal associated with the haptic effect to the haptic output device.
    Type: Application
    Filed: May 16, 2018
    Publication date: November 21, 2019
    Inventors: Colin Swindells, William S. Rihn
  • Publication number: 20190204929
    Abstract: Devices and methods for dynamic association of user inputs to mobile device actions are provided. User sensing panels associated with a mobile device may detect the presence or contact of multiple hand parts. A signal from the user sensing panels indicative of the presence or contact may be associated with a mobile device action. To associate the signal with the mobile device action, a processor associated with the mobile device may determine the identities of the hand part and may recognize user gestures of the identified hand parts. A processor may cause the execution of the mobile device action upon receipt of the signal.
    Type: Application
    Filed: December 29, 2017
    Publication date: July 4, 2019
    Inventors: Sanya ATTARI, Colin Swindells
  • Publication number: 20190201785
    Abstract: A wearable device for providing haptic effects includes a wearable housing configured to be worn on to a portion of a hand of a wearer and an actuator secured to the wearable housing. The wearable housing includes a first digit segment configured to conform to a finger of the hand and a second digit segment configured to conform to another finger or a thumb of the hand. The actuator is configured to receive a command signal indicative of a virtual interaction related to touching or grasping a virtual object. In response to the command signal, the actuator provides a force onto the portion of the hand or provides a force to render a resistance to movement of the first and second digit segments toward each other. The actuator uses the wearable housing to mechanically stabilize the force towards the portion of the hand.
    Type: Application
    Filed: December 28, 2017
    Publication date: July 4, 2019
    Inventors: Robert Heubel, Juan Manuel CRUZ-HERNANDEZ, Vahid KHOSHKAVA, Sanya ATTARI, Colin Swindells, Satoshi ARAKI
  • Patent number: 7736000
    Abstract: An eye tracking apparatus includes a first image detector for capturing scene images, a second image detector fixed relative to the first image detector for capturing eye images including a pupil of a person and a reference object located within an image detection field, a processor for receiving the scene images and the eye images and outputting eye tracking images, each of the eye tracking images including a gaze point corresponding to a line of sight of the person, the processor for determining a location of the gaze point based on a calibration mapping that maps pupil location within the image detection field to the scene images and a correction mapping that corrects for movement of the reference object within the image detection field of the second image detector. Wherein the first image detector and the second image detector are coupled to a wearable accessory and movement of the reference object within the image detection field corresponds to movement of the wearable accessory relative to the person.
    Type: Grant
    Filed: February 25, 2009
    Date of Patent: June 15, 2010
    Assignee: Locarna Systems, Inc.
    Inventors: Mario Enriquez, Colin Swindells, Ricardo Pedrosa
  • Publication number: 20100128118
    Abstract: A method for identifying a visual fixation in an eye tracking video including: locating eye gaze coordinates in a first frame of a video, defining a spatial region surrounding the eye gaze coordinates, identifying and marking consecutive video frames having an eye gaze coordinate location within the spatial region. Wherein the consecutive video frames span at least a minimum fixation time and define a visual fixation.
    Type: Application
    Filed: November 25, 2009
    Publication date: May 27, 2010
    Applicant: LOCARNA SYSTEMS, INC.
    Inventors: Colin SWINDELLS, Mario ENRIQUEZ, Ricardo PEDROSA
  • Publication number: 20100053555
    Abstract: An eye tracking apparatus includes a first image detector for capturing scene images, a second image detector fixed relative to the first image detector for capturing eye images including a pupil of a person and a reference object located within an image detection field, a processor for receiving the scene images and the eye images and outputting eye tracking images, each of the eye tracking images including a gaze point corresponding to a line of sight of the person, the processor for determining a location of the gaze point based on a calibration mapping that maps pupil location within the image detection field to the scene images and a correction mapping that corrects for movement of the reference object within the image detection field of the second image detector. Wherein the first image detector and the second image detector are coupled to a wearable accessory and movement of the reference object within the image detection field corresponds to movement of the wearable accessory relative to the person.
    Type: Application
    Filed: February 25, 2009
    Publication date: March 4, 2010
    Applicant: LOCARNA SYSTEMS, INC.
    Inventors: Mario ENRIQUEZ, Colin SWINDELLS, Ricardo PEDROSA