Patents by Inventor Stephen H Lane

Stephen H Lane has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11893696
    Abstract: Methods, systems, and computer readable media for providing an extended reality (XR) user interface. A method for providing an XR user interface occurs at a user device executing an XR application.
    Type: Grant
    Filed: August 25, 2021
    Date of Patent: February 6, 2024
    Assignee: THE TRUSTEES OF THE UNIVERSITY OF PENNSYLVANIA
    Inventors: Stephen H. Lane, Matthew Anthony Boyd-Surka, Yaoyi Bai, Aline Sarah Normoyle
  • Publication number: 20220068029
    Abstract: Methods, systems, and computer readable media for providing an extended reality (XR) user interface. A method for providing an XR user interface occurs at a user device executing an XR application.
    Type: Application
    Filed: August 25, 2021
    Publication date: March 3, 2022
    Inventors: Stephen H. Lane, Matthew Anthony Boyd-Surka, Yaoyi Bai, Aline Sarah Normoyle
  • Patent number: 10387032
    Abstract: Methods, systems, and computer readable media for receiving user input. According to one example method for receiving user input, the method includes identifying gestures from directional movements of a user's fingers on a touch sensor, mapping the gestures to alphanumeric characters, and outputting alphanumeric characters, including defining a set of the gestures as different sequences of at least two of: a contact event, a no contact event, a hold event, a finger movement in a first direction, a finger movement in a second direction.
    Type: Grant
    Filed: March 4, 2016
    Date of Patent: August 20, 2019
    Assignee: The Trustees of the University of Pennsylvania
    Inventors: Stephen H. Lane, Ji Hyun Kong
  • Publication number: 20180039402
    Abstract: Methods, systems, and computer readable media for receiving user input. According to one example method for receiving user input, the method includes identifying gestures from directional movements of a user's fingers on a touch sensor, mapping the gestures to alphanumeric characters, and outputting alphanumeric characters, including defining a set of the gestures as different sequences of at least two of: a contact event, a no contact event, a hold event, a finger movement in a first direction, a finger movement in a second direction.
    Type: Application
    Filed: March 4, 2016
    Publication date: February 8, 2018
    Inventors: Stephen H. Lane, Ji Hyun Kong
  • Patent number: 9067097
    Abstract: Apparatus and methods are disclosed for a virtual locomotion controller user interface and system that combines data obtained from various sensor devices to allow users to control the movements of their representation in a virtual world using sensorimotor responses closely resembling the tasks and actions they would physically perform in the real world. As a result, users can specify an avatar's locomotion style by assuming body postures normally associated with that type of movement, while controlling locomotion speed or displacement through foot forces and/or stepping motions and locomotion direction through foot and body orientation.
    Type: Grant
    Filed: April 9, 2010
    Date of Patent: June 30, 2015
    Assignee: soVoz, Inc.
    Inventors: Stephen H. Lane, Joseph Schnurr, Hemanth K. Satyanarayana, Liming Zhao
  • Publication number: 20110009241
    Abstract: Apparatus and methods are disclosed for a virtual locomotion controller user interface and system that combines data obtained from various sensor devices to allow users to control the movements of their representation in a virtual world using sensorimotor responses closely resembling the tasks and actions they would physically perform in the real world. As a result, users can specify an avatar's locomotion style by assuming body postures normally associated with that type of movement, while controlling locomotion speed or displacement through foot forces and/or stepping motions and locomotion direction through foot and body orientation.
    Type: Application
    Filed: April 9, 2010
    Publication date: January 13, 2011
    Applicant: SOVOZ, INC.
    Inventors: Stephen H. Lane, Joseph Schnurr, Hemanth K. Satyanarayana, Liming Zhao
  • Patent number: 6191798
    Abstract: A method and apparatus for interactively controlling and coordinating the limb movements of computer-generated articulated characters with an arbitrary number of joints. On-line computational methods are used for animating limb movements of articulated characters by solving associated forward and inverse kinematics problems in real time subject to multiple goals and constraints. The methods provide computer animated characters with fully interactive goal-directed behaviors, such as bipedal walking, through simultaneous satisfaction of position, alignment, posture, balance, obstacle avoidance, and joint limitation constraints. Goal-based motion primitives, called synergies, coordinate sets of joint movements which separately attempt to satisfy each of the above constraints.
    Type: Grant
    Filed: March 31, 1997
    Date of Patent: February 20, 2001
    Assignee: Katrix, Inc.
    Inventors: David A Handelman, Stephen H Lane, Vijaykumar Gullapalli
  • Patent number: 6088042
    Abstract: Recorded motion data is combined with interactive control techniques to manipulate the animation of articulated figures. The methods enable computer animated characters to produce fully interactive goal-directed behaviors, such as bipedal walking, through simultaneous satisfaction of position, alignment, posture, balance, obstacle avoidance, and joint limitation constraints while retaining qualitative characteristics of the original non-interactive motion data. Goal-based motion primitives, called synergies, are used to coordinate sets of joint movements that attempt to satisfy each of the above constraints.
    Type: Grant
    Filed: March 31, 1997
    Date of Patent: July 11, 2000
    Assignee: Katrix, Inc.
    Inventors: David A Handelman, Stephen H Lane, Vijaykumar Gullapalli
  • Patent number: 6057859
    Abstract: A method and apparatus for interactively controlling and coordinating the limb movements of computer-generated articulated. On-line computational methods are used for animating limb movements of articulated characters by solving associated forward and inverse kinematics problems in real time subject to multiple goals and constraints. The methods fully interactive goal-directed behaviors, such as bipedal walking, through simultaneous satisfaction of position, alignment, posture, balance, obstacle avoidance, and joint limitation constraints. Goal-based motion primitives, called synergies, coordinate sets of joint movements which separately attempt to satisfy each of the above constraints. The present methods adapt character movements on-line to accommodate uneven terrain, body modifications, or changes in the environment by automatically transforming and producing joint rotations relative to the instantaneous point of contact of the body with the world.
    Type: Grant
    Filed: March 31, 1997
    Date of Patent: May 2, 2000
    Assignee: Katrix, Inc.
    Inventors: David A. Handelman, Stephen H Lane, Vijaykumar Gullapalli