Patents by Inventor John P. Pella

John P. Pella has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20200133387
    Abstract: A system for detecting facial movements of a user includes: a sensor assembly including: an elastic member 110 configured to couple to a headset and contact a face of a user, proximal and offset from ocular regions of the face of the user and a substrate that defines a set of discrete electrical channels and a set of electrode tabs distributed along the elastic member, wherein each electrode tab is configured to proximally contact a muscular region on the face of the user; and a controller configured to sample a set of sense signals from the set of electrode tabs via the set of discrete electrical channels and to interpret changes in the set of sense signals over time as expressions on the face of the user when the headset is worn by the user.
    Type: Application
    Filed: May 24, 2018
    Publication date: April 30, 2020
    Inventors: Kevin Lee, Ivan Roberto Reyes, John P. Pella, Quentin DeWolf
  • Publication number: 20190361519
    Abstract: A system for detecting facial movements of a user includes: a sensor assembly including: an elastic member 110 configured to couple to a headset and contact a face of a user, proximal and offset from ocular regions of the face of the user and a substrate that defines a set of discrete electrical channels and a set of electrode tabs distributed along the elastic member, wherein each electrode tab is configured to proximally contact a muscular region on the face of the user; and a controller configured to sample a set of sense signals from the set of electrode tabs via the set of discrete electrical channels and to interpret changes in the set of sense signals over time as expressions on the face of the user when the headset is worn by the user.
    Type: Application
    Filed: May 24, 2018
    Publication date: November 28, 2019
    Inventors: Kevin Lee, Ivan Roberto Reyes, John P. Pella, Quentin DeWolf
  • Publication number: 20190138096
    Abstract: A method for detecting facial emotions includes: recording a set of electromyograph signals through a set of sense electrodes arranged about a viewing window in a virtual reality headset; deducting a reference signal from each electromyograph signal in the set of electromyograph signals to generate a set of composite signals; for each composite signal in the set of composite signals, transforming the composite signal into a spectrum of electromyograph components; for each facial action unit in a set of facial action units, calculating a score indicating presence of the facial action unit in the user's facial musculature during the sampling interval based on the spectrum of electromyograph components; and mapping scores for the set of facial action units to a facial expression of the user during the sampling; transforming the facial expression of the user to an emotion of the user based on an emotion model.
    Type: Application
    Filed: August 22, 2018
    Publication date: May 9, 2019
    Inventors: Kevin Lee, Quentin DeWolf, John P. Pella, Ivan Roberto Reyes
  • Patent number: 9107613
    Abstract: A handheld, cordless scanning device for the three-dimensional image capture of patient anatomy without the use of potentially hazardous lasers, optical reference targets for frame alignment, magnetic reference receivers, or the requirement that the scanning device be plugged in while scanning. The device generally includes a housing having a front end and a rear end. The rear end includes a handle and trigger. The front end includes a pattern projector for projecting a unique pattern onto a target object and a camera for capturing live video of the projected pattern as it is deformed around the object. The front end of the housing also includes a pair of focus beam generators and an indexing beam generator. By utilizing data collected with the present invention, patient anatomy such as anatomical features and residual limbs may be digitized to create accurate three-dimensional representations which may be utilized in combination with computer-aided-drafting programs.
    Type: Grant
    Filed: May 11, 2012
    Date of Patent: August 18, 2015
    Assignee: Provel, Inc.
    Inventors: David G. Firth, Brendan O. Beardsley, John P. Pella
  • Publication number: 20130057652
    Abstract: A handheld, cordless scanning device for the three-dimensional image capture of patient anatomy without the use of potentially hazardous lasers, optical reference targets for frame alignment, magnetic reference receivers, or the requirement that the scanning device be plugged in while scanning. The device generally includes a housing having a front end and a rear end. The rear end includes a handle and trigger. The front end includes a pattern projector for projecting a unique pattern onto a target object and a camera for capturing live video of the projected pattern as it is deformed around the object. The front end of the housing also includes a pair of focus beam generators and an indexing beam generator. By utilizing data collected with the present invention, patient anatomy such as anatomical features and residual limbs may be digitized to create accurate three-dimensional representations which may be utilized in combination with computer-aided-drafting programs.
    Type: Application
    Filed: May 11, 2012
    Publication date: March 7, 2013
    Inventors: David G. Firth, Brendan O. Beardsley, John P. Pella
  • Patent number: 7187377
    Abstract: A method and system that archive a three-dimensional site in a highly-compact manner such that real-time, three-dimensional exploration and interaction with the site with high-resolution graphics is enabled. During authoring, information is collected about a site, and processed into a walkmap comprising a number of maps. A visibility map indicates which ones of the many polygons that make up a site are potentially visible from a given region. A collision map establishes where a can navigate in the site. A ground map tracks the terrain for reproducing camera heights and viewing angles, and a trigger map causes scripts to be fired from locations in the site. During navigation, only the maps relevant to a user's current position are active, whereby rapid rendering of an appropriate image for the user's current perspective is possible in real-time, providing a first person, perspective tour of the site in a perceived three-dimensional environment.
    Type: Grant
    Filed: June 28, 2002
    Date of Patent: March 6, 2007
    Assignee: Microsoft Corporation
    Inventors: John P. Pella, Yaacov Kory Kuriel, Charles A. Hale, Scott A. Jensen
  • Patent number: 6234802
    Abstract: A method and system for teaching a language and evaluating language comprehension in a digitally synthesized, interactive three-dimensional graphical representation of an environment. Within the environment, the user is given opportunities to practice language skills by interacting with digital videos of people. As the user walks through the environment and encounters various people therein, the people set forth challenges in the form of scripted questions and tasks that require the user to comprehend the language in order to correctly respond. A speech recognition engine interprets verbal responses of the user, thereby further simulating a real-world environment. A set of one or more scripts controls various aspects of the environment while testing and evaluating the user's comprehension of the language.
    Type: Grant
    Filed: January 26, 1999
    Date of Patent: May 22, 2001
    Assignee: Microsoft Corporation
    Inventors: John P. Pella, Quentin DeWolf, Peter C. Acker, Charles A. Hale, Renée Louise April, Jason T. Cortese, Victor J. Bondi