Patents by Inventor Michael Gassner
Michael Gassner has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11958912Abstract: Herein is reported a method for determining the binding interaction with a multimeric antigen of an antibody of the human IgG1 subclass that has at least two binding sites specifically binding to the antigen comprising the steps of 1) determining the binding affinity of the antibody for the multimeric antigen, and 2) incubating a mixture comprising the antibody and a polypeptide that is derived from lysine-gingipain of Porphyromonas gingivalis under conditions and for a time sufficient to cleave the antibody into Fabs and Fc-region, and determining the binding affinity of the Fabs of the antibody for the multimeric, whereby the binding affinity of the antibody to the multimeric antigen to be affinity-driven if the binding affinity determined in both steps are comparable and to be avidity-driven if the binding affinity determined in both steps are different.Type: GrantFiled: October 2, 2020Date of Patent: April 16, 2024Assignee: HOFFMANN-LA ROCHE INC.Inventors: Joerg Moelleken, Michael Molhoj, Christian Gassner, Manuel Endesfelder, Joerg-Thomas Regula
-
Patent number: 11886637Abstract: A method and system for rendering an augmented reality scene on a mobile computing device tracks a real-world scene and/or a viewpoint of a user with one or more event-based vision sensors and blends an augmented reality scene displayed on the mobile computing device based on the viewpoint of the user and a scene map of the real-world scene and on the tracking of the one or more event-based vision sensors. Event-based vision sensors offer many advantages, mainly by intrinsically compressing the data stream and thus reducing the amount of data that a processing unit needs to perform. Furthermore, the event-based vision sensor pixels continuously sense the visual changes in the scene and report them with a very low latency. This makes the event-based vision sensor an ideal sensor for always-on tasks such as visual tracking and smart sensor control or data enhancement of secondary sensing modalities.Type: GrantFiled: December 13, 2022Date of Patent: January 30, 2024Assignee: Sony Advanced Visual Sensing AGInventors: Christian Brandli, Raphael Berner, Michael Gassner
-
Publication number: 20230185372Abstract: A method and system for rendering an augmented reality scene on a mobile computing device tracks a real-world scene and/or a viewpoint of a user with one or more event-based vision sensors and blends an augmented reality scene displayed on the mobile computing device based on the viewpoint of the user and a scene map of the real-world scene and on the tracking of the one or more event-based vision sensors. Event-based vision sensors offer many advantages, mainly by intrinsically compressing the data stream and thus reducing the amount of data that a processing unit needs to perform. Furthermore, the event-based vision sensor pixels continuously sense the visual changes in the scene and report them with a very low latency. This makes the event-based vision sensor an ideal sensor for always-on tasks such as visual tracking and smart sensor control or data enhancement of secondary sensing modalities.Type: ApplicationFiled: December 13, 2022Publication date: June 15, 2023Inventors: Christian Brandli, Raphael Berner, Michael Gassner
-
Patent number: 11526209Abstract: A method and system for rendering an augmented reality scene on a mobile computing device tracks a real-world scene and/or a viewpoint of a user with one or more event-based vision sensors and blends an augmented reality scene displayed on the mobile computing device based on the viewpoint of the user and a scene map of the real-world scene and on the tracking of the one or more event-based vision sensors. Event-based vision sensors offer many advantages, mainly by intrinsically compressing the data stream and thus reducing the amount of data that a processing unit needs to perform. Furthermore, the event-based vision sensor pixels continuously sense the visual changes in the scene and report them with a very low latency. This makes the event-based vision sensor an ideal sensor for always-on tasks such as visual tracking and smart sensor control or data enhancement of secondary sensing modalities.Type: GrantFiled: January 21, 2020Date of Patent: December 13, 2022Assignee: Sony Advanced Visual Sensing AGInventors: Christian Brändli, Raphael Berner, Michael Gassner
-
Publication number: 20220066552Abstract: A method and system for rendering an augmented reality scene on a mobile computing device tracks a real-world scene and/or a viewpoint of a user with one or more event-based vision sensors and blends an augmented reality scene displayed on the mobile computing device based on the viewpoint of the user and a scene map of the real-world scene and on the tracking of the one or more event-based vision sensors. Event-based vision sensors offer many advantages, mainly by intrinsically compressing the data stream and thus reducing the amount of data that a processing unit needs to perform. Furthermore, the event-based vision sensor pixels continuously sense the visual changes in the scene and report them with a very low latency. This makes the event-based vision sensor an ideal sensor for always-on tasks such as visual tracking and smart sensor control or data enhancement of secondary sensing modalities.Type: ApplicationFiled: January 21, 2020Publication date: March 3, 2022Inventors: Christian Brändli, Raphael Berner, Michael Gassner
-
Publication number: 20070035712Abstract: A method of using an in-situ aerial image sensor array is disclosed to separate and remove the focal plane variations caused by the image sensor array non-flatness and/or by the exposure tool by collecting sensor image data at various nominal focal planes and by determining best focus at each sampling location by analysis of the through-focus data. In various embodiments, the method provides accurate image data at best focus anywhere in the exposure field, image data covering an exposure-dose based process window area, and a map of effective focal plane distortions. The focus map can be separated into contributions from the exposure tool and contributions due to topography of the image sensor array by suitable calibration or self-calibration procedures. The basic method enables a wide range of applications, including for example qualification testing, process monitoring, and process control by deriving optimum process corrections from analysis of the image sensor data.Type: ApplicationFiled: August 2, 2006Publication date: February 15, 2007Applicant: BRION TECHNOLOGIES, INC.Inventors: Michael Gassner, Stefan Hunsche, Yu Cao, Jun Ye, Moshe Preil
-
Publication number: 20060273266Abstract: One embodiment of a method for detecting, sampling, analyzing, and correcting hot spots in an integrated circuit design allows the identification of the weakest patterns within each design layer, the accurate determination of the impact of process drifts upon the patterning performance of the real mask in a real scanner, and the optimum process correction, process monitoring, and RET improvements to optimize integrated circuit device performance and yield. The combination of high speed simulation coupled with massive data collection capability on actual aerial images and/or resist images at the specific patterns of interest provides a complete methodology for optimum RET implementation and process monitoring.Type: ApplicationFiled: May 19, 2006Publication date: December 7, 2006Applicant: BRION TECHNOLOGIES, INC.Inventors: Moshe Preil, Jun Ye, James Wiley, Shauh-Teh Juang, Michael Gassner
-
Publication number: 20050288893Abstract: There are many inventions described and illustrated herein. In one aspect, the present invention is directed to a technique of, and system for autonomously monitoring fabrication equipment, for example, integrated circuit fabrication equipment. In one embodiment of this aspect of the invention, the present invention is an autonomous monitoring device including one or more event sensors (for example, acceleration, motion, velocity and/or inertial sensing device(s)) to detect a predetermined event of or by the fabrication equipment (for example, an event that is indicative of the onset, commencement, initiation and/or launch of fabrication process or sub-processes of or by the fabrication equipment). In response thereto, one or more process parameter sensors sample, sense, detect, characterize, analyze and/or inspect one or more parameters of the process in real time (i.e., during the fabrication process).Type: ApplicationFiled: June 1, 2004Publication date: December 29, 2005Inventor: Michael Gassner