Patents by Inventor Ross N. Luengen
Ross N. Luengen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20130328775Abstract: User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped.Type: ApplicationFiled: August 12, 2013Publication date: December 12, 2013Applicant: Microsoft CorporationInventors: Gavin M. Gear, Ross N. Luengen, Michael C. Miller
-
Patent number: 8514242Abstract: Enhanced user interface elements in ambient light is described. In embodiment(s), a sensor input can be received from light sensor(s) that detect ambient light proximate an integrated display of a portable device. A determination can be made that the ambient light detracts from the visibility of user interface elements displayed in a user interface on the integrated display, and graphic components of a user interface element can be modified to enhance the visibility of the user interface element for display in the ambient light.Type: GrantFiled: October 24, 2008Date of Patent: August 20, 2013Assignee: Microsoft CorporationInventors: Ross N. Luengen, Michael H. LaManna, Gavin M. Gear
-
Patent number: 8508475Abstract: User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped.Type: GrantFiled: October 24, 2008Date of Patent: August 13, 2013Assignee: Microsoft CorporationInventors: Gavin M. Gear, Ross N. Luengen, Michael C. Miller
-
Publication number: 20130176316Abstract: Panning animation techniques are described. In one or more implementations, an input is recognized by a computing device as corresponding to a panning animation. A distance is calculated that is to be traveled by the panning animation in a user interface output by computing device, the distance limited by a predefined maximum distance. The panning animation is output by the computing device to travel the calculated distance.Type: ApplicationFiled: January 6, 2012Publication date: July 11, 2013Applicant: MICROSOFT CORPORATIONInventors: Megan A. Bates, Song Zou, Shaojie Zhang, Ross N. Luengen
-
Publication number: 20130169649Abstract: Movement endpoint exposure techniques are described. In one or more implementations, an input is received by a computing device to cause output of an animation involving movement in a user interface. Responsive to the receipt of the input, an endpoint is exposed to software of the computing device that is associated with the user interface, such as applications and controls. The endpoint references a particular location in the user interface at which the animation is calculated to end for the input.Type: ApplicationFiled: January 4, 2012Publication date: July 4, 2013Applicant: MICROSOFT CORPORATIONInventors: Megan A. Bates, Song Zou, Shaojie Zhang, Ross N. Luengen
-
Publication number: 20130169648Abstract: Cumulative movement animation techniques are described. In one or more implementations, an output a first animation is initiated that involves a display of movement in a user interface of a computing device. An input is received by the computing device during the output of the first animation, the input configured to cause a second display of movement in the user interface. Responsive to the receipt of the input, a remaining portion of the movement of the first animation is output along with the movement of the second animation by the computing device.Type: ApplicationFiled: January 4, 2012Publication date: July 4, 2013Applicant: MICROSOFT CORPORATIONInventors: Megan A. Bates, Song Zou, Shaojie Zhang, Ross N. Luengen
-
Publication number: 20130067420Abstract: Semantic zoom techniques are described. In one or more implementations, techniques are described that may be utilized by a user to navigate to content of interest. These techniques may also include a variety of different features, such as to support semantic swaps and zooming “in” and “out.” These techniques may also include a variety of different input features, such as to support gestures, cursor-control device, and keyboard inputs. A variety of other features are also supported as further described in the detailed description and figures.Type: ApplicationFiled: September 9, 2011Publication date: March 14, 2013Inventors: Theresa B. Pittappilly, Rebecca Deutsch, Orry W. Soegiono, Nicholas R. Waggoner, Holger Kuehnle, William D. Carr, Ross N. Luengen, Paul J. Kwiatkowski, Jan-Kristian Markiewicz, Gerrit H. Hofmeester, Robert Disano, Justin S. Myhres
-
Publication number: 20130067391Abstract: Semantic zoom techniques are described. In one or more implementations, techniques are described that may be utilized by a user to navigate to content of interest. These techniques may also include a variety of different features, such as to support semantic swaps and zooming “in” and “out.” These techniques may also include a variety of different input features, such as to support gestures, cursor-control device, and keyboard inputs. A variety of other features are also supported as further described in the detailed description and figures.Type: ApplicationFiled: September 9, 2011Publication date: March 14, 2013Inventors: Theresa B. Pittappilly, Rebecca Deutsch, Orry W. Soegiono, Nicholas R. Waggoner, Holger Kuehnle, William D. Carr, Ross N. Luengen, Paul J. Kwiatkowski, Jan-Kristian Markiewicz, Gerrit H. Hofmeester, Robert Disano
-
Publication number: 20130067398Abstract: Semantic zoom techniques are described. In one or more implementations, techniques are described that may be utilized by a user to navigate to content of interest. These techniques may also include a variety of different features, such as to support semantic swaps and zooming “in” and “out.” These techniques may also include a variety of different input features, such as to support gestures, cursor-control device, and keyboard inputs. A variety of other features are also supported as further described in the detailed description and figures.Type: ApplicationFiled: September 9, 2011Publication date: March 14, 2013Inventors: Theresa B. Pittappilly, Rebecca Deutsch, Orry W. Soegiono, Nicholas R. Waggoner, Holger Kuehnle, Moneta Ho Kushner, William D. Carr, Ross N. Luengen, Paul J. Kwiatkowski, Adam George Barlow, Scott D. Hoogerwerf, Aaron W. Cardwell, Benjamin J. Karas, Michael J. Gilmore, Rolf A. Ebeling, Jan-Kristian Markiewicz, Gerrit H. Hofmeester, Robert Disano
-
Publication number: 20120311488Abstract: This document describes techniques and apparatuses for asynchronous handling of a user interface manipulation. These techniques handle a user interface manipulation with two or more asynchronous processes. One asynchronous process, for example, may determine a position responsive to the user interface manipulation while another asynchronous process determines the pixels to render. By so doing, these techniques enable a quick and/or consistent response to a user interface manipulation.Type: ApplicationFiled: June 1, 2011Publication date: December 6, 2012Applicant: MICROSOFT CORPORATIONInventors: Laurent Mouton, Nicolas J. Brun, Ross N. Luengen, Song Zou, Nicholas R. Waggoner
-
Publication number: 20120174005Abstract: This document describes content-based snap points and techniques that use these snap points. In some embodiments, multiple content-based snap points are used to stop at points in content that are convenient, prevent overshooting of important parts in the content, and/or aid user's in manipulating and consuming the content.Type: ApplicationFiled: December 31, 2010Publication date: July 5, 2012Applicant: MICROSOFT CORPORATIONInventors: Rebecca Deutsch, Bonny P. Lau, Holger Kuehnle, Nicholas R. Waggoner, Ross N. Luengen, Michael A. Nelte
-
Publication number: 20100103186Abstract: Enhanced user interface elements in ambient light is described. In embodiment(s), a sensor input can be received from light sensor(s) that detect ambient light proximate an integrated display of a portable device. A determination can be made that the ambient light detracts from the visibility of user interface elements displayed in a user interface on the integrated display, and graphic components of a user interface element can be modified to enhance the visibility of the user interface element for display in the ambient light.Type: ApplicationFiled: October 24, 2008Publication date: April 29, 2010Applicant: MICROSOFT CORPORATIONInventors: Ross N. Luengen, Michael H. LaManna, Gavin M. Gear
-
Publication number: 20100103098Abstract: User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped.Type: ApplicationFiled: October 24, 2008Publication date: April 29, 2010Inventors: Gavin M. Gear, Ross N. Luengen, Michael C. Miller
-
Patent number: 7389434Abstract: A method of controlling power management is provided. In an embodiment, the user provides feedback that the inactivity period before a display blanks is too short. In response to the user feedback, a behavior tracking mode is entered and the inactivity period is adjusted to a period that is more suitable to the user's needs. In an embodiment, the adjustment may be done through incrementing a counter and changing the inactivity period based on the value of the counter.Type: GrantFiled: August 30, 2005Date of Patent: June 17, 2008Assignee: Microsoft CorporationInventors: David Switzer, Geralyn M. Miller, Issa Y. Khoury, Matthew H. Holle, Michael S. Bernstein, Ross N. Luengen