Patents by Inventor Ross N. Luengen

Ross N. Luengen has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20130328775
    Abstract: User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped.
    Type: Application
    Filed: August 12, 2013
    Publication date: December 12, 2013
    Applicant: Microsoft Corporation
    Inventors: Gavin M. Gear, Ross N. Luengen, Michael C. Miller
  • Patent number: 8514242
    Abstract: Enhanced user interface elements in ambient light is described. In embodiment(s), a sensor input can be received from light sensor(s) that detect ambient light proximate an integrated display of a portable device. A determination can be made that the ambient light detracts from the visibility of user interface elements displayed in a user interface on the integrated display, and graphic components of a user interface element can be modified to enhance the visibility of the user interface element for display in the ambient light.
    Type: Grant
    Filed: October 24, 2008
    Date of Patent: August 20, 2013
    Assignee: Microsoft Corporation
    Inventors: Ross N. Luengen, Michael H. LaManna, Gavin M. Gear
  • Patent number: 8508475
    Abstract: User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped.
    Type: Grant
    Filed: October 24, 2008
    Date of Patent: August 13, 2013
    Assignee: Microsoft Corporation
    Inventors: Gavin M. Gear, Ross N. Luengen, Michael C. Miller
  • Publication number: 20130176316
    Abstract: Panning animation techniques are described. In one or more implementations, an input is recognized by a computing device as corresponding to a panning animation. A distance is calculated that is to be traveled by the panning animation in a user interface output by computing device, the distance limited by a predefined maximum distance. The panning animation is output by the computing device to travel the calculated distance.
    Type: Application
    Filed: January 6, 2012
    Publication date: July 11, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Megan A. Bates, Song Zou, Shaojie Zhang, Ross N. Luengen
  • Publication number: 20130169649
    Abstract: Movement endpoint exposure techniques are described. In one or more implementations, an input is received by a computing device to cause output of an animation involving movement in a user interface. Responsive to the receipt of the input, an endpoint is exposed to software of the computing device that is associated with the user interface, such as applications and controls. The endpoint references a particular location in the user interface at which the animation is calculated to end for the input.
    Type: Application
    Filed: January 4, 2012
    Publication date: July 4, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Megan A. Bates, Song Zou, Shaojie Zhang, Ross N. Luengen
  • Publication number: 20130169648
    Abstract: Cumulative movement animation techniques are described. In one or more implementations, an output a first animation is initiated that involves a display of movement in a user interface of a computing device. An input is received by the computing device during the output of the first animation, the input configured to cause a second display of movement in the user interface. Responsive to the receipt of the input, a remaining portion of the movement of the first animation is output along with the movement of the second animation by the computing device.
    Type: Application
    Filed: January 4, 2012
    Publication date: July 4, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Megan A. Bates, Song Zou, Shaojie Zhang, Ross N. Luengen
  • Publication number: 20130067420
    Abstract: Semantic zoom techniques are described. In one or more implementations, techniques are described that may be utilized by a user to navigate to content of interest. These techniques may also include a variety of different features, such as to support semantic swaps and zooming “in” and “out.” These techniques may also include a variety of different input features, such as to support gestures, cursor-control device, and keyboard inputs. A variety of other features are also supported as further described in the detailed description and figures.
    Type: Application
    Filed: September 9, 2011
    Publication date: March 14, 2013
    Inventors: Theresa B. Pittappilly, Rebecca Deutsch, Orry W. Soegiono, Nicholas R. Waggoner, Holger Kuehnle, William D. Carr, Ross N. Luengen, Paul J. Kwiatkowski, Jan-Kristian Markiewicz, Gerrit H. Hofmeester, Robert Disano, Justin S. Myhres
  • Publication number: 20130067391
    Abstract: Semantic zoom techniques are described. In one or more implementations, techniques are described that may be utilized by a user to navigate to content of interest. These techniques may also include a variety of different features, such as to support semantic swaps and zooming “in” and “out.” These techniques may also include a variety of different input features, such as to support gestures, cursor-control device, and keyboard inputs. A variety of other features are also supported as further described in the detailed description and figures.
    Type: Application
    Filed: September 9, 2011
    Publication date: March 14, 2013
    Inventors: Theresa B. Pittappilly, Rebecca Deutsch, Orry W. Soegiono, Nicholas R. Waggoner, Holger Kuehnle, William D. Carr, Ross N. Luengen, Paul J. Kwiatkowski, Jan-Kristian Markiewicz, Gerrit H. Hofmeester, Robert Disano
  • Publication number: 20130067398
    Abstract: Semantic zoom techniques are described. In one or more implementations, techniques are described that may be utilized by a user to navigate to content of interest. These techniques may also include a variety of different features, such as to support semantic swaps and zooming “in” and “out.” These techniques may also include a variety of different input features, such as to support gestures, cursor-control device, and keyboard inputs. A variety of other features are also supported as further described in the detailed description and figures.
    Type: Application
    Filed: September 9, 2011
    Publication date: March 14, 2013
    Inventors: Theresa B. Pittappilly, Rebecca Deutsch, Orry W. Soegiono, Nicholas R. Waggoner, Holger Kuehnle, Moneta Ho Kushner, William D. Carr, Ross N. Luengen, Paul J. Kwiatkowski, Adam George Barlow, Scott D. Hoogerwerf, Aaron W. Cardwell, Benjamin J. Karas, Michael J. Gilmore, Rolf A. Ebeling, Jan-Kristian Markiewicz, Gerrit H. Hofmeester, Robert Disano
  • Publication number: 20120311488
    Abstract: This document describes techniques and apparatuses for asynchronous handling of a user interface manipulation. These techniques handle a user interface manipulation with two or more asynchronous processes. One asynchronous process, for example, may determine a position responsive to the user interface manipulation while another asynchronous process determines the pixels to render. By so doing, these techniques enable a quick and/or consistent response to a user interface manipulation.
    Type: Application
    Filed: June 1, 2011
    Publication date: December 6, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Laurent Mouton, Nicolas J. Brun, Ross N. Luengen, Song Zou, Nicholas R. Waggoner
  • Publication number: 20120174005
    Abstract: This document describes content-based snap points and techniques that use these snap points. In some embodiments, multiple content-based snap points are used to stop at points in content that are convenient, prevent overshooting of important parts in the content, and/or aid user's in manipulating and consuming the content.
    Type: Application
    Filed: December 31, 2010
    Publication date: July 5, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Rebecca Deutsch, Bonny P. Lau, Holger Kuehnle, Nicholas R. Waggoner, Ross N. Luengen, Michael A. Nelte
  • Publication number: 20100103186
    Abstract: Enhanced user interface elements in ambient light is described. In embodiment(s), a sensor input can be received from light sensor(s) that detect ambient light proximate an integrated display of a portable device. A determination can be made that the ambient light detracts from the visibility of user interface elements displayed in a user interface on the integrated display, and graphic components of a user interface element can be modified to enhance the visibility of the user interface element for display in the ambient light.
    Type: Application
    Filed: October 24, 2008
    Publication date: April 29, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Ross N. Luengen, Michael H. LaManna, Gavin M. Gear
  • Publication number: 20100103098
    Abstract: User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped.
    Type: Application
    Filed: October 24, 2008
    Publication date: April 29, 2010
    Inventors: Gavin M. Gear, Ross N. Luengen, Michael C. Miller
  • Patent number: 7389434
    Abstract: A method of controlling power management is provided. In an embodiment, the user provides feedback that the inactivity period before a display blanks is too short. In response to the user feedback, a behavior tracking mode is entered and the inactivity period is adjusted to a period that is more suitable to the user's needs. In an embodiment, the adjustment may be done through incrementing a counter and changing the inactivity period based on the value of the counter.
    Type: Grant
    Filed: August 30, 2005
    Date of Patent: June 17, 2008
    Assignee: Microsoft Corporation
    Inventors: David Switzer, Geralyn M. Miller, Issa Y. Khoury, Matthew H. Holle, Michael S. Bernstein, Ross N. Luengen