Patents by Inventor Ian M. Sands

Ian M. Sands has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20100241348
    Abstract: Navigation information may be provided. First, a destination location may be received at a portable device. Next, a current location of the portable device maybe detected. Then, at least one way-point may be calculated based on the current location and the destination location. An orientation and a level of the portable device may be determined and the at least one way-point may then be projected from the portable device.
    Type: Application
    Filed: March 19, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100241999
    Abstract: User interface manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture's detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture's detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture's detection, the 3D UI representation may be converted back into the 2D UI representation.
    Type: Application
    Filed: March 19, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100241987
    Abstract: A tear-drop way-finding user interface (UI) may be provided. A first UI portion corresponding to a device location may be provided. In addition, an object may be displayed at a first relative position within the first UI portion. Then, upon a detected change in device location, a second UI portion corresponding to the changed device location may be provided. In response to the changed device location, a second relative position of the object may be calculated. Next, a determination may be made as to whether the second relative position of the object is within a displayable range of the second UI portion. If the second relative position of the object is not within the displayable range of the second UI portion, then a tear-drop icon indicative of the second relative position of the object may be displayed at an edge of the second UI portion.
    Type: Application
    Filed: March 19, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100240390
    Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.
    Type: Application
    Filed: March 19, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100225595
    Abstract: The claimed subject matter provides a system and/or a method that facilitates distinguishing input among one or more users in a surface computing environment. A variety of information can be obtained and analyzed to infer an association between a particular input and a particular user. Touch point information can be acquired from a surface wherein the touch point information relates to a touch point. In addition, one or more environmental sensors can monitor the surface computing environment and provide environmental information. The touch point information and the environmental information can be analyzed to determine direction of inputs, location of users, and movement of users and so on. Individual analysis results can be correlated and/or aggregated to generate a inference of association between a touch point and user.
    Type: Application
    Filed: March 3, 2009
    Publication date: September 9, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Stephen E. Hodges, Hrvoje Benko, Ian M. Sands, David Alexander Butler, Shahram Izadi, William Ben Kunz, Kenneth P. Hinckley
  • Publication number: 20100218249
    Abstract: The claimed subject matter provides a system and/or a method that facilitates authentication of a user in a surface computing environment. A device or authentication object can be carried by a user and employed to retain authentication information. An authentication component can obtain the authentication information from the device and analyze the information to verify an identity of the user. A touch input component can ascertain if a touch input is authentication by associating touch input with the user. In addition, authentication information can be employed to establish a secure communications channel for transfer of user data.
    Type: Application
    Filed: February 25, 2009
    Publication date: August 26, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Andrew D. Wilson, Stephen E. Hodges, Peter B. Thompson, Meredith June Morris, Paul Armistead Hoover, William J. Westerinen, Steven N. Bathiche, Ian M. Sands, Shahram Izadi, David Alexander Butler, Matthew B. MacLaurin, Arthur T. Whitten, William Ben Kunz, Shawn R. LeProwse, Hrvoje Benko
  • Publication number: 20100149090
    Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.
    Type: Application
    Filed: December 15, 2008
    Publication date: June 17, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
  • Publication number: 20100107219
    Abstract: Within a surface computing environment users are provided a seamless and intuitive manner of modifying security levels associated with information. If a modification is to be made the user can perceive the modifications and the result of such modifications, such as on a display. When information is rendered within the surface computing environment and a condition changes, the user can quickly have that information concealed in order to mitigate unauthorized access to the information.
    Type: Application
    Filed: February 26, 2009
    Publication date: April 29, 2010
    Applicant: Microsoft Corporation
    Inventors: Peter B. Thompson, Ian M. Sands, Ali M. Vassigh, Eric I-Chao Chang
  • Publication number: 20090215471
    Abstract: A user of a mobile device is able to display information about objects in the surrounding environment and to optionally interact with those objects. The information may be displayed as a graphical overlay on top of a real-time display of imagery from a camera in the mobile device with the overlay indexed to the real-time display. The graphical overlay may include positional information about an external object and may include navigational information intended to assist the user in moving to the object's location. There may also be a graphical user interface which allows the user to utilize the mobile device to interact with an external object.
    Type: Application
    Filed: February 21, 2008
    Publication date: August 27, 2009
    Applicant: Microsoft Corporation
    Inventors: Ian M. Sands, V. Kevin Russ