Patents by Inventor Kevin Russ

Kevin Russ has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20100306018
    Abstract: Meeting state recall may be provided. A meeting context may be saved at the end of and/or during an event. The meeting context may comprise, for example, a hardware configuration, a software configuration, a recording of the meeting, and/or data associated with a subject of the meeting. The meeting context may be associated with an ongoing project and may be restored at a subsequent meeting associated with the ongoing project.
    Type: Application
    Filed: May 27, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Russ Burtner, Kevin Russ, Ian Sands, John Snavely
  • Publication number: 20100306004
    Abstract: A computing system causes a plurality of display devices to display user interfaces containing portions of a canvas shared by a plurality of users. The canvas is a graphical space containing discrete graphical elements located at arbitrary locations within the canvas. Each of the discrete graphical elements graphically represents a discrete resource. When a user interacts with a resource in the set of resources, the computing system modifies the canvas to include an interaction element indicating that the user is interacting with the resource. The computer system then causes the display devices to update the user interfaces such that the user interfaces reflect a substantially current state of the canvas. In this way, the users may be able to understand which ones of the users are interacting with which ones of the resources.
    Type: Application
    Filed: May 26, 2009
    Publication date: December 2, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Russ Burtner, Kevin Russ, Ian Sands, John Snavely
  • Publication number: 20100302144
    Abstract: A virtual mouse input device is created in response to a placement of a card on a touch surface. When the card is placed on the touch surface, the boundaries of the card are captured and a virtual mouse appears around the card. The virtual mouse may be linked with a user through an identifier that is contained on the card. Other controls and actions may be presented in menus that appear with the virtual mouse. For instance, the user may select the type of input (e.g. mouse, keyboard, ink or trackball) driven by the business card. Once created, the virtual mouse is configured to receive user input until the card is removed from the touch surface. The virtual mouse is configured to move a cursor on a display in response to movement of the card on the touch surface.
    Type: Application
    Filed: May 28, 2009
    Publication date: December 2, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Edwin Russ Burtner, V. Kevin Russ, Ian M. Sands, John A. Snavely
  • Publication number: 20100295794
    Abstract: Embodiments of the present invention provide a dual-sided multi-touch computing device that offers the advantages of a keyboard in addition to the conveniences of a slate device. The dual-sided multi-touch computing device may be utilized in two orientations; one side is a multi-touch slate device, and the alternate side is a multi-touch display keyboard. The device is configured with an orientation-recognition device, so that it may be configured based on its orientation. The present invention may be utilized as a stand alone personal computer or as a peripheral device in conjunction with other devices.
    Type: Application
    Filed: May 20, 2009
    Publication date: November 25, 2010
    Applicant: Microsoft Corporation
    Inventors: Kevin Russ, Ian Sands, Russ Burtner, John Snavely
  • Publication number: 20100299060
    Abstract: Rule-based location sharing may be provided. A location determining device, such as a Global Positioning System (GPS) enabled device, may receive a request to share the location. A rule may be used to determine whether to share the location with the requestor. If the rule allows the location to be shared, the location may be sent to the requestor. The location may be relayed through a third party server, which may be operative to evaluate the rule before sharing the location with the requestor.
    Type: Application
    Filed: May 22, 2009
    Publication date: November 25, 2010
    Applicant: Microsoft Corporation
    Inventors: John Snavely, Kevin Russ, Ian Sands, Russ Burtner
  • Publication number: 20100293501
    Abstract: Embodiments of the present invention are directed toward facilitating multi-user input on large format displays. In situations where multiple users may want to work individually on separate content, or individually on the same content, embodiments of the present invention provide an interface allowing a user or users to segment a display in a way to create isolated areas in which multiple users may manipulate content independently and concurrently.
    Type: Application
    Filed: May 18, 2009
    Publication date: November 18, 2010
    Applicant: Microsoft Corporation
    Inventors: Kevin Russ, John Snavely, Ian Sands, Russ Burtner
  • Publication number: 20100241987
    Abstract: A tear-drop way-finding user interface (UI) may be provided. A first UI portion corresponding to a device location may be provided. In addition, an object may be displayed at a first relative position within the first UI portion. Then, upon a detected change in device location, a second UI portion corresponding to the changed device location may be provided. In response to the changed device location, a second relative position of the object may be calculated. Next, a determination may be made as to whether the second relative position of the object is within a displayable range of the second UI portion. If the second relative position of the object is not within the displayable range of the second UI portion, then a tear-drop icon indicative of the second relative position of the object may be displayed at an edge of the second UI portion.
    Type: Application
    Filed: March 19, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100241348
    Abstract: Navigation information may be provided. First, a destination location may be received at a portable device. Next, a current location of the portable device maybe detected. Then, at least one way-point may be calculated based on the current location and the destination location. An orientation and a level of the portable device may be determined and the at least one way-point may then be projected from the portable device.
    Type: Application
    Filed: March 19, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100240390
    Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.
    Type: Application
    Filed: March 19, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100241999
    Abstract: User interface manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture's detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture's detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture's detection, the 3D UI representation may be converted back into the 2D UI representation.
    Type: Application
    Filed: March 19, 2009
    Publication date: September 23, 2010
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100149090
    Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.
    Type: Application
    Filed: December 15, 2008
    Publication date: June 17, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
  • Publication number: 20090265632
    Abstract: Technologies are described herein for providing a non-linear presentation canvas. A non-linear presentation canvas is provided. The non-linear presentation canvas is a virtual space onto which multimedia files and hyperlinks can be inserted. A content preview for one or more of the multimedia files and hyperlinks inserted onto the non-linear presentation canvas is displayed. The content preview is selectable to display the corresponding multimedia files and hyperlinks during a presentation.
    Type: Application
    Filed: April 21, 2008
    Publication date: October 22, 2009
    Applicant: MICROSOFT CORPORATION
    Inventors: Victor Kevin Russ, Ian Michael Sands
  • Publication number: 20090215471
    Abstract: A user of a mobile device is able to display information about objects in the surrounding environment and to optionally interact with those objects. The information may be displayed as a graphical overlay on top of a real-time display of imagery from a camera in the mobile device with the overlay indexed to the real-time display. The graphical overlay may include positional information about an external object and may include navigational information intended to assist the user in moving to the object's location. There may also be a graphical user interface which allows the user to utilize the mobile device to interact with an external object.
    Type: Application
    Filed: February 21, 2008
    Publication date: August 27, 2009
    Applicant: Microsoft Corporation
    Inventors: Ian M. Sands, V. Kevin Russ
  • Patent number: 7542845
    Abstract: The information navigation interface facilitates receiving an indication of a discrete physical contact event with a user input device. The contact event is associated with a user request for navigating an assembly of displayable information. A navigation action based on the received indication is then determined. The navigation action may include either (a) both scrolling and expansion of currently displayed information wherein the scrolling and expansion occur simultaneously or nearly simultaneously, or (b) both scrolling and contraction of currently displayed information wherein the scrolling and contraction occur simultaneously or nearly simultaneously. Once determined, the navigation action is applied, resulting in a display of a second portion of the assembly of information.
    Type: Grant
    Filed: July 28, 2006
    Date of Patent: June 2, 2009
    Assignee: Microsoft Corporation
    Inventors: Ian Michael Sands, Victor Kevin Russ
  • Publication number: 20050256786
    Abstract: A system and method for empowering consumers in a retail environment. Among many other features, the present invention provides a number of menu and item list features to enhance a shopper's experience in a retail environment. In addition, the invention provides an item-level location mapping feature. By the use of the user's shopping list and a database storing information describing the location of items on the list, the system dynamically generates routing maps that show a path the user can take to locate the items on the list. The display may include a map with a route graphically displayed on the map, and/or a text description of the route.
    Type: Application
    Filed: August 31, 2004
    Publication date: November 17, 2005
    Inventors: Ian Michael Sands, Andrew Kelly, Victor Kevin Russ