Patents by Inventor John Snavely

John Snavely has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9207806
    Abstract: A virtual mouse input device is created in response to a placement of a card on a touch surface. When the card is placed on the touch surface, the boundaries of the card are captured and a virtual mouse appears around the card. The virtual mouse may be linked with a user through an identifier that is contained on the card. Other controls and actions may be presented in menus that appear with the virtual mouse. For instance, the user may select the type of input (e.g. mouse, keyboard, ink or trackball) driven by the business card. Once created, the virtual mouse is configured to receive user input until the card is removed from the touch surface. The virtual mouse is configured to move a cursor on a display in response to movement of the card on the touch surface.
    Type: Grant
    Filed: May 28, 2009
    Date of Patent: December 8, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Edwin Russ Burtner, V. Kevin Russ, Ian M. Sands, John A. Snavely
  • Patent number: 9141284
    Abstract: An input device is created on a touch screen in response to a user's placement of their hand. When a user places their hand on the touch screen, an input device sized for their hand is dynamically created. Alternatively, some other input device may be created. For example, when the user places two hands on the device a split keyboard input device may be dynamically created on the touch screen that is split between the user's hand locations. Once the input device is determined, the user may enter input through the created device on the input screen. The input devices may be configured for each individual user such that the display of the input device changes based on physical characteristics that are associated with the user.
    Type: Grant
    Filed: May 28, 2009
    Date of Patent: September 22, 2015
    Assignee: MICROSOFT TECHNOLOGY LICENSING, LLC
    Inventors: Ian M. Sands, John A. Snavely, Edwin Russ Burtner, V. Kevin Russ
  • Publication number: 20140344706
    Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.
    Type: Application
    Filed: August 5, 2014
    Publication date: November 20, 2014
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Patent number: 8849570
    Abstract: Navigation information may be provided. First, a destination location may be received at a portable device. Next, a current location of the portable device maybe detected. Then, at least one way-point may be calculated based on the current location and the destination location. An orientation and a level of the portable device may be determined and the at least one way-point may then be projected from the portable device.
    Type: Grant
    Filed: March 19, 2009
    Date of Patent: September 30, 2014
    Assignee: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Patent number: 8798669
    Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.
    Type: Grant
    Filed: February 15, 2012
    Date of Patent: August 5, 2014
    Assignee: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20140143737
    Abstract: In one embodiment, an apparatus includes one or more processors and a memory coupled to the processors include instructions executable by the processors. When executing the instructions, the processors present on a display of the apparatus a first screen that corresponds to a first mode of the apparatus. The first screen is in a first level of a graphical user interface (GUI) hierarchy and occupies a substantial portion of the display when presented. In response to a transition event at the first screen, the processors present a second screen that corresponds to a second mode of the apparatus. The second screen is in the first level of the GUI hierarchy and occupies the substantial portion of the display when presented. In response to a selection event at the first screen, the processors present a third screen that corresponds to a first function of the first mode of the apparatus.
    Type: Application
    Filed: August 30, 2013
    Publication date: May 22, 2014
    Applicant: Samsung Electronics Company, Ltd.
    Inventors: Pranav Mistry, Lining Yao, John Snavely, Eva-Maria Offenberg, Link Chun Huang, Cathy Kim
  • Publication number: 20140139422
    Abstract: In one embodiment, a wearable apparatus includes a sensor, a processor coupled to the sensor, and a memory coupled to the processor that includes instructions executable by the processor. When executing the instructions, the processor detects by the sensor movement of at least a portion of an arm of a user; detects, based at least in part on the movement, a gesture made by the user; and processes the gesture as input to the wearable apparatus.
    Type: Application
    Filed: August 30, 2013
    Publication date: May 22, 2014
    Applicant: Samsung Electronics Company, Ltd.
    Inventors: Pranav Mistry, Sajid Sadi, Lining Yao, John Snavely
  • Publication number: 20140143678
    Abstract: In one embodiment, an apparatus includes one or more processors and a memory coupled to the processors that includes instructions executable by the processors. When executing the instructions, the processors present on a display of the apparatus a first screen of a graphical user interface. The first screen includes one or more first elements. The processors receive user input indicating a transition in the graphical user interface and, in response to the user input, transition from the first screen to a second screen of the graphical user interface and apply one or more visual transition effects to the transition. The second screen includes one or more second elements.
    Type: Application
    Filed: August 30, 2013
    Publication date: May 22, 2014
    Applicant: Samsung Electronics Company, Ltd.
    Inventors: Pranav Mistry, Sajid Sadi, Lining Yao, John Snavely, Eva-Maria Offenberg, Link Chun Huang, Cathy Kim
  • Publication number: 20140139454
    Abstract: In one embodiment, a wearable computing device includes one or more sensors, one or more processors, and a memory coupled to the processors that includes instructions executable by the processors. When executing the instructions, the processors detect, by one or more of the sensors of the wearable computing device when worn on a limb of a user, a gesture-recognition-activation event associated with the wearable computing device; detect, by one or more sensors of the wearable computing device when worn on the limb of the user, a movement of the limb; determine a gesture made by the user based at least in part on the movement; and process the gesture as input to the computing wearable computing device.
    Type: Application
    Filed: August 30, 2013
    Publication date: May 22, 2014
    Applicant: Samsung Electronics Company, Ltd.
    Inventors: Pranav Mistry, Sajid Sadi, Lining Yao, John Snavely
  • Patent number: 8386963
    Abstract: A virtual inking device is created in response to a touch input device detecting a user's inking gesture. For example, when a user places one of their hands in a pen gesture (i.e. by connecting the index finger with the thumb while holding the other fingers near the palm), the user may perform inking operations. When the user changes the pen gesture to an erase gesture (i.e. making a fist) then the virtual pen may become a virtual eraser. Other inking gestures may also be utilized.
    Type: Grant
    Filed: May 28, 2009
    Date of Patent: February 26, 2013
    Assignee: Microsoft Corporation
    Inventors: V. Kevin Russ, Ian M. Sands, John A. Snavely, Edwin Russ Burtner
  • Publication number: 20120139939
    Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.
    Type: Application
    Filed: February 15, 2012
    Publication date: June 7, 2012
    Applicant: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Patent number: 8165799
    Abstract: Rule-based location sharing may be provided. A location determining device, such as a Global Positioning System (GPS) enabled device, may receive a request to share the location. A rule may be used to determine whether to share the location with the requestor. If the rule allows the location to be shared, the location may be sent to the requestor. The location may be relayed through a third party server, which may be operative to evaluate the rule before sharing the location with the requestor.
    Type: Grant
    Filed: May 22, 2009
    Date of Patent: April 24, 2012
    Assignee: Microsoft Corporation
    Inventors: John Snavely, Kevin Russ, Ian Sands, Russ Burtner
  • Patent number: 8121640
    Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.
    Type: Grant
    Filed: March 19, 2009
    Date of Patent: February 21, 2012
    Assignee: Microsoft Corporation
    Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
  • Publication number: 20100302144
    Abstract: A virtual mouse input device is created in response to a placement of a card on a touch surface. When the card is placed on the touch surface, the boundaries of the card are captured and a virtual mouse appears around the card. The virtual mouse may be linked with a user through an identifier that is contained on the card. Other controls and actions may be presented in menus that appear with the virtual mouse. For instance, the user may select the type of input (e.g. mouse, keyboard, ink or trackball) driven by the business card. Once created, the virtual mouse is configured to receive user input until the card is removed from the touch surface. The virtual mouse is configured to move a cursor on a display in response to movement of the card on the touch surface.
    Type: Application
    Filed: May 28, 2009
    Publication date: December 2, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Edwin Russ Burtner, V. Kevin Russ, Ian M. Sands, John A. Snavely
  • Publication number: 20100306018
    Abstract: Meeting state recall may be provided. A meeting context may be saved at the end of and/or during an event. The meeting context may comprise, for example, a hardware configuration, a software configuration, a recording of the meeting, and/or data associated with a subject of the meeting. The meeting context may be associated with an ongoing project and may be restored at a subsequent meeting associated with the ongoing project.
    Type: Application
    Filed: May 27, 2009
    Publication date: December 2, 2010
    Applicant: Microsoft Corporation
    Inventors: Russ Burtner, Kevin Russ, Ian Sands, John Snavely
  • Publication number: 20100302155
    Abstract: An input device is created on a touch screen in response to a user's placement of their hand. When a user places their hand on the touch screen, an input device sized for their hand is dynamically created. Alternatively, some other input device may be created. For example, when the user places two hands on the device a split keyboard input device may be dynamically created on the touch screen that is split between the user's hand locations. Once the input device is determined, the user may enter input through the created device on the input screen. The input devices may be configured for each individual user such that the display of the input device changes based on physical characteristics that are associated with the user.
    Type: Application
    Filed: May 28, 2009
    Publication date: December 2, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Ian M. Sands, John A. Snavely, Edwin Russ Burtner, V. Kevin Russ
  • Publication number: 20100306649
    Abstract: A virtual inking device is created in response to a touch input device detecting a user's inking gesture. For example, when a user places one of their hands in a pen gesture (i.e. by connecting the index finger with the thumb while holding the other fingers near the palm), the user may perform inking operations. When the user changes the pen gesture to an erase gesture (i.e. making a first) then the virtual pen may become a virtual eraser. Other inking gestures may also be utilized.
    Type: Application
    Filed: May 28, 2009
    Publication date: December 2, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: V. Kevin Russ, Ian M. Sands, John A. Snavely, Edwin Russ Burtner
  • Publication number: 20100306004
    Abstract: A computing system causes a plurality of display devices to display user interfaces containing portions of a canvas shared by a plurality of users. The canvas is a graphical space containing discrete graphical elements located at arbitrary locations within the canvas. Each of the discrete graphical elements graphically represents a discrete resource. When a user interacts with a resource in the set of resources, the computing system modifies the canvas to include an interaction element indicating that the user is interacting with the resource. The computer system then causes the display devices to update the user interfaces such that the user interfaces reflect a substantially current state of the canvas. In this way, the users may be able to understand which ones of the users are interacting with which ones of the resources.
    Type: Application
    Filed: May 26, 2009
    Publication date: December 2, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Russ Burtner, Kevin Russ, Ian Sands, John Snavely
  • Patent number: D763288
    Type: Grant
    Filed: October 7, 2014
    Date of Patent: August 9, 2016
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Pranav Mistry, Sajid Sadi, Lining Yao, John Snavely, Eva-Maria Offenberg, Link Chun Huang, Cathy Kim
  • Patent number: D763289
    Type: Grant
    Filed: October 7, 2014
    Date of Patent: August 9, 2016
    Assignee: SAMSUNG ELECTRONICS CO., LTD.
    Inventors: Pranav Mistry, Sajid Sadi, Lining Yao, John Snavely, Eva-Maria Offenberg, Link Chun Huang, Cathy Kim