Patents by Inventor V. Kevin Russ
V. Kevin Russ has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10409381Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.Type: GrantFiled: August 10, 2015Date of Patent: September 10, 2019Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
-
Patent number: 9207806Abstract: A virtual mouse input device is created in response to a placement of a card on a touch surface. When the card is placed on the touch surface, the boundaries of the card are captured and a virtual mouse appears around the card. The virtual mouse may be linked with a user through an identifier that is contained on the card. Other controls and actions may be presented in menus that appear with the virtual mouse. For instance, the user may select the type of input (e.g. mouse, keyboard, ink or trackball) driven by the business card. Once created, the virtual mouse is configured to receive user input until the card is removed from the touch surface. The virtual mouse is configured to move a cursor on a display in response to movement of the card on the touch surface.Type: GrantFiled: May 28, 2009Date of Patent: December 8, 2015Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Edwin Russ Burtner, V. Kevin Russ, Ian M. Sands, John A. Snavely
-
Publication number: 20150346837Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.Type: ApplicationFiled: August 10, 2015Publication date: December 3, 2015Inventors: Meredith June Morris, Eric J. Horvitz, Andrew Daivd Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
-
Patent number: 9141284Abstract: An input device is created on a touch screen in response to a user's placement of their hand. When a user places their hand on the touch screen, an input device sized for their hand is dynamically created. Alternatively, some other input device may be created. For example, when the user places two hands on the device a split keyboard input device may be dynamically created on the touch screen that is split between the user's hand locations. Once the input device is determined, the user may enter input through the created device on the input screen. The input devices may be configured for each individual user such that the display of the input device changes based on physical characteristics that are associated with the user.Type: GrantFiled: May 28, 2009Date of Patent: September 22, 2015Assignee: MICROSOFT TECHNOLOGY LICENSING, LLCInventors: Ian M. Sands, John A. Snavely, Edwin Russ Burtner, V. Kevin Russ
-
Patent number: 9134798Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.Type: GrantFiled: December 15, 2008Date of Patent: September 15, 2015Assignee: Microsoft Technology Licensing, LLCInventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz
-
Patent number: 8903430Abstract: A user of a mobile device is able to display information about objects in the surrounding environment and to optionally interact with those objects. The information may be displayed as a graphical overlay on top of a real-time display of imagery from a camera in the mobile device with the overlay indexed to the real-time display. The graphical overlay may include positional information about an external object and may include navigational information intended to assist the user in moving to the object's location. There may also be a graphical user interface which allows the user to utilize the mobile device to interact with an external object.Type: GrantFiled: February 21, 2008Date of Patent: December 2, 2014Assignee: Microsoft CorporationInventors: Ian M. Sands, V. Kevin Russ
-
Publication number: 20140344706Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.Type: ApplicationFiled: August 5, 2014Publication date: November 20, 2014Inventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Patent number: 8849570Abstract: Navigation information may be provided. First, a destination location may be received at a portable device. Next, a current location of the portable device maybe detected. Then, at least one way-point may be calculated based on the current location and the destination location. An orientation and a level of the portable device may be determined and the at least one way-point may then be projected from the portable device.Type: GrantFiled: March 19, 2009Date of Patent: September 30, 2014Assignee: Microsoft CorporationInventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Patent number: 8798669Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.Type: GrantFiled: February 15, 2012Date of Patent: August 5, 2014Assignee: Microsoft CorporationInventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Patent number: 8386963Abstract: A virtual inking device is created in response to a touch input device detecting a user's inking gesture. For example, when a user places one of their hands in a pen gesture (i.e. by connecting the index finger with the thumb while holding the other fingers near the palm), the user may perform inking operations. When the user changes the pen gesture to an erase gesture (i.e. making a fist) then the virtual pen may become a virtual eraser. Other inking gestures may also be utilized.Type: GrantFiled: May 28, 2009Date of Patent: February 26, 2013Assignee: Microsoft CorporationInventors: V. Kevin Russ, Ian M. Sands, John A. Snavely, Edwin Russ Burtner
-
Publication number: 20120139939Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.Type: ApplicationFiled: February 15, 2012Publication date: June 7, 2012Applicant: Microsoft CorporationInventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Patent number: 8121640Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.Type: GrantFiled: March 19, 2009Date of Patent: February 21, 2012Assignee: Microsoft CorporationInventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Publication number: 20100306649Abstract: A virtual inking device is created in response to a touch input device detecting a user's inking gesture. For example, when a user places one of their hands in a pen gesture (i.e. by connecting the index finger with the thumb while holding the other fingers near the palm), the user may perform inking operations. When the user changes the pen gesture to an erase gesture (i.e. making a first) then the virtual pen may become a virtual eraser. Other inking gestures may also be utilized.Type: ApplicationFiled: May 28, 2009Publication date: December 2, 2010Applicant: MICROSOFT CORPORATIONInventors: V. Kevin Russ, Ian M. Sands, John A. Snavely, Edwin Russ Burtner
-
Publication number: 20100302155Abstract: An input device is created on a touch screen in response to a user's placement of their hand. When a user places their hand on the touch screen, an input device sized for their hand is dynamically created. Alternatively, some other input device may be created. For example, when the user places two hands on the device a split keyboard input device may be dynamically created on the touch screen that is split between the user's hand locations. Once the input device is determined, the user may enter input through the created device on the input screen. The input devices may be configured for each individual user such that the display of the input device changes based on physical characteristics that are associated with the user.Type: ApplicationFiled: May 28, 2009Publication date: December 2, 2010Applicant: MICROSOFT CORPORATIONInventors: Ian M. Sands, John A. Snavely, Edwin Russ Burtner, V. Kevin Russ
-
Publication number: 20100302144Abstract: A virtual mouse input device is created in response to a placement of a card on a touch surface. When the card is placed on the touch surface, the boundaries of the card are captured and a virtual mouse appears around the card. The virtual mouse may be linked with a user through an identifier that is contained on the card. Other controls and actions may be presented in menus that appear with the virtual mouse. For instance, the user may select the type of input (e.g. mouse, keyboard, ink or trackball) driven by the business card. Once created, the virtual mouse is configured to receive user input until the card is removed from the touch surface. The virtual mouse is configured to move a cursor on a display in response to movement of the card on the touch surface.Type: ApplicationFiled: May 28, 2009Publication date: December 2, 2010Applicant: MICROSOFT CORPORATIONInventors: Edwin Russ Burtner, V. Kevin Russ, Ian M. Sands, John A. Snavely
-
Publication number: 20100241999Abstract: User interface manipulation using three-dimensional (3D) spatial gestures may be provided. A two-dimensional (2D) user interface (UI) representation may be displayed. A first gesture may be performed, and, in response to the first gesture's detection, the 2D UI representation may be converted into a 3D UI representation. A second gesture may then be performed, and, in response to the second gesture's detection, the 3D UI representation may be manipulated. Finally, a third gesture may be performed, and, in response to the third gesture's detection, the 3D UI representation may be converted back into the 2D UI representation.Type: ApplicationFiled: March 19, 2009Publication date: September 23, 2010Applicant: Microsoft CorporationInventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Publication number: 20100241348Abstract: Navigation information may be provided. First, a destination location may be received at a portable device. Next, a current location of the portable device maybe detected. Then, at least one way-point may be calculated based on the current location and the destination location. An orientation and a level of the portable device may be determined and the at least one way-point may then be projected from the portable device.Type: ApplicationFiled: March 19, 2009Publication date: September 23, 2010Applicant: Microsoft CorporationInventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Publication number: 20100240390Abstract: A dual module portable device may be provided. A motion of a first module of the dual module portable device may be detected. Based at least in part on the detected motion, a position of the first module may be determined relative to the second module of the portable device. Once the relative position of the first module has been determined, a portion of a user interface associated with the relative position may be displayed at the first module.Type: ApplicationFiled: March 19, 2009Publication date: September 23, 2010Applicant: Microsoft CorporationInventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Publication number: 20100241987Abstract: A tear-drop way-finding user interface (UI) may be provided. A first UI portion corresponding to a device location may be provided. In addition, an object may be displayed at a first relative position within the first UI portion. Then, upon a detected change in device location, a second UI portion corresponding to the changed device location may be provided. In response to the changed device location, a second relative position of the object may be calculated. Next, a determination may be made as to whether the second relative position of the object is within a displayable range of the second UI portion. If the second relative position of the object is not within the displayable range of the second UI portion, then a tear-drop icon indicative of the second relative position of the object may be displayed at an edge of the second UI portion.Type: ApplicationFiled: March 19, 2009Publication date: September 23, 2010Applicant: Microsoft CorporationInventors: V. Kevin Russ, John A. Snavely, Edwin R. Burtner, Ian M. Sands
-
Publication number: 20100149090Abstract: Aspects relate to detecting gestures that relate to a desired action, wherein the detected gestures are common across users and/or devices within a surface computing environment. Inferred intentions and goals based on context, history, affordances, and objects are employed to interpret gestures. Where there is uncertainty in intention of the gestures for a single device or across multiple devices, independent or coordinated communication of uncertainty or engagement of users through signaling and/or information gathering can occur.Type: ApplicationFiled: December 15, 2008Publication date: June 17, 2010Applicant: MICROSOFT CORPORATIONInventors: Meredith June Morris, Eric J. Horvitz, Andrew David Wilson, F. David Jones, Stephen E. Hodges, Kenneth P. Hinckley, David Alexander Butler, Ian M. Sands, V. Kevin Russ, Hrvoje Benko, Shawn R. LeProwse, Shahram Izadi, William Ben Kunz