Patents by Inventor Joseph H. Matthews

Joseph H. Matthews has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20130227431
    Abstract: In embodiments of private interaction hubs, a mobile device has memory storage to maintain hub data that is associated with a private interaction hub, where the hub data includes multiple types of displayable data that is editable by different types of device applications. The memory storage at the device also maintains private data that is displayable and is viewable with one of the device applications. The mobile device also includes a display device to display the multiple types of the hub data in a hub user interface of a hub application. The display device can also display the private data and a subset of the hub data that are both associated with a device application in a device application user interface.
    Type: Application
    Filed: December 22, 2012
    Publication date: August 29, 2013
    Applicant: MICROSOFT CORPORATION
    Inventors: Lavanya Vasudevan, Shawn M. Thomas, Joseph H. Matthews, III, Joseph A. Schrader, Ted Tai-Yu Chen, Raman K. Sarin
  • Patent number: 8509842
    Abstract: The present disclosure relates to a mobile phone and a method for answering such a phone automatically without user input. In one embodiment, the mobile phone detects that a call is being received. A proximity sensor is then used to detect the presence of a nearby object. For example, this allows a determination to be made whether the mobile phone is within a pocket of the user while the phone is ringing. Then a determination is made whether the proximity sensor changes states. For example, if a user removes the phone from their pocket, the proximity sensor switches from detecting something proximal to detecting that the phone is no longer in the user's pocket. Next, a determination is made whether the proximity sensor is again next to an object, such as an ear. If so, the mobile phone can be automatically answered without further user input.
    Type: Grant
    Filed: February 18, 2011
    Date of Patent: August 13, 2013
    Assignee: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Monica Estela Gonzalez Veron, Kenneth Paul Hinckley, Sumit Kumar, James Kai Yu Lau, Joseph H. Matthews, III, Jae Pum Park
  • Patent number: 8490016
    Abstract: Described is distinguishing between input mechanisms to determine which input mechanism was used to activate a start menu. A start menu is selected that corresponds to the input mechanism that was used to activate it. Further data corresponding to start menu interaction is received, and action is taken via based on the further interaction. For example, upon detecting activation of a start menu, how the start menu was activated from among activation types is used to present a first start menu/behavior for a first activation type, which may differ from a second start menu/behavior activated via a second activation type. For example, a determination may be made as to whether a start menu was invoked via a pointing device or via keyboard; when via keyboard, a search entry region may be provided, by which a user may directly enter search criteria via the keyboard.
    Type: Grant
    Filed: October 6, 2009
    Date of Patent: July 16, 2013
    Assignee: Microsoft Corporation
    Inventors: Pasquale DeMaio, Matthew R. Lerner, Charles Cummins, Song Zou, Bret P. Anderson, David A. Matthews, Isabelo Valentin de los Reyes, Joseph H. Matthews, III
  • Publication number: 20130124207
    Abstract: A computing device (e.g., a smart phone, a tablet computer, digital camera, or other device with image capture functionality) causes an image capture device to capture one or more digital images based on audio input (e.g., a voice command) received by the computing device. For example, a user's voice (e.g., a word or phrase) is converted to audio input data by the computing device, which then compares (e.g., using an audio matching algorithm) the audio input data to an expected voice command associated with an image capture application. In another aspect, a computing device activates an image capture application and captures one or more digital images based on a received voice command. In another aspect, a computing device transitions from a low-power state to an active state, activates an image capture application, and causes a camera device to capture digital images based on a received voice command.
    Type: Application
    Filed: November 15, 2011
    Publication date: May 16, 2013
    Applicant: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Joseph H. Matthews, III, James Kai Yu Lau, Monica Estela Gonzalez Veron, Jae Pum Park
  • Publication number: 20130117692
    Abstract: Exemplary methods, apparatus, and systems are disclosed for capturing, organizing, sharing, and/or displaying media. For example, using embodiments of the disclosed technology, a unified playback and browsing experience for a collection of media can be created automatically. For instance, heuristics and metadata can be used to assemble and add narratives to the media data. Furthermore, this representation of media can recompose itself dynamically as more media is added to the collection. While a collection may use a single user's content, sometimes media that is desirable to include in the collection is captured by friends and/or others at the same event. In certain embodiments, media content related to the event can be automatically collected and shared among selected groups. Further, in some embodiments, new media can be automatically incorporated into a media collection associated with the event, and the playback experience dynamically updated.
    Type: Application
    Filed: November 9, 2011
    Publication date: May 9, 2013
    Applicant: Microsoft Corporation
    Inventors: Udiyan Padmanabhan, William Messing, Joseph H. Matthews, III, Martin Shetter, Tatiana Gershanovich, Michael J. Ricker, Jannes Paul Peters
  • Publication number: 20130117365
    Abstract: Exemplary methods, apparatus, and systems are disclosed for capturing, organizing, sharing, and/or displaying media. For example, using embodiments of the disclosed technology, a unified playback and browsing experience for a collection of media can be created automatically. For instance, heuristics and metadata can be used to assemble and add narratives to the media data. Furthermore, this representation of media can recompose itself dynamically as more media is added to the collection. While a collection may use a single user's content, sometimes media that is desirable to include in the collection is captured by friends and/or others at the same event. In certain embodiments, media content related to the event can be automatically collected and shared among selected groups. Further, in some embodiments, new media can be automatically incorporated into a media collection associated with the event, and the playback experience dynamically updated.
    Type: Application
    Filed: November 9, 2011
    Publication date: May 9, 2013
    Applicant: Microsoft Corporation
    Inventors: Udiyan Padmanabhan, William Messing, Martin Shetter, Tatiana Gershanovich, Michael J. Ricker, Jannes Paul Peters, Raman Kumar Sarin, Joseph H. Matthews, III, Monica Gonzalez, Jae Pum Park
  • Publication number: 20120321271
    Abstract: Embodiments are disclosed that relate to providing commentary for video content. For example, one disclosed embodiment provides a method comprising receiving and storing an input of commentary data from each of a plurality of commentary input devices, and also, for each input of commentary data, receiving and storing identification metadata identifying a commentator, for each input of commentary data, synchronization metadata that synchronizes the commentary data with the associated media content item is received and stored. The method further comprises receiving a request from a requesting media presentation device for commentary relevant to a specified media content item and a specified user, identifying relevant commentary data based upon social network information for the specified user, and sending the relevant commentary data to the requesting client device.
    Type: Application
    Filed: June 20, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: James A. Baldwin, Joseph H. Matthews, III
  • Publication number: 20120324492
    Abstract: Embodiments related to providing video items to a plurality of viewers in a video viewing environment are provided. In one embodiment, the video item is provided by determining identities for each of the viewers from data received from video viewing environment sensors, obtaining the video item based on those identities, and sending the video item for display.
    Type: Application
    Filed: June 20, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: David Rogers Treadwell, III, Doug Burger, Steven Bathiche, Joseph H. Matthews, III, Todd Eric Holmdahl, Jay Schiller
  • Publication number: 20120324491
    Abstract: Embodiments related to identifying and displaying portions of video content taken from longer video content are disclosed. In one example embodiment, a portion of a video item is provided by receiving, for a video item, an emotional response profile for each viewer of a plurality of viewers, each emotional response profile comprising a temporal correlation of a particular viewer's emotional response to the video item when viewed by the particular viewer. The method further comprises selecting, using the emotional response profiles, a first portion of the video item judged to be more emotionally stimulating than a second portion of the video item, and sending the first portion of the video item to another computing device in response to a request for the first portion of the video item without sending the second portion of the video item.
    Type: Application
    Filed: June 17, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Steven Bathiche, Doug Burger, David Rogers Treadwell, III, Joseph H. Matthews, III
  • Publication number: 20120324494
    Abstract: Embodiments related to selecting advertisements for display to targeted viewers are disclosed. In one example embodiment, an advertisement is selected by, for each of a plurality of advertisements, aggregating a plurality of emotional response profiles from a corresponding plurality of prior viewers of the advertisement to form an aggregated emotional response profile for the advertisement, wherein each of the emotional response profiles comprises a temporal record of a prior viewer's emotional response to the advertisement. The method further includes identifying a group of potentially positively correlated viewers for the targeted viewer, filtering the aggregated emotional response profiles based on the group of potentially positively correlated viewers, selecting a particular advertisement from the plurality of advertisements based on a correlation of the filtered aggregated emotional response profiles, and sending the particular advertisement for display to the targeted viewer.
    Type: Application
    Filed: June 17, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Doug Burger, Todd Eric Holmdahl, Joseph H. Matthews, III, James A. Baldwin, Jay Schiller
  • Publication number: 20120324495
    Abstract: Embodiments related to distributing an identity of a video item being presented on a video presentation device within a video viewing environment to applications configured to obtain content related to the video item are disclosed. In one example embodiment, an identity is transmitted by determining an identity of the video item currently being presented on the video presentation device and responsive to a trigger, transmitting the identity of the video item to a receiving application while the video item is being presented on the video presentation device.
    Type: Application
    Filed: June 17, 2011
    Publication date: December 20, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Joseph H. Matthews, III, James A. Baldwin, David Rogers Treadwell, III
  • Publication number: 20120297284
    Abstract: Annotations can be automatically added to a media presentation during playback of the presentation without a user having to manually interact with the playback device. The playback device determines whether an annotation is to be added to the media presentation based on characteristics of voice input received at the device, such as voice input signal strength or variances in the voice input signal strength. Characteristics of video input received at the device can be used to determine whether a user is speaking to the computing device as well. The device can handle a new annotation overlapping an existing annotation by either removing the existing annotation or by shifting the existing annotation until there is no more overlap. A media presentation can comprise multiple annotation tracks.
    Type: Application
    Filed: May 18, 2011
    Publication date: November 22, 2012
    Applicant: Microsoft Corporation
    Inventors: Joseph H. Matthews, III, Udiyan Padmanabhan, Jannes Paul Peters
  • Publication number: 20120214542
    Abstract: The present disclosure relates to a mobile phone and a method for answering such a phone automatically without user input. In one embodiment, the mobile phone detects that a call is being received. A proximity sensor is then used to detect the presence of a nearby object. For example, this allows a determination to be made whether the mobile phone is within a pocket of the user while the phone is ringing. Then a determination is made whether the proximity sensor changes states. For example, if a user removes the phone from their pocket, the proximity sensor switches from detecting something proximal to detecting that the phone is no longer in the user's pocket. Next, a determination is made whether the proximity sensor is again next to an object, such as an ear. If so, the mobile phone can be automatically answered without further user input.
    Type: Application
    Filed: February 18, 2011
    Publication date: August 23, 2012
    Applicant: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Monica Estela Gonzalez Veron, Kenneth Paul Hinckley, Sumit Kumar, James Kai Yu Lau, Joseph H. Matthews, III, Jae Pum Park
  • Publication number: 20120169610
    Abstract: Systems and methods are provided for use with a computing device having a touch sensitive display including a touch sensor configured to detect touches of a digit of a user. The method may include detecting an initial digit down position on the display via the touch sensor, and establishing a neutral position for a virtual controller at the digit down position. The method may further include detecting a subsequent movement of the digit relative to the initial digit down position, and determining a controller input parameter based on the subsequent movement of the digit relative to the initial digit down position. The method may further include generating a controller input message indicating the determined controller input parameter.
    Type: Application
    Filed: December 29, 2010
    Publication date: July 5, 2012
    Applicant: MICROSOFT CORPORATION
    Inventors: Otto Berkes, Joseph H. Matthews, III, Avi Geiger
  • Patent number: 8108899
    Abstract: An interactive entertainment system enables presentation of supplemental interactive content along side traditional broadcast video programs, such as television shows and movies. The programs are broadcast in a conventional manner. The supplemental content is supplied as part of the same program signal over the broadcast network, or separately over another distribution network. A viewer computing unit is located at the viewer's home to present the program and supplemental content to a viewer. When the viewer tunes to a particular channel, the viewer computing unit consults an electronic programming guide (EPG) to determine if the present program carried on the channel is interactive. If it is, the viewer computing unit launches a browser. The browser uses a target specification stored in the EPG to activate a target resource containing the supplemental content for enhancing the broadcast program.
    Type: Grant
    Filed: April 26, 2004
    Date of Patent: January 31, 2012
    Assignee: Microsoft Corporation
    Inventors: Daniel J. Shoff, Valerie L. Bronson, Joseph H. Matthews, III, Frank A. Lawler
  • Patent number: 7913157
    Abstract: A method and system is provided for the creation and playback of multiple independently produced and distributed media intended for synchronized playback. One embodiment of the invention overcomes variances in independently produced and distributed media that make accurate synchronization impossible today. The system utilizes both authoring and playback processes. During authoring, a relative virtual time code profile is generated based on the original source media in a defined associated media set. The system employs an extensible framework of multiple synchronization recognizers that analyze the source media to generate a relative virtual time code profile for the associated media set. During playback, the system's client can access the relative virtual time code profile to coordinate the synchronized playback of an associated media set. The system generates an absolute time code using the available associated media and the original relative virtual time code profile.
    Type: Grant
    Filed: April 17, 2007
    Date of Patent: March 22, 2011
    Assignee: Overcast Media Incorporated
    Inventors: Richard Wales Stoakley, Laura Janet Butler, Joseph H. Matthews, III
  • Patent number: 7757254
    Abstract: An interactive entertainment system enables presentation of supplemental interactive content along side traditional broadcast video programs, such as television shows and movies. The programs are broadcast in a conventional manner. The supplemental content is supplied as part of the same program signal over the broadcast network, or separately over another distribution network. A viewer computing unit is located at the viewer's home to present the program and supplemental content to a viewer. When the viewer tunes to a particular channel, the viewer computing unit consults an electronic programming guide (EPG) to determine if the present program carried on the channel is interactive. If it is, the viewer computing unit launches a browser. The browser uses a target specification stored in the EPG to activate a target resource containing the supplemental content for enhancing the broadcast program.
    Type: Grant
    Filed: July 21, 2004
    Date of Patent: July 13, 2010
    Assignee: Microsoft Corporation
    Inventors: Daniel J. Shoff, Valerie L. Bronson, Joseph H. Matthews, III, Frank Lawler
  • Patent number: 7696953
    Abstract: A portable, interactive display device is disclosed. The device presents to a user the graphical interface of a host computer. The host is separate from the display device and sits in a fixed location. The invention allows a user to carry with him the user interface capability of the host, limited only by the capabilities of a wireless communications channel to the host. The host provides processing, storage, and access to its own peripheral devices. The display device need only provide the amount of processing necessary to communicate with the host, to run the client side of the hosting software, and to provide security functions. The host provides a docking station that accommodates the display device. When in the docking station, the display device continues to operate but communicates with the host through the docking station rather than through the wireless channel. This allows for a higher quality video connection.
    Type: Grant
    Filed: October 4, 2006
    Date of Patent: April 13, 2010
    Assignee: Microsoft Corporation
    Inventors: Joseph H. Matthews, Richard W. Stoakley
  • Publication number: 20100070922
    Abstract: Described is distinguishing between input mechanisms to determine which input mechanism was used to activate a start menu. A start menu is selected that corresponds to the input mechanism that was used to activate it. Further data corresponding to start menu interaction is received, and action is taken via based on the further interaction. For example, upon detecting activation of a start menu, how the start menu was activated from among activation types is used to present a first start menu/behavior for a first activation type, which may differ from a second start menu/behavior activated via a second activation type. For example, a determination may be made as to whether a start menu was invoked via a pointing device or via keyboard; when via keyboard, a search entry region may be provided, by which a user may directly enter search criteria via the keyboard.
    Type: Application
    Filed: October 6, 2009
    Publication date: March 18, 2010
    Applicant: MICROSOFT CORPORATION
    Inventors: Pasquale DeMaio, Matthew R. Lerner, Charles Cummins, Song Zou, Bret P. Anderson, David A. Matthews, Isabelo Valentin de los Reyes, Joseph H. Matthews, III
  • Patent number: 7665109
    Abstract: An electronic programming guide (EPG) resides in a user interface unit memory and is executable on the processor of the user interface unit to organize programming information that is descriptive of the programs supplied over an interactive entertainment system. The EPG supports a user interface (UI) which visually correlates programs titles to scheduled viewing times. A hyperlink browser also resides in memory and is executable on the processor. One or more hyperlinks, which reference target resources containing interactive content related to the video programs, are integrated as part of the EPG UI. When a viewer activates a hyperlink within the EPG, the user interface unit launches the browser to activate the target resource specified by the hyperlink. The instruction might cause the visual display unit to tune to the program or channel represented by the particular label, or to initiate procedures to record the program when it begins playing, or to jump to a related target resource, such as a Web site.
    Type: Grant
    Filed: July 15, 2003
    Date of Patent: February 16, 2010
    Assignee: Microsoft Corporation
    Inventors: Joseph H. Matthews, III, Frank Lawler, James O. Robarts, David S. Byrne