Patents by Inventor Jae Pum Park

Jae Pum Park has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 9143601
    Abstract: Exemplary methods, apparatus, and systems are disclosed for capturing, organizing, sharing, and/or displaying media. For example, using embodiments of the disclosed technology, a unified playback and browsing experience for a collection of media can be created automatically. For instance, heuristics and metadata can be used to assemble and add narratives to the media data. Furthermore, this representation of media can recompose itself dynamically as more media is added to the collection. While a collection may use a single user's content, sometimes media that is desirable to include in the collection is captured by friends and/or others at the same event. In certain embodiments, media content related to the event can be automatically collected and shared among selected groups. Further, in some embodiments, new media can be automatically incorporated into a media collection associated with the event, and the playback experience dynamically updated.
    Type: Grant
    Filed: November 9, 2011
    Date of Patent: September 22, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Udiyan Padmanabhan, William Messing, Martin Shetter, Tatiana Gershanovich, Michael J. Ricker, Jannes Paul Peters, Raman Kumar Sarin, Joseph H. Matthews, III, Monica Gonzalez, Jae Pum Park
  • Patent number: 9031847
    Abstract: A computing device (e.g., a smart phone, a tablet computer, digital camera, or other device with image capture functionality) causes an image capture device to capture one or more digital images based on audio input (e.g., a voice command) received by the computing device. For example, a user's voice (e.g., a word or phrase) is converted to audio input data by the computing device, which then compares (e.g., using an audio matching algorithm) the audio input data to an expected voice command associated with an image capture application. In another aspect, a computing device activates an image capture application and captures one or more digital images based on a received voice command. In another aspect, a computing device transitions from a low-power state to an active state, activates an image capture application, and causes a camera device to capture digital images based on a received voice command.
    Type: Grant
    Filed: November 15, 2011
    Date of Patent: May 12, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: Raman Kumar Sarin, Joseph H. Matthews, III, James Kai Yu Lau, Monica Estela Gonzalez Veron, Jae Pum Park
  • Patent number: 9008639
    Abstract: Techniques and tools are described for controlling an audio signal of a mobile device. For example, information indicative of acceleration of the mobile device can be received and correlation between the information indicative of acceleration and exemplar whack event data can be determined. An audio signal of the mobile device can be controlled based on the correlation.
    Type: Grant
    Filed: March 11, 2011
    Date of Patent: April 14, 2015
    Assignee: Microsoft Technology Licensing, LLC
    Inventors: James M. Lyon, James Kai Yu Lau, Raman Kumar Sarin, Jae Pum Park, Monica Estela Gonzalez Veron
  • Publication number: 20140333670
    Abstract: Electronic devices, interfaces for electronic devices, and techniques for interacting with such interfaces and electronic devices are described. For instance, this disclosure describes an example electronic device that includes sensors, such as multiple front-facing cameras to detect orientation and/or location of the electronic device relative to an object and one or more inertial sensors. Users of the device may perform gestures on the device by moving the device in-air and/or by moving their head, face, or eyes relative to the device. In response to these gestures, the device may perform operations.
    Type: Application
    Filed: May 9, 2014
    Publication date: November 13, 2014
    Applicant: Amazon Technologies, Inc.
    Inventors: Bryan Todd Agnetta, Venkata Nagesh Babu Balivada, Blair Harold Beebe, Joseph Robert Buchta, Vibhunandan Gavini, Catherine Ann Hendricks, Brian Peter Kralyevich, Santhosh Kumar Paraliyil Krishnankutty, Richard Leigh Mains, Garret Martin Miller Graaf, Jae Pum Park, Sean Anthony Rooney, Marc Anthony Salazar, Nino Yuniardi
  • Publication number: 20140333530
    Abstract: Electronic devices, interfaces for electronic devices, and techniques for interacting with such interfaces and electronic devices are described. For instance, this disclosure describes an example electronic device that includes sensors, such as multiple front-facing cameras to detect orientation and/or location of the electronic device relative to an object and one or more inertial sensors. Users of the device may perform gestures on the device by moving the device in-air and/or by moving their head, face, or eyes relative to the device. In response to these gestures, the device may perform operations.
    Type: Application
    Filed: May 9, 2014
    Publication date: November 13, 2014
    Applicant: Amazon Technologies, Inc.
    Inventors: Bryan Todd Agnetta, Brian Peter Kralyevich, Jason Phillip Kriese, Jae Pum Park, Sean Anthony Rooney
  • Publication number: 20140337800
    Abstract: A computing device can utilize a recognition mode wherein an interface utilizes graphical elements, such as virtual fireflies or other such elements, to indicate objects that are recognized or identified. As objects are recognized, fireflies perform one or more specified actions to indicate recognition. A ribbon or other user-selectable icon is displayed indicates a specific action that the device can perform with respect to the respective object. As additional objects are recognized, additional ribbons are created and older ribbons can be moved off screen and stored for subsequent retrieval or search. The fireflies disperse when the objects are no longer represented in captured sensor data, and can be animated to move towards representations of new objects as features of those objects are identified as potential object features, in order to communicate a level of recognition for a current scene or environment.
    Type: Application
    Filed: December 20, 2013
    Publication date: November 13, 2014
    Applicant: Amazon Technologies, Inc.
    Inventors: Timothy Thomas Gray, Gray Anthony Salazar, Steven Steven Sommer, Charles Eugene Cummins, Sean Anthony Rooney, Bryan Todd Agnetta, Jae Pum Park, Richard Leigh Mains, Suzan Marashi
  • Publication number: 20140337791
    Abstract: Electronic devices, interfaces for electronic devices, and techniques for interacting with such interfaces and electronic devices are described. For instance, this disclosure describes an example electronic device that includes sensors, such as multiple front-facing cameras to detect orientation and/or location of the electronic device relative to an object and one or more inertial sensors. Users of the device may perform gestures on the device by moving the device in-air and/or by moving their head, face, or eyes relative to the device. In response to these gestures, the device may perform operations.
    Type: Application
    Filed: May 9, 2014
    Publication date: November 13, 2014
    Applicant: Amazon Technologies, Inc.
    Inventors: Bryan Todd Agnetta, Aaron Michael Donsbach, Catherine Ann Hendricks, Brian Peter Kralyevich, Richard Leigh Mains, Jae Pum Park, Sean Anthony Rooney, Marc Anthony Salazar, Jason Glenn Silvis, Nino Yuniardi
  • Publication number: 20130324194
    Abstract: The present disclosure relates to a mobile phone and a method for answering such a phone automatically without user input. In one embodiment, the mobile phone detects that a call is being received. A proximity sensor is then used to detect the presence of a nearby object. For example, this allows a determination to be made whether the mobile phone is within a pocket of the user while the phone is ringing. Then a determination is made whether the proximity sensor changes states. For example, if a user removes the phone from their pocket, the proximity sensor switches from detecting something proximal to detecting that the phone is no longer in the user's pocket. Next, a determination is made whether the proximity sensor is again next to an object, such as an ear. If so, the mobile phone can be automatically answered without further user input.
    Type: Application
    Filed: August 8, 2013
    Publication date: December 5, 2013
    Applicant: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Monica Estela Gonzalez Veron, Kenneth Paul Hinckley, Sumit Kumar, James Kai Yu Lau, Joseph H. Matthews, III, Jae Pum Park
  • Patent number: 8509842
    Abstract: The present disclosure relates to a mobile phone and a method for answering such a phone automatically without user input. In one embodiment, the mobile phone detects that a call is being received. A proximity sensor is then used to detect the presence of a nearby object. For example, this allows a determination to be made whether the mobile phone is within a pocket of the user while the phone is ringing. Then a determination is made whether the proximity sensor changes states. For example, if a user removes the phone from their pocket, the proximity sensor switches from detecting something proximal to detecting that the phone is no longer in the user's pocket. Next, a determination is made whether the proximity sensor is again next to an object, such as an ear. If so, the mobile phone can be automatically answered without further user input.
    Type: Grant
    Filed: February 18, 2011
    Date of Patent: August 13, 2013
    Assignee: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Monica Estela Gonzalez Veron, Kenneth Paul Hinckley, Sumit Kumar, James Kai Yu Lau, Joseph H. Matthews, III, Jae Pum Park
  • Publication number: 20130124207
    Abstract: A computing device (e.g., a smart phone, a tablet computer, digital camera, or other device with image capture functionality) causes an image capture device to capture one or more digital images based on audio input (e.g., a voice command) received by the computing device. For example, a user's voice (e.g., a word or phrase) is converted to audio input data by the computing device, which then compares (e.g., using an audio matching algorithm) the audio input data to an expected voice command associated with an image capture application. In another aspect, a computing device activates an image capture application and captures one or more digital images based on a received voice command. In another aspect, a computing device transitions from a low-power state to an active state, activates an image capture application, and causes a camera device to capture digital images based on a received voice command.
    Type: Application
    Filed: November 15, 2011
    Publication date: May 16, 2013
    Applicant: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Joseph H. Matthews, III, James Kai Yu Lau, Monica Estela Gonzalez Veron, Jae Pum Park
  • Publication number: 20130117365
    Abstract: Exemplary methods, apparatus, and systems are disclosed for capturing, organizing, sharing, and/or displaying media. For example, using embodiments of the disclosed technology, a unified playback and browsing experience for a collection of media can be created automatically. For instance, heuristics and metadata can be used to assemble and add narratives to the media data. Furthermore, this representation of media can recompose itself dynamically as more media is added to the collection. While a collection may use a single user's content, sometimes media that is desirable to include in the collection is captured by friends and/or others at the same event. In certain embodiments, media content related to the event can be automatically collected and shared among selected groups. Further, in some embodiments, new media can be automatically incorporated into a media collection associated with the event, and the playback experience dynamically updated.
    Type: Application
    Filed: November 9, 2011
    Publication date: May 9, 2013
    Applicant: Microsoft Corporation
    Inventors: Udiyan Padmanabhan, William Messing, Martin Shetter, Tatiana Gershanovich, Michael J. Ricker, Jannes Paul Peters, Raman Kumar Sarin, Joseph H. Matthews, III, Monica Gonzalez, Jae Pum Park
  • Patent number: 8375007
    Abstract: A method to expose status information is provided. The status information is associated with metadata extracted from multimedia files and stored in a metadata database. The metadata information that is extracted from the multimedia files is stored in a read queue to allow a background thread to process the metadata and populate the metadata database. Additionally, the metadata database may be updated to include user-define metadata, which is written back to the multimedia files. The user-defined metadata is included in a write queue and is written to the multimedia files associated with the user-defined metadata. The status of the read and write queues are exposed to a user through a graphical user interface. The status may include the list of multimedia files included in the read and write queues, the priorities of each multimedia file, and the number of remaining multimedia files.
    Type: Grant
    Filed: June 21, 2011
    Date of Patent: February 12, 2013
    Assignee: Microsoft Corporation
    Inventors: Alexander S. Brodie, Benjamin L. Perry, David R. Parlin, Jae Pum Park, Michael J. Gilmore, Scott E. Dart
  • Publication number: 20120231838
    Abstract: Techniques and tools are described for controlling an audio signal of a mobile device. For example, information indicative of acceleration of the mobile device can be received and correlation between the information indicative of acceleration and exemplar whack event data can be determined. An audio signal of the mobile device can be controlled based on the correlation.
    Type: Application
    Filed: March 11, 2011
    Publication date: September 13, 2012
    Applicant: Microsoft Corporation
    Inventors: James M. Lyon, James Kai Yu Lau, Raman Kumar Sarin, Jae Pum Park, Monica Estela Gonzalez Veron
  • Publication number: 20120214542
    Abstract: The present disclosure relates to a mobile phone and a method for answering such a phone automatically without user input. In one embodiment, the mobile phone detects that a call is being received. A proximity sensor is then used to detect the presence of a nearby object. For example, this allows a determination to be made whether the mobile phone is within a pocket of the user while the phone is ringing. Then a determination is made whether the proximity sensor changes states. For example, if a user removes the phone from their pocket, the proximity sensor switches from detecting something proximal to detecting that the phone is no longer in the user's pocket. Next, a determination is made whether the proximity sensor is again next to an object, such as an ear. If so, the mobile phone can be automatically answered without further user input.
    Type: Application
    Filed: February 18, 2011
    Publication date: August 23, 2012
    Applicant: Microsoft Corporation
    Inventors: Raman Kumar Sarin, Monica Estela Gonzalez Veron, Kenneth Paul Hinckley, Sumit Kumar, James Kai Yu Lau, Joseph H. Matthews, III, Jae Pum Park
  • Publication number: 20120158290
    Abstract: A navigation user interface displays a route to be navigated over a road view map. If a user selects a particular segment of the displayed route, text-based directions associated with the particular route segment are displayed, with a selectable area to return to the map view. Additionally, as the road view map is zoomed in, when the zoom level reaches a threshold zoom level, the navigation user interface automatically transitions to displaying a satellite view map overlaid with at least a portion of the route to be navigated.
    Type: Application
    Filed: December 17, 2010
    Publication date: June 21, 2012
    Applicant: Microsoft Corporation
    Inventors: Aarti Bharathan, Liang Chen, Jae Pum Park
  • Publication number: 20110252069
    Abstract: A method to expose status information is provided. The status information is associated with metadata extracted from multimedia files and stored in a metadata database. The metadata information that is extracted from the multimedia files is stored in a read queue to allow a background thread to process the metadata and populate the metadata database. Additionally, the metadata database may be updated to include user-define metadata, which is written back to the multimedia files. The user-defined metadata is included in a write queue and is written to the multimedia files associated with the user-defined metadata. The status of the read and write queues are exposed to a user through a graphical user interface. The status may include the list of multimedia files included in the read and write queues, the priorities of each multimedia file, and the number of remaining multimedia files.
    Type: Application
    Filed: June 21, 2011
    Publication date: October 13, 2011
    Applicant: MICROSOFT CORPORATION
    Inventors: ALEXANDER S. BRODIE, BENJAMIN L. PERRY, DAVID R. PARLIN, JAE PUM PARK, MICHAEL J. GILMORE, SCOTT E. DART
  • Patent number: 7987160
    Abstract: A method to expose status information is provided. The status information is associated with metadata extracted from multimedia files and stored in a metadata database. The metadata information that is extracted from the multimedia files is stored in a read queue to allow a background thread to process the metadata and populate the metadata database. Additionally, the metadata database may be updated to include user-define metadata, which is written back to the multimedia files. The user-defined metadata is included in a write queue and is written to the multimedia files associated with the user-defined metadata. The status of the read and write queues are exposed to a user through a graphical user interface. The status may include the list of multimedia files included in the read and write queues, the priorities of each multimedia file, and the number of remaining multimedia files.
    Type: Grant
    Filed: January 30, 2006
    Date of Patent: July 26, 2011
    Assignee: Microsoft Corporation
    Inventors: Alexander S. Brodie, Benjamin L. Perry, David R. Parlin, Jae Pum Park, Michael J. Gilmore, Scott E. Dart
  • Patent number: 7979790
    Abstract: A method and system to manage rendering of multimedia content are provided. A theme specifies a collection of layouts defining multimedia content placement. The multimedia content is processed to extract one or more characteristics. The layouts are selected and populated with multimedia content based on the one or more characteristics associated with the multimedia content. The multimedia content is rendered by transitioning through the selected layouts.
    Type: Grant
    Filed: February 28, 2006
    Date of Patent: July 12, 2011
    Assignee: Microsoft Corporation
    Inventors: Benjamin Nicholas Truelove, Chunkit Chan, Jae Pum Park, Shabbir Abbas Shahpurwala, William Mountain Lewis
  • Patent number: 7451407
    Abstract: A system, a method and computer-readable media for presenting groups of items to a user. Items are divided into groups, and a group header is associated with each group. The items and group headers are presented on a screen display, and the displayed content is subject to navigational requests from a user. When one of the group headers is located near an edge of the screen display, its position is fixed to prevent the header from being removed from the screen display.
    Type: Grant
    Filed: November 30, 2005
    Date of Patent: November 11, 2008
    Assignee: Microsoft Corporation
    Inventors: Alexander Brodie, Benjamin Truelove, David Parlin, Jae Pum Park, Scott Dart
  • Patent number: 7441182
    Abstract: Systems and methods for digital negatives are described. In one aspect, a digital negative is created on a computing device from a digital image. The digital image is linked to the digital negative. In response to a save operation associated with the digital image, a new digital image is generated and bi-directionally connected to the digital negative. In response to a revert operation associated with the new digital image, contents of the new digital image are replaced with contents of the digital negative.
    Type: Grant
    Filed: October 23, 2003
    Date of Patent: October 21, 2008
    Assignee: Microsoft Corporation
    Inventors: Craig Beilinson, Benjamin L. Perry, Christopher A. Evans, Clint Jorgenson, Jae Pum Park, Linda Hong, Pritvinath Obla, Anthony T. Chor, Wei Feng, Alexander Castro