Patents by Inventor Mikkel Crone Köser

Mikkel Crone Köser has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 10070170
    Abstract: A content annotation tool is disclosed. In a configuration, a portion of a movie may be obtained from a database. Entities, such as an actor, background music, text, etc. may be automatically identified in the movie. A user, such as a content producer, may associate and/or provide supplemental content for an identified entity to the database. A selection of one or more automatically identified entities may be received. A database entry may be generated that links the identified entity with the supplemental content. The selected automatically identified one or more entities and//or supplemental content associated therewith may be presented to an end user.
    Type: Grant
    Filed: September 2, 2016
    Date of Patent: September 4, 2018
    Assignee: Google LLC
    Inventors: Henry Will Schneiderman, Michael Andrew Sipe, Marco Paglia, Mikkel Crone Köser
  • Patent number: 10038844
    Abstract: The disclosed technology includes switching between a normal or standard-lens UI and a panoramic or wide-angle photography UI responsive to a zoom gesture. In one implementation, a user gesture corresponding to a “zoom-out” command, when received at a mobile computing device associated with a minimum zoom state, may trigger a switch from a standard lens photo capture UI to a wide-angle photography UI. In another implementation, a user gesture corresponding to a “zoom-in” command, when received at a mobile computing device associated with a nominal wide-angle state, may trigger a switch from a wide-angle photography UI to a standard lens photo capture UI.
    Type: Grant
    Filed: December 15, 2016
    Date of Patent: July 31, 2018
    Assignee: Google LLC
    Inventors: Nirav Bipinchandra Mehta, Mikkel Crone Köser, David Singleton, Robert William Hamilton, Henry John Holland, Tony Ferreira, Thomas Weedon Hume
  • Patent number: 9990694
    Abstract: Certain embodiments of this disclosure include methods and devices for outputting a zoom sequence. According to one embodiment, a method is provided. The method may include: (i) determining first location information from first metadata associated with one or more images, wherein the first location information identifies a first location; and (ii) outputting, for display, a first zoom sequence based on the first location information, wherein the first zoom sequence may include a first plurality of mapped images of the first location from a first plurality of zoom levels and the plurality of mapped images are sequentially ordered by a magnitude of the zoom level.
    Type: Grant
    Filed: November 28, 2016
    Date of Patent: June 5, 2018
    Assignee: Google LLC
    Inventors: Thomas Weedon Hume, Mikkel Crone Köser, Tony Ferreira, Jeremy Lyon, Waldemar Ariel Baraldi, Bryan Mawhinney, Christopher James Smith, Lenka Trochtova, Andrei Popescu, David Ingram, Flavio Lerda, Michael Ananin, Vytautas Vaitukaitis, Marc Paulina
  • Patent number: 9756260
    Abstract: A method and system is disclosed for simulating different types of camera lens on a device by guiding a user through a set of images to be captured in connection with one or more desired lens effects. In one aspect, a wide-angle lens may be simulated by taking a plurality of images that have been taken at a particular location over a set of camera orientations that are determined based on the selection of the wide-angle lens. The mobile device may provide prompts to the user indicating the camera orientations for which images should be captured in order to generate the simulated camera lens effect.
    Type: Grant
    Filed: September 9, 2016
    Date of Patent: September 5, 2017
    Assignee: Google Inc.
    Inventors: Scott Ettinger, David Lee, Evan Rapoport, Jacob Mintz, Bryan Feldman, Mikkel Crone Köser, Daniel Joseph Filip
  • Patent number: 9699488
    Abstract: Systems and techniques are provided for smart snap to interesting points in media content. A position control input may be received from a user to a control interface for a content player being used with a content item. A smart snap point and an associated smart snap area t may be determined for the content item based on the received position control input. The smart snap point and the associated smart snap area may be stored. A second position control input to the control interface for the content player being used with the content item may be received. The position control input may be determined to move a position indicator into the associated smart snap area for the smart snap point. Use of the content item may be resumed with the content player from the smart snap point.
    Type: Grant
    Filed: June 2, 2014
    Date of Patent: July 4, 2017
    Assignee: Google Inc.
    Inventor: Mikkel Crone Köser
  • Publication number: 20170099437
    Abstract: The disclosed technology includes switching between a normal or standard-lens UI and a panoramic or wide-angle photography UI responsive to a zoom gesture. In one implementation, a user gesture corresponding to a “zoom-out” command, when received at a mobile computing device associated with a minimum zoom state, may trigger a switch from a standard lens photo capture UI to a wide-angle photography UI. In another implementation, a user gesture corresponding to a “zoom-in” command, when received at a mobile computing device associated with a nominal wide-angle state, may trigger a switch from a wide-angle photography UI to a standard lens photo capture UI.
    Type: Application
    Filed: December 15, 2016
    Publication date: April 6, 2017
    Inventors: Nirav Bipinchandra Mehta, Mikkel Crone Köser, David Singleton, Robert William Hamilton, Henry John Holland, Tony Ferreira, Thomas Weedon Hume
  • Publication number: 20170076427
    Abstract: Certain embodiments of this disclosure include methods and devices for outputting a zoom sequence. According to one embodiment, a method is provided. The method may include: (i) determining first location information from first metadata associated with one or more images, wherein the first location information identifies a first location; and (ii) outputting, for display, a first zoom sequence based on the first location information, wherein the first zoom sequence may include a first plurality of mapped images of the first location from a first plurality of zoom levels and the plurality of mapped images are sequentially ordered by a magnitude of the zoom level.
    Type: Application
    Filed: November 28, 2016
    Publication date: March 16, 2017
    Inventors: Thomas Weedon Hume, Mikkel Crone Köser, Tony Ferreira, Jeremy Lyon, Waldemar Ariel Baraldi, Bryan Mawhinney, Christopher James Smith, Lenka Trochtova, Andrei Popescu, David Ingram, Flavio Lerda, Michael Ananin, Vytautas Vaitukaitis, Marc Paulina
  • Patent number: 9589321
    Abstract: Techniques for animating a view of a composite image based on metadata related to the capture of the underlying source images. According to certain implementations, the metadata may include timing or sensor data collected or generated during capture of the component source images. For example, the timing data may indicate an order or sequence in which the source images were captured. Accordingly, the corresponding regions of the composite panoramic image may be panned to in sequence, for example, using the Ken Burns Effect. In another example, sensor data from gyroscopes or accelerometers may be used to simulate the movement of the image capture device used to generate the source images. In another implementation, the source images may be associated with varying focal lengths or zoom levels. Accordingly, certain implementations may vary a level zoom, based on the metadata, while panning between source photos.
    Type: Grant
    Filed: April 24, 2014
    Date of Patent: March 7, 2017
    Assignee: Google Inc.
    Inventors: Thomas Weedon Hume, Mikkel Crone Köser
  • Patent number: 9538078
    Abstract: The disclosed technology includes switching between a normal or standard-lens UI and a panoramic or wide-angle photography UI responsive to a zoom gesture. In one implementation, a user gesture corresponding to a “zoom-out” command, when received at a mobile computing device associated with a minimum zoom state, may trigger a switch from a standard lens photo capture UI to a wide-angle photography UI. In another implementation, a user gesture corresponding to a “zoom-in” command, when received at a mobile computing device associated with a nominal wide-angle state, may trigger a switch from a wide-angle photography UI to a standard lens photo capture UI.
    Type: Grant
    Filed: March 2, 2014
    Date of Patent: January 3, 2017
    Assignee: Google Inc.
    Inventors: Nirav Bipinchandra Mehta, Mikkel Crone Köser, David Singleton, Robert William Hamilton, Henry John Holland, Tony Ferreira, Thomas Weedon Hume
  • Publication number: 20160373796
    Abstract: A content annotation tool is disclosed. In a configuration, a portion of a movie may be obtained from a database. Entities, such as an actor, background music, text, etc. may be automatically identified in the movie. A user, such as a content producer, may associate and/or provide supplemental content for an identified entity to the database. A selection of one or more automatically identified entities may be received. A database entry may be generated that links the identified entity with the supplemental content. The selected automatically identified one or more entities and//or supplemental content associated therewith may be presented to an end user.
    Type: Application
    Filed: September 2, 2016
    Publication date: December 22, 2016
    Inventors: Henry Will Schneiderman, Michael Andrew Sipe, Marco Paglia, Mikkel Crone Köser
  • Patent number: 9467620
    Abstract: A method and system is disclosed for simulating different types of camera lens on a device by guiding a user through a set of images to be captured in connection with one or more desired lens effects. In one aspect, a wide-angle lens may be simulated by taking a plurality of images that have been taken at a particular location over a set of camera orientations that are determined based on the selection of the wide-angle lens. The mobile device may provide prompts to the user indicating the camera orientations for which images should be captured in order to generate the simulated camera lens effect.
    Type: Grant
    Filed: January 14, 2015
    Date of Patent: October 11, 2016
    Assignee: Google Inc.
    Inventors: Scott Ettinger, David Lee, Evan Rapoport, Jake Mintz, Bryan Feldman, Mikkel Crone Köser, Daniel Joseph Filip
  • Patent number: 9438947
    Abstract: A content annotation tool is disclosed. In a configuration, a portion of a movie may be obtained from a database. Entities, such as an actor, background music, text, etc. may be automatically identified in the movie. A user, such as a content producer, may associate and/or provide supplemental content for an identified entity to the database. A selection of one or more automatically identified entities may be received. A database entry may be generated that links the identified entity with the supplemental content. The selected automatically identified one or more entities and/or supplemental content associated therewith may be presented to an end user.
    Type: Grant
    Filed: May 1, 2013
    Date of Patent: September 6, 2016
    Assignee: Google Inc.
    Inventors: Henry Will Schneiderman, Michael Andrew Sipe, Marco Paglia, Mikkel Crone Köser
  • Publication number: 20150350735
    Abstract: Systems and techniques are provided for smart snap to interesting points in media content. A position control input may be received from a user to a control interface for a content player being used with a content item. A smart snap point and an associated smart snap area t may be determined for the content item based on the received position control input. The smart snap point and the associated smart snap area may be stored. A second position control input to the control interface for the content player being used with the content item may be received. The position control input may be determined to move a position indicator into the associated smart snap area for the smart snap point. Use of the content item may be resumed with the content player from the smart snap point.
    Type: Application
    Filed: June 2, 2014
    Publication date: December 3, 2015
    Applicant: Google Inc.
    Inventor: Mikkel Crone Köser
  • Publication number: 20150310583
    Abstract: Techniques for animating a view of a composite image based on metadata related to the capture of the underlying source images. According to certain implementations, the metadata may include timing or sensor data collected or generated during capture of the component source images. For example, the timing data may indicate an order or sequence in which the source images were captured. Accordingly, the corresponding regions of the composite panoramic image may be panned to in sequence, for example, using the Ken Burns Effect. In another example, sensor data from gyroscopes or accelerometers may be used to simulate the movement of the image capture device used to generate the source images. In another implementation, the source images may be associated with varying focal lengths or zoom levels. Accordingly, certain implementations may vary a level zoom, based on the metadata, while panning between source photos.
    Type: Application
    Filed: April 24, 2014
    Publication date: October 29, 2015
    Applicant: Google Inc.
    Inventors: Thomas Weedon Hume, Mikkel Crone Köser
  • Publication number: 20150249785
    Abstract: The disclosed technology includes switching between a normal or standard-lens UI and a panoramic or wide-angle photography UI responsive to a zoom gesture. In one implementation, a user gesture corresponding to a “zoom-out” command, when received at a mobile computing device associated with a minimum zoom state, may trigger a switch from a standard lens photo capture UI to a wide-angle photography UI. In another implementation, a user gesture corresponding to a “zoom-in” command, when received at a mobile computing device associated with a nominal wide-angle state, may trigger a switch from a wide-angle photography UI to a standard lens photo capture UI.
    Type: Application
    Filed: March 2, 2014
    Publication date: September 3, 2015
    Applicant: Google Inc.
    Inventors: Nirav Bipinchandra Mehta, Mikkel Crone Köser, David Singleton, Robert William Hamilton, Henry John Holland, Tony Ferreira, Thomas Weedon Hume
  • Publication number: 20150244930
    Abstract: A method and system is disclosed for simulating different types of camera lens on a device by guiding a user through a set of images to be captured in connection with one or more desired lens effects. In one aspect, a wide-angle lens may be simulated by taking a plurality of images that have been taken at a particular location over a set of camera orientations that are determined based on the selection of the wide-angle lens. The mobile device may provide prompts to the user indicating the camera orientations for which images should be captured in order to generate the simulated camera lens effect.
    Type: Application
    Filed: January 14, 2015
    Publication date: August 27, 2015
    Inventors: Scott Ettinger, David Lee, Evan Rapoport, Jake Mintz, Bryan Feldman, Mikkel Crone Köser, Daniel Joseph Filip
  • Publication number: 20140188997
    Abstract: The present disclosure includes systems and methods for creating and sharing inline commentary relating to media within an online community, for example, a social network. The inline commentary can be one or more types of media, for example, text, audio, image, video, URL link, etc. In some implementations, the systems and methods either receive media that is live or pre-recorded, permit viewing by users and receive selective added commentary by users inline. The systems and methods are configured to send one or more notifications regarding the commentary. In some implementations, the systems and methods are configured to receive responses by other users to the initial commentary provided by a particular user.
    Type: Application
    Filed: December 31, 2012
    Publication date: July 3, 2014
    Inventors: Henry Will Schneiderman, Michael Andrew Sipe, Steven James Ross, Brian Ronald Colonna, Danielle Marie Millett, Uriel Gerardo Rodriguez, Michael Christian Nechyba, Mikkel Crone Köser, Ankit Jain