Patents by Inventor Mikkel Crone Köser
Mikkel Crone Köser has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10070170Abstract: A content annotation tool is disclosed. In a configuration, a portion of a movie may be obtained from a database. Entities, such as an actor, background music, text, etc. may be automatically identified in the movie. A user, such as a content producer, may associate and/or provide supplemental content for an identified entity to the database. A selection of one or more automatically identified entities may be received. A database entry may be generated that links the identified entity with the supplemental content. The selected automatically identified one or more entities and//or supplemental content associated therewith may be presented to an end user.Type: GrantFiled: September 2, 2016Date of Patent: September 4, 2018Assignee: Google LLCInventors: Henry Will Schneiderman, Michael Andrew Sipe, Marco Paglia, Mikkel Crone Köser
-
Patent number: 10038844Abstract: The disclosed technology includes switching between a normal or standard-lens UI and a panoramic or wide-angle photography UI responsive to a zoom gesture. In one implementation, a user gesture corresponding to a “zoom-out” command, when received at a mobile computing device associated with a minimum zoom state, may trigger a switch from a standard lens photo capture UI to a wide-angle photography UI. In another implementation, a user gesture corresponding to a “zoom-in” command, when received at a mobile computing device associated with a nominal wide-angle state, may trigger a switch from a wide-angle photography UI to a standard lens photo capture UI.Type: GrantFiled: December 15, 2016Date of Patent: July 31, 2018Assignee: Google LLCInventors: Nirav Bipinchandra Mehta, Mikkel Crone Köser, David Singleton, Robert William Hamilton, Henry John Holland, Tony Ferreira, Thomas Weedon Hume
-
Patent number: 9990694Abstract: Certain embodiments of this disclosure include methods and devices for outputting a zoom sequence. According to one embodiment, a method is provided. The method may include: (i) determining first location information from first metadata associated with one or more images, wherein the first location information identifies a first location; and (ii) outputting, for display, a first zoom sequence based on the first location information, wherein the first zoom sequence may include a first plurality of mapped images of the first location from a first plurality of zoom levels and the plurality of mapped images are sequentially ordered by a magnitude of the zoom level.Type: GrantFiled: November 28, 2016Date of Patent: June 5, 2018Assignee: Google LLCInventors: Thomas Weedon Hume, Mikkel Crone Köser, Tony Ferreira, Jeremy Lyon, Waldemar Ariel Baraldi, Bryan Mawhinney, Christopher James Smith, Lenka Trochtova, Andrei Popescu, David Ingram, Flavio Lerda, Michael Ananin, Vytautas Vaitukaitis, Marc Paulina
-
Patent number: 9756260Abstract: A method and system is disclosed for simulating different types of camera lens on a device by guiding a user through a set of images to be captured in connection with one or more desired lens effects. In one aspect, a wide-angle lens may be simulated by taking a plurality of images that have been taken at a particular location over a set of camera orientations that are determined based on the selection of the wide-angle lens. The mobile device may provide prompts to the user indicating the camera orientations for which images should be captured in order to generate the simulated camera lens effect.Type: GrantFiled: September 9, 2016Date of Patent: September 5, 2017Assignee: Google Inc.Inventors: Scott Ettinger, David Lee, Evan Rapoport, Jacob Mintz, Bryan Feldman, Mikkel Crone Köser, Daniel Joseph Filip
-
Patent number: 9699488Abstract: Systems and techniques are provided for smart snap to interesting points in media content. A position control input may be received from a user to a control interface for a content player being used with a content item. A smart snap point and an associated smart snap area t may be determined for the content item based on the received position control input. The smart snap point and the associated smart snap area may be stored. A second position control input to the control interface for the content player being used with the content item may be received. The position control input may be determined to move a position indicator into the associated smart snap area for the smart snap point. Use of the content item may be resumed with the content player from the smart snap point.Type: GrantFiled: June 2, 2014Date of Patent: July 4, 2017Assignee: Google Inc.Inventor: Mikkel Crone Köser
-
Publication number: 20170099437Abstract: The disclosed technology includes switching between a normal or standard-lens UI and a panoramic or wide-angle photography UI responsive to a zoom gesture. In one implementation, a user gesture corresponding to a “zoom-out” command, when received at a mobile computing device associated with a minimum zoom state, may trigger a switch from a standard lens photo capture UI to a wide-angle photography UI. In another implementation, a user gesture corresponding to a “zoom-in” command, when received at a mobile computing device associated with a nominal wide-angle state, may trigger a switch from a wide-angle photography UI to a standard lens photo capture UI.Type: ApplicationFiled: December 15, 2016Publication date: April 6, 2017Inventors: Nirav Bipinchandra Mehta, Mikkel Crone Köser, David Singleton, Robert William Hamilton, Henry John Holland, Tony Ferreira, Thomas Weedon Hume
-
Publication number: 20170076427Abstract: Certain embodiments of this disclosure include methods and devices for outputting a zoom sequence. According to one embodiment, a method is provided. The method may include: (i) determining first location information from first metadata associated with one or more images, wherein the first location information identifies a first location; and (ii) outputting, for display, a first zoom sequence based on the first location information, wherein the first zoom sequence may include a first plurality of mapped images of the first location from a first plurality of zoom levels and the plurality of mapped images are sequentially ordered by a magnitude of the zoom level.Type: ApplicationFiled: November 28, 2016Publication date: March 16, 2017Inventors: Thomas Weedon Hume, Mikkel Crone Köser, Tony Ferreira, Jeremy Lyon, Waldemar Ariel Baraldi, Bryan Mawhinney, Christopher James Smith, Lenka Trochtova, Andrei Popescu, David Ingram, Flavio Lerda, Michael Ananin, Vytautas Vaitukaitis, Marc Paulina
-
Patent number: 9589321Abstract: Techniques for animating a view of a composite image based on metadata related to the capture of the underlying source images. According to certain implementations, the metadata may include timing or sensor data collected or generated during capture of the component source images. For example, the timing data may indicate an order or sequence in which the source images were captured. Accordingly, the corresponding regions of the composite panoramic image may be panned to in sequence, for example, using the Ken Burns Effect. In another example, sensor data from gyroscopes or accelerometers may be used to simulate the movement of the image capture device used to generate the source images. In another implementation, the source images may be associated with varying focal lengths or zoom levels. Accordingly, certain implementations may vary a level zoom, based on the metadata, while panning between source photos.Type: GrantFiled: April 24, 2014Date of Patent: March 7, 2017Assignee: Google Inc.Inventors: Thomas Weedon Hume, Mikkel Crone Köser
-
Patent number: 9538078Abstract: The disclosed technology includes switching between a normal or standard-lens UI and a panoramic or wide-angle photography UI responsive to a zoom gesture. In one implementation, a user gesture corresponding to a “zoom-out” command, when received at a mobile computing device associated with a minimum zoom state, may trigger a switch from a standard lens photo capture UI to a wide-angle photography UI. In another implementation, a user gesture corresponding to a “zoom-in” command, when received at a mobile computing device associated with a nominal wide-angle state, may trigger a switch from a wide-angle photography UI to a standard lens photo capture UI.Type: GrantFiled: March 2, 2014Date of Patent: January 3, 2017Assignee: Google Inc.Inventors: Nirav Bipinchandra Mehta, Mikkel Crone Köser, David Singleton, Robert William Hamilton, Henry John Holland, Tony Ferreira, Thomas Weedon Hume
-
Publication number: 20160373796Abstract: A content annotation tool is disclosed. In a configuration, a portion of a movie may be obtained from a database. Entities, such as an actor, background music, text, etc. may be automatically identified in the movie. A user, such as a content producer, may associate and/or provide supplemental content for an identified entity to the database. A selection of one or more automatically identified entities may be received. A database entry may be generated that links the identified entity with the supplemental content. The selected automatically identified one or more entities and//or supplemental content associated therewith may be presented to an end user.Type: ApplicationFiled: September 2, 2016Publication date: December 22, 2016Inventors: Henry Will Schneiderman, Michael Andrew Sipe, Marco Paglia, Mikkel Crone Köser
-
Patent number: 9467620Abstract: A method and system is disclosed for simulating different types of camera lens on a device by guiding a user through a set of images to be captured in connection with one or more desired lens effects. In one aspect, a wide-angle lens may be simulated by taking a plurality of images that have been taken at a particular location over a set of camera orientations that are determined based on the selection of the wide-angle lens. The mobile device may provide prompts to the user indicating the camera orientations for which images should be captured in order to generate the simulated camera lens effect.Type: GrantFiled: January 14, 2015Date of Patent: October 11, 2016Assignee: Google Inc.Inventors: Scott Ettinger, David Lee, Evan Rapoport, Jake Mintz, Bryan Feldman, Mikkel Crone Köser, Daniel Joseph Filip
-
Patent number: 9438947Abstract: A content annotation tool is disclosed. In a configuration, a portion of a movie may be obtained from a database. Entities, such as an actor, background music, text, etc. may be automatically identified in the movie. A user, such as a content producer, may associate and/or provide supplemental content for an identified entity to the database. A selection of one or more automatically identified entities may be received. A database entry may be generated that links the identified entity with the supplemental content. The selected automatically identified one or more entities and/or supplemental content associated therewith may be presented to an end user.Type: GrantFiled: May 1, 2013Date of Patent: September 6, 2016Assignee: Google Inc.Inventors: Henry Will Schneiderman, Michael Andrew Sipe, Marco Paglia, Mikkel Crone Köser
-
Publication number: 20150350735Abstract: Systems and techniques are provided for smart snap to interesting points in media content. A position control input may be received from a user to a control interface for a content player being used with a content item. A smart snap point and an associated smart snap area t may be determined for the content item based on the received position control input. The smart snap point and the associated smart snap area may be stored. A second position control input to the control interface for the content player being used with the content item may be received. The position control input may be determined to move a position indicator into the associated smart snap area for the smart snap point. Use of the content item may be resumed with the content player from the smart snap point.Type: ApplicationFiled: June 2, 2014Publication date: December 3, 2015Applicant: Google Inc.Inventor: Mikkel Crone Köser
-
Publication number: 20150310583Abstract: Techniques for animating a view of a composite image based on metadata related to the capture of the underlying source images. According to certain implementations, the metadata may include timing or sensor data collected or generated during capture of the component source images. For example, the timing data may indicate an order or sequence in which the source images were captured. Accordingly, the corresponding regions of the composite panoramic image may be panned to in sequence, for example, using the Ken Burns Effect. In another example, sensor data from gyroscopes or accelerometers may be used to simulate the movement of the image capture device used to generate the source images. In another implementation, the source images may be associated with varying focal lengths or zoom levels. Accordingly, certain implementations may vary a level zoom, based on the metadata, while panning between source photos.Type: ApplicationFiled: April 24, 2014Publication date: October 29, 2015Applicant: Google Inc.Inventors: Thomas Weedon Hume, Mikkel Crone Köser
-
Publication number: 20150249785Abstract: The disclosed technology includes switching between a normal or standard-lens UI and a panoramic or wide-angle photography UI responsive to a zoom gesture. In one implementation, a user gesture corresponding to a “zoom-out” command, when received at a mobile computing device associated with a minimum zoom state, may trigger a switch from a standard lens photo capture UI to a wide-angle photography UI. In another implementation, a user gesture corresponding to a “zoom-in” command, when received at a mobile computing device associated with a nominal wide-angle state, may trigger a switch from a wide-angle photography UI to a standard lens photo capture UI.Type: ApplicationFiled: March 2, 2014Publication date: September 3, 2015Applicant: Google Inc.Inventors: Nirav Bipinchandra Mehta, Mikkel Crone Köser, David Singleton, Robert William Hamilton, Henry John Holland, Tony Ferreira, Thomas Weedon Hume
-
Publication number: 20150244930Abstract: A method and system is disclosed for simulating different types of camera lens on a device by guiding a user through a set of images to be captured in connection with one or more desired lens effects. In one aspect, a wide-angle lens may be simulated by taking a plurality of images that have been taken at a particular location over a set of camera orientations that are determined based on the selection of the wide-angle lens. The mobile device may provide prompts to the user indicating the camera orientations for which images should be captured in order to generate the simulated camera lens effect.Type: ApplicationFiled: January 14, 2015Publication date: August 27, 2015Inventors: Scott Ettinger, David Lee, Evan Rapoport, Jake Mintz, Bryan Feldman, Mikkel Crone Köser, Daniel Joseph Filip
-
Publication number: 20140188997Abstract: The present disclosure includes systems and methods for creating and sharing inline commentary relating to media within an online community, for example, a social network. The inline commentary can be one or more types of media, for example, text, audio, image, video, URL link, etc. In some implementations, the systems and methods either receive media that is live or pre-recorded, permit viewing by users and receive selective added commentary by users inline. The systems and methods are configured to send one or more notifications regarding the commentary. In some implementations, the systems and methods are configured to receive responses by other users to the initial commentary provided by a particular user.Type: ApplicationFiled: December 31, 2012Publication date: July 3, 2014Inventors: Henry Will Schneiderman, Michael Andrew Sipe, Steven James Ross, Brian Ronald Colonna, Danielle Marie Millett, Uriel Gerardo Rodriguez, Michael Christian Nechyba, Mikkel Crone Köser, Ankit Jain