Patents by Inventor Su Chuin Leong

Su Chuin Leong has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20150185873
    Abstract: The present disclosure relates to devices and methods for automatically moving a cursor within a map viewport. More specifically, the present disclosure relates to devices and methods that determine a location of various objects within a 3D scene displayed within a map viewport and determining a location of a 3D cursor within the 3D scene. When a distance between an object location and the 3D cursor location is less than a threshold distance, the geometric shape data of the 3D cursor is automatically modified to at least partially surround the object. The location of the object may be determined based on data representative of the 3D scene. The location of the 3D cursor may be determined based on data representative of the 3D cursor location.
    Type: Application
    Filed: August 13, 2012
    Publication date: July 2, 2015
    Applicant: GOOGLE INC.
    Inventors: Andrew Ofstad, Su Chuin Leong
  • Publication number: 20150185991
    Abstract: A mapping system allows a user to interact with any location on a digital map and present the user with location related information associated with the selected location. The location related information may be in the form or a card, pop-up, image, or other graphic and may be displayed on the map at or near the selected location, around the map, etc. The displayed location related information may include predetermined or pre-stored data about the location or may include location related information collected and generated on the fly in response to the user interaction with the digital map. The displayed location related information may be displayed in the same graphical format on the digital map regardless of whether location related information is predetermined information about the location that already exists or information that is collected and generated on the fly in response to the user interaction.
    Type: Application
    Filed: May 15, 2013
    Publication date: July 2, 2015
    Inventors: Kelvin Ho, Jonah Jones, Yatin Chawathe, Bemhard Seefeld, Paul Merrell, Alirez Ali, Jonathan Siegel, Daniel Otero, Su Chuin Leong
  • Publication number: 20140062998
    Abstract: The present disclosure relates to devices and user interfaces for orienting a camera view toward surfaces in a 3D map. More specifically, the present disclosure relates to devices and methods that determine a zoom level associated with a 3D scene and 3D geometry of a map feature with the 3D scene and orient a 3D cursor to a surface of the map feature based on the zoom level and the 3D geometry when a user moves the 3D cursor over the map feature. When a user selects a point within the 3D geometry of the map feature, the 3D map is re-oriented with a view of the surface of the map feature.
    Type: Application
    Filed: September 4, 2012
    Publication date: March 6, 2014
    Applicant: GOOGLE INC.
    Inventors: Andrew Ofstad, Su Chuin Leong
  • Publication number: 20130332890
    Abstract: A system and method for providing content for a point of interest are provided. One or more two-dimensional content items are provided for display on a user interface of an electronic device, where each of the one or more two-dimensional content items represents a corresponding point of interest. A user selection of one of the one or more two-dimensional content items is received. A three-dimensional content item corresponding to a point of interest that is represented by the selected two-dimensional content item is provided in response to receiving the user selection of the one of the one or more two-dimensional content items.
    Type: Application
    Filed: June 5, 2013
    Publication date: December 12, 2013
    Inventors: Haris RAMIC, Su Chuin LEONG, Brian Lawrence ELLIS
  • Patent number: 8412531
    Abstract: The present invention provides a user interface for providing press-to-talk-interaction via utilization of a touch-anywhere-to-speak module on a mobile computing device. Upon receiving an indication of a touch anywhere on the screen of a touch screen interface, the touch-anywhere-to-speak module activates the listening mechanism of a speech recognition module to accept audible user input and displays dynamic visual feedback of a measured sound level of the received audible input. The touch-anywhere-to-speak module may also provide a user a convenient and more accurate speech recognition experience by utilizing and applying the data relative to a context of the touch (e.g., relative location on the visual interface) in correlation with the spoken audible input.
    Type: Grant
    Filed: June 10, 2009
    Date of Patent: April 2, 2013
    Assignee: Microsoft Corporation
    Inventors: Anne K. Sullivan, Lisa Stifelman, Kathleen J. Lee, Su Chuin Leong
  • Publication number: 20100318366
    Abstract: The present invention provides a user interface for providing press-to-talk-interaction via utilization of a touch-anywhere-to-speak module on a mobile computing device. Upon receiving an indication of a touch anywhere on the screen of a touch screen interface, the touch-anywhere-to-speak module activates the listening mechanism of a speech recognition module to accept audible user input and displays dynamic visual feedback of a measured sound level of the received audible input. The touch-anywhere-to-speak module may also provide a user a convenient and more accurate speech recognition experience by utilizing and applying the data relative to a context of the touch (e.g., relative location on the visual interface) in correlation with the spoken audible input.
    Type: Application
    Filed: June 10, 2009
    Publication date: December 16, 2010
    Applicant: Microsoft Corporation
    Inventors: Anne K. Sullivan, Lisa Stifelman, Kathleen J. Lee, Su Chuin Leong