Patents by Inventor Jeffrey Cheng-yao Fong

Jeffrey Cheng-yao Fong has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20110239153
    Abstract: A pointer tool in a touch-screen display is disclosed. The method includes activating a pointer tool in a touch screen display in response to contact with an area of the touch screen and persisting the displaying of the pointer tool in the touch screen display after the contact with the touch screen is removed. Once editing data is received, the pointer tool is removed from the touch screen display.
    Type: Application
    Filed: March 24, 2010
    Publication date: September 29, 2011
    Applicant: Microsoft Corporation
    Inventors: Benjamin F. Carter, Priyanka Singhal, Shawna Julie Davis, Tirthankar Sengupta, Jeffrey Cheng-Yao Fong, Ryan Terry Bickel, Peter Gregory Davis
  • Publication number: 20110225543
    Abstract: A user interface is described that temporarily displays portions of a page that reside outside of the viewable area of the screen. An animated transition creates a brief preview of at least one user interface feature. Additionally, the user interface feature is then animated out of the viewable area in a way to suggest a location of the feature. In one embodiment, a target page that is being opened controls the transition and animates features into and out of the viewable area to create the temporary preview. In another embodiment, the target page includes user interface elements that can asynchronously control the preview animation independent from the main content of the target page. In yet another embodiment, a transition coordinator can coordinate the timing between animating out a foreground application while animating in a target application.
    Type: Application
    Filed: March 10, 2010
    Publication date: September 15, 2011
    Applicant: Microsoft Corporation
    Inventors: Jeff Arnold, Jeffrey Cheng-Yao Fong, Pritesh Bodalia, Jeffrey L. Bogdan, Lee Dicks Clark
  • Publication number: 20110225547
    Abstract: Dynamic icons are described that can employ animations, such as visual effects, audio, and other content that change with time. If multiple animations are scheduled to occur simultaneously, the timing of the animations can be controlled so that timing overlap of the animations is reduced. For example, the starting times of the animations can be staggered so that multiple animations are not initiated too close in time. It has been found that too much motion in the user interface can be distracting and cause confusion amongst users.
    Type: Application
    Filed: March 10, 2010
    Publication date: September 15, 2011
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Jeff Arnold, Chris Glein
  • Publication number: 20110202859
    Abstract: Techniques and tools are described that relate to different aspects of a user interface in which visual distortion effects are presented to provide visual cues to a user to indicate a location in a movable data collection (e.g., a scrollable list, an email message, a content layer, etc.). For example, in response to a user gesture on a touchscreen, a user interface system presents a portion of a list or layer in a visually distorted state, such as a “squished,” squeezed or compressed state in which text, images or other content is shown to be smaller than normal in one or more dimensions, to indicate to a user that the end of a list has been reached.
    Type: Application
    Filed: March 10, 2010
    Publication date: August 18, 2011
    Applicant: Microsoft Corporation
    Inventor: Jeffrey Cheng-Yao Fong
  • Publication number: 20110202834
    Abstract: Aspects of a user interface that provides visual feedback in response to user input. For example, boundary effects are presented to provide visual cues to a user to indicate that a boundary in a movable user interface element (e.g., the end of a scrollable list) has been reached. As another example, parallax effects are presented in which multiple parallel or substantially parallel layers in a multi-layer user interface move at different rates, in response to user input. As another example, simulated inertia motion of UI elements is used to provide a more natural feel for touch input. Various combinations of features are described. For example, simulated inertia motion can be used in combination with parallax effects, boundary effects, or other types of visual feedback.
    Type: Application
    Filed: May 4, 2010
    Publication date: August 18, 2011
    Applicant: Microsoft Corporation
    Inventors: Luciano Baretta Mandryk, Jeffrey Cheng-Yao Fong
  • Publication number: 20110202837
    Abstract: A user interface (UI) system calculates movements in a multi-layer graphical user interface. The UI system receives user input corresponding to gestures on a touchscreen. The UI system calculates a movement of a first layer in a first direction (e.g., a horizontal direction) at a first movement rate. The UI system calculates a movement of a second layer substantially parallel to the movement of the first layer, at a second movement rate that differs from the first movement rate. The UI system calculates a movement (e.g., a vertical movement) in a direction substantially orthogonal to the first direction, in a UI element of one of the layers.
    Type: Application
    Filed: June 25, 2010
    Publication date: August 18, 2011
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Eric J. Hull, Sergey Chub
  • Publication number: 20110199318
    Abstract: A user interface (UI) system calculates movements in a multi-layer graphical user interface. The UI system receives user input corresponding to gestures on a touchscreen. The UI system calculates a movement of a first layer in a first direction (e.g., a horizontal direction) at a first movement rate. For example, the first movement rate can be substantially equal to the movement rate of a gesture made by a user's finger or other object on the touchscreen. The UI system calculates movements of other layers substantially parallel to the movement of the first layer, at movement rates that differ from the first movement rate.
    Type: Application
    Filed: June 25, 2010
    Publication date: August 18, 2011
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Cheng-Yao Fong, Eric J. Hull, Sergey Chub
  • Publication number: 20110119619
    Abstract: A user interface can display active and passive content. For example, a camera viewfinder image can be displayed on a screen, as part of a strip, concatenated with one or more other images, for example, images that were previously taken with the camera. A user can cause the viewfinder image and the other images to move together across the screen. This can allow a user to easily examine the other images and the viewfinder image without, for example, switching between different screens in a user interface. Media captured with a device can be associated with a media category by positioning a user interface element near one or more other elements associated with the category.
    Type: Application
    Filed: February 15, 2010
    Publication date: May 19, 2011
    Applicant: Microsoft Corporation
    Inventors: Jeffrey Cheng-yao Fong, Donald Allen Barnett
  • Publication number: 20090004973
    Abstract: A method to indicate that a first device is in communication with a second device is disclosed. The first device may receive an indication activity from the second device. The indication activity may change the display and the illumination object on the first device and the displays on the illumination object and the display are similar.
    Type: Application
    Filed: June 29, 2007
    Publication date: January 1, 2009
    Applicant: MICROSOFT CORPORATION
    Inventors: Anton Oguzhan Alford Andrews, David Walter Proctor, Jeffrey Cheng-Yao Fong, Thamer A. Abanami
  • Publication number: 20080284739
    Abstract: An input device may detect an input on an input device. The input may be compared to stored inputs to determine if the input is related to one of the stored inputs where the stored inputs can be user defined. If the input is related to one of the stored inputs, an action may be executed related to the stored input. If the input is not related to one of the stored inputs or is not recognized, the steps of the method may be repeated. The actions associated with different gestures may be defined by the user.
    Type: Application
    Filed: May 17, 2007
    Publication date: November 20, 2008
    Applicant: MICROSOFT CORPORATION
    Inventors: Anton Oguzhan Alford Andrews, Thamer A. Abanami, Jeffrey Cheng-Yao Fong, Morgan Venable, Thomas J. Misage
  • Patent number: D527375
    Type: Grant
    Filed: September 30, 2003
    Date of Patent: August 29, 2006
    Assignee: Microsoft Corporation
    Inventors: William T. Flora, Jeffrey Cheng-yao Fong