Patents by Inventor Peter Broadwell

Peter Broadwell has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20060238540
    Abstract: The methods and systems for scoring multiple time-based assets and events include declaratively playing a first media sequence; and declaratively initiating a second media sequence at a predetermined time prior to an end point of the first media sequence wherein the predetermined time is independent from an amount of time utilized to play the first media sequence.
    Type: Application
    Filed: March 17, 2006
    Publication date: October 26, 2006
    Inventors: Christopher Marrin, James Kent, Peter Broadwell, Robert Myers
  • Patent number: 6990681
    Abstract: A broadcast of an event is enhanced with synthetic scenes generated from audio visual and supplemental data received in the broadcast. A synthetic scene is integrated into the broadcast in accordance with a depth map that contains depth information for the synthetic scene. The supplemental data may be sensing data from various sensors placed at the event, position and orientation data of particular objects at the event, or environmental data on conditions at the event. The supplemental data may also be camera tracking data from a camera that is used to generate a virtual camera and viewpoints for the synthetic scene. The present invention describes systems, clients, servers, methods, and computer-readable media of varying scope. In addition to the aspects of the present invention described in this summary, further aspects of the invention will become apparent by reference to the drawings and by reading the detailed description that follows.
    Type: Grant
    Filed: March 29, 2002
    Date of Patent: January 24, 2006
    Assignees: Sony Corporation, Sony Electronics Inc.
    Inventors: Sidney Wang, Richter A. Rafey, Hubert Le Van Gong, Peter Broadwell, Simon Gibbs
  • Patent number: 6940538
    Abstract: A method of extracting a depth map from known camera and tracking data. The method includes the steps of positioning a virtual camera at the coordinates of the tracked camera, setting the field of view to that of the tracked camera, positioning and orienting a synthetic tracked object to the coordinates of the tracked object, clearing the depth buffer and rendering the tracked object as a depth map.
    Type: Grant
    Filed: August 29, 2001
    Date of Patent: September 6, 2005
    Assignees: Sony Corporation, Sony Electronics Inc.
    Inventors: Richter A. Rafey, Peter Broadwell, Sidney Wang, Simon Gibbs
  • Publication number: 20050128200
    Abstract: The methods and systems for scoring multiple time-based assets and events include declaratively playing a first media sequence; and declaratively initiating a second media sequence at a predetermined time prior to an end point of the first media sequence wherein the predetermined time is independent from an amount of time utilized to play the first media sequence.
    Type: Application
    Filed: November 16, 2004
    Publication date: June 16, 2005
    Inventors: Christopher Marrin, James Kent, Peter Broadwell, Robert Myers
  • Publication number: 20050128220
    Abstract: In one embodiment, the methods and apparatuses detect hardware associated with a device configured for displaying authored content; set an initial frame rate for the authored content based on the hardware; and play the content at the initial frame rate, wherein the authored content is scripted in a declarative markup language.
    Type: Application
    Filed: November 16, 2004
    Publication date: June 16, 2005
    Inventors: Christopher Marrin, James Kent, Peter Broadwell, Robert Myers
  • Publication number: 20050088458
    Abstract: A system and method for the real-time composition and presentation of a complex, dynamic, and interactive experience by means of an efficient declarative markup language. Using the Surface construct, authors can embed images or full-motion video data anywhere they would use a traditional texture map within their 3D scene. Authors can also use the results of rendering one scene description as an image to be texture mapped into another scene. In particular, the Surface allows the results of any rendering application to be used as a texture within the author's scene. This allows declarative rendering of nested scenes and rendering of scenes having component Surfaces with decoupled rendering rates.
    Type: Application
    Filed: November 16, 2004
    Publication date: April 28, 2005
    Inventors: Christopher Marrin, James Kent, Robert Myers, Peter Broadwell
  • Publication number: 20050035970
    Abstract: In one embodiment, the methods and apparatuses transmit authored content from an authoring device to a remote device; directly play the authored content on the remote device; and monitor a portion of the authored content on the authoring device while simultaneously playing the portion of the authored content on the remote device, wherein the authored content is scripted in a declarative markup language.
    Type: Application
    Filed: September 9, 2004
    Publication date: February 17, 2005
    Inventors: Jenny Wirtschafter, Christopher Marrin, Peter Broadwell
  • Publication number: 20030043270
    Abstract: A method of extracting a depth map from known camera and tracking data is disclosed. The method includes the steps of positioning a virtual camera at the coordinates of the tracked camera, setting the field of view to that of the tracked camera, positioning and orienting a synthetic tracked object to the coordinates of the tracked object, clearing the depth buffer and rendering the tracked object as a depth map.
    Type: Application
    Filed: August 29, 2001
    Publication date: March 6, 2003
    Inventors: Richter A. Rafey, Peter Broadwell, Sidney Wang, Simon Gibbs
  • Publication number: 20030038892
    Abstract: A broadcast of an event is enhanced with synthetic scenes generated from audio visual and supplemental data received in the broadcast. A synthetic scene is integrated into the broadcast in accordance with a depth map that contains depth information for the synthetic scene. The supplemental data may be sensing data from various sensors placed at the event, position and orientation data of particular objects at the event, or environmental data on conditions at the event. The supplemental data may also be camera tracking data from a camera that is used to generate a virtual camera and viewpoints for the synthetic scene.
    Type: Application
    Filed: March 29, 2002
    Publication date: February 27, 2003
    Inventors: Sidney Wang, Richter A. Rafey, Hubert Le Van Gong, Peter Broadwell, Simon Gibbs