Patents by Inventor Jim Marggraff

Jim Marggraff has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20090251337
    Abstract: Embodiments of the invention present a system and method for identifying relationships between different types of data captured by a pen-based computing system, such as a smart pen. The pen-based computing system generates one or more sessions including different types of data that are associated with each other. In one embodiment, the pen-based computing system generates an index file including captured audio data and written data, where the written data is associated with a temporal location of the audio data corresponding to the time the written data was captured. For example, the pen-based computing system applies one or more heuristic processes to the received data to identify relationships between various types of the received data, used to associated different types of data with each other.
    Type: Application
    Filed: March 31, 2009
    Publication date: October 8, 2009
    Applicant: LIVESCRIBE, INC.
    Inventors: Jim Marggraff, Erica Leverett, Tracy L. Edgecomb, Alexander Sasha Pesic
  • Publication number: 20090253107
    Abstract: An instruction is presented to a user for making a target gesture. The target gesture may be a portion of an exemplary or model symbol. The instruction may be presented in various ways, such as being printed on a writing surface or being played in audio format through a speaker of a smart pen device. A writing gesture made on a writing surface by the user is digitally captured using a smart pen device. The captured writing gesture is compared to the target gesture and feedback is determined based on this comparison. This feedback may indicate the correctness of the user's writing gesture. The feedback may be presented to the user through various means, including through the smart pen device. The comparison may also be used to determine a next instruction for the user to follow.
    Type: Application
    Filed: March 31, 2009
    Publication date: October 8, 2009
    Applicant: LIVESCRIBE, INC.
    Inventor: Jim Marggraff
  • Publication number: 20090063492
    Abstract: In a pen-based computing system, use of paradigms similar to those used with physical paper to organize user generated content captured by a smart pen is disclosed. Data, such as handwriting gestures, is captured by the smart pen and transferred to a digital domain, such as by being transferred to a computing system. Once in the digital domain, the captured content is organized as virtual pages or virtual notebooks. Hence, content captured from various sources, such as different notebooks or different physical pieces of paper, is assembled into a virtual page or virtual notebook. User input or automatic application of rules can be used to assemble the captured content into a virtual page or virtual notebook.
    Type: Application
    Filed: May 29, 2008
    Publication date: March 5, 2009
    Inventors: Vinaitheerthan Meyyappan, Jim Marggraff, Tracy L. Edgecomb, Andy Van Schaack
  • Publication number: 20090052778
    Abstract: In a pen-based computing system, a printed version of a document having preexisting content is annotated using a smart pen. The smart pen captures handwriting gestures to obtain an electronic representation of the annotations. The smart pen computing system identifies a digital version of the document having the preexisting content and stores the electronic representation of the annotations in association with the digital document. The smart pen computing system may overlay the electronic representation of the annotations with the preexisting content to provide a digital representation of the annotated document.
    Type: Application
    Filed: May 29, 2008
    Publication date: February 26, 2009
    Inventors: Tracy L. Edgecomb, Andy Van Schaack, Jim Marggraff, Vinaitheerthan Meyyappan
  • Publication number: 20090027400
    Abstract: In a pen-based computing system, a microphone on the smart pen device records audio to produce audio data and a gesture capture system on the smart pen device records writing gestures to produce writing gesture data. Both the audio data and the writing gesture data include a time component. The audio data and writing gesture data are combined or synchronized according to their time components to create audio ink data. The audio ink data can be uploaded to a computer system attached to the smart pen device and displayed to a user through a user interface. The user makes a selection in the user interface to play the audio ink data, and the audio ink data is played back by animated the captured writing gestures and playing the recorded audio in synchronization.
    Type: Application
    Filed: May 29, 2008
    Publication date: January 29, 2009
    Inventors: Jim Marggraff, Tracy L. Edgecomb, Andy Van Schaack
  • Publication number: 20090021493
    Abstract: In a pen-based computing system, multi-modal data is transferred between a paper domain and a digital domain. Data initially generated in the paper domain is captured by a smart pen and a digital file including the captured data is generated. For example, a computing system coupled to the smart pen generates a digital file including the captured data. A paper representation of the digital file is subsequently generated. The digital file can subsequently be modified by editing the paper representation of the digital file using the smart pen. Edits to the paper representation of the digital file are captured by the smart pen and converted to the digital domain where they are used to edit the content of the digital file.
    Type: Application
    Filed: May 29, 2008
    Publication date: January 22, 2009
    Inventors: Jim Marggraff, Tracy L. Edgecomb
  • Publication number: 20090021494
    Abstract: In a pen-based computing system, a smart pen allows user interaction with the pen-based computing system using multiple modalities. Generally, the modalities are categorized as input (or command) modalities and output (or feedback) modalities. Examples of input modalities for the smart pen include writing with the smart pen to provide written input and/or speaking or otherwise providing sound to give audio input to the smart pen.
    Type: Application
    Filed: May 29, 2008
    Publication date: January 22, 2009
    Inventors: Jim Marggraff, Andy Van Schaack
  • Publication number: 20090024988
    Abstract: In a pen-based computing system, a user-specific smart pen application is created from a template application using customer authoring tools. The template application contains computer program code that is to be executed by a processor of a smart pen. Application content and a representation for printed content are received. The application content, provided by user or customer, defines functional interactions between the printed content representation and a smart pen. The template application is combined with application content to generate a user-specific application comprising instructions for being executed on a processor of a smart pen. The user-specific application is stored on a storage medium.
    Type: Application
    Filed: May 29, 2008
    Publication date: January 22, 2009
    Inventors: Tracy L. Edgecomb, Andy Van Schaack, Jim Marggraff
  • Publication number: 20090021495
    Abstract: In a pen-based computing system, audio data is recorded from a microphone on the smart pen, and synchronized writing gesture data is generated from a gesture signal. Unsynchronized writing gesture data may also be received from another gesture signal. The audio data and the synchronized writing gesture data include a time component, enabling them to be synchronized. Message recipient data is received and a message is composed including the unsynchronized writing gesture data, the audio data, and the synchronized writing gesture data. The message is sent to a destination computing system based on the message recipient data. At the destination computing system, a user requests to play the message. The unsynchronized writing gesture data is displayed to the user, and the audio data is played as the synchronized writing gesture data is animated in sync.
    Type: Application
    Filed: May 29, 2008
    Publication date: January 22, 2009
    Inventors: Tracy L. Edgecomb, Jim Marggraff
  • Publication number: 20090000832
    Abstract: A “self-addressing” writing surface, such as paper, includes an encoded identifier that is uniquely associated with the recipient or a group of recipients. A pen-based computing system is used to capture writing made on the writing surface. The captured writing and the recipient identifier are sent electronically to a routing system, which identifies the recipient to which the content is to be routed based on the recipient identifier. The routing system forwards the message to the identified recipient, thereby enabling communication from the writer to the recipient associated with the writing surface.
    Type: Application
    Filed: May 29, 2008
    Publication date: January 1, 2009
    Inventors: Jim Marggraff, Alexander Sasha Pesic, Tracy L. Edgecomb