Patents by Inventor Tracy L. Edgecomb

Tracy L. Edgecomb has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Publication number: 20160154482
    Abstract: A method of selecting content using a pen-based computing system. Gestures generated by a user with a smart pen on a writing surface are captured and used to select content. The content can be written or audio content. Optionally additional content linked to the selected content is also selected.
    Type: Application
    Filed: November 20, 2015
    Publication date: June 2, 2016
    Inventors: Tracy L. Edgecomb, Andrew J. Van Schaack
  • Publication number: 20160124702
    Abstract: A system and method for identifying temporal positions in audio data and accessing the identified temporal positions are disclosed. “Audio bookmarks” are created by using various types of input, such as accessing a printed control with a smart pen, providing a voice command to the smart pen or providing a written command to the smart pen to identify temporal positions within the audio data. Alternatively, one or more rules are applied to audio data by a pen-based computing system to identify temporal positions in the audio data. The audio bookmarks are associated with one or more visual, auditory or tactile indicators showing the location of the audio bookmarks in the audio data. When an indicator is accessed, a portion of the audio data is played beginning from the temporal position associated with the accessed indicator. Additional data, such as written data, may also be associated with an indicator.
    Type: Application
    Filed: October 6, 2015
    Publication date: May 5, 2016
    Inventors: Tracy L. Edgecomb, James L. Marggraff, Alexander Sasha Pesic
  • Patent number: 9250718
    Abstract: A “self-addressing” writing surface, such as paper, includes an encoded identifier that is uniquely associated with the recipient or a group of recipients. A pen-based computing system is used to capture writing made on the writing surface. The captured writing and the recipient identifier are sent electronically to a routing system, which identifies the recipient to which the content is to be routed based on the recipient identifier. The routing system forwards the message to the identified recipient, thereby enabling communication from the writer to the recipient associated with the writing surface.
    Type: Grant
    Filed: May 29, 2008
    Date of Patent: February 2, 2016
    Assignee: Livescribe, Inc.
    Inventors: James L. Marggraff, Alexander Sasha Pesic, Tracy L. Edgecomb
  • Publication number: 20140347328
    Abstract: A method of selecting content using a pen-based computing system. Gestures generated by a user with a smart pen on a writing surface are captured and used to select content. The content can be written or audio content. Optionally additional content linked to the selected content is also selected.
    Type: Application
    Filed: May 23, 2012
    Publication date: November 27, 2014
    Applicant: LIVESCRIBE
    Inventors: Tracy L. Edgecomb, Andrew J. Van Schaack
  • Patent number: 8842100
    Abstract: In a pen-based computing system, a user-specific smart pen application is created from a template application using customer authoring tools. The template application contains computer program code that is to be executed by a processor of a smart pen. Application content and a representation for printed content are received. The application content, provided by user or customer, defines functional interactions between the printed content representation and a smart pen. The template application is combined with application content to generate a user-specific application comprising instructions for being executed on a processor of a smart pen. The user-specific application is stored on a storage medium.
    Type: Grant
    Filed: December 23, 2013
    Date of Patent: September 23, 2014
    Assignee: Livescribe Inc.
    Inventors: Tracy L. Edgecomb, Andrew J. Van Schaack, James L. Marggraff
  • Publication number: 20140111489
    Abstract: In a pen-based computing system, a user-specific smart pen application is created from a template application using customer authoring tools. The template application contains computer program code that is to be executed by a processor of a smart pen. Application content and a representation for printed content are received. The application content, provided by user or customer, defines functional interactions between the printed content representation and a smart pen. The template application is combined with application content to generate a user-specific application comprising instructions for being executed on a processor of a smart pen. The user-specific application is stored on a storage medium.
    Type: Application
    Filed: December 23, 2013
    Publication date: April 24, 2014
    Applicant: Livescribe Inc.
    Inventors: Tracy L. Edgecomb, Andrew J. Van Schaack, James L. Marggraff
  • Patent number: 8638319
    Abstract: In a pen-based computing system, a user-specific smart pen application is created from a template application using customer authoring tools. The template application contains computer program code that is to be executed by a processor of a smart pen. Application content and a representation for printed content are received. The application content, provided by user or customer, defines functional interactions between the printed content representation and a smart pen. The template application is combined with application content to generate a user-specific application comprising instructions for being executed on a processor of a smart pen. The user-specific application is stored on a storage medium.
    Type: Grant
    Filed: May 29, 2008
    Date of Patent: January 28, 2014
    Assignee: Livescribe Inc.
    Inventors: Tracy L. Edgecomb, Andy Van Schaack, Jim Marggraff
  • Patent number: 8446298
    Abstract: Embodiments of the invention present a system and method for controlling audio capture by a smart pen based computing system. An audio capture mechanism that is independent from a gesture capture system is included on the smart pen to control audio capture by one or more microphones included on the smart pen. In one embodiment, the audio capture mechanism comprises a shared function button, such as a power button. For example, a user interaction with the shared function button initiates audio capture by the one or more microphones on the smart pen and a second user interaction with the shared audio function stops audio capture. Alternatively, audio capture is stopped after completion of a predefined time interval after the user interaction with the shared function button.
    Type: Grant
    Filed: March 31, 2009
    Date of Patent: May 21, 2013
    Assignee: LiveScribe, Inc.
    Inventors: Jim Marggraff, Tracy L. Edgecomb
  • Patent number: 8446297
    Abstract: Embodiments of the invention present a system and method for identifying relationships between different types of data captured by a pen-based computing system, such as a smart pen. The pen-based computing system generates one or more sessions including different types of data that are associated with each other. In one embodiment, the pen-based computing system generates an index file including captured audio data and written data, where the written data is associated with a temporal location of the audio data corresponding to the time the written data was captured. For example, the pen-based computing system applies one or more heuristic processes to the received data to identify relationships between various types of the received data, used to associated different types of data with each other.
    Type: Grant
    Filed: March 31, 2009
    Date of Patent: May 21, 2013
    Assignee: LiveScribe, Inc.
    Inventors: Jim Marggraff, Erica Leverett, Tracy L. Edgecomb, Alexander Sasha Pesic
  • Patent number: 8427344
    Abstract: Paper-based playback of media can be performed by electronically recording handwritten notes including pen strokes using a digital pen on a position-coded paper. A plurality of bounding areas is identified, e.g. by generating bounding areas around the pen strokes as they are developed. Each bounding box is provided with a time stamp which indexes a media file, which was recorded simultaneously with the handwritten notes. By placing the digital pen close to a handwritten note on the paper, the part of the media that was recorded when the specific note was written can be recalled. More specifically, the bounding box corresponding to the position of the digital pen is identified and the associated time stamp is used to find the media to recall. This paper-based playback of media can be performed in e.g. a stand-alone device or by a combination of a digital pen and a mobile phone.
    Type: Grant
    Filed: June 1, 2007
    Date of Patent: April 23, 2013
    Assignee: Anoto AB
    Inventors: James Marggraff, Tracy L. Edgecomb, Gabriel Acosta-Mikulasek, Dan Gärdenfors, Anders Svensson
  • Patent number: 8416218
    Abstract: In a pen-based computing system, multi-modal data is transferred between a paper domain and a digital domain. Data initially generated in the paper domain is captured by a smart pen and a digital file including the captured data is generated. For example, a computing system coupled to the smart pen generates a digital file including the captured data. A paper representation of the digital file is subsequently generated. The digital file can subsequently be modified by editing the paper representation of the digital file using the smart pen. Edits to the paper representation of the digital file are captured by the smart pen and converted to the digital domain where they are used to edit the content of the digital file.
    Type: Grant
    Filed: May 29, 2008
    Date of Patent: April 9, 2013
    Assignee: Livescribe, Inc.
    Inventors: Jim Marggraff, Tracy L. Edgecomb
  • Patent number: 8374992
    Abstract: In a pen-based computing system, use of paradigms similar to those used with physical paper to organize user generated content captured by a smart pen is disclosed. Data, such as handwriting gestures, is captured by the smart pen and transferred to a digital domain, such as by being transferred to a computing system. Once in the digital domain, the captured content is organized as virtual pages or virtual notebooks. Hence, content captured from various sources, such as different notebooks or different physical pieces of paper, is assembled into a virtual page or virtual notebook. User input or automatic application of rules can be used to assemble the captured content into a virtual page or virtual notebook.
    Type: Grant
    Filed: May 29, 2008
    Date of Patent: February 12, 2013
    Assignee: Livescribe, Inc.
    Inventors: Vinaitheerthan Meyyappan, Jim Marggraff, Tracy L. Edgecomb, Andy Van Schaack
  • Patent number: 8358309
    Abstract: In a pen-based computing system, a microphone on the smart pen device records audio to produce audio data and a gesture capture system on the smart pen device records writing gestures to produce writing gesture data. Both the audio data and the writing gesture data include a time component. The audio data and writing gesture data are combined or synchronized according to their time components to create audio ink data. The audio ink data can be uploaded to a computer system attached to the smart pen device and displayed to a user through a user interface. The user makes a selection in the user interface to play the audio ink data, and the audio ink data is played back by animated the captured writing gestures and playing the recorded audio in synchronization.
    Type: Grant
    Filed: April 4, 2012
    Date of Patent: January 22, 2013
    Assignee: Livescribe, Inc.
    Inventors: Jim Marggraff, Tracy L. Edgecomb, Andy Van Schaack
  • Patent number: 8265382
    Abstract: In a pen-based computing system, a printed version of a document having preexisting content is annotated using a smart pen. The smart pen captures handwriting gestures to obtain an electronic representation of the annotations. The smart pen computing system identifies a digital version of the document having the preexisting content and stores the electronic representation of the annotations in association with the digital document. The smart pen computing system may overlay the electronic representation of the annotations with the preexisting content to provide a digital representation of the annotated document.
    Type: Grant
    Filed: May 29, 2008
    Date of Patent: September 11, 2012
    Assignee: Livescribe, Inc.
    Inventors: Tracy L. Edgecomb, Andy Van Schaack, Jim Marggraff, Vinaitheerthan Meyyappan
  • Publication number: 20120194523
    Abstract: In a pen-based computing system, a microphone on the smart pen device records audio to produce audio data and a gesture capture system on the smart pen device records writing gestures to produce writing gesture data. Both the audio data and the writing gesture data include a time component. The audio data and writing gesture data are combined or synchronized according to their time components to create audio ink data. The audio ink data can be uploaded to a computer system attached to the smart pen device and displayed to a user through a user interface. The user makes a selection in the user interface to play the audio ink data, and the audio ink data is played back by animated the captured writing gestures and playing the recorded audio in synchronization.
    Type: Application
    Filed: April 4, 2012
    Publication date: August 2, 2012
    Applicant: LIVESCRIBE, INC.
    Inventors: Jim Marggraff, Tracy L. Edgecomb, Andy Van Schaack
  • Patent number: 8194081
    Abstract: In a pen-based computing system, a microphone on the smart pen device records audio to produce audio data and a gesture capture system on the smart pen device records writing gestures to produce writing gesture data. Both the audio data and the writing gesture data include a time component. The audio data and writing gesture data are combined or synchronized according to their time components to create audio ink data. The audio ink data can be uploaded to a computer system attached to the smart pen device and displayed to a user through a user interface. The user makes a selection in the user interface to play the audio ink data, and the audio ink data is played back by animated the captured writing gestures and playing the recorded audio in synchronization.
    Type: Grant
    Filed: May 29, 2008
    Date of Patent: June 5, 2012
    Assignee: Livescribe, Inc.
    Inventors: Jim Marggraff, Tracy L. Edgecomb, Andy Van Schaack
  • Publication number: 20110279415
    Abstract: A method and system for implementing a user interface for a device through user created graphical elements. The method includes recognizing a graphical element icon created by a user. Once recognized, a function related to the graphical element icon is accessed and an output in accordance with the function is provided. The function is persistently associated with the graphical element icon. Menu selection and navigation is implemented through interaction with the graphic element icon. A listing of options associated with the graphical element icon is audibly rendered. In response to a selection of one of the options, the selected option is invoked.
    Type: Application
    Filed: November 9, 2010
    Publication date: November 17, 2011
    Applicant: LEAPFROG ENTERPRISES, INC.
    Inventors: James Marggraff, Alexander Chisholm, Tracy L. Edgecomb
  • Patent number: 7936339
    Abstract: A device user interface in which computer functionality is invoked by user interaction with dynamically generated interface regions of a writing surface. A computer system identifies a marking written on the writing surface or a user selection of an existing written marking. Adjacent to the marking, the computer system automatically generates one or more interface regions associated with the marking. User interaction with one of these regions automatically invokes computer functionality related to the interacted region. A different function may be invoked by each region. The computer system dynamically positions and may dynamically size the interface regions based on the position (and size) of the marking. Multiple markings yield multiple regions, with different regions associated with respective markings. In one embodiment, the regions are established in front of and/or after a written word. Regions may also be established on top of and/or below the written word, for example.
    Type: Grant
    Filed: November 1, 2005
    Date of Patent: May 3, 2011
    Assignee: LeapFrog Enterprises, Inc.
    Inventors: James Marggraff, Tracy L. Edgecomb, Teresa Cameron, Nicole Wrubel, Steve Baldzikowski
  • Patent number: 7831933
    Abstract: A method and system for implementing a user interface for a device through user created graphical elements. The method includes recognizing a graphical element icon created by a user. Once recognized, a function related to the graphical element icon is accessed and an output in accordance with the function is provided. The function is persistently associated with the graphical element icon. Menu selection and navigation is implemented through interaction with the graphic element icon. A listing of options associated with the graphical element icon is audibly rendered. In response to a selection of one of the options, the selected option is invoked.
    Type: Grant
    Filed: January 12, 2005
    Date of Patent: November 9, 2010
    Assignee: LeapFrog Enterprises, Inc.
    Inventors: James Marggraff, Alexander Chisholm, Tracy L. Edgecomb
  • Publication number: 20100039296
    Abstract: Paper-based playback of media can be performed by electronically recording handwritten notes including pen strokes using a digital pen on a position-coded paper. A plurality of bounding areas is identified, e.g. by generating bounding areas around the pen strokes as they are developed. Each bounding box is provided with a time stamp which indexes a media file, which was recorded simultaneously with the handwritten notes. By placing the digital pen close to a handwritten note on the paper, the part of the media that was recorded when the specific note was written can be recalled. More specifically, the bounding box corresponding to the position of the digital pen is identified and the associated time stamp is used to find the media to recall. This paper-based playback of media can be performed in e.g. a stand-alone device or by a combination of a digital pen and a mobile phone.
    Type: Application
    Filed: June 1, 2007
    Publication date: February 18, 2010
    Inventors: James Marggraff, Tracy L. Edgecomb, Gabriel Acosta-Mikulasek, Dan Gärdenfors, Anders Svensson