Patents by Inventor Lucas R. A. Ives

Lucas R. A. Ives has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11586936
    Abstract: Systems and methods for both technical and non-technical users to create content for interactive synthetic characters are provided. In some embodiments, a conversation editor may be configured to create a traversable script for an interactive synthetic character by receiving a set of conversation rules from a user. These rules can be used to match certain words or phrases that a user speaks or types, or to monitor for a physical movement of the user or synthetic character. Each conversation rule can include responses to be performed by the interactive synthetic character. The responses can include, for example, producing audible or textual speech for the synthetic character, performing one or more animations, playing one or more sound effects, retrieving data from one or more data sources, and the like. A traversable script can be generated from the set of conversation rules that when executed by the synthetic character allows for the dynamic interactions.
    Type: Grant
    Filed: January 18, 2019
    Date of Patent: February 21, 2023
    Assignee: Chatterbox Capital LLC
    Inventors: Martin Reddy, Oren M. Jacob, Robert G. Podesta, Lucas R. Ives, Kathleen Hale
  • Publication number: 20190385065
    Abstract: Systems and methods for both technical and non-technical users to create content for interactive synthetic characters are provided. In some embodiments, a conversation editor may be configured to create a traversable script for an interactive synthetic character by receiving a set of conversation rules from a user. These rules can be used to match certain words or phrases that a user speaks or types, or to monitor for a physical movement of the user or synthetic character. Each conversation rule can include responses to be performed by the interactive synthetic character. The responses can include, for example, producing audible or textual speech for the synthetic character, performing one or more animations, playing one or more sound effects, retrieving data from one or more data sources, and the like. A traversable script can be generated from the set of conversation rules that when executed by the synthetic character allows for the dynamic interactions.
    Type: Application
    Filed: January 18, 2019
    Publication date: December 19, 2019
    Inventors: Martin Reddy, Oren M. Jacob, Robert G. Podesta, Lucas R.A. Ives, Kathleen Hale
  • Patent number: 10223636
    Abstract: Systems and methods to create content for interactive synthetic characters are provided. In some embodiments, a conversation editor may be configured to create a traversable script for an interactive synthetic character by receiving conversation rules from a user. These rules can be used to match words or phrases that a user speaks or types, or to monitor for a physical movement of the user or synthetic character. Each rule can include responses to be performed by the interactive synthetic character. Examples of responses include producing audible or textual speech for the synthetic character, performing animations, playing sound effects, retrieving data, and the like. A traversable script can be generated from the conversation rules that when executed by the synthetic character allows for the dynamic interactions. In some embodiments, the traversable script can be navigated by a state engine using navigational directives associated with the conversation rules.
    Type: Grant
    Filed: July 25, 2012
    Date of Patent: March 5, 2019
    Assignee: PULLSTRING, INC.
    Inventors: Martin Reddy, Oren M. Jacob, Robert G. Podesta, Lucas R. A. Ives, Kathleen Hale
  • Patent number: 9454838
    Abstract: Various of the disclosed embodiments relate to systems and methods for providing animated multimedia, e.g. animated shows, to an audience over a network. Particularly, some embodiments provide systems and methods for generating and providing audio, animation, and other experience-related information so that user's may readily experience the content in a seamless manner (e.g., as an audience member watching a show, playing a video game, etc.). Various embodiments animate “to the audience” based, e.g., on what content the audience is consuming. The animations may be generated in real-time from constituent components and assets in response to user behavior.
    Type: Grant
    Filed: May 28, 2014
    Date of Patent: September 27, 2016
    Assignee: PULLSTRING, INC.
    Inventors: Michael Chann, Jon Collins, Benjamin Morse, Lucas R. A. Ives, Martin Reddy, Oren M. Jacob
  • Patent number: 9443337
    Abstract: Various of the disclosed embodiments relate to systems and methods for providing animated multimedia, e.g. animated shows, to an audience over a network. Particularly, some embodiments provide systems and methods for generating and providing audio, animation, and other experience-related information so that user's may readily experience the content in a seamless manner (e.g., as an audience member watching a show, playing a video game, etc.). Various embodiments animate “to the audience” based, e.g., on what content the audience is consuming. The animations may be generated in real-time from constituent components and assets in response to user behavior.
    Type: Grant
    Filed: May 28, 2014
    Date of Patent: September 13, 2016
    Assignee: PULLSTRING, INC.
    Inventors: Michael Chann, Jon Collins, Benjamin Morse, Lucas R. A. Ives, Martin Reddy, Oren M. Jacob
  • Publication number: 20150062131
    Abstract: Various of the disclosed embodiments relate to systems and methods for providing animated multimedia, e.g. animated shows, to an audience over a network. Particularly, some embodiments provide systems and methods for generating and providing audio, animation, and other experience-related information so that user's may readily experience the content in a seamless manner (e.g., as an audience member watching a show, playing a video game, etc.). Various embodiments animate “to the audience” based, e.g., on what content the audience is consuming. The animations may be generated in real-time from constituent components and assets in response to user behavior.
    Type: Application
    Filed: May 28, 2014
    Publication date: March 5, 2015
    Inventors: Michael Chann, Jon Collins, Benjamin Morse, Lucas R.A. Ives, Martin Reddy, Oren M. Jacob
  • Publication number: 20150062132
    Abstract: Various of the disclosed embodiments relate to systems and methods for providing animated multimedia, e.g. animated shows, to an audience over a network. Particularly, some embodiments provide systems and methods for generating and providing audio, animation, and other experience-related information so that user's may readily experience the content in a seamless manner (e.g., as an audience member watching a show, playing a video game, etc.). Various embodiments animate “to the audience” based, e.g., on what content the audience is consuming. The animations may be generated in real-time from constituent components and assets in response to user behavior.
    Type: Application
    Filed: May 28, 2014
    Publication date: March 5, 2015
    Inventors: Michael Chann, Jon Collins, Benjamin Morse, Lucas R.A. Ives, Martin Reddy, Oren M. Jacob
  • Publication number: 20140278403
    Abstract: Various of the disclosed embodiments concern systems and methods for conversation-based human-computer interactions. In some embodiments, the system includes a plurality of interactive scenes. A user may access each scene and engage in conversation with a synthetic character regarding an activity associated with that active scene. In certain embodiments, a central server may house a plurality of waveforms associated with the synthetic character's speech, and may dynamically deliver the waveforms to a user device in conjunction with the operation of an artificial intelligence. In other embodiments, the character's speech is generated using a text-to-speech system.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: TOYTALK, INC.
    Inventors: Oren M. Jacob, Martin Reddy, Lucas R.A. Ives, Robert G. Podesta
  • Publication number: 20140272827
    Abstract: Various of the disclosed embodiments relate to systems and methods for managing a vocal performance. In some embodiments, a central hosting server may maintain a repository of speech text, waveforms, and metadata supplied by a plurality of development team members. The central hosting server may facilitate modification of the metadata and collaborative commentary procedures so that the development team members may generate higher quality voice assets more efficiently.
    Type: Application
    Filed: March 14, 2013
    Publication date: September 18, 2014
    Applicant: TOYTALK, INC.
    Inventors: Oren M. Jacobs, Martin Reddy, Lucas R.A. Ives
  • Publication number: 20140032471
    Abstract: Systems and methods to create content for interactive synthetic characters are provided. In some embodiments, a conversation editor may be configured to create a traversable script for an interactive synthetic character by receiving conversation rules from a user. These rules can be used to match words or phrases that a user speaks or types, or to monitor for a physical movement of the user or synthetic character. Each rule can include responses to be performed by the interactive synthetic character. Examples of responses include producing audible or textual speech for the synthetic character, performing animations, playing sound effects, retrieving data, and the like. A traversable script can be generated from the conversation rules that when executed by the synthetic character allows for the dynamic interactions. In some embodiments, the traversable script can be navigated by a state engine using navigational directives associated with the conversation rules.
    Type: Application
    Filed: July 25, 2012
    Publication date: January 30, 2014
    Applicant: TOYTALK, INC.
    Inventors: Martin Reddy, Oren M. Jacob, Robert G. Podesta, Lucas R. A. Ives, Kathleen Hale