Patents by Inventor David Douglas Springgay

David Douglas Springgay has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 11409390
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface, with audio location, includes structure and/or function whereby at least one processor: (i) receives a touch input from a touch device; (ii) establishes a touch-speech time window; (iii) receives a speech input from a speech device; (iii) determines whether the speech input is present in a global dictionary; (iv) determines a location of a sound source from the speech device; (v) determines whether the touch input and the location of the speech input are both within a same region; (vi) if the speech input is in the dictionary, determines whether the speech input has been received within the window; and (vii) if the speech input has been received within the window, and the touch input and the speech input are both within the same region, activates an action corresponding to both the touch input and the speech input.
    Type: Grant
    Filed: October 30, 2020
    Date of Patent: August 9, 2022
    Inventors: David Popovich, David Douglas Springgay, David Frederick Gurnsey
  • Publication number: 20210048915
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface, with audio location, includes structure and/or function whereby at least one processor: (i) receives a touch input from a touch device; (ii) establishes a touch-speech time window; (iii) receives a speech input from a speech device; (iii) determines whether the speech input is present in a global dictionary; (iv) determines a location of a sound source from the speech device; (v) determines whether the touch input and the location of the speech input are both within a same region; (vi) if the speech input is in the dictionary, determines whether the speech input has been received within the window; and (vii) if the speech input has been received within the window, and the touch input and the speech input are both within the same region, activates an action corresponding to both the touch input and the speech input.
    Type: Application
    Filed: October 30, 2020
    Publication date: February 18, 2021
    Inventors: DAVID POPOVICH, DAVID DOUGLAS SPRINGGAY, DAVID FREDERICK GURNSEY
  • Patent number: 10845909
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface, with audio location, includes structure and/or function whereby at least one processor: (i) receives a touch input from a touch device; (ii) establishes a touch-speech time window; (iii) receives a speech input from a speech device; (iii) determines whether the speech input is present in a global dictionary; (iv) determines a location of a sound source from the speech device; (v) determines whether the touch input and the location of the speech input are both within a same region; (vi) if the speech input is in the dictionary, determines whether the speech input has been received within the window; and (vii) if the speech input has been received within the window, and the touch input and the speech input are both within the same region, activates an action corresponding to both the touch input and the speech input.
    Type: Grant
    Filed: May 30, 2019
    Date of Patent: November 24, 2020
    Inventors: David Popovich, David Douglas Springgay, David Frederick Gurnsey
  • Patent number: 10831297
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface includes structure and/or function whereby at least one processor: (i) receives an input from a touch sensitive input device; (ii) establishes a touch speech time window with respect to the received touch input; (iv) receives an input from a speech input device; (v) determines whether the received speech input is present in a global dictionary; (vi) if the received speech input is present in the global dictionary, determines whether the received speech input has been received within the established touch speech time window; and (vii) if the received speech input has been received within the established touch speech time window, activate an action corresponding to both the received touch input and the received speech input.
    Type: Grant
    Filed: August 15, 2019
    Date of Patent: November 10, 2020
    Assignee: NUREVA INC.
    Inventors: David Popovich, David Douglas Springgay, David Frederick Gurnsey
  • Publication number: 20200033981
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface includes structure and/or function whereby at least one processor: (i) receives an input from a touch sensitive input device; (ii) establishes a touch speech time window with respect to the received touch input; (iv) receives an input from a speech input device; (v) determines whether the received speech input is present in a global dictionary; (vi) if the received speech input is present in the global dictionary, determines whether the received speech input has been received within the established touch speech time window; and (vii) if the received speech input has been received within the established touch speech time window, activate an action corresponding to both the received touch input and the received speech input.
    Type: Application
    Filed: August 15, 2019
    Publication date: January 30, 2020
    Inventors: David Popovich, David Douglas Springgay, David Frederick Gurnsey
  • Publication number: 20190346955
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface, with audio location, includes structure and/or function whereby at least one processor: (i) receives a touch input from a touch device; (ii) establishes a touch-speech time window; (iii) receives a speech input from a speech device; (iii) determines whether the speech input is present in a global dictionary; (iv) determines a location of a sound source from the speech device; (v) determines whether the touch input and the location of the speech input are both within a same region; (vi) if the speech input is in the dictionary, determines whether the speech input has been received within the window; and (vii) if the speech input has been received within the window, and the touch input and the speech input are both within the same region, activates an action corresponding to both the touch input and the speech input.
    Type: Application
    Filed: May 30, 2019
    Publication date: November 14, 2019
    Inventors: DAVID POPOVICH, DAVID DOUGLAS SPRINGGAY, DAVID FREDERICK GURNSEY
  • Patent number: 10394358
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface includes structure and/or function whereby at least one processor: (i) receives an input from a touch sensitive input device; (ii) establishes a touch speech time window with respect to the received touch input; (iv) receives an input from a speech input device; (v) determines whether the received speech input is present in a global dictionary; (vi) if the received speech input is present in the global dictionary, determines whether the received speech input has been received within the established touch speech time window; and (vii) if the received speech input has been received within the established touch speech time window, activate an action corresponding to both the received touch input and the received speech input.
    Type: Grant
    Filed: June 6, 2017
    Date of Patent: August 27, 2019
    Assignee: Nureva, Inc.
    Inventors: David Popovich, David Douglas Springgay, David Frederick Gurnsey
  • Patent number: 10338713
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface, with audio location, includes structure and/or function whereby at least one processor: (i) receives a touch input from a touch device; (ii) establishes a touch-speech time window; (iii) receives a speech input from a speech device; (iii) determines whether the speech input is present in a global dictionary; (iv) determines a location of a sound source from the speech device; (v) determines whether the touch input and the location of the speech input are both within a same region; (vi) if the speech input is in the dictionary, determines whether the speech input has been received within the window; and (vii) if the speech input has been received within the window, and the touch input and the speech input are both within the same region, activates an action corresponding to both the touch input and the speech input.
    Type: Grant
    Filed: June 6, 2017
    Date of Patent: July 2, 2019
    Assignee: Nureva, Inc.
    Inventors: David Popovich, David Douglas Springgay, David Frederick Gurnsey
  • Publication number: 20170351367
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface, with audio location, includes structure and/or function whereby at least one processor: (i) receives a touch input from a touch device; (ii) establishes a touch-speech time window; (iii) receives a speech input from a speech device; (iii) determines whether the speech input is present in a global dictionary; (iv) determines a location of a sound source from the speech device; (v) determines whether the touch input and the location of the speech input are both within a same region; (vi) if the speech input is in the dictionary, determines whether the speech input has been received within the window; and (vii) if the speech input has been received within the window, and the touch input and the speech input are both within the same region, activates an action corresponding to both the touch input and the speech input.
    Type: Application
    Filed: June 6, 2017
    Publication date: December 7, 2017
    Inventors: DAVID POPOVICH, DAVID DOUGLAS SPRINGGAY, DAVID FREDERICK GURNSEY
  • Publication number: 20170351366
    Abstract: Method, apparatus, and computer-readable media for touch and speech interface includes structure and/or function whereby at least one processor: (i) receives an input from a touch sensitive input device; (ii) establishes a touch speech time window with respect to the received touch input; (iv) receives an input from a speech input device; (v) determines whether the received speech input is present in a global dictionary; (vi) if the received speech input is present in the global dictionary, determines whether the received speech input has been received within the established touch speech time window; and (vii) if the received speech input has been received within the established touch speech time window, activate an action corresponding to both the received touch input and the received speech input.
    Type: Application
    Filed: June 6, 2017
    Publication date: December 7, 2017
    Inventors: DAVID POPOVICH, DAVID DOUGLAS SPRINGGAY, DAVID FREDERICK GURNSEY
  • Patent number: 9348807
    Abstract: An apparatus and method for providing a user interface through which a user may generate a conditional expression are provided. The user interface provides guidance to a user as to the proper parts of the conditional expression to include in the conditional expression as the user constructs the conditional expression. Thus, as the user completes parts of the conditional expression, the guidance that is offered is updated based on the current context of the conditional expression. This guidance may include listings of variables, attributes and/or functions that are most appropriate to be entered next in the conditional expression, help text, and the like.
    Type: Grant
    Filed: January 18, 2011
    Date of Patent: May 24, 2016
    Assignee: International Business Machines Corporation
    Inventors: Kevin T. McGuire, Eduardo Jose Pereira, Nashib Qadri, David Douglas Springgay
  • Publication number: 20130305145
    Abstract: There is provided a computer-implemented method of publishing digital content in a page-based grid format for a device, the method comprising: identifying device attributes; obtaining raw digital content; determining a device-specific font size for the raw content based on the device attributes; determining a column width for page columns within a page grid; determining the number of available column rows within the page grid based on the column width; and, automatically populating the page columns with the digital content to generate a device specific digital publication in a page-based grid format for display on the device.
    Type: Application
    Filed: December 21, 2012
    Publication date: November 14, 2013
    Applicant: NI GROUP LIMITED
    Inventors: Paul James Jackson, Alexander Joseph Breuer, Mark Philip James Steyn, David Douglas Springgay
  • Patent number: 8332750
    Abstract: A computer implemented method, apparatus, and computer program product for resolving inter-page nodes in flow diagrams is presented. In one embodiment, an inter-page node in a flow diagram is identified. An inter-page node is a node laid out on a page break in a multi-page flow diagram. A set of candidate pages is formed. A bid for each page in a set of candidate pages is requested from each policy in a set of page break policies. A page is selected from the set of candidate pages based on bids received from the set of page break policies. A value of each bid indicates a level of suitability of each page in the set of candidate pages. The inter-page node located on the page break is moved to a new location on the selected page. The new location on the selected page is located off of all page breaks for the flow diagram.
    Type: Grant
    Filed: September 25, 2009
    Date of Patent: December 11, 2012
    Assignee: International Business Machines Corporation
    Inventors: Omid Banyasad, Mark Andrew MacDonald, Siobhan Nearey, Rodrigo Trevizan Peretti, David Douglas Springgay
  • Patent number: 8296723
    Abstract: Illustrative embodiments provide a computer-implemented method for configurable Unified Modeling Language building blocks. The computer-implemented method obtains a Unified Modeling Language specification and generates a set of logical units from the Unified Modeling Language specification to form a set of building blocks. The computer-implemented method further fetches desired blocks from the set of building blocks according to predefined criteria to form a set of desired blocks, and presents the set of desired building blocks to a requestor for execution of functions provided by the set of desired building blocks to complete a defined task.
    Type: Grant
    Filed: December 11, 2008
    Date of Patent: October 23, 2012
    Assignee: International Business Machines Corporation
    Inventors: Michael Hanner, Daniel Donat Joseph Leroux, Dusko Misic, David Douglas Springgay, Mira Vrbaski
  • Publication number: 20110119286
    Abstract: An apparatus and method for providing a user interface through which a user may generate a conditional expression are provided. The user interface provides guidance to a user as to the proper parts of the conditional expression to include in the conditional expression as the user constructs the conditional expression. Thus, as the user completes parts of the conditional expression, the guidance that is offered is updated based on the current context of the conditional expression. This guidance may include listings of variables, attributes and/or functions that are most appropriate to be entered next in the conditional expression, help text, and the like.
    Type: Application
    Filed: January 18, 2011
    Publication date: May 19, 2011
    Applicant: International Business Machines Corporation
    Inventors: Kevin T. McGuire, Eduardo Jose Pereira, Nashib Qadri, David Douglas Springgay
  • Patent number: 7899836
    Abstract: An apparatus and method for providing a user interface through which a user may generate a conditional expression are provided. The user interface provides guidance to a user as to the proper parts of the conditional expression to include in the conditional expression as the user constructs the conditional expression. Thus, as the user completes parts of the conditional expression, the guidance that is offered is updated based on the current context of the conditional expression. This guidance may include listings of variables, attributes and/or functions that are most appropriate to be entered next in the conditional expression, help text, and the like.
    Type: Grant
    Filed: July 10, 2008
    Date of Patent: March 1, 2011
    Assignee: International Business Machines Corporation
    Inventors: Kevin T. McGuire, Eduardo Jose Pereira, Nashib Qadri, David Douglas Springgay
  • Publication number: 20100318942
    Abstract: A computer implemented method, apparatus, and computer program product for resolving inter-page nodes in flow diagrams is presented. In one embodiment, an inter-page node in a flow diagram is identified. An inter-page node is a node laid out on a page break in a multi-page flow diagram. A set of candidate pages is formed. A bid for each page in a set of candidate pages is requested from each policy in a set of page break policies. A page is selected from the set of candidate pages based on bids received from the set of page break policies. A value of each bid indicates a level of suitability of each page in the set of candidate pages. The inter-page node located on the page break is moved to a new location on the selected page. The new location on the selected page is located off of all page breaks for the flow diagram.
    Type: Application
    Filed: September 25, 2009
    Publication date: December 16, 2010
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: OMID BANYASAD, MARK ANDREW MACDONALD, SIOBHAN NEAREY, RODRIGO TREVIZAN PERETTI, DAVID DOUGLAS SPRINGGAY
  • Publication number: 20100153907
    Abstract: Illustrative embodiments provide a computer-implemented method for configurable Unified Modeling Language building blocks. The computer-implemented method obtains a Unified Modeling Language specification and generates a set of logical units from the Unified Modeling Language specification to form a set of building blocks. The computer-implemented method further fetches desired blocks from the set of building blocks according to predefined criteria to form a set of desired blocks, and presents the set of desired building blocks to a requestor for execution of functions provided by the set of desired building blocks to complete a defined task.
    Type: Application
    Filed: December 11, 2008
    Publication date: June 17, 2010
    Applicant: INTERNATIONAL BUSINESS MACHINES CORPORATION
    Inventors: Michael Hanner, Daniel Donat Joseph Leroux, Dusko Misic, David Douglas Springgay, Mira Vrbaski
  • Publication number: 20080276219
    Abstract: An apparatus and method for providing a user interface through which a user may generate a conditional expression are provided. The user interface provides guidance to a user as to the proper parts of the conditional expression to include in the conditional expression as the user constructs the conditional expression. Thus, as the user completes parts of the conditional expression, the guidance that is offered is updated based on the current context of the conditional expression. This guidance may include listings of variables, attributes and/or functions that are most appropriate to be entered next in the conditional expression, help text, and the like.
    Type: Application
    Filed: July 10, 2008
    Publication date: November 6, 2008
    Applicant: International Business Machines Corporation
    Inventors: Kevin T. McGuire, Eduardo Jose Pereira, Nashib Qadri, David Douglas Springgay
  • Patent number: 7428536
    Abstract: An apparatus and method for providing a user interface through which a user may generate a conditional expression are provided. The user interface provides guidance to a user as to the proper parts of the conditional expression to include in the conditional expression as the user constructs the conditional expression. Thus, as the user completes parts of the conditional expression, the guidance that is offered is updated based on the current context of the conditional expression. This guidance may include listings of variables, attributes and/or functions that are most appropriate to be entered next in the conditional expression, help text, and the like.
    Type: Grant
    Filed: April 5, 2005
    Date of Patent: September 23, 2008
    Assignee: International Business Machines Corporation
    Inventors: Kevin T. McGuire, Eduardo Jose Pereira, Nashib Qadri, David Douglas Springgay