Patents by Inventor David C. Martin

David C. Martin has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 12252582
    Abstract: The functionalized 3,4-alkylenedioxythiophene (ADOT+) monomers can be represented by a chemical formula (CR1R2)(CR3R4)(CR4R5)xO2C4H2S, wherein x=0 or 1; wherein each of R1, R2, R3, R4, R5, and R6 is independently selected from hydrogen, a hydrocarbyl moiety, and a heteroatom-containing functional group; and wherein at least one of R1, R2, R3, R4, R5, and R6 comprises the heteroatom-containing functional group selected from an aldehyde, a maleimide, and their derivatives thereof. Also, disclosed herein are aldehyde derivatives represented by (ADOT-CH2—NH)pY and a maleimide derivative represented by (ADOT-(CH2)q—N)pZ where p=1-2 and each of Y and Z is a hydrocarbyl moiety or a biofunctional hydrocarbyl moiety. In an embodiment of the ADOT+ monomers, one of R1, R2, R3, R4, R5, and R6 is replaced by a direct bond to an amide group, an azide group, or an ester group of a biofunctional hydrocarbyl moiety. Also, disclosed herein are polymers and copolymers made therefrom.
    Type: Grant
    Filed: August 28, 2020
    Date of Patent: March 18, 2025
    Assignee: University Of Delaware
    Inventors: David C. Martin, Samadhan Suresh Nagane, Peter Sitarik
  • Patent number: 11882505
    Abstract: A telecommunications system, that after a communication is established by a first electronic communication device and a second electronic communication device, while the conversation is ongoing between a first person using the first electronic communication device and a second person using the second electronic communication device, responsive to content of converted text based on a plurality of words spoken, route the content to a cloud-based phone recognition and entity identification, annotation, and relevance processing resource, to enable display of information related to the content by at least one of the first electronic communication device and the second electronic communication device.
    Type: Grant
    Filed: December 7, 2022
    Date of Patent: January 23, 2024
    Assignee: Eolas Technologies Inc.
    Inventors: David C. Martin, Michael D. Doyle
  • Publication number: 20230100171
    Abstract: A telecommunications system, that after a communication is established by a first electronic communication device and a second electronic communication device, while the conversation is ongoing between a first person using the first electronic communication device and a second person using the second electronic communication device, responsive to content of converted text based on a plurality of words spoken, route the content to a cloud-based phone recognition and entity identification, annotation, and relevance processing resource, to enable display of information related to the content by at least one of the first electronic communication device and the second electronic communication device.
    Type: Application
    Filed: December 7, 2022
    Publication date: March 30, 2023
    Inventors: David C. Martin, Michael D. Doyle
  • Patent number: 11540093
    Abstract: A telecommunications system, that after a communication is established by a first electronic communication device and a second electronic communication device, receives digital auditory data corresponding to an ongoing conversation between a first person using the first electronic communication device and a second person using the second electronic communication device, uses that digital auditory data to enable transformation of an information knowledge model, and utilizes the information knowledge model to suggest information for the ongoing conversation.
    Type: Grant
    Filed: February 1, 2021
    Date of Patent: December 27, 2022
    Assignee: Eolas Technologies Inc.
    Inventors: David C. Martin, Michael D. Doyle
  • Publication number: 20220289902
    Abstract: Disclosed herein are functionalized 3,4-alkylenedioxythiophene (ADOT+) monomers represented by a chemical formula (CR1R2)(CR3R4)(CR4R6)xO2H2S, wherein x=0 or 1; wherein each of R1, R2, R3, R4, R5, and R6 is independently selected from hydrogen, a hydrocarbyl moiety, and a heteroatom-containing functional group; and wherein at least one of R1, R2, R3, R4, R5, and R6 comprises the heteroatom-containing functional group selected from an aldehyde, a maleimide, and their derivatives thereof. Also, disclosed herein are aldehyde derivatives represented by (ADOT-CH2—NH)pY and a maleimide derivative represented by (ADOT-(CH2)q—N)pZ where p=1-2 and each of Y and Z is a hydrocarbyl moiety or a biofunctional hydrocarbyl moiety. In an embodiment of the ADOT+ monomers, one of R1, R2, R3, R4, R5, and R6 is replaced by a direct bond to an amide group, an azide group, or an ester group of a biofunctional hydrocarbyl moiety. Also, disclosed herein are polymers and copolymers made therefrom.
    Type: Application
    Filed: August 28, 2020
    Publication date: September 15, 2022
    Applicant: University of Delaware
    Inventors: David C. Martin, Samadhan Suresh Nagane, Peter Sitarik
  • Publication number: 20210160667
    Abstract: A telecommunications system, that after a communication is established by a first electronic communication device and a second electronic communication device, receives digital auditory data corresponding to an ongoing conversation between a first person using the first electronic communication device and a second person using the second electronic communication device, uses that digital auditory data to enable transformation of an information knowledge model, and utilizes the information knowledge model to suggest information for the ongoing conversation.
    Type: Application
    Filed: February 1, 2021
    Publication date: May 27, 2021
    Inventors: David C. Martin, Michael D. Doyle
  • Patent number: 10917761
    Abstract: In one embodiment, a mobile device application automatically identifies and annotates auditory data of a conversation between two or more parties. Digital auditory data is processed to identify at least one fact from the auditory data of the conversation. An identified fact is annotated, based on context and relevance, to search remotely-accessible data for information associated with the identified fact. The information is displayed while the conversation is ongoing to enable verification of the identified fact.
    Type: Grant
    Filed: February 3, 2020
    Date of Patent: February 9, 2021
    Assignee: Eolas Technologies Inc.
    Inventors: David C. Martin, Michael D. Doyle
  • Publication number: 20210009815
    Abstract: Curable film-forming sol-gel compositions that are essentially free of inorganic oxide particles are provided. The compositions contain: a tetraalkoxysilane; a solvent component; and non-oxide particles, and further contain either i) a mineral acid or ii) an epoxy functional trialkoxysilane and a metal-containing catalyst. Coated articles demonstrating antiglare properties are also provided, comprising: (a) a substrate having at least one surface; and (b) a cured film-forming composition applied thereon, formed from a curable sol-gel composition comprising a silane and non-oxide particles. A method of forming an antiglare coating on a substrate is also provided.
    Type: Application
    Filed: June 18, 2020
    Publication date: January 14, 2021
    Inventors: Songwei Lu, Noel Vanier, Xiangling Xu, Shanti Swarup, David C. Martin, Kurt G. Olson, Irina Schwendeman
  • Patent number: 10878520
    Abstract: Embodiments of the present invention provide techniques for identifying and quantifying waste in a process. Waste information is input via images and/or natural language. The amount of waste is estimated based on information in images and/or a natural language description. A computer-implemented technique extracts metadata on waste products from the images and/or natural language description. A variety of factors such as social media trends, weather, traffic, and/or sports schedules are evaluated by the computer and used in predicting the amount of waste that will occur.
    Type: Grant
    Filed: May 10, 2019
    Date of Patent: December 29, 2020
    Assignee: International Business Machines Corporation
    Inventors: Timothy T. Bohn, Natalie N. Brooks Powell, Gary W. Crupi, Ho-Yiu Lam, David C. Martin, Sandhya ReddyVeera
  • Patent number: 10723890
    Abstract: Curable film-forming sol-gel compositions that are essentially free of inorganic oxide particles are provided. The compositions contain: a tetraalkoxysilane; a solvent component; and non-oxide particles, and further contain either i) a mineral acid or ii) an epoxy functional trialkoxysilane and a metal-containing catalyst. Coated articles demonstrating antiglare properties are also provided, comprising: (a) a substrate having at least one surface; and (b) a cured film-forming composition applied thereon, formed from a curable sol-gel composition comprising a silane and non-oxide particles. A method of forming an antiglare coating on a substrate is also provided.
    Type: Grant
    Filed: November 25, 2015
    Date of Patent: July 28, 2020
    Assignee: PPG Industries Ohio, Inc.
    Inventors: Songwei Lu, Noel Vanier, Xiangling Xu, Shanti Swarup, David C. Martin, Kurt G. Olson, Irina Schwendeman
  • Publication number: 20200178047
    Abstract: In one embodiment, a mobile device application automatically identifies and annotates auditory signals from one or more parties.
    Type: Application
    Filed: February 3, 2020
    Publication date: June 4, 2020
    Inventors: David C. Martin, Michael D. Doyle
  • Patent number: 10582350
    Abstract: In one embodiment, a mobile device application automatically identifies and annotates auditory signals from one or more parties.
    Type: Grant
    Filed: February 20, 2019
    Date of Patent: March 3, 2020
    Assignee: Eolas Technologies Inc.
    Inventors: David C. Martin, Michael D. Doyle
  • Publication number: 20190266680
    Abstract: Embodiments of the present invention provide techniques for identifying and quantifying waste in a process. Waste information is input via images and/or natural language. The amount of waste is estimated based on information in images and/or a natural language description. A computer-implemented technique extracts metadata on waste products from the images and/or natural language description. A variety of factors such as social media trends, weather, traffic, and/or sports schedules are evaluated by the computer and used in predicting the amount of waste that will occur.
    Type: Application
    Filed: May 10, 2019
    Publication date: August 29, 2019
    Inventors: Timothy T. Bohn, Natalie N. Brooks Powell, Gary W. Crupi, Ho-Yiu Lam, David C. Martin, Sandhya ReddyVeera
  • Patent number: 10339614
    Abstract: Embodiments of the present invention provide techniques for identifying and quantifying waste in a process. Waste information is input via images and/or natural language. The amount of waste is estimated based on information in images and/or a natural language description. A computer-implemented technique extracts metadata on waste products from the images and/or natural language description. A variety of factors such as social media trends, weather, traffic, and/or sports schedules are evaluated by the computer and used in predicting the amount of waste that will occur.
    Type: Grant
    Filed: September 25, 2018
    Date of Patent: July 2, 2019
    Assignee: International Business Machines Corporation
    Inventors: Timothy T. Bohn, Natalie N. Brooks Powell, Gary W. Crupi, Ho-Yiu Lam, David C. Martin, Sandhya ReddyVeera
  • Publication number: 20190182636
    Abstract: In one embodiment, a mobile device application automatically identifies and annotates auditory signals from one or more parties.
    Type: Application
    Filed: February 20, 2019
    Publication date: June 13, 2019
    Inventors: David C. Martin, Michael D. Doyle
  • Patent number: 10244368
    Abstract: In one embodiment, a mobile device application automatically identifies and annotates auditory signals from one or more parties.
    Type: Grant
    Filed: March 21, 2018
    Date of Patent: March 26, 2019
    Assignee: Eolas Technologies Inc.
    Inventors: David C. Martin, Michael D. Doyle
  • Publication number: 20190026844
    Abstract: Embodiments of the present invention provide techniques for identifying and quantifying waste in a process. Waste information is input via images and/or natural language. The amount of waste is estimated based on information in images and/or a natural language description. A computer-implemented technique extracts metadata on waste products from the images and/or natural language description. A variety of factors such as social media trends, weather, traffic, and/or sports schedules are evaluated by the computer and used in predicting the amount of waste that will occur.
    Type: Application
    Filed: September 25, 2018
    Publication date: January 24, 2019
    Inventors: Timothy T. Bohn, Natalie N. Brooks Powell, Gary W. Crupi, Ho-Yiu Lam, David C. Martin, Sandhya ReddyVeera
  • Patent number: 10121211
    Abstract: Embodiments of the present invention provide techniques for identifying and quantifying waste in a process. Waste information is input via images and/or natural language. The amount of waste is estimated based on information in images and/or a natural language description. A computer-implemented technique extracts metadata on waste products from the images and/or natural language description. A variety of factors such as social media trends, weather, traffic, and/or sports schedules are evaluated by the computer and used in predicting the amount of waste that will occur.
    Type: Grant
    Filed: February 15, 2017
    Date of Patent: November 6, 2018
    Assignee: International Business Machines Corporation
    Inventors: Timothy T. Bohn, Natalie N. Brooks Powell, Gary W. Crupi, Ho-Yiu Lam, David C. Martin, Sandhya ReddyVeera
  • Patent number: 10070283
    Abstract: In one embodiment, a mobile device application automatically identifies and annotates auditory data of a conversation between two or more parties. Digital auditory data is processed to identify mention of specific entities. An identified entity is annotated, based on context and relevance, to list one or more actions possible to perform with or on the identified entity, and the identified entity displayed while the conversation is ongoing. An action is selected by signals from an input interface.
    Type: Grant
    Filed: March 13, 2014
    Date of Patent: September 4, 2018
    Assignee: Eolas Technologies Inc.
    Inventors: David C. Martin, Michael D. Doyle
  • Publication number: 20180232822
    Abstract: Embodiments of the present invention provide techniques for identifying and quantifying waste in a process. Waste information is input via images and/or natural language. The amount of waste is estimated based on information in images and/or a natural language description. A computer-implemented technique extracts metadata on waste products from the images and/or natural language description. A variety of factors such as social media trends, weather, traffic, and/or sports schedules are evaluated by the computer and used in predicting the amount of waste that will occur.
    Type: Application
    Filed: February 15, 2017
    Publication date: August 16, 2018
    Inventors: Timothy T. Bohn, Natalie N. Brooks Powell, Gary W. Crupi, Ho-Yiu Lam, David C. Martin, Sandhya ReddyVeera