Patents Assigned to HUANENG GROUP TECH INNOVATION CENTER CO., LTD.
  • Patent number: 11842324
    Abstract: A method for extracting a dam emergency event based on a dual attention mechanism is provided. The method includes: performing data preprocessing, building a dependency graph, building a dual attention network, and filling a document-level argument. The performing data preprocessing includes labeling a dam emergency corpus and encoding sentences. Building a dependency graph includes assisting a model to mine a syntactic relation based on a dependency. Building a dual attention network includes weighing and fusing an attention network based on a graph transformer network (GTN) and capturing key semantic information in the sentence. Filling a document-level argument includes filling a document-level argument by detecting a key sentence and ordering a semantic similarity. The method introduces a dependency and overcomes the long-range dependency problem based on the dual attention mechanism, thus achieving high identification accuracy and reducing a lot of labor costs.
    Type: Grant
    Filed: October 14, 2022
    Date of Patent: December 12, 2023
    Assignees: HOHAI UNIVERSITY, HUANENG LANCANG RIVER HYDROPOWER INC., HUANENG GROUP TECH INNOVATION CENTER CO., LTD.
    Inventors: Yingchi Mao, Wei Sun, Haibin Xiao, Fudong Chi, Hao Chen, Weiyong Zhan, Fugang Zhao, Han Fang, Xiaofeng Zhou, Chunrui Zhang, Bin Tan, Wenming Xie, Bingbing Nie, Zhixiang Chen, Chunrui Yang
  • Publication number: 20230119211
    Abstract: A method for extracting a dam emergency event based on a dual attention mechanism is provided. The method includes: performing data preprocessing, building a dependency graph, building a dual attention network, and filling a document-level argument. The performing data preprocessing includes labeling a dam emergency corpus and encoding sentences. Building a dependency graph includes assisting a model to mine a syntactic relation based on a dependency. Building a dual attention network includes weighing and fusing an attention network based on a graph transformer network (GTN) and capturing key semantic information in the sentence. Filling a document-level argument includes filling a document-level argument by detecting a key sentence and ordering a semantic similarity. The method introduces a dependency and overcomes the long-range dependency problem based on the dual attention mechanism, thus achieving high identification accuracy and reducing a lot of labor costs.
    Type: Application
    Filed: October 14, 2022
    Publication date: April 20, 2023
    Applicants: HOHAI UNIVERSITY, HUANENG LANCANG RIVER HYDROPOWER INC., HUANENG GROUP TECH INNOVATION CENTER CO., LTD.
    Inventors: Yong CHENG, Yingchi MAO, Haibin XIAO, Weiyong ZHAN, Hao CHEN, Longbao WANG, Fugang ZHAO, Han FANG, Xiaofeng ZHOU, Chunrui ZHANG, Bin TAN, Wenming XIE, Bingbing NIE, Zhixiang CHEN, Chunrui YANG