高级检索

    基于BERT的微波组件故障知识图谱推理方法

    A BERT-based Method for Inference in Fault Knowledge Graph of Microwave Components

    • 摘要: 本研究旨在利用知识推理技术改进微波组件调测过程中的知识补全,以增强故障知识图谱的完整性和准确性,从而提升调测效率和可靠性。针对微波技术领域知识不完备的问题,文中提出了一种基于来自Transformers的双向编码器表示(Bidirectional Encoder Representations from Transformers, BERT)模型的知识推理方法MicroReason-BERT。该模型首先在微波组件故障知识图谱上通过遮掩语言模型进行预训练,增强对现有数据的理解能力。随后,通过使用预训练模型的编码器部分,分别对三元组中的头实体、关系和尾实体进行上下文表示编码。在此基础上,MicroReason-BERT结合确定性分类器与空间测量进行表示和结构学习,以优化模型表现。为进一步提升模型性能,文中还引入多种损失函数进行学习。实验结果表明:MicroReason-BERT在测试集上的综合表现优于其他知识推理模型,有效提高了知识图谱的完整度。

       

      Abstract: The aim of this study is to leverage knowledge inference techniques to enhance knowledge completion in the debugging and testing process of microwave components, thereby improving the completeness and accuracy of fault knowledge graphs, ultimately boosting testing efficiency and reliability. Addressing the challenge of incomplete knowledge in the field of microwave technology, in this paper a knowledge inference method based on the bidirectional encoder representations from Transformers (BERT) is introduced, named MicroReason-BERT. The model first undergoes pre-training on the fault knowledge graph of microwave components using a masked language modeling (MLM) task, enhancing its comprehension of the existing data. Subsequently, the encoder portion of the pre-trained model is used to encode the contextual representations of the head entity, relationship, and tail entity in the triples. On this basis, MicroReason-BERT employs deterministic classifiers and spatial measurement for representation and structural learning, optimizing model performance. To further enhance the model's performance, multiple loss functions are incorporated into the learning process. Experimental results show that MicroReason-BERT outperforms other knowledge inference models on the test set, significantly improving the completeness of the knowledge graph.

       

    /

    返回文章
    返回