A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion
Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain st...
Ausführliche Beschreibung
Autor*in: |
Shang, Bin [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2023 |
---|
Schlagwörter: |
---|
Anmerkung: |
© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
---|
Übergeordnetes Werk: |
Enthalten in: Neural computing & applications - Springer London, 1993, 35(2023), 20 vom: 03. Apr., Seite 15005-15018 |
---|---|
Übergeordnetes Werk: |
volume:35 ; year:2023 ; number:20 ; day:03 ; month:04 ; pages:15005-15018 |
Links: |
---|
DOI / URN: |
10.1007/s00521-023-08514-z |
---|
Katalog-ID: |
OLC2143655908 |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | OLC2143655908 | ||
003 | DE-627 | ||
005 | 20240118091116.0 | ||
007 | tu | ||
008 | 240118s2023 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s00521-023-08514-z |2 doi | |
035 | |a (DE-627)OLC2143655908 | ||
035 | |a (DE-He213)s00521-023-08514-z-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 004 |q VZ |
100 | 1 | |a Shang, Bin |e verfasserin |4 aut | |
245 | 1 | 0 | |a A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion |
264 | 1 | |c 2023 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. | ||
520 | |a Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. | ||
650 | 4 | |a Knowledge graph completion | |
650 | 4 | |a Representation learning | |
650 | 4 | |a Graph attention network | |
650 | 4 | |a Contrastive learning | |
700 | 1 | |a Zhao, Yinliang |4 aut | |
700 | 1 | |a Liu, Jun |4 aut | |
700 | 1 | |a Liu, Yifan |4 aut | |
700 | 1 | |a Wang, Chenxin |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Neural computing & applications |d Springer London, 1993 |g 35(2023), 20 vom: 03. Apr., Seite 15005-15018 |w (DE-627)165669608 |w (DE-600)1136944-9 |w (DE-576)032873050 |x 0941-0643 |7 nnns |
773 | 1 | 8 | |g volume:35 |g year:2023 |g number:20 |g day:03 |g month:04 |g pages:15005-15018 |
856 | 4 | 1 | |u https://doi.org/10.1007/s00521-023-08514-z |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a GBV_ILN_2018 | ||
912 | |a GBV_ILN_4277 | ||
951 | |a AR | ||
952 | |d 35 |j 2023 |e 20 |b 03 |c 04 |h 15005-15018 |
author_variant |
b s bs y z yz j l jl y l yl c w cw |
---|---|
matchkey_str |
article:09410643:2023----::cnrsienweggahmednmdlihirrhcltet |
hierarchy_sort_str |
2023 |
publishDate |
2023 |
allfields |
10.1007/s00521-023-08514-z doi (DE-627)OLC2143655908 (DE-He213)s00521-023-08514-z-p DE-627 ger DE-627 rakwb eng 004 VZ Shang, Bin verfasserin aut A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion 2023 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. Knowledge graph completion Representation learning Graph attention network Contrastive learning Zhao, Yinliang aut Liu, Jun aut Liu, Yifan aut Wang, Chenxin aut Enthalten in Neural computing & applications Springer London, 1993 35(2023), 20 vom: 03. Apr., Seite 15005-15018 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:35 year:2023 number:20 day:03 month:04 pages:15005-15018 https://doi.org/10.1007/s00521-023-08514-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 35 2023 20 03 04 15005-15018 |
spelling |
10.1007/s00521-023-08514-z doi (DE-627)OLC2143655908 (DE-He213)s00521-023-08514-z-p DE-627 ger DE-627 rakwb eng 004 VZ Shang, Bin verfasserin aut A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion 2023 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. Knowledge graph completion Representation learning Graph attention network Contrastive learning Zhao, Yinliang aut Liu, Jun aut Liu, Yifan aut Wang, Chenxin aut Enthalten in Neural computing & applications Springer London, 1993 35(2023), 20 vom: 03. Apr., Seite 15005-15018 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:35 year:2023 number:20 day:03 month:04 pages:15005-15018 https://doi.org/10.1007/s00521-023-08514-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 35 2023 20 03 04 15005-15018 |
allfields_unstemmed |
10.1007/s00521-023-08514-z doi (DE-627)OLC2143655908 (DE-He213)s00521-023-08514-z-p DE-627 ger DE-627 rakwb eng 004 VZ Shang, Bin verfasserin aut A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion 2023 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. Knowledge graph completion Representation learning Graph attention network Contrastive learning Zhao, Yinliang aut Liu, Jun aut Liu, Yifan aut Wang, Chenxin aut Enthalten in Neural computing & applications Springer London, 1993 35(2023), 20 vom: 03. Apr., Seite 15005-15018 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:35 year:2023 number:20 day:03 month:04 pages:15005-15018 https://doi.org/10.1007/s00521-023-08514-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 35 2023 20 03 04 15005-15018 |
allfieldsGer |
10.1007/s00521-023-08514-z doi (DE-627)OLC2143655908 (DE-He213)s00521-023-08514-z-p DE-627 ger DE-627 rakwb eng 004 VZ Shang, Bin verfasserin aut A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion 2023 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. Knowledge graph completion Representation learning Graph attention network Contrastive learning Zhao, Yinliang aut Liu, Jun aut Liu, Yifan aut Wang, Chenxin aut Enthalten in Neural computing & applications Springer London, 1993 35(2023), 20 vom: 03. Apr., Seite 15005-15018 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:35 year:2023 number:20 day:03 month:04 pages:15005-15018 https://doi.org/10.1007/s00521-023-08514-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 35 2023 20 03 04 15005-15018 |
allfieldsSound |
10.1007/s00521-023-08514-z doi (DE-627)OLC2143655908 (DE-He213)s00521-023-08514-z-p DE-627 ger DE-627 rakwb eng 004 VZ Shang, Bin verfasserin aut A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion 2023 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. Knowledge graph completion Representation learning Graph attention network Contrastive learning Zhao, Yinliang aut Liu, Jun aut Liu, Yifan aut Wang, Chenxin aut Enthalten in Neural computing & applications Springer London, 1993 35(2023), 20 vom: 03. Apr., Seite 15005-15018 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:35 year:2023 number:20 day:03 month:04 pages:15005-15018 https://doi.org/10.1007/s00521-023-08514-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 35 2023 20 03 04 15005-15018 |
language |
English |
source |
Enthalten in Neural computing & applications 35(2023), 20 vom: 03. Apr., Seite 15005-15018 volume:35 year:2023 number:20 day:03 month:04 pages:15005-15018 |
sourceStr |
Enthalten in Neural computing & applications 35(2023), 20 vom: 03. Apr., Seite 15005-15018 volume:35 year:2023 number:20 day:03 month:04 pages:15005-15018 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Knowledge graph completion Representation learning Graph attention network Contrastive learning |
dewey-raw |
004 |
isfreeaccess_bool |
false |
container_title |
Neural computing & applications |
authorswithroles_txt_mv |
Shang, Bin @@aut@@ Zhao, Yinliang @@aut@@ Liu, Jun @@aut@@ Liu, Yifan @@aut@@ Wang, Chenxin @@aut@@ |
publishDateDaySort_date |
2023-04-03T00:00:00Z |
hierarchy_top_id |
165669608 |
dewey-sort |
14 |
id |
OLC2143655908 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">OLC2143655908</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20240118091116.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">240118s2023 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00521-023-08514-z</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2143655908</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00521-023-08514-z-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Shang, Bin</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2023</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Knowledge graph completion</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Representation learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Graph attention network</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Contrastive learning</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zhao, Yinliang</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Liu, Jun</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Liu, Yifan</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Wang, Chenxin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Neural computing & applications</subfield><subfield code="d">Springer London, 1993</subfield><subfield code="g">35(2023), 20 vom: 03. Apr., Seite 15005-15018</subfield><subfield code="w">(DE-627)165669608</subfield><subfield code="w">(DE-600)1136944-9</subfield><subfield code="w">(DE-576)032873050</subfield><subfield code="x">0941-0643</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:35</subfield><subfield code="g">year:2023</subfield><subfield code="g">number:20</subfield><subfield code="g">day:03</subfield><subfield code="g">month:04</subfield><subfield code="g">pages:15005-15018</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00521-023-08514-z</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">35</subfield><subfield code="j">2023</subfield><subfield code="e">20</subfield><subfield code="b">03</subfield><subfield code="c">04</subfield><subfield code="h">15005-15018</subfield></datafield></record></collection>
|
author |
Shang, Bin |
spellingShingle |
Shang, Bin ddc 004 misc Knowledge graph completion misc Representation learning misc Graph attention network misc Contrastive learning A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion |
authorStr |
Shang, Bin |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)165669608 |
format |
Article |
dewey-ones |
004 - Data processing & computer science |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
0941-0643 |
topic_title |
004 VZ A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion Knowledge graph completion Representation learning Graph attention network Contrastive learning |
topic |
ddc 004 misc Knowledge graph completion misc Representation learning misc Graph attention network misc Contrastive learning |
topic_unstemmed |
ddc 004 misc Knowledge graph completion misc Representation learning misc Graph attention network misc Contrastive learning |
topic_browse |
ddc 004 misc Knowledge graph completion misc Representation learning misc Graph attention network misc Contrastive learning |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Neural computing & applications |
hierarchy_parent_id |
165669608 |
dewey-tens |
000 - Computer science, knowledge & systems |
hierarchy_top_title |
Neural computing & applications |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 |
title |
A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion |
ctrlnum |
(DE-627)OLC2143655908 (DE-He213)s00521-023-08514-z-p |
title_full |
A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion |
author_sort |
Shang, Bin |
journal |
Neural computing & applications |
journalStr |
Neural computing & applications |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works |
recordtype |
marc |
publishDateSort |
2023 |
contenttype_str_mv |
txt |
container_start_page |
15005 |
author_browse |
Shang, Bin Zhao, Yinliang Liu, Jun Liu, Yifan Wang, Chenxin |
container_volume |
35 |
class |
004 VZ |
format_se |
Aufsätze |
author-letter |
Shang, Bin |
doi_str_mv |
10.1007/s00521-023-08514-z |
dewey-full |
004 |
title_sort |
a contrastive knowledge graph embedding model with hierarchical attention and dynamic completion |
title_auth |
A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion |
abstract |
Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
abstractGer |
Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
abstract_unstemmed |
Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods. © The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 |
container_issue |
20 |
title_short |
A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion |
url |
https://doi.org/10.1007/s00521-023-08514-z |
remote_bool |
false |
author2 |
Zhao, Yinliang Liu, Jun Liu, Yifan Wang, Chenxin |
author2Str |
Zhao, Yinliang Liu, Jun Liu, Yifan Wang, Chenxin |
ppnlink |
165669608 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s00521-023-08514-z |
up_date |
2024-07-03T17:15:09.953Z |
_version_ |
1803578937132974080 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">OLC2143655908</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20240118091116.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">240118s2023 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00521-023-08514-z</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2143655908</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00521-023-08514-z-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Shang, Bin</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">A contrastive knowledge graph embedding model with hierarchical attention and dynamic completion</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2023</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Recently, multi-head Graph Attention Networks (GATs) have achieved satisfactory performance in Knowledge Graph Embedding (KGE) tasks by imposing attention mechanism in local information. However, existing GATs based KGE approaches update entities with few neighbors is difficult to obtain structured semantic information, and these methods only use relations to model the local pairwise importance of entities, which result in missing semantic information of entity embedding. Meanwhile, different entities may have the same position in vector space, which result in poor performance of the model. To this end, we propose a contrastive knowledge graph embedding model named HADC with hierarchical attention network and dynamic completion. HADC dynamically adds the neighbors of entities to complement its local structural information, incorporates both entities’ and relations’ importance in any given entity’s neighborhood, and proposes a contrastive learning-based loss function to distinguish the position of positive and negative samples in vector space. Different experiments on three standard datasets confirm the effectiveness of our innovations, and the performance of our proposed HADC is significantly improved compared to the state-of-the-art methods.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Knowledge graph completion</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Representation learning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Graph attention network</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Contrastive learning</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zhao, Yinliang</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Liu, Jun</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Liu, Yifan</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Wang, Chenxin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Neural computing & applications</subfield><subfield code="d">Springer London, 1993</subfield><subfield code="g">35(2023), 20 vom: 03. Apr., Seite 15005-15018</subfield><subfield code="w">(DE-627)165669608</subfield><subfield code="w">(DE-600)1136944-9</subfield><subfield code="w">(DE-576)032873050</subfield><subfield code="x">0941-0643</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:35</subfield><subfield code="g">year:2023</subfield><subfield code="g">number:20</subfield><subfield code="g">day:03</subfield><subfield code="g">month:04</subfield><subfield code="g">pages:15005-15018</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00521-023-08514-z</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">35</subfield><subfield code="j">2023</subfield><subfield code="e">20</subfield><subfield code="b">03</subfield><subfield code="c">04</subfield><subfield code="h">15005-15018</subfield></datafield></record></collection>
|
score |
7.3991003 |