Extensive study on the underlying gender bias in contextualized word embeddings
Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlyi...
Ausführliche Beschreibung
Autor*in: |
Basta, Christine [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2020 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Springer-Verlag London Ltd., part of Springer Nature 2020 |
---|
Übergeordnetes Werk: |
Enthalten in: Neural computing & applications - Springer London, 1993, 33(2020), 8 vom: 24. Juli, Seite 3371-3384 |
---|---|
Übergeordnetes Werk: |
volume:33 ; year:2020 ; number:8 ; day:24 ; month:07 ; pages:3371-3384 |
Links: |
---|
DOI / URN: |
10.1007/s00521-020-05211-z |
---|
Katalog-ID: |
OLC2124525670 |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | OLC2124525670 | ||
003 | DE-627 | ||
005 | 20230505091001.0 | ||
007 | tu | ||
008 | 230505s2020 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s00521-020-05211-z |2 doi | |
035 | |a (DE-627)OLC2124525670 | ||
035 | |a (DE-He213)s00521-020-05211-z-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 004 |q VZ |
100 | 1 | |a Basta, Christine |e verfasserin |0 (orcid)0000-0001-5551-0356 |4 aut | |
245 | 1 | 0 | |a Extensive study on the underlying gender bias in contextualized word embeddings |
264 | 1 | |c 2020 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © Springer-Verlag London Ltd., part of Springer Nature 2020 | ||
520 | |a Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. | ||
650 | 4 | |a Gender bias | |
650 | 4 | |a Contextualized embeddings | |
650 | 4 | |a Natural Language processing | |
700 | 1 | |a Costa-jussà, Marta R. |4 aut | |
700 | 1 | |a Casas, Noe |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Neural computing & applications |d Springer London, 1993 |g 33(2020), 8 vom: 24. Juli, Seite 3371-3384 |w (DE-627)165669608 |w (DE-600)1136944-9 |w (DE-576)032873050 |x 0941-0643 |7 nnns |
773 | 1 | 8 | |g volume:33 |g year:2020 |g number:8 |g day:24 |g month:07 |g pages:3371-3384 |
856 | 4 | 1 | |u https://doi.org/10.1007/s00521-020-05211-z |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a GBV_ILN_2018 | ||
912 | |a GBV_ILN_4277 | ||
951 | |a AR | ||
952 | |d 33 |j 2020 |e 8 |b 24 |c 07 |h 3371-3384 |
author_variant |
c b cb m r c j mrc mrcj n c nc |
---|---|
matchkey_str |
article:09410643:2020----::xesvsuynhudryngnebaicneta |
hierarchy_sort_str |
2020 |
publishDate |
2020 |
allfields |
10.1007/s00521-020-05211-z doi (DE-627)OLC2124525670 (DE-He213)s00521-020-05211-z-p DE-627 ger DE-627 rakwb eng 004 VZ Basta, Christine verfasserin (orcid)0000-0001-5551-0356 aut Extensive study on the underlying gender bias in contextualized word embeddings 2020 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London Ltd., part of Springer Nature 2020 Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. Gender bias Contextualized embeddings Natural Language processing Costa-jussà, Marta R. aut Casas, Noe aut Enthalten in Neural computing & applications Springer London, 1993 33(2020), 8 vom: 24. Juli, Seite 3371-3384 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:33 year:2020 number:8 day:24 month:07 pages:3371-3384 https://doi.org/10.1007/s00521-020-05211-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 33 2020 8 24 07 3371-3384 |
spelling |
10.1007/s00521-020-05211-z doi (DE-627)OLC2124525670 (DE-He213)s00521-020-05211-z-p DE-627 ger DE-627 rakwb eng 004 VZ Basta, Christine verfasserin (orcid)0000-0001-5551-0356 aut Extensive study on the underlying gender bias in contextualized word embeddings 2020 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London Ltd., part of Springer Nature 2020 Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. Gender bias Contextualized embeddings Natural Language processing Costa-jussà, Marta R. aut Casas, Noe aut Enthalten in Neural computing & applications Springer London, 1993 33(2020), 8 vom: 24. Juli, Seite 3371-3384 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:33 year:2020 number:8 day:24 month:07 pages:3371-3384 https://doi.org/10.1007/s00521-020-05211-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 33 2020 8 24 07 3371-3384 |
allfields_unstemmed |
10.1007/s00521-020-05211-z doi (DE-627)OLC2124525670 (DE-He213)s00521-020-05211-z-p DE-627 ger DE-627 rakwb eng 004 VZ Basta, Christine verfasserin (orcid)0000-0001-5551-0356 aut Extensive study on the underlying gender bias in contextualized word embeddings 2020 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London Ltd., part of Springer Nature 2020 Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. Gender bias Contextualized embeddings Natural Language processing Costa-jussà, Marta R. aut Casas, Noe aut Enthalten in Neural computing & applications Springer London, 1993 33(2020), 8 vom: 24. Juli, Seite 3371-3384 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:33 year:2020 number:8 day:24 month:07 pages:3371-3384 https://doi.org/10.1007/s00521-020-05211-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 33 2020 8 24 07 3371-3384 |
allfieldsGer |
10.1007/s00521-020-05211-z doi (DE-627)OLC2124525670 (DE-He213)s00521-020-05211-z-p DE-627 ger DE-627 rakwb eng 004 VZ Basta, Christine verfasserin (orcid)0000-0001-5551-0356 aut Extensive study on the underlying gender bias in contextualized word embeddings 2020 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London Ltd., part of Springer Nature 2020 Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. Gender bias Contextualized embeddings Natural Language processing Costa-jussà, Marta R. aut Casas, Noe aut Enthalten in Neural computing & applications Springer London, 1993 33(2020), 8 vom: 24. Juli, Seite 3371-3384 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:33 year:2020 number:8 day:24 month:07 pages:3371-3384 https://doi.org/10.1007/s00521-020-05211-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 33 2020 8 24 07 3371-3384 |
allfieldsSound |
10.1007/s00521-020-05211-z doi (DE-627)OLC2124525670 (DE-He213)s00521-020-05211-z-p DE-627 ger DE-627 rakwb eng 004 VZ Basta, Christine verfasserin (orcid)0000-0001-5551-0356 aut Extensive study on the underlying gender bias in contextualized word embeddings 2020 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London Ltd., part of Springer Nature 2020 Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. Gender bias Contextualized embeddings Natural Language processing Costa-jussà, Marta R. aut Casas, Noe aut Enthalten in Neural computing & applications Springer London, 1993 33(2020), 8 vom: 24. Juli, Seite 3371-3384 (DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 0941-0643 nnns volume:33 year:2020 number:8 day:24 month:07 pages:3371-3384 https://doi.org/10.1007/s00521-020-05211-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 AR 33 2020 8 24 07 3371-3384 |
language |
English |
source |
Enthalten in Neural computing & applications 33(2020), 8 vom: 24. Juli, Seite 3371-3384 volume:33 year:2020 number:8 day:24 month:07 pages:3371-3384 |
sourceStr |
Enthalten in Neural computing & applications 33(2020), 8 vom: 24. Juli, Seite 3371-3384 volume:33 year:2020 number:8 day:24 month:07 pages:3371-3384 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Gender bias Contextualized embeddings Natural Language processing |
dewey-raw |
004 |
isfreeaccess_bool |
false |
container_title |
Neural computing & applications |
authorswithroles_txt_mv |
Basta, Christine @@aut@@ Costa-jussà, Marta R. @@aut@@ Casas, Noe @@aut@@ |
publishDateDaySort_date |
2020-07-24T00:00:00Z |
hierarchy_top_id |
165669608 |
dewey-sort |
14 |
id |
OLC2124525670 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">OLC2124525670</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230505091001.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">230505s2020 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00521-020-05211-z</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2124525670</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00521-020-05211-z-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Basta, Christine</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0001-5551-0356</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Extensive study on the underlying gender bias in contextualized word embeddings</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2020</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag London Ltd., part of Springer Nature 2020</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Gender bias</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Contextualized embeddings</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Natural Language processing</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Costa-jussà, Marta R.</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Casas, Noe</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Neural computing & applications</subfield><subfield code="d">Springer London, 1993</subfield><subfield code="g">33(2020), 8 vom: 24. Juli, Seite 3371-3384</subfield><subfield code="w">(DE-627)165669608</subfield><subfield code="w">(DE-600)1136944-9</subfield><subfield code="w">(DE-576)032873050</subfield><subfield code="x">0941-0643</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:33</subfield><subfield code="g">year:2020</subfield><subfield code="g">number:8</subfield><subfield code="g">day:24</subfield><subfield code="g">month:07</subfield><subfield code="g">pages:3371-3384</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00521-020-05211-z</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">33</subfield><subfield code="j">2020</subfield><subfield code="e">8</subfield><subfield code="b">24</subfield><subfield code="c">07</subfield><subfield code="h">3371-3384</subfield></datafield></record></collection>
|
author |
Basta, Christine |
spellingShingle |
Basta, Christine ddc 004 misc Gender bias misc Contextualized embeddings misc Natural Language processing Extensive study on the underlying gender bias in contextualized word embeddings |
authorStr |
Basta, Christine |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)165669608 |
format |
Article |
dewey-ones |
004 - Data processing & computer science |
delete_txt_mv |
keep |
author_role |
aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
0941-0643 |
topic_title |
004 VZ Extensive study on the underlying gender bias in contextualized word embeddings Gender bias Contextualized embeddings Natural Language processing |
topic |
ddc 004 misc Gender bias misc Contextualized embeddings misc Natural Language processing |
topic_unstemmed |
ddc 004 misc Gender bias misc Contextualized embeddings misc Natural Language processing |
topic_browse |
ddc 004 misc Gender bias misc Contextualized embeddings misc Natural Language processing |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Neural computing & applications |
hierarchy_parent_id |
165669608 |
dewey-tens |
000 - Computer science, knowledge & systems |
hierarchy_top_title |
Neural computing & applications |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)165669608 (DE-600)1136944-9 (DE-576)032873050 |
title |
Extensive study on the underlying gender bias in contextualized word embeddings |
ctrlnum |
(DE-627)OLC2124525670 (DE-He213)s00521-020-05211-z-p |
title_full |
Extensive study on the underlying gender bias in contextualized word embeddings |
author_sort |
Basta, Christine |
journal |
Neural computing & applications |
journalStr |
Neural computing & applications |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works |
recordtype |
marc |
publishDateSort |
2020 |
contenttype_str_mv |
txt |
container_start_page |
3371 |
author_browse |
Basta, Christine Costa-jussà, Marta R. Casas, Noe |
container_volume |
33 |
class |
004 VZ |
format_se |
Aufsätze |
author-letter |
Basta, Christine |
doi_str_mv |
10.1007/s00521-020-05211-z |
normlink |
(ORCID)0000-0001-5551-0356 |
normlink_prefix_str_mv |
(orcid)0000-0001-5551-0356 |
dewey-full |
004 |
title_sort |
extensive study on the underlying gender bias in contextualized word embeddings |
title_auth |
Extensive study on the underlying gender bias in contextualized word embeddings |
abstract |
Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. © Springer-Verlag London Ltd., part of Springer Nature 2020 |
abstractGer |
Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. © Springer-Verlag London Ltd., part of Springer Nature 2020 |
abstract_unstemmed |
Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings. © Springer-Verlag London Ltd., part of Springer Nature 2020 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_2018 GBV_ILN_4277 |
container_issue |
8 |
title_short |
Extensive study on the underlying gender bias in contextualized word embeddings |
url |
https://doi.org/10.1007/s00521-020-05211-z |
remote_bool |
false |
author2 |
Costa-jussà, Marta R. Casas, Noe |
author2Str |
Costa-jussà, Marta R. Casas, Noe |
ppnlink |
165669608 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s00521-020-05211-z |
up_date |
2024-07-04T00:08:56.028Z |
_version_ |
1803604969160441856 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000naa a22002652 4500</leader><controlfield tag="001">OLC2124525670</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230505091001.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">230505s2020 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00521-020-05211-z</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2124525670</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00521-020-05211-z-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Basta, Christine</subfield><subfield code="e">verfasserin</subfield><subfield code="0">(orcid)0000-0001-5551-0356</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Extensive study on the underlying gender bias in contextualized word embeddings</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2020</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag London Ltd., part of Springer Nature 2020</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Gender bias is affecting many natural language processing applications. While we are still far from proposing debiasing methods that will solve the problem, we are making progress analyzing the impact of this bias in current algorithms. This paper provides an extensive study of the underlying gender bias in popular contextualized word embeddings. Our study provides an insightful analysis of evaluation measures applied to several English data domains and the layers of the contextualized word embeddings. It is also adapted and extended to the Spanish language. Our study points out the advantages and limitations of the various evaluation measures that we are using and aims to standardize the evaluation of gender bias in contextualized word embeddings.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Gender bias</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Contextualized embeddings</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Natural Language processing</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Costa-jussà, Marta R.</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Casas, Noe</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Neural computing & applications</subfield><subfield code="d">Springer London, 1993</subfield><subfield code="g">33(2020), 8 vom: 24. Juli, Seite 3371-3384</subfield><subfield code="w">(DE-627)165669608</subfield><subfield code="w">(DE-600)1136944-9</subfield><subfield code="w">(DE-576)032873050</subfield><subfield code="x">0941-0643</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:33</subfield><subfield code="g">year:2020</subfield><subfield code="g">number:8</subfield><subfield code="g">day:24</subfield><subfield code="g">month:07</subfield><subfield code="g">pages:3371-3384</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00521-020-05211-z</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2018</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4277</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">33</subfield><subfield code="j">2020</subfield><subfield code="e">8</subfield><subfield code="b">24</subfield><subfield code="c">07</subfield><subfield code="h">3371-3384</subfield></datafield></record></collection>
|
score |
7.402466 |