A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary
Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the ad...
Ausführliche Beschreibung
Autor*in: |
Xinhua Zhu [verfasserIn] Qingting Xu [verfasserIn] Yishan Chen [verfasserIn] Hongchao Chen [verfasserIn] Tianjun Wu [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2020 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
In: IEEE Access - IEEE, 2014, 8(2020), Seite 24990-25000 |
---|---|
Übergeordnetes Werk: |
volume:8 ; year:2020 ; pages:24990-25000 |
Links: |
---|
DOI / URN: |
10.1109/ACCESS.2019.2954106 |
---|
Katalog-ID: |
DOAJ056266251 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | DOAJ056266251 | ||
003 | DE-627 | ||
005 | 20230308200310.0 | ||
007 | cr uuu---uuuuu | ||
008 | 230227s2020 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1109/ACCESS.2019.2954106 |2 doi | |
035 | |a (DE-627)DOAJ056266251 | ||
035 | |a (DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
050 | 0 | |a TK1-9971 | |
100 | 0 | |a Xinhua Zhu |e verfasserin |4 aut | |
245 | 1 | 2 | |a A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary |
264 | 1 | |c 2020 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. | ||
650 | 4 | |a Text classification | |
650 | 4 | |a dependencies | |
650 | 4 | |a weight calculation | |
650 | 4 | |a semantic dictionary | |
650 | 4 | |a part-of-speech | |
650 | 4 | |a class-center vector | |
653 | 0 | |a Electrical engineering. Electronics. Nuclear engineering | |
700 | 0 | |a Qingting Xu |e verfasserin |4 aut | |
700 | 0 | |a Yishan Chen |e verfasserin |4 aut | |
700 | 0 | |a Hongchao Chen |e verfasserin |4 aut | |
700 | 0 | |a Tianjun Wu |e verfasserin |4 aut | |
773 | 0 | 8 | |i In |t IEEE Access |d IEEE, 2014 |g 8(2020), Seite 24990-25000 |w (DE-627)728440385 |w (DE-600)2687964-5 |x 21693536 |7 nnns |
773 | 1 | 8 | |g volume:8 |g year:2020 |g pages:24990-25000 |
856 | 4 | 0 | |u https://doi.org/10.1109/ACCESS.2019.2954106 |z kostenfrei |
856 | 4 | 0 | |u https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b |z kostenfrei |
856 | 4 | 0 | |u https://ieeexplore.ieee.org/document/8905993/ |z kostenfrei |
856 | 4 | 2 | |u https://doaj.org/toc/2169-3536 |y Journal toc |z kostenfrei |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_DOAJ | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_65 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4367 | ||
912 | |a GBV_ILN_4700 | ||
951 | |a AR | ||
952 | |d 8 |j 2020 |h 24990-25000 |
author_variant |
x z xz q x qx y c yc h c hc t w tw |
---|---|
matchkey_str |
article:21693536:2020----::nvllscnevcomdlotxcasfctouigeednis |
hierarchy_sort_str |
2020 |
callnumber-subject-code |
TK |
publishDate |
2020 |
allfields |
10.1109/ACCESS.2019.2954106 doi (DE-627)DOAJ056266251 (DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b DE-627 ger DE-627 rakwb eng TK1-9971 Xinhua Zhu verfasserin aut A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. Text classification dependencies weight calculation semantic dictionary part-of-speech class-center vector Electrical engineering. Electronics. Nuclear engineering Qingting Xu verfasserin aut Yishan Chen verfasserin aut Hongchao Chen verfasserin aut Tianjun Wu verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 24990-25000 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:24990-25000 https://doi.org/10.1109/ACCESS.2019.2954106 kostenfrei https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b kostenfrei https://ieeexplore.ieee.org/document/8905993/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 24990-25000 |
spelling |
10.1109/ACCESS.2019.2954106 doi (DE-627)DOAJ056266251 (DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b DE-627 ger DE-627 rakwb eng TK1-9971 Xinhua Zhu verfasserin aut A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. Text classification dependencies weight calculation semantic dictionary part-of-speech class-center vector Electrical engineering. Electronics. Nuclear engineering Qingting Xu verfasserin aut Yishan Chen verfasserin aut Hongchao Chen verfasserin aut Tianjun Wu verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 24990-25000 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:24990-25000 https://doi.org/10.1109/ACCESS.2019.2954106 kostenfrei https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b kostenfrei https://ieeexplore.ieee.org/document/8905993/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 24990-25000 |
allfields_unstemmed |
10.1109/ACCESS.2019.2954106 doi (DE-627)DOAJ056266251 (DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b DE-627 ger DE-627 rakwb eng TK1-9971 Xinhua Zhu verfasserin aut A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. Text classification dependencies weight calculation semantic dictionary part-of-speech class-center vector Electrical engineering. Electronics. Nuclear engineering Qingting Xu verfasserin aut Yishan Chen verfasserin aut Hongchao Chen verfasserin aut Tianjun Wu verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 24990-25000 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:24990-25000 https://doi.org/10.1109/ACCESS.2019.2954106 kostenfrei https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b kostenfrei https://ieeexplore.ieee.org/document/8905993/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 24990-25000 |
allfieldsGer |
10.1109/ACCESS.2019.2954106 doi (DE-627)DOAJ056266251 (DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b DE-627 ger DE-627 rakwb eng TK1-9971 Xinhua Zhu verfasserin aut A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. Text classification dependencies weight calculation semantic dictionary part-of-speech class-center vector Electrical engineering. Electronics. Nuclear engineering Qingting Xu verfasserin aut Yishan Chen verfasserin aut Hongchao Chen verfasserin aut Tianjun Wu verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 24990-25000 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:24990-25000 https://doi.org/10.1109/ACCESS.2019.2954106 kostenfrei https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b kostenfrei https://ieeexplore.ieee.org/document/8905993/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 24990-25000 |
allfieldsSound |
10.1109/ACCESS.2019.2954106 doi (DE-627)DOAJ056266251 (DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b DE-627 ger DE-627 rakwb eng TK1-9971 Xinhua Zhu verfasserin aut A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary 2020 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. Text classification dependencies weight calculation semantic dictionary part-of-speech class-center vector Electrical engineering. Electronics. Nuclear engineering Qingting Xu verfasserin aut Yishan Chen verfasserin aut Hongchao Chen verfasserin aut Tianjun Wu verfasserin aut In IEEE Access IEEE, 2014 8(2020), Seite 24990-25000 (DE-627)728440385 (DE-600)2687964-5 21693536 nnns volume:8 year:2020 pages:24990-25000 https://doi.org/10.1109/ACCESS.2019.2954106 kostenfrei https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b kostenfrei https://ieeexplore.ieee.org/document/8905993/ kostenfrei https://doaj.org/toc/2169-3536 Journal toc kostenfrei GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 AR 8 2020 24990-25000 |
language |
English |
source |
In IEEE Access 8(2020), Seite 24990-25000 volume:8 year:2020 pages:24990-25000 |
sourceStr |
In IEEE Access 8(2020), Seite 24990-25000 volume:8 year:2020 pages:24990-25000 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Text classification dependencies weight calculation semantic dictionary part-of-speech class-center vector Electrical engineering. Electronics. Nuclear engineering |
isfreeaccess_bool |
true |
container_title |
IEEE Access |
authorswithroles_txt_mv |
Xinhua Zhu @@aut@@ Qingting Xu @@aut@@ Yishan Chen @@aut@@ Hongchao Chen @@aut@@ Tianjun Wu @@aut@@ |
publishDateDaySort_date |
2020-01-01T00:00:00Z |
hierarchy_top_id |
728440385 |
id |
DOAJ056266251 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ056266251</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230308200310.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230227s2020 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2019.2954106</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ056266251</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Xinhua Zhu</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="2"><subfield code="a">A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2020</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Text classification</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">dependencies</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">weight calculation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">semantic dictionary</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">part-of-speech</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">class-center vector</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Qingting Xu</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Yishan Chen</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Hongchao Chen</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Tianjun Wu</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">8(2020), Seite 24990-25000</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:8</subfield><subfield code="g">year:2020</subfield><subfield code="g">pages:24990-25000</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2019.2954106</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/8905993/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">8</subfield><subfield code="j">2020</subfield><subfield code="h">24990-25000</subfield></datafield></record></collection>
|
callnumber-first |
T - Technology |
author |
Xinhua Zhu |
spellingShingle |
Xinhua Zhu misc TK1-9971 misc Text classification misc dependencies misc weight calculation misc semantic dictionary misc part-of-speech misc class-center vector misc Electrical engineering. Electronics. Nuclear engineering A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary |
authorStr |
Xinhua Zhu |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)728440385 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut aut aut aut |
collection |
DOAJ |
remote_str |
true |
callnumber-label |
TK1-9971 |
illustrated |
Not Illustrated |
issn |
21693536 |
topic_title |
TK1-9971 A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary Text classification dependencies weight calculation semantic dictionary part-of-speech class-center vector |
topic |
misc TK1-9971 misc Text classification misc dependencies misc weight calculation misc semantic dictionary misc part-of-speech misc class-center vector misc Electrical engineering. Electronics. Nuclear engineering |
topic_unstemmed |
misc TK1-9971 misc Text classification misc dependencies misc weight calculation misc semantic dictionary misc part-of-speech misc class-center vector misc Electrical engineering. Electronics. Nuclear engineering |
topic_browse |
misc TK1-9971 misc Text classification misc dependencies misc weight calculation misc semantic dictionary misc part-of-speech misc class-center vector misc Electrical engineering. Electronics. Nuclear engineering |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
IEEE Access |
hierarchy_parent_id |
728440385 |
hierarchy_top_title |
IEEE Access |
isfreeaccess_txt |
true |
familylinks_str_mv |
(DE-627)728440385 (DE-600)2687964-5 |
title |
A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary |
ctrlnum |
(DE-627)DOAJ056266251 (DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b |
title_full |
A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary |
author_sort |
Xinhua Zhu |
journal |
IEEE Access |
journalStr |
IEEE Access |
callnumber-first-code |
T |
lang_code |
eng |
isOA_bool |
true |
recordtype |
marc |
publishDateSort |
2020 |
contenttype_str_mv |
txt |
container_start_page |
24990 |
author_browse |
Xinhua Zhu Qingting Xu Yishan Chen Hongchao Chen Tianjun Wu |
container_volume |
8 |
class |
TK1-9971 |
format_se |
Elektronische Aufsätze |
author-letter |
Xinhua Zhu |
doi_str_mv |
10.1109/ACCESS.2019.2954106 |
author2-role |
verfasserin |
title_sort |
novel class-center vector model for text classification using dependencies and a semantic dictionary |
callnumber |
TK1-9971 |
title_auth |
A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary |
abstract |
Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. |
abstractGer |
Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. |
abstract_unstemmed |
Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_DOAJ GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_65 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_95 GBV_ILN_105 GBV_ILN_110 GBV_ILN_151 GBV_ILN_161 GBV_ILN_170 GBV_ILN_213 GBV_ILN_230 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_2014 GBV_ILN_4012 GBV_ILN_4037 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4249 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4335 GBV_ILN_4338 GBV_ILN_4367 GBV_ILN_4700 |
title_short |
A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary |
url |
https://doi.org/10.1109/ACCESS.2019.2954106 https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b https://ieeexplore.ieee.org/document/8905993/ https://doaj.org/toc/2169-3536 |
remote_bool |
true |
author2 |
Qingting Xu Yishan Chen Hongchao Chen Tianjun Wu |
author2Str |
Qingting Xu Yishan Chen Hongchao Chen Tianjun Wu |
ppnlink |
728440385 |
callnumber-subject |
TK - Electrical and Nuclear Engineering |
mediatype_str_mv |
c |
isOA_txt |
true |
hochschulschrift_bool |
false |
doi_str |
10.1109/ACCESS.2019.2954106 |
callnumber-a |
TK1-9971 |
up_date |
2024-07-03T19:50:56.358Z |
_version_ |
1803588737553137664 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">DOAJ056266251</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230308200310.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">230227s2020 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1109/ACCESS.2019.2954106</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)DOAJ056266251</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)DOAJ6d886cff7edb459d97500fd53dfd385b</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="0"><subfield code="a">TK1-9971</subfield></datafield><datafield tag="100" ind1="0" ind2=" "><subfield code="a">Xinhua Zhu</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="2"><subfield code="a">A Novel Class-Center Vector Model for Text Classification Using Dependencies and a Semantic Dictionary</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2020</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Automatic text classification is a research focus and core technology in information retrieval and natural language processing. Different from the traditional text classification methods (SVM, Bayesian, KNN), the class-center vector method is an important text classification method, which has the advantages of less calculation and high efficiency. However, the traditional class-center vector method for text classification has the disadvantages that the class vector is large and sparse, and its classification accuracy is not high because of the lack of semantic information. To overcome these problems, this paper proposes a novel class-center vector model for text classification using dependencies and a semantic dictionary. We respectively use WordNet English semantic dictionary and Tongyici Cilin Chinese semantic dictionary to cluster the English or Chinese feature words in the class-center vector and to significantly reduce the dimension of class-center vector, thereby realizing a new class-center vector for text classification using dependencies and a semantic dictionary. Experiments show that, compared with traditional text classification algorithms, the improved class-center vector method has lower time complexity and higher accuracy on the 20Newsgroups English corpus, Fudan and Sogou Chinese corpus. This paper is an improved version of our NLPCC2019 conference paper.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Text classification</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">dependencies</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">weight calculation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">semantic dictionary</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">part-of-speech</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">class-center vector</subfield></datafield><datafield tag="653" ind1=" " ind2="0"><subfield code="a">Electrical engineering. Electronics. Nuclear engineering</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Qingting Xu</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Yishan Chen</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Hongchao Chen</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="0" ind2=" "><subfield code="a">Tianjun Wu</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">In</subfield><subfield code="t">IEEE Access</subfield><subfield code="d">IEEE, 2014</subfield><subfield code="g">8(2020), Seite 24990-25000</subfield><subfield code="w">(DE-627)728440385</subfield><subfield code="w">(DE-600)2687964-5</subfield><subfield code="x">21693536</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:8</subfield><subfield code="g">year:2020</subfield><subfield code="g">pages:24990-25000</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1109/ACCESS.2019.2954106</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doaj.org/article/6d886cff7edb459d97500fd53dfd385b</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ieeexplore.ieee.org/document/8905993/</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="u">https://doaj.org/toc/2169-3536</subfield><subfield code="y">Journal toc</subfield><subfield code="z">kostenfrei</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_DOAJ</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_65</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4367</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">8</subfield><subfield code="j">2020</subfield><subfield code="h">24990-25000</subfield></datafield></record></collection>
|
score |
7.4011583 |