DeepPatent: patent classification with convolutional neural networks and word embedding
Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN)...
Ausführliche Beschreibung
Autor*in: |
Li, Shaobo [verfasserIn] Hu, Jie [verfasserIn] Cui, Yuxin [verfasserIn] Hu, Jianjun [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2018 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
Enthalten in: Scientometrics - Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978, 117(2018), 2 vom: 06. Sept., Seite 721-744 |
---|---|
Übergeordnetes Werk: |
volume:117 ; year:2018 ; number:2 ; day:06 ; month:09 ; pages:721-744 |
Links: |
---|
DOI / URN: |
10.1007/s11192-018-2905-5 |
---|
Katalog-ID: |
SPR017619327 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | SPR017619327 | ||
003 | DE-627 | ||
005 | 20220111052938.0 | ||
007 | cr uuu---uuuuu | ||
008 | 201006s2018 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1007/s11192-018-2905-5 |2 doi | |
035 | |a (DE-627)SPR017619327 | ||
035 | |a (SPR)s11192-018-2905-5-e | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 050 |a 370 |q ASE |
084 | |a 31.00 |2 bkl | ||
100 | 1 | |a Li, Shaobo |e verfasserin |4 aut | |
245 | 1 | 0 | |a DeepPatent: patent classification with convolutional neural networks and word embedding |
264 | 1 | |c 2018 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. | ||
650 | 4 | |a Patent classification |7 (dpeaa)DE-He213 | |
650 | 4 | |a Text classification |7 (dpeaa)DE-He213 | |
650 | 4 | |a Convolutional neural network |7 (dpeaa)DE-He213 | |
650 | 4 | |a Machine learning |7 (dpeaa)DE-He213 | |
650 | 4 | |a Word embedding |7 (dpeaa)DE-He213 | |
700 | 1 | |a Hu, Jie |e verfasserin |4 aut | |
700 | 1 | |a Cui, Yuxin |e verfasserin |4 aut | |
700 | 1 | |a Hu, Jianjun |e verfasserin |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Scientometrics |d Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978 |g 117(2018), 2 vom: 06. Sept., Seite 721-744 |w (DE-627)320589099 |w (DE-600)2018679-4 |x 1588-2861 |7 nnns |
773 | 1 | 8 | |g volume:117 |g year:2018 |g number:2 |g day:06 |g month:09 |g pages:721-744 |
856 | 4 | 0 | |u https://dx.doi.org/10.1007/s11192-018-2905-5 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_SPRINGER | ||
912 | |a SSG-OPC-MAT | ||
912 | |a SSG-OPC-ASE | ||
912 | |a GBV_ILN_11 | ||
912 | |a GBV_ILN_20 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_23 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_32 | ||
912 | |a GBV_ILN_39 | ||
912 | |a GBV_ILN_40 | ||
912 | |a GBV_ILN_60 | ||
912 | |a GBV_ILN_62 | ||
912 | |a GBV_ILN_63 | ||
912 | |a GBV_ILN_69 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_73 | ||
912 | |a GBV_ILN_74 | ||
912 | |a GBV_ILN_90 | ||
912 | |a GBV_ILN_95 | ||
912 | |a GBV_ILN_100 | ||
912 | |a GBV_ILN_105 | ||
912 | |a GBV_ILN_110 | ||
912 | |a GBV_ILN_120 | ||
912 | |a GBV_ILN_138 | ||
912 | |a GBV_ILN_150 | ||
912 | |a GBV_ILN_151 | ||
912 | |a GBV_ILN_152 | ||
912 | |a GBV_ILN_161 | ||
912 | |a GBV_ILN_170 | ||
912 | |a GBV_ILN_171 | ||
912 | |a GBV_ILN_187 | ||
912 | |a GBV_ILN_213 | ||
912 | |a GBV_ILN_224 | ||
912 | |a GBV_ILN_230 | ||
912 | |a GBV_ILN_250 | ||
912 | |a GBV_ILN_281 | ||
912 | |a GBV_ILN_285 | ||
912 | |a GBV_ILN_293 | ||
912 | |a GBV_ILN_370 | ||
912 | |a GBV_ILN_602 | ||
912 | |a GBV_ILN_636 | ||
912 | |a GBV_ILN_702 | ||
912 | |a GBV_ILN_2001 | ||
912 | |a GBV_ILN_2003 | ||
912 | |a GBV_ILN_2004 | ||
912 | |a GBV_ILN_2005 | ||
912 | |a GBV_ILN_2006 | ||
912 | |a GBV_ILN_2007 | ||
912 | |a GBV_ILN_2009 | ||
912 | |a GBV_ILN_2010 | ||
912 | |a GBV_ILN_2011 | ||
912 | |a GBV_ILN_2014 | ||
912 | |a GBV_ILN_2015 | ||
912 | |a GBV_ILN_2020 | ||
912 | |a GBV_ILN_2021 | ||
912 | |a GBV_ILN_2025 | ||
912 | |a GBV_ILN_2026 | ||
912 | |a GBV_ILN_2027 | ||
912 | |a GBV_ILN_2031 | ||
912 | |a GBV_ILN_2034 | ||
912 | |a GBV_ILN_2037 | ||
912 | |a GBV_ILN_2038 | ||
912 | |a GBV_ILN_2039 | ||
912 | |a GBV_ILN_2044 | ||
912 | |a GBV_ILN_2048 | ||
912 | |a GBV_ILN_2049 | ||
912 | |a GBV_ILN_2050 | ||
912 | |a GBV_ILN_2055 | ||
912 | |a GBV_ILN_2057 | ||
912 | |a GBV_ILN_2059 | ||
912 | |a GBV_ILN_2061 | ||
912 | |a GBV_ILN_2064 | ||
912 | |a GBV_ILN_2065 | ||
912 | |a GBV_ILN_2068 | ||
912 | |a GBV_ILN_2070 | ||
912 | |a GBV_ILN_2086 | ||
912 | |a GBV_ILN_2088 | ||
912 | |a GBV_ILN_2093 | ||
912 | |a GBV_ILN_2106 | ||
912 | |a GBV_ILN_2107 | ||
912 | |a GBV_ILN_2108 | ||
912 | |a GBV_ILN_2110 | ||
912 | |a GBV_ILN_2111 | ||
912 | |a GBV_ILN_2112 | ||
912 | |a GBV_ILN_2113 | ||
912 | |a GBV_ILN_2116 | ||
912 | |a GBV_ILN_2118 | ||
912 | |a GBV_ILN_2119 | ||
912 | |a GBV_ILN_2122 | ||
912 | |a GBV_ILN_2129 | ||
912 | |a GBV_ILN_2143 | ||
912 | |a GBV_ILN_2144 | ||
912 | |a GBV_ILN_2147 | ||
912 | |a GBV_ILN_2148 | ||
912 | |a GBV_ILN_2152 | ||
912 | |a GBV_ILN_2153 | ||
912 | |a GBV_ILN_2188 | ||
912 | |a GBV_ILN_2190 | ||
912 | |a GBV_ILN_2232 | ||
912 | |a GBV_ILN_2336 | ||
912 | |a GBV_ILN_2446 | ||
912 | |a GBV_ILN_2470 | ||
912 | |a GBV_ILN_2472 | ||
912 | |a GBV_ILN_2507 | ||
912 | |a GBV_ILN_2522 | ||
912 | |a GBV_ILN_2548 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4035 | ||
912 | |a GBV_ILN_4037 | ||
912 | |a GBV_ILN_4046 | ||
912 | |a GBV_ILN_4112 | ||
912 | |a GBV_ILN_4125 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4242 | ||
912 | |a GBV_ILN_4246 | ||
912 | |a GBV_ILN_4249 | ||
912 | |a GBV_ILN_4251 | ||
912 | |a GBV_ILN_4305 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4313 | ||
912 | |a GBV_ILN_4322 | ||
912 | |a GBV_ILN_4323 | ||
912 | |a GBV_ILN_4324 | ||
912 | |a GBV_ILN_4325 | ||
912 | |a GBV_ILN_4326 | ||
912 | |a GBV_ILN_4328 | ||
912 | |a GBV_ILN_4333 | ||
912 | |a GBV_ILN_4334 | ||
912 | |a GBV_ILN_4335 | ||
912 | |a GBV_ILN_4336 | ||
912 | |a GBV_ILN_4338 | ||
912 | |a GBV_ILN_4393 | ||
912 | |a GBV_ILN_4700 | ||
936 | b | k | |a 31.00 |q ASE |
951 | |a AR | ||
952 | |d 117 |j 2018 |e 2 |b 06 |c 09 |h 721-744 |
author_variant |
s l sl j h jh y c yc j h jh |
---|---|
matchkey_str |
article:15882861:2018----::epaetaetlsiiainihovltoanuant |
hierarchy_sort_str |
2018 |
bklnumber |
31.00 |
publishDate |
2018 |
allfields |
10.1007/s11192-018-2905-5 doi (DE-627)SPR017619327 (SPR)s11192-018-2905-5-e DE-627 ger DE-627 rakwb eng 050 370 ASE 31.00 bkl Li, Shaobo verfasserin aut DeepPatent: patent classification with convolutional neural networks and word embedding 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. Patent classification (dpeaa)DE-He213 Text classification (dpeaa)DE-He213 Convolutional neural network (dpeaa)DE-He213 Machine learning (dpeaa)DE-He213 Word embedding (dpeaa)DE-He213 Hu, Jie verfasserin aut Cui, Yuxin verfasserin aut Hu, Jianjun verfasserin aut Enthalten in Scientometrics Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978 117(2018), 2 vom: 06. Sept., Seite 721-744 (DE-627)320589099 (DE-600)2018679-4 1588-2861 nnns volume:117 year:2018 number:2 day:06 month:09 pages:721-744 https://dx.doi.org/10.1007/s11192-018-2905-5 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER SSG-OPC-MAT SSG-OPC-ASE GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4012 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 31.00 ASE AR 117 2018 2 06 09 721-744 |
spelling |
10.1007/s11192-018-2905-5 doi (DE-627)SPR017619327 (SPR)s11192-018-2905-5-e DE-627 ger DE-627 rakwb eng 050 370 ASE 31.00 bkl Li, Shaobo verfasserin aut DeepPatent: patent classification with convolutional neural networks and word embedding 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. Patent classification (dpeaa)DE-He213 Text classification (dpeaa)DE-He213 Convolutional neural network (dpeaa)DE-He213 Machine learning (dpeaa)DE-He213 Word embedding (dpeaa)DE-He213 Hu, Jie verfasserin aut Cui, Yuxin verfasserin aut Hu, Jianjun verfasserin aut Enthalten in Scientometrics Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978 117(2018), 2 vom: 06. Sept., Seite 721-744 (DE-627)320589099 (DE-600)2018679-4 1588-2861 nnns volume:117 year:2018 number:2 day:06 month:09 pages:721-744 https://dx.doi.org/10.1007/s11192-018-2905-5 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER SSG-OPC-MAT SSG-OPC-ASE GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4012 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 31.00 ASE AR 117 2018 2 06 09 721-744 |
allfields_unstemmed |
10.1007/s11192-018-2905-5 doi (DE-627)SPR017619327 (SPR)s11192-018-2905-5-e DE-627 ger DE-627 rakwb eng 050 370 ASE 31.00 bkl Li, Shaobo verfasserin aut DeepPatent: patent classification with convolutional neural networks and word embedding 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. Patent classification (dpeaa)DE-He213 Text classification (dpeaa)DE-He213 Convolutional neural network (dpeaa)DE-He213 Machine learning (dpeaa)DE-He213 Word embedding (dpeaa)DE-He213 Hu, Jie verfasserin aut Cui, Yuxin verfasserin aut Hu, Jianjun verfasserin aut Enthalten in Scientometrics Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978 117(2018), 2 vom: 06. Sept., Seite 721-744 (DE-627)320589099 (DE-600)2018679-4 1588-2861 nnns volume:117 year:2018 number:2 day:06 month:09 pages:721-744 https://dx.doi.org/10.1007/s11192-018-2905-5 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER SSG-OPC-MAT SSG-OPC-ASE GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4012 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 31.00 ASE AR 117 2018 2 06 09 721-744 |
allfieldsGer |
10.1007/s11192-018-2905-5 doi (DE-627)SPR017619327 (SPR)s11192-018-2905-5-e DE-627 ger DE-627 rakwb eng 050 370 ASE 31.00 bkl Li, Shaobo verfasserin aut DeepPatent: patent classification with convolutional neural networks and word embedding 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. Patent classification (dpeaa)DE-He213 Text classification (dpeaa)DE-He213 Convolutional neural network (dpeaa)DE-He213 Machine learning (dpeaa)DE-He213 Word embedding (dpeaa)DE-He213 Hu, Jie verfasserin aut Cui, Yuxin verfasserin aut Hu, Jianjun verfasserin aut Enthalten in Scientometrics Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978 117(2018), 2 vom: 06. Sept., Seite 721-744 (DE-627)320589099 (DE-600)2018679-4 1588-2861 nnns volume:117 year:2018 number:2 day:06 month:09 pages:721-744 https://dx.doi.org/10.1007/s11192-018-2905-5 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER SSG-OPC-MAT SSG-OPC-ASE GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4012 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 31.00 ASE AR 117 2018 2 06 09 721-744 |
allfieldsSound |
10.1007/s11192-018-2905-5 doi (DE-627)SPR017619327 (SPR)s11192-018-2905-5-e DE-627 ger DE-627 rakwb eng 050 370 ASE 31.00 bkl Li, Shaobo verfasserin aut DeepPatent: patent classification with convolutional neural networks and word embedding 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. Patent classification (dpeaa)DE-He213 Text classification (dpeaa)DE-He213 Convolutional neural network (dpeaa)DE-He213 Machine learning (dpeaa)DE-He213 Word embedding (dpeaa)DE-He213 Hu, Jie verfasserin aut Cui, Yuxin verfasserin aut Hu, Jianjun verfasserin aut Enthalten in Scientometrics Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978 117(2018), 2 vom: 06. Sept., Seite 721-744 (DE-627)320589099 (DE-600)2018679-4 1588-2861 nnns volume:117 year:2018 number:2 day:06 month:09 pages:721-744 https://dx.doi.org/10.1007/s11192-018-2905-5 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER SSG-OPC-MAT SSG-OPC-ASE GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4012 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 31.00 ASE AR 117 2018 2 06 09 721-744 |
language |
English |
source |
Enthalten in Scientometrics 117(2018), 2 vom: 06. Sept., Seite 721-744 volume:117 year:2018 number:2 day:06 month:09 pages:721-744 |
sourceStr |
Enthalten in Scientometrics 117(2018), 2 vom: 06. Sept., Seite 721-744 volume:117 year:2018 number:2 day:06 month:09 pages:721-744 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Patent classification Text classification Convolutional neural network Machine learning Word embedding |
dewey-raw |
050 |
isfreeaccess_bool |
false |
container_title |
Scientometrics |
authorswithroles_txt_mv |
Li, Shaobo @@aut@@ Hu, Jie @@aut@@ Cui, Yuxin @@aut@@ Hu, Jianjun @@aut@@ |
publishDateDaySort_date |
2018-09-06T00:00:00Z |
hierarchy_top_id |
320589099 |
dewey-sort |
250 |
id |
SPR017619327 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR017619327</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20220111052938.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201006s2018 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11192-018-2905-5</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR017619327</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s11192-018-2905-5-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">050</subfield><subfield code="a">370</subfield><subfield code="q">ASE</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">31.00</subfield><subfield code="2">bkl</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Li, Shaobo</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">DeepPatent: patent classification with convolutional neural networks and word embedding</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2018</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Patent classification</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Text classification</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Convolutional neural network</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Machine learning</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Word embedding</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Hu, Jie</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Cui, Yuxin</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Hu, Jianjun</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Scientometrics</subfield><subfield code="d">Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978</subfield><subfield code="g">117(2018), 2 vom: 06. Sept., Seite 721-744</subfield><subfield code="w">(DE-627)320589099</subfield><subfield code="w">(DE-600)2018679-4</subfield><subfield code="x">1588-2861</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:117</subfield><subfield code="g">year:2018</subfield><subfield code="g">number:2</subfield><subfield code="g">day:06</subfield><subfield code="g">month:09</subfield><subfield code="g">pages:721-744</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s11192-018-2905-5</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OPC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OPC-ASE</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_32</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_74</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_90</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_120</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_138</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_150</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_171</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_187</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_224</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_250</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_281</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_636</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_702</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2001</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2003</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2004</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2007</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2025</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2026</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2027</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2031</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2034</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2038</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2039</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2044</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2048</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2049</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2050</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2059</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2061</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2064</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2065</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2068</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2070</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2086</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2106</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2107</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2113</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2116</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2118</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2119</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2122</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2129</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2143</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2144</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2147</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2148</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2153</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2188</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2190</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2232</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2446</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2470</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2472</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2522</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2548</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4035</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4242</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4246</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4251</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4328</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4333</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4334</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4393</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="936" ind1="b" ind2="k"><subfield code="a">31.00</subfield><subfield code="q">ASE</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">117</subfield><subfield code="j">2018</subfield><subfield code="e">2</subfield><subfield code="b">06</subfield><subfield code="c">09</subfield><subfield code="h">721-744</subfield></datafield></record></collection>
|
author |
Li, Shaobo |
spellingShingle |
Li, Shaobo ddc 050 bkl 31.00 misc Patent classification misc Text classification misc Convolutional neural network misc Machine learning misc Word embedding DeepPatent: patent classification with convolutional neural networks and word embedding |
authorStr |
Li, Shaobo |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)320589099 |
format |
electronic Article |
dewey-ones |
050 - General serial publications 370 - Education |
delete_txt_mv |
keep |
author_role |
aut aut aut aut |
collection |
springer |
remote_str |
true |
illustrated |
Not Illustrated |
issn |
1588-2861 |
topic_title |
050 370 ASE 31.00 bkl DeepPatent: patent classification with convolutional neural networks and word embedding Patent classification (dpeaa)DE-He213 Text classification (dpeaa)DE-He213 Convolutional neural network (dpeaa)DE-He213 Machine learning (dpeaa)DE-He213 Word embedding (dpeaa)DE-He213 |
topic |
ddc 050 bkl 31.00 misc Patent classification misc Text classification misc Convolutional neural network misc Machine learning misc Word embedding |
topic_unstemmed |
ddc 050 bkl 31.00 misc Patent classification misc Text classification misc Convolutional neural network misc Machine learning misc Word embedding |
topic_browse |
ddc 050 bkl 31.00 misc Patent classification misc Text classification misc Convolutional neural network misc Machine learning misc Word embedding |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
Scientometrics |
hierarchy_parent_id |
320589099 |
dewey-tens |
050 - Magazines, journals & serials 370 - Education |
hierarchy_top_title |
Scientometrics |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)320589099 (DE-600)2018679-4 |
title |
DeepPatent: patent classification with convolutional neural networks and word embedding |
ctrlnum |
(DE-627)SPR017619327 (SPR)s11192-018-2905-5-e |
title_full |
DeepPatent: patent classification with convolutional neural networks and word embedding |
author_sort |
Li, Shaobo |
journal |
Scientometrics |
journalStr |
Scientometrics |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works 300 - Social sciences |
recordtype |
marc |
publishDateSort |
2018 |
contenttype_str_mv |
txt |
container_start_page |
721 |
author_browse |
Li, Shaobo Hu, Jie Cui, Yuxin Hu, Jianjun |
container_volume |
117 |
class |
050 370 ASE 31.00 bkl |
format_se |
Elektronische Aufsätze |
author-letter |
Li, Shaobo |
doi_str_mv |
10.1007/s11192-018-2905-5 |
dewey-full |
050 370 |
author2-role |
verfasserin |
title_sort |
deeppatent: patent classification with convolutional neural networks and word embedding |
title_auth |
DeepPatent: patent classification with convolutional neural networks and word embedding |
abstract |
Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. |
abstractGer |
Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. |
abstract_unstemmed |
Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER SSG-OPC-MAT SSG-OPC-ASE GBV_ILN_11 GBV_ILN_20 GBV_ILN_22 GBV_ILN_23 GBV_ILN_24 GBV_ILN_31 GBV_ILN_32 GBV_ILN_39 GBV_ILN_40 GBV_ILN_60 GBV_ILN_62 GBV_ILN_63 GBV_ILN_69 GBV_ILN_70 GBV_ILN_73 GBV_ILN_74 GBV_ILN_90 GBV_ILN_95 GBV_ILN_100 GBV_ILN_105 GBV_ILN_110 GBV_ILN_120 GBV_ILN_138 GBV_ILN_150 GBV_ILN_151 GBV_ILN_152 GBV_ILN_161 GBV_ILN_170 GBV_ILN_171 GBV_ILN_187 GBV_ILN_213 GBV_ILN_224 GBV_ILN_230 GBV_ILN_250 GBV_ILN_281 GBV_ILN_285 GBV_ILN_293 GBV_ILN_370 GBV_ILN_602 GBV_ILN_636 GBV_ILN_702 GBV_ILN_2001 GBV_ILN_2003 GBV_ILN_2004 GBV_ILN_2005 GBV_ILN_2006 GBV_ILN_2007 GBV_ILN_2009 GBV_ILN_2010 GBV_ILN_2011 GBV_ILN_2014 GBV_ILN_2015 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2025 GBV_ILN_2026 GBV_ILN_2027 GBV_ILN_2031 GBV_ILN_2034 GBV_ILN_2037 GBV_ILN_2038 GBV_ILN_2039 GBV_ILN_2044 GBV_ILN_2048 GBV_ILN_2049 GBV_ILN_2050 GBV_ILN_2055 GBV_ILN_2057 GBV_ILN_2059 GBV_ILN_2061 GBV_ILN_2064 GBV_ILN_2065 GBV_ILN_2068 GBV_ILN_2070 GBV_ILN_2086 GBV_ILN_2088 GBV_ILN_2093 GBV_ILN_2106 GBV_ILN_2107 GBV_ILN_2108 GBV_ILN_2110 GBV_ILN_2111 GBV_ILN_2112 GBV_ILN_2113 GBV_ILN_2116 GBV_ILN_2118 GBV_ILN_2119 GBV_ILN_2122 GBV_ILN_2129 GBV_ILN_2143 GBV_ILN_2144 GBV_ILN_2147 GBV_ILN_2148 GBV_ILN_2152 GBV_ILN_2153 GBV_ILN_2188 GBV_ILN_2190 GBV_ILN_2232 GBV_ILN_2336 GBV_ILN_2446 GBV_ILN_2470 GBV_ILN_2472 GBV_ILN_2507 GBV_ILN_2522 GBV_ILN_2548 GBV_ILN_4012 GBV_ILN_4035 GBV_ILN_4037 GBV_ILN_4046 GBV_ILN_4112 GBV_ILN_4125 GBV_ILN_4126 GBV_ILN_4242 GBV_ILN_4246 GBV_ILN_4249 GBV_ILN_4251 GBV_ILN_4305 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4313 GBV_ILN_4322 GBV_ILN_4323 GBV_ILN_4324 GBV_ILN_4325 GBV_ILN_4326 GBV_ILN_4328 GBV_ILN_4333 GBV_ILN_4334 GBV_ILN_4335 GBV_ILN_4336 GBV_ILN_4338 GBV_ILN_4393 GBV_ILN_4700 |
container_issue |
2 |
title_short |
DeepPatent: patent classification with convolutional neural networks and word embedding |
url |
https://dx.doi.org/10.1007/s11192-018-2905-5 |
remote_bool |
true |
author2 |
Hu, Jie Cui, Yuxin Hu, Jianjun |
author2Str |
Hu, Jie Cui, Yuxin Hu, Jianjun |
ppnlink |
320589099 |
mediatype_str_mv |
c |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s11192-018-2905-5 |
up_date |
2024-07-03T14:04:42.810Z |
_version_ |
1803566954904027136 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR017619327</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20220111052938.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201006s2018 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s11192-018-2905-5</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR017619327</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s11192-018-2905-5-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">050</subfield><subfield code="a">370</subfield><subfield code="q">ASE</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">31.00</subfield><subfield code="2">bkl</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Li, Shaobo</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">DeepPatent: patent classification with convolutional neural networks and word embedding</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2018</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Patent classification is an essential task in patent information management and patent knowledge mining. However, this task is still largely done manually due to the unsatisfactory performance of current algorithms. Recently, deep learning methods such as convolutional neural networks (CNN) have led to great progress in image processing, voice recognition, and speech recognition, which has yet to be applied to patent classification. We proposed DeepPatent, a deep learning algorithm for patent classification based on CNN and word vector embedding. We evaluated the algorithm on the standard patent classification benchmark dataset CLEF-IP and compared it with other algorithms in the CLEF-IP competition. Experiments showed that DeepPatent with automatic feature extraction achieved a classification precision of 83.98%, which outperformed all the existing algorithms that used the same information for training. Its performance is better than the state-of-art patent classifier with a precision of 83.50%, whose performance is, however, based on 4000 characters from the description section and a lot of feature engineering while DeepPatent only used the title and abstract information. DeepPatent is further tested on USPTO-2M, a patent classification benchmark data set that we contributed with 2,000,147 records after data cleaning of 2,679,443 USA raw utility patent documents in 637 categories at the subclass level. Our algorithms achieved a precision of 73.88%.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Patent classification</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Text classification</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Convolutional neural network</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Machine learning</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Word embedding</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Hu, Jie</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Cui, Yuxin</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Hu, Jianjun</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Scientometrics</subfield><subfield code="d">Dordrecht [u.a.] : Springer Science + Business Media B.V., 1978</subfield><subfield code="g">117(2018), 2 vom: 06. Sept., Seite 721-744</subfield><subfield code="w">(DE-627)320589099</subfield><subfield code="w">(DE-600)2018679-4</subfield><subfield code="x">1588-2861</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:117</subfield><subfield code="g">year:2018</subfield><subfield code="g">number:2</subfield><subfield code="g">day:06</subfield><subfield code="g">month:09</subfield><subfield code="g">pages:721-744</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s11192-018-2905-5</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OPC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OPC-ASE</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_11</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_20</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_23</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_32</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_39</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_40</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_60</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_62</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_63</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_69</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_73</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_74</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_90</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_95</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_100</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_105</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_120</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_138</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_150</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_151</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_161</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_170</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_171</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_187</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_213</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_224</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_230</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_250</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_281</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_285</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_293</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_370</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_602</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_636</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_702</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2001</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2003</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2004</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2005</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2007</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2009</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2011</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2014</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2015</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2025</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2026</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2027</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2031</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2034</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2038</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2039</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2044</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2048</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2049</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2050</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2055</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2057</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2059</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2061</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2064</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2065</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2068</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2070</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2086</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2106</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2107</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2108</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2110</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2111</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2113</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2116</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2118</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2119</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2122</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2129</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2143</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2144</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2147</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2148</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2152</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2153</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2188</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2190</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2232</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2446</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2470</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2472</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2507</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2522</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2548</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4035</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4037</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4112</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4125</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4242</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4246</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4249</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4251</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4305</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4313</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4322</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4323</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4324</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4325</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4326</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4328</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4333</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4334</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4335</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4336</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4338</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4393</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4700</subfield></datafield><datafield tag="936" ind1="b" ind2="k"><subfield code="a">31.00</subfield><subfield code="q">ASE</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">117</subfield><subfield code="j">2018</subfield><subfield code="e">2</subfield><subfield code="b">06</subfield><subfield code="c">09</subfield><subfield code="h">721-744</subfield></datafield></record></collection>
|
score |
7.401043 |