Trading accuracy for simplicity in decision trees
Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the proble...
Ausführliche Beschreibung
Autor*in: |
Bohanec, Marko [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
1994 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Kluwer Academic Publishers 1994 |
---|
Übergeordnetes Werk: |
Enthalten in: Machine learning - Kluwer Academic Publishers, 1986, 15(1994), 3 vom: Juni, Seite 223-250 |
---|---|
Übergeordnetes Werk: |
volume:15 ; year:1994 ; number:3 ; month:06 ; pages:223-250 |
Links: |
---|
DOI / URN: |
10.1007/BF00993345 |
---|
Katalog-ID: |
OLC2026514119 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | OLC2026514119 | ||
003 | DE-627 | ||
005 | 20230503172154.0 | ||
007 | tu | ||
008 | 200820s1994 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/BF00993345 |2 doi | |
035 | |a (DE-627)OLC2026514119 | ||
035 | |a (DE-He213)BF00993345-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 150 |a 004 |q VZ |
100 | 1 | |a Bohanec, Marko |e verfasserin |4 aut | |
245 | 1 | 0 | |a Trading accuracy for simplicity in decision trees |
264 | 1 | |c 1994 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © Kluwer Academic Publishers 1994 | ||
520 | |a Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. | ||
650 | 4 | |a decision trees | |
650 | 4 | |a knowledge representation | |
650 | 4 | |a pruning | |
650 | 4 | |a dynamic programming | |
700 | 1 | |a Bratko, Ivan |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Machine learning |d Kluwer Academic Publishers, 1986 |g 15(1994), 3 vom: Juni, Seite 223-250 |w (DE-627)12920403X |w (DE-600)54638-0 |w (DE-576)014457377 |x 0885-6125 |7 nnns |
773 | 1 | 8 | |g volume:15 |g year:1994 |g number:3 |g month:06 |g pages:223-250 |
856 | 4 | 1 | |u https://doi.org/10.1007/BF00993345 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a GBV_ILN_21 | ||
912 | |a GBV_ILN_22 | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_31 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_130 | ||
912 | |a GBV_ILN_2006 | ||
912 | |a GBV_ILN_2010 | ||
912 | |a GBV_ILN_2020 | ||
912 | |a GBV_ILN_2021 | ||
912 | |a GBV_ILN_2093 | ||
912 | |a GBV_ILN_2244 | ||
912 | |a GBV_ILN_4012 | ||
912 | |a GBV_ILN_4046 | ||
912 | |a GBV_ILN_4266 | ||
912 | |a GBV_ILN_4306 | ||
912 | |a GBV_ILN_4307 | ||
912 | |a GBV_ILN_4318 | ||
951 | |a AR | ||
952 | |d 15 |j 1994 |e 3 |c 06 |h 223-250 |
author_variant |
m b mb i b ib |
---|---|
matchkey_str |
article:08856125:1994----::rdnacrcfripiiyn |
hierarchy_sort_str |
1994 |
publishDate |
1994 |
allfields |
10.1007/BF00993345 doi (DE-627)OLC2026514119 (DE-He213)BF00993345-p DE-627 ger DE-627 rakwb eng 150 004 VZ Bohanec, Marko verfasserin aut Trading accuracy for simplicity in decision trees 1994 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Kluwer Academic Publishers 1994 Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. decision trees knowledge representation pruning dynamic programming Bratko, Ivan aut Enthalten in Machine learning Kluwer Academic Publishers, 1986 15(1994), 3 vom: Juni, Seite 223-250 (DE-627)12920403X (DE-600)54638-0 (DE-576)014457377 0885-6125 nnns volume:15 year:1994 number:3 month:06 pages:223-250 https://doi.org/10.1007/BF00993345 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_21 GBV_ILN_22 GBV_ILN_24 GBV_ILN_31 GBV_ILN_70 GBV_ILN_130 GBV_ILN_2006 GBV_ILN_2010 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2093 GBV_ILN_2244 GBV_ILN_4012 GBV_ILN_4046 GBV_ILN_4266 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4318 AR 15 1994 3 06 223-250 |
spelling |
10.1007/BF00993345 doi (DE-627)OLC2026514119 (DE-He213)BF00993345-p DE-627 ger DE-627 rakwb eng 150 004 VZ Bohanec, Marko verfasserin aut Trading accuracy for simplicity in decision trees 1994 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Kluwer Academic Publishers 1994 Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. decision trees knowledge representation pruning dynamic programming Bratko, Ivan aut Enthalten in Machine learning Kluwer Academic Publishers, 1986 15(1994), 3 vom: Juni, Seite 223-250 (DE-627)12920403X (DE-600)54638-0 (DE-576)014457377 0885-6125 nnns volume:15 year:1994 number:3 month:06 pages:223-250 https://doi.org/10.1007/BF00993345 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_21 GBV_ILN_22 GBV_ILN_24 GBV_ILN_31 GBV_ILN_70 GBV_ILN_130 GBV_ILN_2006 GBV_ILN_2010 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2093 GBV_ILN_2244 GBV_ILN_4012 GBV_ILN_4046 GBV_ILN_4266 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4318 AR 15 1994 3 06 223-250 |
allfields_unstemmed |
10.1007/BF00993345 doi (DE-627)OLC2026514119 (DE-He213)BF00993345-p DE-627 ger DE-627 rakwb eng 150 004 VZ Bohanec, Marko verfasserin aut Trading accuracy for simplicity in decision trees 1994 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Kluwer Academic Publishers 1994 Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. decision trees knowledge representation pruning dynamic programming Bratko, Ivan aut Enthalten in Machine learning Kluwer Academic Publishers, 1986 15(1994), 3 vom: Juni, Seite 223-250 (DE-627)12920403X (DE-600)54638-0 (DE-576)014457377 0885-6125 nnns volume:15 year:1994 number:3 month:06 pages:223-250 https://doi.org/10.1007/BF00993345 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_21 GBV_ILN_22 GBV_ILN_24 GBV_ILN_31 GBV_ILN_70 GBV_ILN_130 GBV_ILN_2006 GBV_ILN_2010 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2093 GBV_ILN_2244 GBV_ILN_4012 GBV_ILN_4046 GBV_ILN_4266 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4318 AR 15 1994 3 06 223-250 |
allfieldsGer |
10.1007/BF00993345 doi (DE-627)OLC2026514119 (DE-He213)BF00993345-p DE-627 ger DE-627 rakwb eng 150 004 VZ Bohanec, Marko verfasserin aut Trading accuracy for simplicity in decision trees 1994 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Kluwer Academic Publishers 1994 Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. decision trees knowledge representation pruning dynamic programming Bratko, Ivan aut Enthalten in Machine learning Kluwer Academic Publishers, 1986 15(1994), 3 vom: Juni, Seite 223-250 (DE-627)12920403X (DE-600)54638-0 (DE-576)014457377 0885-6125 nnns volume:15 year:1994 number:3 month:06 pages:223-250 https://doi.org/10.1007/BF00993345 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_21 GBV_ILN_22 GBV_ILN_24 GBV_ILN_31 GBV_ILN_70 GBV_ILN_130 GBV_ILN_2006 GBV_ILN_2010 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2093 GBV_ILN_2244 GBV_ILN_4012 GBV_ILN_4046 GBV_ILN_4266 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4318 AR 15 1994 3 06 223-250 |
allfieldsSound |
10.1007/BF00993345 doi (DE-627)OLC2026514119 (DE-He213)BF00993345-p DE-627 ger DE-627 rakwb eng 150 004 VZ Bohanec, Marko verfasserin aut Trading accuracy for simplicity in decision trees 1994 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Kluwer Academic Publishers 1994 Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. decision trees knowledge representation pruning dynamic programming Bratko, Ivan aut Enthalten in Machine learning Kluwer Academic Publishers, 1986 15(1994), 3 vom: Juni, Seite 223-250 (DE-627)12920403X (DE-600)54638-0 (DE-576)014457377 0885-6125 nnns volume:15 year:1994 number:3 month:06 pages:223-250 https://doi.org/10.1007/BF00993345 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_21 GBV_ILN_22 GBV_ILN_24 GBV_ILN_31 GBV_ILN_70 GBV_ILN_130 GBV_ILN_2006 GBV_ILN_2010 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2093 GBV_ILN_2244 GBV_ILN_4012 GBV_ILN_4046 GBV_ILN_4266 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4318 AR 15 1994 3 06 223-250 |
language |
English |
source |
Enthalten in Machine learning 15(1994), 3 vom: Juni, Seite 223-250 volume:15 year:1994 number:3 month:06 pages:223-250 |
sourceStr |
Enthalten in Machine learning 15(1994), 3 vom: Juni, Seite 223-250 volume:15 year:1994 number:3 month:06 pages:223-250 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
decision trees knowledge representation pruning dynamic programming |
dewey-raw |
150 |
isfreeaccess_bool |
false |
container_title |
Machine learning |
authorswithroles_txt_mv |
Bohanec, Marko @@aut@@ Bratko, Ivan @@aut@@ |
publishDateDaySort_date |
1994-06-01T00:00:00Z |
hierarchy_top_id |
12920403X |
dewey-sort |
3150 |
id |
OLC2026514119 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2026514119</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230503172154.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200820s1994 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/BF00993345</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2026514119</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)BF00993345-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">150</subfield><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Bohanec, Marko</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Trading accuracy for simplicity in decision trees</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">1994</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Kluwer Academic Publishers 1994</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">decision trees</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">knowledge representation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">pruning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">dynamic programming</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Bratko, Ivan</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Machine learning</subfield><subfield code="d">Kluwer Academic Publishers, 1986</subfield><subfield code="g">15(1994), 3 vom: Juni, Seite 223-250</subfield><subfield code="w">(DE-627)12920403X</subfield><subfield code="w">(DE-600)54638-0</subfield><subfield code="w">(DE-576)014457377</subfield><subfield code="x">0885-6125</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:15</subfield><subfield code="g">year:1994</subfield><subfield code="g">number:3</subfield><subfield code="g">month:06</subfield><subfield code="g">pages:223-250</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/BF00993345</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_21</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_130</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2244</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4266</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4318</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">15</subfield><subfield code="j">1994</subfield><subfield code="e">3</subfield><subfield code="c">06</subfield><subfield code="h">223-250</subfield></datafield></record></collection>
|
author |
Bohanec, Marko |
spellingShingle |
Bohanec, Marko ddc 150 misc decision trees misc knowledge representation misc pruning misc dynamic programming Trading accuracy for simplicity in decision trees |
authorStr |
Bohanec, Marko |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)12920403X |
format |
Article |
dewey-ones |
150 - Psychology 004 - Data processing & computer science |
delete_txt_mv |
keep |
author_role |
aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
0885-6125 |
topic_title |
150 004 VZ Trading accuracy for simplicity in decision trees decision trees knowledge representation pruning dynamic programming |
topic |
ddc 150 misc decision trees misc knowledge representation misc pruning misc dynamic programming |
topic_unstemmed |
ddc 150 misc decision trees misc knowledge representation misc pruning misc dynamic programming |
topic_browse |
ddc 150 misc decision trees misc knowledge representation misc pruning misc dynamic programming |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Machine learning |
hierarchy_parent_id |
12920403X |
dewey-tens |
150 - Psychology 000 - Computer science, knowledge & systems |
hierarchy_top_title |
Machine learning |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)12920403X (DE-600)54638-0 (DE-576)014457377 |
title |
Trading accuracy for simplicity in decision trees |
ctrlnum |
(DE-627)OLC2026514119 (DE-He213)BF00993345-p |
title_full |
Trading accuracy for simplicity in decision trees |
author_sort |
Bohanec, Marko |
journal |
Machine learning |
journalStr |
Machine learning |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
100 - Philosophy & psychology 000 - Computer science, information & general works |
recordtype |
marc |
publishDateSort |
1994 |
contenttype_str_mv |
txt |
container_start_page |
223 |
author_browse |
Bohanec, Marko Bratko, Ivan |
container_volume |
15 |
class |
150 004 VZ |
format_se |
Aufsätze |
author-letter |
Bohanec, Marko |
doi_str_mv |
10.1007/BF00993345 |
dewey-full |
150 004 |
title_sort |
trading accuracy for simplicity in decision trees |
title_auth |
Trading accuracy for simplicity in decision trees |
abstract |
Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. © Kluwer Academic Publishers 1994 |
abstractGer |
Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. © Kluwer Academic Publishers 1994 |
abstract_unstemmed |
Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy. © Kluwer Academic Publishers 1994 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_21 GBV_ILN_22 GBV_ILN_24 GBV_ILN_31 GBV_ILN_70 GBV_ILN_130 GBV_ILN_2006 GBV_ILN_2010 GBV_ILN_2020 GBV_ILN_2021 GBV_ILN_2093 GBV_ILN_2244 GBV_ILN_4012 GBV_ILN_4046 GBV_ILN_4266 GBV_ILN_4306 GBV_ILN_4307 GBV_ILN_4318 |
container_issue |
3 |
title_short |
Trading accuracy for simplicity in decision trees |
url |
https://doi.org/10.1007/BF00993345 |
remote_bool |
false |
author2 |
Bratko, Ivan |
author2Str |
Bratko, Ivan |
ppnlink |
12920403X |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/BF00993345 |
up_date |
2024-07-04T04:08:21.674Z |
_version_ |
1803620032629964800 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2026514119</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230503172154.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200820s1994 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/BF00993345</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2026514119</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)BF00993345-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">150</subfield><subfield code="a">004</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Bohanec, Marko</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Trading accuracy for simplicity in decision trees</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">1994</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Kluwer Academic Publishers 1994</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract When communicating concepts, it is often convenient or even necessary to define a concept approximately. A simple, although only approximately accurate concept definition may be more useful than a completely accurate definition which involves a lot of detail. This paper addresses the problem: given a completely accurate, but complex, definition of a concept, simplify the definition, possibly at the expense of accuracy, so that the simplified definition still corresponds to the concept “sufficiently” well. Concepts are represented by decision trees, and the method of simplification is tree pruning. Given a decision tree that accurately specifies a concept, the problem is to find a smallest pruned tree that still represents the concept within some specified accuracy. A pruning algorithm is presented that finds an optimal solution by generating adense sequence of pruned trees, decreasing in size, such that each tree has the highest accuracy among all the possible pruned trees of the same size. An efficient implementation of the algorithm, based on dynamic programming, is presented and empirically compared with three progressive pruning algorithms using both artificial and real-world data. An interesting empirical finding is that the real-world data generally allow significantly greater simplification at equal loss of accuracy.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">decision trees</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">knowledge representation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">pruning</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">dynamic programming</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Bratko, Ivan</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Machine learning</subfield><subfield code="d">Kluwer Academic Publishers, 1986</subfield><subfield code="g">15(1994), 3 vom: Juni, Seite 223-250</subfield><subfield code="w">(DE-627)12920403X</subfield><subfield code="w">(DE-600)54638-0</subfield><subfield code="w">(DE-576)014457377</subfield><subfield code="x">0885-6125</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:15</subfield><subfield code="g">year:1994</subfield><subfield code="g">number:3</subfield><subfield code="g">month:06</subfield><subfield code="g">pages:223-250</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/BF00993345</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_21</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_22</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_31</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_130</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2006</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2010</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2020</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2021</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2093</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2244</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4012</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4046</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4266</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4306</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4307</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4318</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">15</subfield><subfield code="j">1994</subfield><subfield code="e">3</subfield><subfield code="c">06</subfield><subfield code="h">223-250</subfield></datafield></record></collection>
|
score |
7.4013176 |