Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks
Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy fu...
Ausführliche Beschreibung
Autor*in: |
Kuang, Yin [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2014 |
---|
Schlagwörter: |
Closed affine subspace learning (CASL) Non-negative matrix factorization (NMF) |
---|
Anmerkung: |
© Springer-Verlag London 2014 |
---|
Übergeordnetes Werk: |
Enthalten in: Pattern analysis and applications - Springer London, 1998, 18(2014), 4 vom: 09. Mai, Seite 817-828 |
---|---|
Übergeordnetes Werk: |
volume:18 ; year:2014 ; number:4 ; day:09 ; month:05 ; pages:817-828 |
Links: |
---|
DOI / URN: |
10.1007/s10044-014-0370-1 |
---|
Katalog-ID: |
OLC2051700257 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | OLC2051700257 | ||
003 | DE-627 | ||
005 | 20230502161411.0 | ||
007 | tu | ||
008 | 200819s2014 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s10044-014-0370-1 |2 doi | |
035 | |a (DE-627)OLC2051700257 | ||
035 | |a (DE-He213)s10044-014-0370-1-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 004 |a 600 |q VZ |
084 | |a 54.74$jMaschinelles Sehen |2 bkl | ||
100 | 1 | |a Kuang, Yin |e verfasserin |4 aut | |
245 | 1 | 0 | |a Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks |
264 | 1 | |c 2014 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © Springer-Verlag London 2014 | ||
520 | |a Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. | ||
650 | 4 | |a Closed affine subspace learning (CASL) | |
650 | 4 | |a Non-negative matrix factorization (NMF) | |
650 | 4 | |a Sparse representation | |
650 | 4 | |a Multiple manifolds clustering and embedding | |
650 | 4 | |a Lotka–Volterra recurrent neural networks (LV RNNs) | |
700 | 1 | |a Zhang, Lei |4 aut | |
700 | 1 | |a Yi, Zhang |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Pattern analysis and applications |d Springer London, 1998 |g 18(2014), 4 vom: 09. Mai, Seite 817-828 |w (DE-627)24992921X |w (DE-600)1446989-3 |w (DE-576)27655583X |x 1433-7541 |7 nnns |
773 | 1 | 8 | |g volume:18 |g year:2014 |g number:4 |g day:09 |g month:05 |g pages:817-828 |
856 | 4 | 1 | |u https://doi.org/10.1007/s10044-014-0370-1 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a GBV_ILN_70 | ||
936 | b | k | |a 54.74$jMaschinelles Sehen |q VZ |0 10641030X |0 (DE-625)10641030X |
951 | |a AR | ||
952 | |d 18 |j 2014 |e 4 |b 09 |c 05 |h 817-828 |
author_variant |
y k yk l z lz z y zy |
---|---|
matchkey_str |
article:14337541:2014----::osrcig_gahfrusaeerigircr |
hierarchy_sort_str |
2014 |
bklnumber |
54.74$jMaschinelles Sehen |
publishDate |
2014 |
allfields |
10.1007/s10044-014-0370-1 doi (DE-627)OLC2051700257 (DE-He213)s10044-014-0370-1-p DE-627 ger DE-627 rakwb eng 004 600 VZ 54.74$jMaschinelles Sehen bkl Kuang, Yin verfasserin aut Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks 2014 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London 2014 Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. Closed affine subspace learning (CASL) Non-negative matrix factorization (NMF) Sparse representation Multiple manifolds clustering and embedding Lotka–Volterra recurrent neural networks (LV RNNs) Zhang, Lei aut Yi, Zhang aut Enthalten in Pattern analysis and applications Springer London, 1998 18(2014), 4 vom: 09. Mai, Seite 817-828 (DE-627)24992921X (DE-600)1446989-3 (DE-576)27655583X 1433-7541 nnns volume:18 year:2014 number:4 day:09 month:05 pages:817-828 https://doi.org/10.1007/s10044-014-0370-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 54.74$jMaschinelles Sehen VZ 10641030X (DE-625)10641030X AR 18 2014 4 09 05 817-828 |
spelling |
10.1007/s10044-014-0370-1 doi (DE-627)OLC2051700257 (DE-He213)s10044-014-0370-1-p DE-627 ger DE-627 rakwb eng 004 600 VZ 54.74$jMaschinelles Sehen bkl Kuang, Yin verfasserin aut Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks 2014 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London 2014 Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. Closed affine subspace learning (CASL) Non-negative matrix factorization (NMF) Sparse representation Multiple manifolds clustering and embedding Lotka–Volterra recurrent neural networks (LV RNNs) Zhang, Lei aut Yi, Zhang aut Enthalten in Pattern analysis and applications Springer London, 1998 18(2014), 4 vom: 09. Mai, Seite 817-828 (DE-627)24992921X (DE-600)1446989-3 (DE-576)27655583X 1433-7541 nnns volume:18 year:2014 number:4 day:09 month:05 pages:817-828 https://doi.org/10.1007/s10044-014-0370-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 54.74$jMaschinelles Sehen VZ 10641030X (DE-625)10641030X AR 18 2014 4 09 05 817-828 |
allfields_unstemmed |
10.1007/s10044-014-0370-1 doi (DE-627)OLC2051700257 (DE-He213)s10044-014-0370-1-p DE-627 ger DE-627 rakwb eng 004 600 VZ 54.74$jMaschinelles Sehen bkl Kuang, Yin verfasserin aut Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks 2014 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London 2014 Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. Closed affine subspace learning (CASL) Non-negative matrix factorization (NMF) Sparse representation Multiple manifolds clustering and embedding Lotka–Volterra recurrent neural networks (LV RNNs) Zhang, Lei aut Yi, Zhang aut Enthalten in Pattern analysis and applications Springer London, 1998 18(2014), 4 vom: 09. Mai, Seite 817-828 (DE-627)24992921X (DE-600)1446989-3 (DE-576)27655583X 1433-7541 nnns volume:18 year:2014 number:4 day:09 month:05 pages:817-828 https://doi.org/10.1007/s10044-014-0370-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 54.74$jMaschinelles Sehen VZ 10641030X (DE-625)10641030X AR 18 2014 4 09 05 817-828 |
allfieldsGer |
10.1007/s10044-014-0370-1 doi (DE-627)OLC2051700257 (DE-He213)s10044-014-0370-1-p DE-627 ger DE-627 rakwb eng 004 600 VZ 54.74$jMaschinelles Sehen bkl Kuang, Yin verfasserin aut Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks 2014 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London 2014 Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. Closed affine subspace learning (CASL) Non-negative matrix factorization (NMF) Sparse representation Multiple manifolds clustering and embedding Lotka–Volterra recurrent neural networks (LV RNNs) Zhang, Lei aut Yi, Zhang aut Enthalten in Pattern analysis and applications Springer London, 1998 18(2014), 4 vom: 09. Mai, Seite 817-828 (DE-627)24992921X (DE-600)1446989-3 (DE-576)27655583X 1433-7541 nnns volume:18 year:2014 number:4 day:09 month:05 pages:817-828 https://doi.org/10.1007/s10044-014-0370-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 54.74$jMaschinelles Sehen VZ 10641030X (DE-625)10641030X AR 18 2014 4 09 05 817-828 |
allfieldsSound |
10.1007/s10044-014-0370-1 doi (DE-627)OLC2051700257 (DE-He213)s10044-014-0370-1-p DE-627 ger DE-627 rakwb eng 004 600 VZ 54.74$jMaschinelles Sehen bkl Kuang, Yin verfasserin aut Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks 2014 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer-Verlag London 2014 Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. Closed affine subspace learning (CASL) Non-negative matrix factorization (NMF) Sparse representation Multiple manifolds clustering and embedding Lotka–Volterra recurrent neural networks (LV RNNs) Zhang, Lei aut Yi, Zhang aut Enthalten in Pattern analysis and applications Springer London, 1998 18(2014), 4 vom: 09. Mai, Seite 817-828 (DE-627)24992921X (DE-600)1446989-3 (DE-576)27655583X 1433-7541 nnns volume:18 year:2014 number:4 day:09 month:05 pages:817-828 https://doi.org/10.1007/s10044-014-0370-1 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 54.74$jMaschinelles Sehen VZ 10641030X (DE-625)10641030X AR 18 2014 4 09 05 817-828 |
language |
English |
source |
Enthalten in Pattern analysis and applications 18(2014), 4 vom: 09. Mai, Seite 817-828 volume:18 year:2014 number:4 day:09 month:05 pages:817-828 |
sourceStr |
Enthalten in Pattern analysis and applications 18(2014), 4 vom: 09. Mai, Seite 817-828 volume:18 year:2014 number:4 day:09 month:05 pages:817-828 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Closed affine subspace learning (CASL) Non-negative matrix factorization (NMF) Sparse representation Multiple manifolds clustering and embedding Lotka–Volterra recurrent neural networks (LV RNNs) |
dewey-raw |
004 |
isfreeaccess_bool |
false |
container_title |
Pattern analysis and applications |
authorswithroles_txt_mv |
Kuang, Yin @@aut@@ Zhang, Lei @@aut@@ Yi, Zhang @@aut@@ |
publishDateDaySort_date |
2014-05-09T00:00:00Z |
hierarchy_top_id |
24992921X |
dewey-sort |
14 |
id |
OLC2051700257 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2051700257</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230502161411.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200819s2014 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s10044-014-0370-1</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2051700257</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s10044-014-0370-1-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="a">600</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">54.74$jMaschinelles Sehen</subfield><subfield code="2">bkl</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Kuang, Yin</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2014</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag London 2014</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Closed affine subspace learning (CASL)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Non-negative matrix factorization (NMF)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Sparse representation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Multiple manifolds clustering and embedding</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Lotka–Volterra recurrent neural networks (LV RNNs)</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zhang, Lei</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Yi, Zhang</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Pattern analysis and applications</subfield><subfield code="d">Springer London, 1998</subfield><subfield code="g">18(2014), 4 vom: 09. Mai, Seite 817-828</subfield><subfield code="w">(DE-627)24992921X</subfield><subfield code="w">(DE-600)1446989-3</subfield><subfield code="w">(DE-576)27655583X</subfield><subfield code="x">1433-7541</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:18</subfield><subfield code="g">year:2014</subfield><subfield code="g">number:4</subfield><subfield code="g">day:09</subfield><subfield code="g">month:05</subfield><subfield code="g">pages:817-828</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s10044-014-0370-1</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="936" ind1="b" ind2="k"><subfield code="a">54.74$jMaschinelles Sehen</subfield><subfield code="q">VZ</subfield><subfield code="0">10641030X</subfield><subfield code="0">(DE-625)10641030X</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">18</subfield><subfield code="j">2014</subfield><subfield code="e">4</subfield><subfield code="b">09</subfield><subfield code="c">05</subfield><subfield code="h">817-828</subfield></datafield></record></collection>
|
author |
Kuang, Yin |
spellingShingle |
Kuang, Yin ddc 004 bkl 54.74$jMaschinelles Sehen misc Closed affine subspace learning (CASL) misc Non-negative matrix factorization (NMF) misc Sparse representation misc Multiple manifolds clustering and embedding misc Lotka–Volterra recurrent neural networks (LV RNNs) Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks |
authorStr |
Kuang, Yin |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)24992921X |
format |
Article |
dewey-ones |
004 - Data processing & computer science 600 - Technology |
delete_txt_mv |
keep |
author_role |
aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
1433-7541 |
topic_title |
004 600 VZ 54.74$jMaschinelles Sehen bkl Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks Closed affine subspace learning (CASL) Non-negative matrix factorization (NMF) Sparse representation Multiple manifolds clustering and embedding Lotka–Volterra recurrent neural networks (LV RNNs) |
topic |
ddc 004 bkl 54.74$jMaschinelles Sehen misc Closed affine subspace learning (CASL) misc Non-negative matrix factorization (NMF) misc Sparse representation misc Multiple manifolds clustering and embedding misc Lotka–Volterra recurrent neural networks (LV RNNs) |
topic_unstemmed |
ddc 004 bkl 54.74$jMaschinelles Sehen misc Closed affine subspace learning (CASL) misc Non-negative matrix factorization (NMF) misc Sparse representation misc Multiple manifolds clustering and embedding misc Lotka–Volterra recurrent neural networks (LV RNNs) |
topic_browse |
ddc 004 bkl 54.74$jMaschinelles Sehen misc Closed affine subspace learning (CASL) misc Non-negative matrix factorization (NMF) misc Sparse representation misc Multiple manifolds clustering and embedding misc Lotka–Volterra recurrent neural networks (LV RNNs) |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Pattern analysis and applications |
hierarchy_parent_id |
24992921X |
dewey-tens |
000 - Computer science, knowledge & systems 600 - Technology |
hierarchy_top_title |
Pattern analysis and applications |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)24992921X (DE-600)1446989-3 (DE-576)27655583X |
title |
Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks |
ctrlnum |
(DE-627)OLC2051700257 (DE-He213)s10044-014-0370-1-p |
title_full |
Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks |
author_sort |
Kuang, Yin |
journal |
Pattern analysis and applications |
journalStr |
Pattern analysis and applications |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works 600 - Technology |
recordtype |
marc |
publishDateSort |
2014 |
contenttype_str_mv |
txt |
container_start_page |
817 |
author_browse |
Kuang, Yin Zhang, Lei Yi, Zhang |
container_volume |
18 |
class |
004 600 VZ 54.74$jMaschinelles Sehen bkl |
format_se |
Aufsätze |
author-letter |
Kuang, Yin |
doi_str_mv |
10.1007/s10044-014-0370-1 |
normlink |
10641030X |
normlink_prefix_str_mv |
10641030X (DE-625)10641030X |
dewey-full |
004 600 |
title_sort |
constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks |
title_auth |
Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks |
abstract |
Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. © Springer-Verlag London 2014 |
abstractGer |
Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. © Springer-Verlag London 2014 |
abstract_unstemmed |
Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds. © Springer-Verlag London 2014 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT GBV_ILN_70 |
container_issue |
4 |
title_short |
Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks |
url |
https://doi.org/10.1007/s10044-014-0370-1 |
remote_bool |
false |
author2 |
Zhang, Lei Yi, Zhang |
author2Str |
Zhang, Lei Yi, Zhang |
ppnlink |
24992921X |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s10044-014-0370-1 |
up_date |
2024-07-04T05:04:17.422Z |
_version_ |
1803623551376293888 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2051700257</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230502161411.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200819s2014 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s10044-014-0370-1</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2051700257</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s10044-014-0370-1-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="a">600</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">54.74$jMaschinelles Sehen</subfield><subfield code="2">bkl</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Kuang, Yin</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Constructing $$L_{1}$$-graphs for subspace learning via recurrent neural networks</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2014</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer-Verlag London 2014</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract In this paper, the problem that we are interested is constructing $$l_{1}$$-graphs for subspace learning via recurrent neural networks. We propose a closed affine subspace learning (CASL) method to do so. The problem of CASL is formulated as an optimization problem described by an energy function in the non-negative orthant. The sufficient conditions for keeping the minimum points of the energy function in the non-negative orthant are given. A model of Lotka–Volterra recurrent neural networks is constructed to solve the optimization problem in terms of these sufficient conditions. It shows that the set of stable attractors of the network model just equals the set of minimum points of the energy function in the non-negative orthant. Based on these equivalences, the problem of CASL can be solved by running the proposed LV RNNs to obtain the stable attractors. The $$l_{1}$$-graphs then can be constructed. Experiments on some synthetic and real data show that the $$l_{1}$$-graphs constructed in this way are more effective, especially when processing the data sampled from multiple manifolds.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Closed affine subspace learning (CASL)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Non-negative matrix factorization (NMF)</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Sparse representation</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Multiple manifolds clustering and embedding</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Lotka–Volterra recurrent neural networks (LV RNNs)</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Zhang, Lei</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Yi, Zhang</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Pattern analysis and applications</subfield><subfield code="d">Springer London, 1998</subfield><subfield code="g">18(2014), 4 vom: 09. Mai, Seite 817-828</subfield><subfield code="w">(DE-627)24992921X</subfield><subfield code="w">(DE-600)1446989-3</subfield><subfield code="w">(DE-576)27655583X</subfield><subfield code="x">1433-7541</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:18</subfield><subfield code="g">year:2014</subfield><subfield code="g">number:4</subfield><subfield code="g">day:09</subfield><subfield code="g">month:05</subfield><subfield code="g">pages:817-828</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s10044-014-0370-1</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="936" ind1="b" ind2="k"><subfield code="a">54.74$jMaschinelles Sehen</subfield><subfield code="q">VZ</subfield><subfield code="0">10641030X</subfield><subfield code="0">(DE-625)10641030X</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">18</subfield><subfield code="j">2014</subfield><subfield code="e">4</subfield><subfield code="b">09</subfield><subfield code="c">05</subfield><subfield code="h">817-828</subfield></datafield></record></collection>
|
score |
7.401531 |