L1-norm orthogonal neighbourhood preserving projection and its applications
Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. Ho...
Ausführliche Beschreibung
Autor*in: |
Koringa, Purvi A. [verfasserIn] Mitra, Suman K. [verfasserIn] |
---|
Format: |
E-Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2018 |
---|
Schlagwörter: |
---|
Übergeordnetes Werk: |
Enthalten in: Pattern Analysis & Applications - Springer-Verlag, 1999, 22(2018), 4 vom: 04. Aug., Seite 1481-1492 |
---|---|
Übergeordnetes Werk: |
volume:22 ; year:2018 ; number:4 ; day:04 ; month:08 ; pages:1481-1492 |
Links: |
---|
DOI / URN: |
10.1007/s10044-018-0745-9 |
---|
Katalog-ID: |
SPR008218803 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | SPR008218803 | ||
003 | DE-627 | ||
005 | 20201124023812.0 | ||
007 | cr uuu---uuuuu | ||
008 | 201005s2018 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.1007/s10044-018-0745-9 |2 doi | |
035 | |a (DE-627)SPR008218803 | ||
035 | |a (SPR)s10044-018-0745-9-e | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Koringa, Purvi A. |e verfasserin |4 aut | |
245 | 1 | 0 | |a L1-norm orthogonal neighbourhood preserving projection and its applications |
264 | 1 | |c 2018 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a Computermedien |b c |2 rdamedia | ||
338 | |a Online-Ressource |b cr |2 rdacarrier | ||
520 | |a Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. | ||
650 | 4 | |a L1-norm |7 (dpeaa)DE-He213 | |
650 | 4 | |a L2-norm |7 (dpeaa)DE-He213 | |
650 | 4 | |a Outliers |7 (dpeaa)DE-He213 | |
650 | 4 | |a Dimensionality reduction |7 (dpeaa)DE-He213 | |
700 | 1 | |a Mitra, Suman K. |e verfasserin |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Pattern Analysis & Applications |d Springer-Verlag, 1999 |g 22(2018), 4 vom: 04. Aug., Seite 1481-1492 |w (DE-627)SPR008209189 |7 nnns |
773 | 1 | 8 | |g volume:22 |g year:2018 |g number:4 |g day:04 |g month:08 |g pages:1481-1492 |
856 | 4 | 0 | |u https://dx.doi.org/10.1007/s10044-018-0745-9 |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_SPRINGER | ||
951 | |a AR | ||
952 | |d 22 |j 2018 |e 4 |b 04 |c 08 |h 1481-1492 |
author_variant |
p a k pa pak s k m sk skm |
---|---|
matchkey_str |
koringapurviamitrasumank:2018----:1omrhgnlegbuhopeevnpoeto |
hierarchy_sort_str |
2018 |
publishDate |
2018 |
allfields |
10.1007/s10044-018-0745-9 doi (DE-627)SPR008218803 (SPR)s10044-018-0745-9-e DE-627 ger DE-627 rakwb eng Koringa, Purvi A. verfasserin aut L1-norm orthogonal neighbourhood preserving projection and its applications 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. L1-norm (dpeaa)DE-He213 L2-norm (dpeaa)DE-He213 Outliers (dpeaa)DE-He213 Dimensionality reduction (dpeaa)DE-He213 Mitra, Suman K. verfasserin aut Enthalten in Pattern Analysis & Applications Springer-Verlag, 1999 22(2018), 4 vom: 04. Aug., Seite 1481-1492 (DE-627)SPR008209189 nnns volume:22 year:2018 number:4 day:04 month:08 pages:1481-1492 https://dx.doi.org/10.1007/s10044-018-0745-9 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 22 2018 4 04 08 1481-1492 |
spelling |
10.1007/s10044-018-0745-9 doi (DE-627)SPR008218803 (SPR)s10044-018-0745-9-e DE-627 ger DE-627 rakwb eng Koringa, Purvi A. verfasserin aut L1-norm orthogonal neighbourhood preserving projection and its applications 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. L1-norm (dpeaa)DE-He213 L2-norm (dpeaa)DE-He213 Outliers (dpeaa)DE-He213 Dimensionality reduction (dpeaa)DE-He213 Mitra, Suman K. verfasserin aut Enthalten in Pattern Analysis & Applications Springer-Verlag, 1999 22(2018), 4 vom: 04. Aug., Seite 1481-1492 (DE-627)SPR008209189 nnns volume:22 year:2018 number:4 day:04 month:08 pages:1481-1492 https://dx.doi.org/10.1007/s10044-018-0745-9 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 22 2018 4 04 08 1481-1492 |
allfields_unstemmed |
10.1007/s10044-018-0745-9 doi (DE-627)SPR008218803 (SPR)s10044-018-0745-9-e DE-627 ger DE-627 rakwb eng Koringa, Purvi A. verfasserin aut L1-norm orthogonal neighbourhood preserving projection and its applications 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. L1-norm (dpeaa)DE-He213 L2-norm (dpeaa)DE-He213 Outliers (dpeaa)DE-He213 Dimensionality reduction (dpeaa)DE-He213 Mitra, Suman K. verfasserin aut Enthalten in Pattern Analysis & Applications Springer-Verlag, 1999 22(2018), 4 vom: 04. Aug., Seite 1481-1492 (DE-627)SPR008209189 nnns volume:22 year:2018 number:4 day:04 month:08 pages:1481-1492 https://dx.doi.org/10.1007/s10044-018-0745-9 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 22 2018 4 04 08 1481-1492 |
allfieldsGer |
10.1007/s10044-018-0745-9 doi (DE-627)SPR008218803 (SPR)s10044-018-0745-9-e DE-627 ger DE-627 rakwb eng Koringa, Purvi A. verfasserin aut L1-norm orthogonal neighbourhood preserving projection and its applications 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. L1-norm (dpeaa)DE-He213 L2-norm (dpeaa)DE-He213 Outliers (dpeaa)DE-He213 Dimensionality reduction (dpeaa)DE-He213 Mitra, Suman K. verfasserin aut Enthalten in Pattern Analysis & Applications Springer-Verlag, 1999 22(2018), 4 vom: 04. Aug., Seite 1481-1492 (DE-627)SPR008209189 nnns volume:22 year:2018 number:4 day:04 month:08 pages:1481-1492 https://dx.doi.org/10.1007/s10044-018-0745-9 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 22 2018 4 04 08 1481-1492 |
allfieldsSound |
10.1007/s10044-018-0745-9 doi (DE-627)SPR008218803 (SPR)s10044-018-0745-9-e DE-627 ger DE-627 rakwb eng Koringa, Purvi A. verfasserin aut L1-norm orthogonal neighbourhood preserving projection and its applications 2018 Text txt rdacontent Computermedien c rdamedia Online-Ressource cr rdacarrier Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. L1-norm (dpeaa)DE-He213 L2-norm (dpeaa)DE-He213 Outliers (dpeaa)DE-He213 Dimensionality reduction (dpeaa)DE-He213 Mitra, Suman K. verfasserin aut Enthalten in Pattern Analysis & Applications Springer-Verlag, 1999 22(2018), 4 vom: 04. Aug., Seite 1481-1492 (DE-627)SPR008209189 nnns volume:22 year:2018 number:4 day:04 month:08 pages:1481-1492 https://dx.doi.org/10.1007/s10044-018-0745-9 lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER AR 22 2018 4 04 08 1481-1492 |
language |
English |
source |
Enthalten in Pattern Analysis & Applications 22(2018), 4 vom: 04. Aug., Seite 1481-1492 volume:22 year:2018 number:4 day:04 month:08 pages:1481-1492 |
sourceStr |
Enthalten in Pattern Analysis & Applications 22(2018), 4 vom: 04. Aug., Seite 1481-1492 volume:22 year:2018 number:4 day:04 month:08 pages:1481-1492 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
L1-norm L2-norm Outliers Dimensionality reduction |
isfreeaccess_bool |
false |
container_title |
Pattern Analysis & Applications |
authorswithroles_txt_mv |
Koringa, Purvi A. @@aut@@ Mitra, Suman K. @@aut@@ |
publishDateDaySort_date |
2018-08-04T00:00:00Z |
hierarchy_top_id |
SPR008209189 |
id |
SPR008218803 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR008218803</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20201124023812.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201005s2018 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s10044-018-0745-9</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR008218803</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s10044-018-0745-9-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Koringa, Purvi A.</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">L1-norm orthogonal neighbourhood preserving projection and its applications</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2018</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">L1-norm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">L2-norm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Outliers</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Dimensionality reduction</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Mitra, Suman K.</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Pattern Analysis & Applications</subfield><subfield code="d">Springer-Verlag, 1999</subfield><subfield code="g">22(2018), 4 vom: 04. Aug., Seite 1481-1492</subfield><subfield code="w">(DE-627)SPR008209189</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:22</subfield><subfield code="g">year:2018</subfield><subfield code="g">number:4</subfield><subfield code="g">day:04</subfield><subfield code="g">month:08</subfield><subfield code="g">pages:1481-1492</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s10044-018-0745-9</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">22</subfield><subfield code="j">2018</subfield><subfield code="e">4</subfield><subfield code="b">04</subfield><subfield code="c">08</subfield><subfield code="h">1481-1492</subfield></datafield></record></collection>
|
author |
Koringa, Purvi A. |
spellingShingle |
Koringa, Purvi A. misc L1-norm misc L2-norm misc Outliers misc Dimensionality reduction L1-norm orthogonal neighbourhood preserving projection and its applications |
authorStr |
Koringa, Purvi A. |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)SPR008209189 |
format |
electronic Article |
delete_txt_mv |
keep |
author_role |
aut aut |
collection |
springer |
remote_str |
true |
illustrated |
Not Illustrated |
topic_title |
L1-norm orthogonal neighbourhood preserving projection and its applications L1-norm (dpeaa)DE-He213 L2-norm (dpeaa)DE-He213 Outliers (dpeaa)DE-He213 Dimensionality reduction (dpeaa)DE-He213 |
topic |
misc L1-norm misc L2-norm misc Outliers misc Dimensionality reduction |
topic_unstemmed |
misc L1-norm misc L2-norm misc Outliers misc Dimensionality reduction |
topic_browse |
misc L1-norm misc L2-norm misc Outliers misc Dimensionality reduction |
format_facet |
Elektronische Aufsätze Aufsätze Elektronische Ressource |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
cr |
hierarchy_parent_title |
Pattern Analysis & Applications |
hierarchy_parent_id |
SPR008209189 |
hierarchy_top_title |
Pattern Analysis & Applications |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)SPR008209189 |
title |
L1-norm orthogonal neighbourhood preserving projection and its applications |
ctrlnum |
(DE-627)SPR008218803 (SPR)s10044-018-0745-9-e |
title_full |
L1-norm orthogonal neighbourhood preserving projection and its applications |
author_sort |
Koringa, Purvi A. |
journal |
Pattern Analysis & Applications |
journalStr |
Pattern Analysis & Applications |
lang_code |
eng |
isOA_bool |
false |
recordtype |
marc |
publishDateSort |
2018 |
contenttype_str_mv |
txt |
container_start_page |
1481 |
author_browse |
Koringa, Purvi A. Mitra, Suman K. |
container_volume |
22 |
format_se |
Elektronische Aufsätze |
author-letter |
Koringa, Purvi A. |
doi_str_mv |
10.1007/s10044-018-0745-9 |
author2-role |
verfasserin |
title_sort |
l1-norm orthogonal neighbourhood preserving projection and its applications |
title_auth |
L1-norm orthogonal neighbourhood preserving projection and its applications |
abstract |
Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. |
abstractGer |
Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. |
abstract_unstemmed |
Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP. |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_SPRINGER |
container_issue |
4 |
title_short |
L1-norm orthogonal neighbourhood preserving projection and its applications |
url |
https://dx.doi.org/10.1007/s10044-018-0745-9 |
remote_bool |
true |
author2 |
Mitra, Suman K. |
author2Str |
Mitra, Suman K. |
ppnlink |
SPR008209189 |
mediatype_str_mv |
c |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s10044-018-0745-9 |
up_date |
2024-07-03T18:02:03.938Z |
_version_ |
1803581887810109440 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">SPR008218803</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20201124023812.0</controlfield><controlfield tag="007">cr uuu---uuuuu</controlfield><controlfield tag="008">201005s2018 xx |||||o 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s10044-018-0745-9</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)SPR008218803</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SPR)s10044-018-0745-9-e</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Koringa, Purvi A.</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">L1-norm orthogonal neighbourhood preserving projection and its applications</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2018</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">Computermedien</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Online-Ressource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract Dimensionality reduction techniques based on manifold learning are becoming very popular for computer vision tasks like image recognition and image classification. Generally, most of these techniques involve optimizing a cost function in L2-norm and thus they are susceptible to outliers. However, recently, due to capability of handling outliers, L1-norm optimization is drawing the attention of researchers. The work documented here is the first attempt towards the same goal where orthogonal neighbourhood preserving projection (ONPP) technique is performed using optimization in terms of L1-norm to handle data having outliers. In particular, the relationship between ONPP and PCA is established theoretically in the light of L2-norm and then ONPP is optimized using an already proposed mechanism of PCA-L1. Extensive experiments are performed on synthetic as well as real data for applications like classification and recognition. It has been observed that when larger number of training data is available L1-ONPP outperforms its counterpart L2-ONPP.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">L1-norm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">L2-norm</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Outliers</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Dimensionality reduction</subfield><subfield code="7">(dpeaa)DE-He213</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Mitra, Suman K.</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Pattern Analysis & Applications</subfield><subfield code="d">Springer-Verlag, 1999</subfield><subfield code="g">22(2018), 4 vom: 04. Aug., Seite 1481-1492</subfield><subfield code="w">(DE-627)SPR008209189</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:22</subfield><subfield code="g">year:2018</subfield><subfield code="g">number:4</subfield><subfield code="g">day:04</subfield><subfield code="g">month:08</subfield><subfield code="g">pages:1481-1492</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://dx.doi.org/10.1007/s10044-018-0745-9</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_SPRINGER</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">22</subfield><subfield code="j">2018</subfield><subfield code="e">4</subfield><subfield code="b">04</subfield><subfield code="c">08</subfield><subfield code="h">1481-1492</subfield></datafield></record></collection>
|
score |
7.4013624 |