Information Lower Bounds via Self-Reducibility
Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear...
Ausführliche Beschreibung
Autor*in: |
Braverman, Mark [verfasserIn] |
---|
Format: |
Artikel |
---|---|
Sprache: |
Englisch |
Erschienen: |
2015 |
---|
Schlagwörter: |
---|
Anmerkung: |
© Springer Science+Business Media New York 2015 |
---|
Übergeordnetes Werk: |
Enthalten in: Theory of computing systems - Springer US, 1997, 59(2015), 2 vom: 24. Sept., Seite 377-396 |
---|---|
Übergeordnetes Werk: |
volume:59 ; year:2015 ; number:2 ; day:24 ; month:09 ; pages:377-396 |
Links: |
---|
DOI / URN: |
10.1007/s00224-015-9655-z |
---|
Katalog-ID: |
OLC2061923089 |
---|
LEADER | 01000caa a22002652 4500 | ||
---|---|---|---|
001 | OLC2061923089 | ||
003 | DE-627 | ||
005 | 20230323225335.0 | ||
007 | tu | ||
008 | 200819s2015 xx ||||| 00| ||eng c | ||
024 | 7 | |a 10.1007/s00224-015-9655-z |2 doi | |
035 | |a (DE-627)OLC2061923089 | ||
035 | |a (DE-He213)s00224-015-9655-z-p | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
082 | 0 | 4 | |a 004 |a 510 |q VZ |
082 | 0 | 4 | |a 510 |a 000 |q VZ |
100 | 1 | |a Braverman, Mark |e verfasserin |4 aut | |
245 | 1 | 0 | |a Information Lower Bounds via Self-Reducibility |
264 | 1 | |c 2015 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ohne Hilfsmittel zu benutzen |b n |2 rdamedia | ||
338 | |a Band |b nc |2 rdacarrier | ||
500 | |a © Springer Science+Business Media New York 2015 | ||
520 | |a Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. | ||
650 | 4 | |a Information complexity | |
650 | 4 | |a Communication complexity | |
650 | 4 | |a Self-reducibility | |
650 | 4 | |a Gap-hamming distance | |
650 | 4 | |a Inner product | |
700 | 1 | |a Garg, Ankit |4 aut | |
700 | 1 | |a Pankratov, Denis |4 aut | |
700 | 1 | |a Weinstein, Omri |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Theory of computing systems |d Springer US, 1997 |g 59(2015), 2 vom: 24. Sept., Seite 377-396 |w (DE-627)222610387 |w (DE-600)1355722-1 |w (DE-576)056755198 |x 1432-4350 |7 nnns |
773 | 1 | 8 | |g volume:59 |g year:2015 |g number:2 |g day:24 |g month:09 |g pages:377-396 |
856 | 4 | 1 | |u https://doi.org/10.1007/s00224-015-9655-z |z lizenzpflichtig |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a SYSFLAG_A | ||
912 | |a GBV_OLC | ||
912 | |a SSG-OLC-MAT | ||
912 | |a SSG-OLC-BUB | ||
912 | |a SSG-OPC-MAT | ||
912 | |a GBV_ILN_24 | ||
912 | |a GBV_ILN_30 | ||
912 | |a GBV_ILN_70 | ||
912 | |a GBV_ILN_2088 | ||
912 | |a GBV_ILN_4126 | ||
912 | |a GBV_ILN_4266 | ||
912 | |a GBV_ILN_4318 | ||
912 | |a GBV_ILN_4319 | ||
951 | |a AR | ||
952 | |d 59 |j 2015 |e 2 |b 24 |c 09 |h 377-396 |
author_variant |
m b mb a g ag d p dp o w ow |
---|---|
matchkey_str |
article:14324350:2015----::nomtolwronsislr |
hierarchy_sort_str |
2015 |
publishDate |
2015 |
allfields |
10.1007/s00224-015-9655-z doi (DE-627)OLC2061923089 (DE-He213)s00224-015-9655-z-p DE-627 ger DE-627 rakwb eng 004 510 VZ 510 000 VZ Braverman, Mark verfasserin aut Information Lower Bounds via Self-Reducibility 2015 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media New York 2015 Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. Information complexity Communication complexity Self-reducibility Gap-hamming distance Inner product Garg, Ankit aut Pankratov, Denis aut Weinstein, Omri aut Enthalten in Theory of computing systems Springer US, 1997 59(2015), 2 vom: 24. Sept., Seite 377-396 (DE-627)222610387 (DE-600)1355722-1 (DE-576)056755198 1432-4350 nnns volume:59 year:2015 number:2 day:24 month:09 pages:377-396 https://doi.org/10.1007/s00224-015-9655-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-BUB SSG-OPC-MAT GBV_ILN_24 GBV_ILN_30 GBV_ILN_70 GBV_ILN_2088 GBV_ILN_4126 GBV_ILN_4266 GBV_ILN_4318 GBV_ILN_4319 AR 59 2015 2 24 09 377-396 |
spelling |
10.1007/s00224-015-9655-z doi (DE-627)OLC2061923089 (DE-He213)s00224-015-9655-z-p DE-627 ger DE-627 rakwb eng 004 510 VZ 510 000 VZ Braverman, Mark verfasserin aut Information Lower Bounds via Self-Reducibility 2015 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media New York 2015 Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. Information complexity Communication complexity Self-reducibility Gap-hamming distance Inner product Garg, Ankit aut Pankratov, Denis aut Weinstein, Omri aut Enthalten in Theory of computing systems Springer US, 1997 59(2015), 2 vom: 24. Sept., Seite 377-396 (DE-627)222610387 (DE-600)1355722-1 (DE-576)056755198 1432-4350 nnns volume:59 year:2015 number:2 day:24 month:09 pages:377-396 https://doi.org/10.1007/s00224-015-9655-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-BUB SSG-OPC-MAT GBV_ILN_24 GBV_ILN_30 GBV_ILN_70 GBV_ILN_2088 GBV_ILN_4126 GBV_ILN_4266 GBV_ILN_4318 GBV_ILN_4319 AR 59 2015 2 24 09 377-396 |
allfields_unstemmed |
10.1007/s00224-015-9655-z doi (DE-627)OLC2061923089 (DE-He213)s00224-015-9655-z-p DE-627 ger DE-627 rakwb eng 004 510 VZ 510 000 VZ Braverman, Mark verfasserin aut Information Lower Bounds via Self-Reducibility 2015 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media New York 2015 Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. Information complexity Communication complexity Self-reducibility Gap-hamming distance Inner product Garg, Ankit aut Pankratov, Denis aut Weinstein, Omri aut Enthalten in Theory of computing systems Springer US, 1997 59(2015), 2 vom: 24. Sept., Seite 377-396 (DE-627)222610387 (DE-600)1355722-1 (DE-576)056755198 1432-4350 nnns volume:59 year:2015 number:2 day:24 month:09 pages:377-396 https://doi.org/10.1007/s00224-015-9655-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-BUB SSG-OPC-MAT GBV_ILN_24 GBV_ILN_30 GBV_ILN_70 GBV_ILN_2088 GBV_ILN_4126 GBV_ILN_4266 GBV_ILN_4318 GBV_ILN_4319 AR 59 2015 2 24 09 377-396 |
allfieldsGer |
10.1007/s00224-015-9655-z doi (DE-627)OLC2061923089 (DE-He213)s00224-015-9655-z-p DE-627 ger DE-627 rakwb eng 004 510 VZ 510 000 VZ Braverman, Mark verfasserin aut Information Lower Bounds via Self-Reducibility 2015 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media New York 2015 Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. Information complexity Communication complexity Self-reducibility Gap-hamming distance Inner product Garg, Ankit aut Pankratov, Denis aut Weinstein, Omri aut Enthalten in Theory of computing systems Springer US, 1997 59(2015), 2 vom: 24. Sept., Seite 377-396 (DE-627)222610387 (DE-600)1355722-1 (DE-576)056755198 1432-4350 nnns volume:59 year:2015 number:2 day:24 month:09 pages:377-396 https://doi.org/10.1007/s00224-015-9655-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-BUB SSG-OPC-MAT GBV_ILN_24 GBV_ILN_30 GBV_ILN_70 GBV_ILN_2088 GBV_ILN_4126 GBV_ILN_4266 GBV_ILN_4318 GBV_ILN_4319 AR 59 2015 2 24 09 377-396 |
allfieldsSound |
10.1007/s00224-015-9655-z doi (DE-627)OLC2061923089 (DE-He213)s00224-015-9655-z-p DE-627 ger DE-627 rakwb eng 004 510 VZ 510 000 VZ Braverman, Mark verfasserin aut Information Lower Bounds via Self-Reducibility 2015 Text txt rdacontent ohne Hilfsmittel zu benutzen n rdamedia Band nc rdacarrier © Springer Science+Business Media New York 2015 Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. Information complexity Communication complexity Self-reducibility Gap-hamming distance Inner product Garg, Ankit aut Pankratov, Denis aut Weinstein, Omri aut Enthalten in Theory of computing systems Springer US, 1997 59(2015), 2 vom: 24. Sept., Seite 377-396 (DE-627)222610387 (DE-600)1355722-1 (DE-576)056755198 1432-4350 nnns volume:59 year:2015 number:2 day:24 month:09 pages:377-396 https://doi.org/10.1007/s00224-015-9655-z lizenzpflichtig Volltext GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-BUB SSG-OPC-MAT GBV_ILN_24 GBV_ILN_30 GBV_ILN_70 GBV_ILN_2088 GBV_ILN_4126 GBV_ILN_4266 GBV_ILN_4318 GBV_ILN_4319 AR 59 2015 2 24 09 377-396 |
language |
English |
source |
Enthalten in Theory of computing systems 59(2015), 2 vom: 24. Sept., Seite 377-396 volume:59 year:2015 number:2 day:24 month:09 pages:377-396 |
sourceStr |
Enthalten in Theory of computing systems 59(2015), 2 vom: 24. Sept., Seite 377-396 volume:59 year:2015 number:2 day:24 month:09 pages:377-396 |
format_phy_str_mv |
Article |
institution |
findex.gbv.de |
topic_facet |
Information complexity Communication complexity Self-reducibility Gap-hamming distance Inner product |
dewey-raw |
004 |
isfreeaccess_bool |
false |
container_title |
Theory of computing systems |
authorswithroles_txt_mv |
Braverman, Mark @@aut@@ Garg, Ankit @@aut@@ Pankratov, Denis @@aut@@ Weinstein, Omri @@aut@@ |
publishDateDaySort_date |
2015-09-24T00:00:00Z |
hierarchy_top_id |
222610387 |
dewey-sort |
14 |
id |
OLC2061923089 |
language_de |
englisch |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2061923089</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230323225335.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200819s2015 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00224-015-9655-z</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2061923089</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00224-015-9655-z-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="a">510</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">510</subfield><subfield code="a">000</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Braverman, Mark</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Information Lower Bounds via Self-Reducibility</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2015</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer Science+Business Media New York 2015</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Information complexity</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Communication complexity</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Self-reducibility</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Gap-hamming distance</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Inner product</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Garg, Ankit</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Pankratov, Denis</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Weinstein, Omri</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Theory of computing systems</subfield><subfield code="d">Springer US, 1997</subfield><subfield code="g">59(2015), 2 vom: 24. Sept., Seite 377-396</subfield><subfield code="w">(DE-627)222610387</subfield><subfield code="w">(DE-600)1355722-1</subfield><subfield code="w">(DE-576)056755198</subfield><subfield code="x">1432-4350</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:59</subfield><subfield code="g">year:2015</subfield><subfield code="g">number:2</subfield><subfield code="g">day:24</subfield><subfield code="g">month:09</subfield><subfield code="g">pages:377-396</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00224-015-9655-z</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-BUB</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OPC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_30</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4266</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4318</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4319</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">59</subfield><subfield code="j">2015</subfield><subfield code="e">2</subfield><subfield code="b">24</subfield><subfield code="c">09</subfield><subfield code="h">377-396</subfield></datafield></record></collection>
|
author |
Braverman, Mark |
spellingShingle |
Braverman, Mark ddc 004 ddc 510 misc Information complexity misc Communication complexity misc Self-reducibility misc Gap-hamming distance misc Inner product Information Lower Bounds via Self-Reducibility |
authorStr |
Braverman, Mark |
ppnlink_with_tag_str_mv |
@@773@@(DE-627)222610387 |
format |
Article |
dewey-ones |
004 - Data processing & computer science 510 - Mathematics 000 - Computer science, information & general works |
delete_txt_mv |
keep |
author_role |
aut aut aut aut |
collection |
OLC |
remote_str |
false |
illustrated |
Not Illustrated |
issn |
1432-4350 |
topic_title |
004 510 VZ 510 000 VZ Information Lower Bounds via Self-Reducibility Information complexity Communication complexity Self-reducibility Gap-hamming distance Inner product |
topic |
ddc 004 ddc 510 misc Information complexity misc Communication complexity misc Self-reducibility misc Gap-hamming distance misc Inner product |
topic_unstemmed |
ddc 004 ddc 510 misc Information complexity misc Communication complexity misc Self-reducibility misc Gap-hamming distance misc Inner product |
topic_browse |
ddc 004 ddc 510 misc Information complexity misc Communication complexity misc Self-reducibility misc Gap-hamming distance misc Inner product |
format_facet |
Aufsätze Gedruckte Aufsätze |
format_main_str_mv |
Text Zeitschrift/Artikel |
carriertype_str_mv |
nc |
hierarchy_parent_title |
Theory of computing systems |
hierarchy_parent_id |
222610387 |
dewey-tens |
000 - Computer science, knowledge & systems 510 - Mathematics |
hierarchy_top_title |
Theory of computing systems |
isfreeaccess_txt |
false |
familylinks_str_mv |
(DE-627)222610387 (DE-600)1355722-1 (DE-576)056755198 |
title |
Information Lower Bounds via Self-Reducibility |
ctrlnum |
(DE-627)OLC2061923089 (DE-He213)s00224-015-9655-z-p |
title_full |
Information Lower Bounds via Self-Reducibility |
author_sort |
Braverman, Mark |
journal |
Theory of computing systems |
journalStr |
Theory of computing systems |
lang_code |
eng |
isOA_bool |
false |
dewey-hundreds |
000 - Computer science, information & general works 500 - Science |
recordtype |
marc |
publishDateSort |
2015 |
contenttype_str_mv |
txt |
container_start_page |
377 |
author_browse |
Braverman, Mark Garg, Ankit Pankratov, Denis Weinstein, Omri |
container_volume |
59 |
class |
004 510 VZ 510 000 VZ |
format_se |
Aufsätze |
author-letter |
Braverman, Mark |
doi_str_mv |
10.1007/s00224-015-9655-z |
dewey-full |
004 510 000 |
title_sort |
information lower bounds via self-reducibility |
title_auth |
Information Lower Bounds via Self-Reducibility |
abstract |
Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. © Springer Science+Business Media New York 2015 |
abstractGer |
Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. © Springer Science+Business Media New York 2015 |
abstract_unstemmed |
Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner. © Springer Science+Business Media New York 2015 |
collection_details |
GBV_USEFLAG_A SYSFLAG_A GBV_OLC SSG-OLC-MAT SSG-OLC-BUB SSG-OPC-MAT GBV_ILN_24 GBV_ILN_30 GBV_ILN_70 GBV_ILN_2088 GBV_ILN_4126 GBV_ILN_4266 GBV_ILN_4318 GBV_ILN_4319 |
container_issue |
2 |
title_short |
Information Lower Bounds via Self-Reducibility |
url |
https://doi.org/10.1007/s00224-015-9655-z |
remote_bool |
false |
author2 |
Garg, Ankit Pankratov, Denis Weinstein, Omri |
author2Str |
Garg, Ankit Pankratov, Denis Weinstein, Omri |
ppnlink |
222610387 |
mediatype_str_mv |
n |
isOA_txt |
false |
hochschulschrift_bool |
false |
doi_str |
10.1007/s00224-015-9655-z |
up_date |
2024-07-04T04:46:47.799Z |
_version_ |
1803622450768904192 |
fullrecord_marcxml |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01000caa a22002652 4500</leader><controlfield tag="001">OLC2061923089</controlfield><controlfield tag="003">DE-627</controlfield><controlfield tag="005">20230323225335.0</controlfield><controlfield tag="007">tu</controlfield><controlfield tag="008">200819s2015 xx ||||| 00| ||eng c</controlfield><datafield tag="024" ind1="7" ind2=" "><subfield code="a">10.1007/s00224-015-9655-z</subfield><subfield code="2">doi</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-627)OLC2061923089</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-He213)s00224-015-9655-z-p</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-627</subfield><subfield code="b">ger</subfield><subfield code="c">DE-627</subfield><subfield code="e">rakwb</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">004</subfield><subfield code="a">510</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="082" ind1="0" ind2="4"><subfield code="a">510</subfield><subfield code="a">000</subfield><subfield code="q">VZ</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Braverman, Mark</subfield><subfield code="e">verfasserin</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Information Lower Bounds via Self-Reducibility</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="c">2015</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">Text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">ohne Hilfsmittel zu benutzen</subfield><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">Band</subfield><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">© Springer Science+Business Media New York 2015</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Abstract We use self-reduction methods to prove strong information lower bounds on two of the most studied functions in the communication complexity literature: Gap Hamming Distance (GHD) and Inner Product (IP). In our first result we affirm the conjecture that the information cost of GHD is linear even under the uniform distribution, which strengthens the Ω(n) bound recently shown by Kerenidis et al. (2012), and answers an open problem from Chakrabarti et al. (2012). In our second result we prove that the information cost of IPn is arbitrarily close to the trivial upper bound n as the permitted error tends to zero, again strengthening the Ω(n) lower bound recently proved by Braverman and Weinstein (Electronic Colloquium on Computational Complexity (ECCC) 18, 164 2011). Our proofs demonstrate that self-reducibility makes the connection between information complexity and communication complexity lower bounds a two-way connection. Whereas numerous results in the past (Chakrabarti et al. 2001; Bar-Yossef et al. J. Comput. Syst. Sci. 68(4), 702–732 2004; Barak et al. 2010) used information complexity techniques to derive new communication complexity lower bounds, we explore a generic way in which communication complexity lower bounds imply information complexity lower bounds in a black-box manner.</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Information complexity</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Communication complexity</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Self-reducibility</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Gap-hamming distance</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Inner product</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Garg, Ankit</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Pankratov, Denis</subfield><subfield code="4">aut</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Weinstein, Omri</subfield><subfield code="4">aut</subfield></datafield><datafield tag="773" ind1="0" ind2="8"><subfield code="i">Enthalten in</subfield><subfield code="t">Theory of computing systems</subfield><subfield code="d">Springer US, 1997</subfield><subfield code="g">59(2015), 2 vom: 24. Sept., Seite 377-396</subfield><subfield code="w">(DE-627)222610387</subfield><subfield code="w">(DE-600)1355722-1</subfield><subfield code="w">(DE-576)056755198</subfield><subfield code="x">1432-4350</subfield><subfield code="7">nnns</subfield></datafield><datafield tag="773" ind1="1" ind2="8"><subfield code="g">volume:59</subfield><subfield code="g">year:2015</subfield><subfield code="g">number:2</subfield><subfield code="g">day:24</subfield><subfield code="g">month:09</subfield><subfield code="g">pages:377-396</subfield></datafield><datafield tag="856" ind1="4" ind2="1"><subfield code="u">https://doi.org/10.1007/s00224-015-9655-z</subfield><subfield code="z">lizenzpflichtig</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_USEFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SYSFLAG_A</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_OLC</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OLC-BUB</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">SSG-OPC-MAT</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_24</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_30</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_70</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_2088</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4126</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4266</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4318</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">GBV_ILN_4319</subfield></datafield><datafield tag="951" ind1=" " ind2=" "><subfield code="a">AR</subfield></datafield><datafield tag="952" ind1=" " ind2=" "><subfield code="d">59</subfield><subfield code="j">2015</subfield><subfield code="e">2</subfield><subfield code="b">24</subfield><subfield code="c">09</subfield><subfield code="h">377-396</subfield></datafield></record></collection>
|
score |
7.4009514 |