TY - JOUR
T1 - Smoothed Analysis of the Komlós Conjecture
T2 - Rademacher Noise
AU - Aigner-Horev, Elad
AU - Hefetz, Dan
AU - Trushkin, Michael
N1 - Publisher Copyright:
© The authors.
PY - 2025
Y1 - 2025
N2 - The discrepancy of a matrix M ∈ Rd×n is given by DISC(M):= minx∈{−1,1}n ‖Mx‖∞. An outstanding conjecture, attributed to Komlós, stipulates that DISC(M) = O(1), whenever M is a Komlós matrix, that is, whenever every column of M lies within the unit sphere. Our main result asserts that DISC(M + R/√d) = O(d−1/2) holds asymptotically almost surely, whenever M ∈ Rd×n is Komlós, R ∈ Rd×n is a Rademacher random matrix, d = ω(1), and n = ω(d log d). The factor d−1/2 normalising R is essentially best possible and the dependency between n and d is asymptotically best possible. Our main source of inspiration is a result by Bansal, Jiang, Meka, Singla, and Sinha (ICALP 2022). They obtained an assertion similar to the one above in the case that the smoothing matrix is Gaussian. They asked whether their result can be attained with the optimal dependency n = ω(d log d) in the case of Bernoulli random noise or any other types of discretely distributed noise; the latter types being more conducive for Smoothed Analysis in other discrepancy settings such as the Beck-Fiala problem. For Bernoulli noise, their method works if n = ω(d2). In the case of Rademacher noise, we answer the question posed by Bansal, Jiang, Meka, Singla, and Sinha. Our proof builds upon their approach in a strong way and provides a discrete version of the latter.
AB - The discrepancy of a matrix M ∈ Rd×n is given by DISC(M):= minx∈{−1,1}n ‖Mx‖∞. An outstanding conjecture, attributed to Komlós, stipulates that DISC(M) = O(1), whenever M is a Komlós matrix, that is, whenever every column of M lies within the unit sphere. Our main result asserts that DISC(M + R/√d) = O(d−1/2) holds asymptotically almost surely, whenever M ∈ Rd×n is Komlós, R ∈ Rd×n is a Rademacher random matrix, d = ω(1), and n = ω(d log d). The factor d−1/2 normalising R is essentially best possible and the dependency between n and d is asymptotically best possible. Our main source of inspiration is a result by Bansal, Jiang, Meka, Singla, and Sinha (ICALP 2022). They obtained an assertion similar to the one above in the case that the smoothing matrix is Gaussian. They asked whether their result can be attained with the optimal dependency n = ω(d log d) in the case of Bernoulli random noise or any other types of discretely distributed noise; the latter types being more conducive for Smoothed Analysis in other discrepancy settings such as the Beck-Fiala problem. For Bernoulli noise, their method works if n = ω(d2). In the case of Rademacher noise, we answer the question posed by Bansal, Jiang, Meka, Singla, and Sinha. Our proof builds upon their approach in a strong way and provides a discrete version of the latter.
UR - http://www.scopus.com/inward/record.url?scp=105001717346&partnerID=8YFLogxK
U2 - 10.37236/13213
DO - 10.37236/13213
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:105001717346
SN - 1077-8926
VL - 32
JO - Electronic Journal of Combinatorics
JF - Electronic Journal of Combinatorics
IS - 1
M1 - P1.52
ER -