Greedy algorithm computing minkowski reduced lattice bases with quadratic bit complexity of input vectors

Hao Chen , Liqing Xu

Chinese Annals of Mathematics, Series B ›› 2011, Vol. 32 ›› Issue (6) : 857 -862.

PDF
Chinese Annals of Mathematics, Series B ›› 2011, Vol. 32 ›› Issue (6) : 857 -862. DOI: 10.1007/s11401-011-0680-1
Article

Greedy algorithm computing minkowski reduced lattice bases with quadratic bit complexity of input vectors

Author information +
History +
PDF

Abstract

The authors present an algorithm which is a modification of the Nguyen-Stehle greedy reduction algorithm due to Nguyen and Stehle in 2009. This algorithm can be used to compute the Minkowski reduced lattice bases for arbitrary rank lattices with quadratic bit complexity on the size of the input vectors. The total bit complexity of the algorithm is O(n^2 \cdot (4n!)^n \cdot (\tfrac{{n!}}{{2^n }})^{\tfrac{n}{2}} \cdot (\tfrac{4}{3})^{\tfrac{{n(n - 1)}}{4}} \cdot (\tfrac{3}{2})^{\tfrac{{n^2 (n - 1)}}{2}} \cdot \log ^2 A), where n is the rank of the lattice and A is maximal norm of the input base vectors. This is an O(log2 A) algorithm which can be used to compute Minkowski reduced bases for the fixed rank lattices. A time complexity n! · 3 n(log A) O(1) algorithm which can be used to compute the successive minima with the help of the dual Hermite-Korkin-Zolotarev base was given by Blomer in 2000 and improved to the time complexity n! · (log A) O(1) by Micciancio in 2008. The algorithm in this paper is more suitable for computing the Minkowski reduced bases of low rank lattices with very large base vector sizes.

Keywords

Lattice / Successive minima / Minkowski reduced bases / Greedy reduction

Cite this article

Download citation ▾
Hao Chen, Liqing Xu. Greedy algorithm computing minkowski reduced lattice bases with quadratic bit complexity of input vectors. Chinese Annals of Mathematics, Series B, 2011, 32(6): 857-862 DOI:10.1007/s11401-011-0680-1

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Ajtai, M., Kumar, R. and Sivakumar, D., A sieve algorithm for the shortest lattice vector problem, Proceedings on 33rd Annual ACM Symposium on Theory of Computing, ACM, Heraklion, Greece, 2001, 601–610.

[2]

Blomer J.. Closest vectors, successive minima and dual HKZ bases of lattice. Proceedings of the 27th International Colloquium on Automata, Languages and Programming, 2000, New York: Springer-Verlag 248-259

[3]

Cassels J. W. S.. An Introduction to the Geometry of Numbers, 1972 2nd ed. New York: Springer-Verlag

[4]

Delone D. N., Ryshkov S. S., Shtogrin M. I.. A theorem due to Sandakova concerning positive quadratic forms. Metematicheskie Zametki, 1967, 1(3): 253-262

[5]

Eisenbrand F., Rote G.. Fast reduction of ternary quadratic forms. Proceedings of 2001 Cryptography and Lattice Conference, 2001, Heidelberg: Springer-Verlag 32-44

[6]

Helfrich B.. Algorithms to construct Minkowski reduced and Hermite reduced lattice bases. Theor. Comp. Sci., 1985, 41: 125-139

[7]

Henk, M., Geometry of Numbers. http://fma2.math.uni-magdeburg.de/henk/index.html

[8]

Micciancio, D., Efficient reductions among lattice problems, Proceedings of the 19th annual ACM-SIAM Symposium on Discrete Algorithms, ACM-SIAM, San Francisco, California, 2008, 84–93.

[9]

Micciancio D., Goldwasser S.. Complexity of lattice problems, A Cryptographic perspective, 2002, Boston: Kluwer Academic Publishers

[10]

Nguyen P. Q., Stehle D.. Low dimensional basis reduction revisited. Proceedings of the 6th Algorithmic Number Theory Symposium, 2004, New York: Springer-Verlag 338-357

[11]

Nguyen P. Q., Stehle D.. Floating-point LLL revisited, Proc. Eurocrypt 2005, full version, An LLL algorithm with quadratic complexity. SIAM J. Comp., 2009, 39(3): 874-903

[12]

Tammela P. P.. On the reduction theory of positive quadratic forms. Soveit Math. Dolklady, 1973, 14: 651-655

[13]

Tammela P. P.. Minkowski reduction region for positive quadratic forms in seven variables. J. Soviet Math., 1981, 16: 863-857

AI Summary AI Mindmap
PDF

159

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/