We shall now provide a few instances where lattices are used in various algorithms. Most of these uses the LLL algorithm as it is quite fast.
The Coppersmith algorithm can be made even more general. There are two main extensions, first to an unknown modulus, then to multivariate polynomials.
This extension of Coppersmith allows one to find small roots modulo an unknown some unknown factor of a number. More specifically, suppose that we have some unknown factorofsuch that and some monic polynomialof degreesuch that for some . Then we can findin time polynomial in .
One key reason why this is possible is because we dont need to explicitly know the modulus to determine if a small root exists, i.e. is sufficient for a root less thanto exist. The algorithm here is extremely similar to the Coppersmith algorithm, except we add more polynomials into the lattice. The polynomials that we will use are
The latticegenerated by these polynomials would have
As we require the shortest vector to have length at most , repeating the computations from the previous section, we obtain
This algorithm solves for small roots of polynomials modulo any integer, meaning given some polynomial of degree and any integer , then if , this algorithm can findwith time polynomial in and . The key idea behind this algorithm is to construct a polynomialsuch that in . As roots of polynomials over the reals can be found easily, this gives an easy way to find . We shall introduce the Coppersmith algorithm in a few iterations, with each iteration approaching the bound.
We first consider a criteria for a root of a polynomial moduloto exists over the reals as well. Supposeis a polynomial of degree. Define the norm of a polynomial to be . Given , if
then in . The proof is a relatively straightforward chain of inequalities:
and since implies for some , we know that must beto satisfy the inequality above.
With this, if we can find some polynomials such that , then if we can find some such that , then we can find easily. This gives a brief idea as to why lattices would be useful in such a problem.
To use lattices, notice that we can encode the polynomial as the vector with components . In this way, adding polynomials and multiplying polynomials by numbers still makes sense. Lets suppose that and (otherwise multiplyby . Consider the polynomials and consider the latticegenerated by and , . As a matrix, the basis vectors are
the basis vectors of our lattice would look like
Let
It turns out that the maxima of is as . One way to achieve this is by settingand we obtain
and this indeed achieves the bound. Similar to the Coppersmith algorithm, one chooses a sufficiently big such that the remainding bits can be brute forced in constant time while the algorithm still remains in polynomial time.
1) We show that the maximum ofis indeed . We can assume that . Since and , the maximum occurs when and , hence we have reduced this to maximizing which achieves its maximum of at .
As every element in this lattice is some polynomial , if , then. Furthermore, if and a short vectorhas length less than , then we have in .
The volume of this lattice is and the lattice has dimension . By using the LLL algorithm, we can find a vector with length at most
As long as , then by the above criteria we know that this vector has has a root over . This tells us that
While this isn't the bound that we want, this gives us an idea of what we can do to achieve this bound, i.e. add more vectors such that the length of the shortest vector decreases.
One important observation to make is that any coefficients in front ofdoes not matter as we can simply brute force the top bits of our small root in time. Hence we only need to getfor some fixed constant.
In order to achieve this, notice that if , then . This loosens the inequality required for a polynomial to have as a small root as our modulus is now larger. With this in mind, consider the polynomials
where we will determinelater. Here , hence we shall consider the lattice generated by . As an example, if we have
We have the following immediate computations of :
hence when using the LLL algorithm, the shortest LLL basis vectorhas length
and we need for . Hence we have
Since , this will achieve the bound that we want. However as for big , the LLL algorithm would take a very long time, we typically choose a suitably largesuch that the algorithm is still polynomial in and brute force the remaining bits.
1) We often see in literature. We shall now show where this mysteriouscomes from. The other term will appear in the next exercise. Typically, one sets to simplify calculations involving the LLL algorithm as . Suppose we want , show that this gives us .
2) We show that we can indeed find small roots less thanin polynomial time. In the worse case, the longest basis vector cannot exceed . Hence the LLL algorithm will run in at mosttime.
and choose , then is a constant hence the number of bits needed to be brute forced is a constant. This gives us the approximate run time of .
3) We shall show that this is the best bound we can hope for using lattice methods. Suppose there exists some algorithm that finds roots less than in polynomial time. Then consider the case whenand . Show that this forces the lattice to have a size too big for the algorithm to run in polynomial time, assuming the algorithm finds all small roots.