Let’s say that you have two polynomials, and you REALLY need to know if they have a common root. Now, if they’re quadratic, you’re in luck, because we can solve them both completely and just check. In fact, if you’re patient, you can do it whenever the Galois group is solvable, in particular, you can do it for cubics and quartics. But in general? What if I gave you a degree 100 polynomial and a degree 103 polynomial? Well, you can still do it, without having to solve anything.

The object that solves this problem is called the Resultant. Let f=a_0x^\ell+\ldots+a_\ell and g=b_0x^m+\ldots +b_m with a_0,b_0\neq 0 and \ell,n>0. Then the resultant is the determinant of the (\ell+m)\times(\ell+m) matrix Res(f,g)=\det\left[\begin{array}{cccccccc}a_0&&&&b_0&&&\\a_1&a_0&&&b_1&b_0&&\\a_2&a_1&\ddots&&b_2&b_1&\ddots&\\\vdots&a_2&\ddots&a_0&\vdots&b_2&\ddots&b_0\\a_\ell&\vdots&\ddots&a_1&b_m&\vdots&\ddots&b_1\\&a_\ell&&a_2&&b_m&&b_2\\&&\ddots&\vdots&&&\ddots&\vdots\\&&&a_\ell&&&&b_m\\\end{array}\right]

Now, this is actually a rather good thing to use for computations, though the determinant gets big, there’s a lot of zeros floating around, and computers can handle large determinants. Now, resultants have three really nice properties:

  1. The resultant is an integer polynomial in the coefficients of f,g.
  2. Res(f,g)=0 iff f,g have a common factor.
  3. There are polynomials a,b such that af+bg=Res(f,g).

The last property is called the Elimination property, and it ties in with the Elimination Theorem from last time. If f,g\in k[x_1,\ldots,x_n], then we choose a variable and write f,g as polynomials in that variable with coefficients polynomials in the others. Then Res(f,g,x_i) (written to point out the dependence on the variable chosen) is in the ideal generated by f,g, but also doesn’t depend on the variable x_i, and so is in the elimination ideal.

Now, we’ve done resultants for one variable. But algebra in one variable is the same as algebra of homogeneous polynomials in two variables, so we extend the resultant to these in the most obvious way possible. Then the condition that the resultant is zero turns out to be equivalent to having a nontrivial solution. That is, one other than (0,0). But if they have one, then there’s a line worth, and so this turns out to determine if the two polynomials share any zeros on \mathbb{P}^1.

Now, we generalize to get some nice stuff. To start, we denote the above resultants, as integer polynomials, by Res_{\ell,m}. We can handle pairs of polynomials fairly well with these. But what about a whole pile of polynomials of various degrees in more variables? Say, take F_0,\ldots,F_n to be homogeneous polynomials in the variables x_0,\ldots,x_{n+1}.

Well, we can fix degrees d_0,\ldots,d_n, the degrees of the F_0,\ldots,F_n. Now, there is a unique integer polynomial satisfying the properties:

  1. F_0=\ldots=F_n=0 has a nontrivial solution if adn only if Res_{d_0,\ldots,d_n}(F_0,\ldots,F_n)=0.
  2. Res_{d_0,\ldots,d_n}(x_0^{d_0},\ldots,x_n^{d_n})=1 (This is just to normalize things, generally we’ll only care if a resultant is zero or not).
  3. Res_{d_0,\ldots,d_n} is an irreducible polynomial over \mathbb{C}.

Now, proving existence is hard and long, but we can still look at these polynomials and see what we can say about them.

For instance, in the case that all the F_i are linear, with F_i=\sum_{j=0}^n c_{ij}x_j, then Res_{1,\ldots,1}(F_0,\ldots,F_n)=\det(c_{ij}). Also, the resultant of two polynomials is a special case here.

Now, these general resultants, when computable, are rather nice for working out if a hypersurface is smooth. Let f be a homogeneous polynomial of degree d+1 on \mathbb{P}^n. Then a singular point is a common solution to the n+1 partial derivatives, which are in n+1 variables. So we look at Res_{d,\ldots,d}(f_{x_0},\ldots,f_{x_n}), and it is zero if and only if V(f) is singular.

As an aside and an application, the hypersurfaces of degree d on \mathbb{P}^n naturally form a projective space, and the zero locus of the resultant as above is precisely the set of singular hypersurfaces. As this locus is cut out by a single polynomial, the collection of singular degree d hypersurfaces is a divisor on the space of all degree d hypersurfaces.

Now, before stopping, we’ll define one more thing in terms of resultants, and we’re even going to go back to the case of just two polynomials in one variable. Let f be a polynomial in x, then we define the discriminant \Delta to be Res\left( f,f'\right). This turns out to be wonderful, as it tells us if f has a multiple root, because that’s the same thing as f and f' to have a common root. Well, strictly, we define the discriminant to be \frac{(-1)^{\ell(\ell-1)/2}}{a_0}Res(f,f'), to make the signs all work out. So now, if you plug in a quadratic, you get the expected b^2-4ac.

About these ads