By this time I’m sure everyone whose curiousity was piqued by the statement of the Grothendieck-Riemann-Roch theorem has read it themselves. Nevertheless, in case you haven’t, I will proceed to outline the steps of the surprisingly “easy” proof.  It is “easy” in the sense that the most is made of a relatively simple computation on projecive space.  Last time we saw that it is enough to prove the formula separately for an injection and a projection.  We’ll see here how to carry these two steps through and how the first may be reduced to the inclusion of a divisor.  Though last time I said that I wanted to go into each step in more detail, I realized that 1) probably very few people are (still?) following along, 2) for those who are, they will get more by seeing an outline and reading the paper or looking at Fulton’s Intersection Theory themselves, and 3) this way we can illustrate the power of the theorem with some applications.

Step 1: Projective Space.

Let P be n-dimensional projective space and let F be a coherent sheaf on P.  The first step will be to show that the so-called Hirzebruch-Riemann-Roch formula is valid: deg( ch(F) \cdot td(P) ) = \chi(P, F).  As discussed previously in the series, this is the GRR formula in the special case of a map from projective space to a point.

To verify this formula, let H \subset P be a hyperplane and let x \in A^1(P) be its class.  The Chern polynomial of P is ch(P) = (1 + tx)^{n+1} and so td(P) = x^{n+1} / (1 - e^{-x})^{n+1}.

The sheaf F corresponds to a graded module M over k[X_0, \ldots, X_n].  By Hilbert’s syzygy theorem, M admits a finte length resolution by free graded modules.  This implies that in K(P), the sheaf F is equal to a linear combination of sheaves of the form \mathcal{O}(m) and by linearity, it suffices to show the theorem holds for F = \mathcal{O}(m).  We know everything we could hope to about this sheaf, in particular, ch(F) = e^{mx} and \chi(P, F) = \binom{m + n}{n}.   We are reduced to verifying the formula

deg(e^{mx} \cdot x^{n+1}/ (1 - e^{-x})^{n+1}) = \binom{m + n}{n}

and this may be rewritten in terms of residues as

Res ( e^{mx} dx/ (1 - e^{-x})^{n+1}) =\binom{m + n}{n}.

We’ll leave this statement as an exercise with the hint that the change of variables y = 1 - e^{-x} will be quite helpful.

Step 2: A projection

Let P be n-dimensional projective space and consider the projection map f: X \times P \rightarrow X.  We have the following fact, which is not too difficult to prove:

The natural map K(X) \otimes K(P) \rightarrow K(X \times P) is surjective.

We won’t prove this fact, but will only say that the proof procedes by induction on n, and uses a homotopy property of the K-groups that is similar to the corresponding property in the Chow ring (the corresponding properties for divisors are proved in Hartshorne’s Algebraic Geometry, II.6.5 and II.6.6).

The fact that GRR is true for the map f now follows from Step 1 and from the last Lemma stated in the previous post.

Step 3: A closed immersion of a divisor

Suppose that i: Y \rightarrow X is the inclusion of a smooth subvariety.  We will denote by N(Y) the normal bundle of Y \subset X and I(Y) the sheaf of ideals.   We have the exact sequence

0 \rightarrow I(Y) \rightarrow \mathcal{O}_X \rightarrow \mathcal{O}_Y \rightarrow 0

and the identity N(Y)^* = I(Y) / I(Y)^2.

The GRR formula for this case then reads ch(i_! y) = i_* (ch(y) \cdot td(N)^{-1}) for y \in K(Y).  Indeed, this follows by the projection formula and the identity i^*td(X) = td(Y) \cdot td(N(Y)).

We now will prove this formula when Y is a divisor in X and y = i^!(x) for x \in K(X).

The left hand side of the equation becomes ch(i_! i^!x) = ch (x \cdot i_!(1)) = ch(x \cdot (1 - [Y]^{-1}).  Here [Y] denotes the line bundle associated to the divisor (note: restricting [Y] to Y gives the normal bundle N(Y)).  The first equality is an application of the projection formula.  The second equality follows from the exact sequence above.  Because ch is a ring homomorphism, this then reads ch(i_! i^!x) = ch(x) \cdot ch(1 - [Y]^{-1}) = ch(x) \cdot (1 - e^{-Y}).

We now analyze the right hand side of the equation.  This reads i_* (ch(i^!x) \cdot td(N)^{-1}) = i_*( i^*ch(x) \cdot i^*td([Y])^{-1}) = ch(x) \cdot td([Y])^{-1}\cdot i_*(1) = ch(x) \cdot td([Y])^{-1} \cdot Y.  Here the second equality is another application of the projection formula and the formula finally follows because td[Y] = Y / (1 - e^{-Y}).

Corollary 1 GRR is true for the injection Y \rightarrow Y \times P for the map y \mapsto (y, p) for a fixed point p \in P.

By the Lemma from last time, the proof of this corollary reduces to the case when Y is a point.  Then the statement follow from the previous argument when dim(P) = 1 and in general by induction.  (Warning – that barely constitutes a proof sketch).

Corollary 2 If the statement above is true when 2 \cdot dim(Y) \leq dim(X) - 2, then it is true in general.

This follows from the first Lemma from last time and Corollary 1 because we may compose i with an injection X \rightarrow X \times P for a sufficiently large dimensional projective space.

Step 4: Reduction to the case of a divisor; i.e., end of proof.

Part of the beauty of the proof of GRR is that the absolute most is made out of the computation of projective space over a point and the inclusion of a divisor.  We now reduce the general case of an inclusion of a subvariety to that of a divisor using the blow-up construction.

By Corollary 2 above, we may assume that i: Y \rightarrow X (the inclusion) is of codimension p \geq \dim Y + 2.  Let f: X' \rightarrow X be the blowup of X along Y.  Denote by Y' the exceptional divisor f^{-1}(Y), j: Y' \rightarrow X' the inclusion and g: Y' \rightarrow Y the projection.  Recall that if N is the normal bundle of Y in X, and N' is the pullback to Y', then N' constains the restriction of the line bundle associated to Y' restricted to Y', which we will denote L.   We will set F' = N'/L, a bundle of rank p - 1.

We have the following facts about the behavior under such a blowup.

Facts:

A.  f_*[1] = 1 and so f_*f^* is the identity map.

B. g_*(c_{p-1}(F) = 1

C. For y \in K(Y) we have f^!i_!(y) = j_!(g^!(y) \cdot \lambda_{-1}F^*).

D. \lambda_{-1} F^* = 0 modulo 1 - L^* as long as p \geq \dim Y +2.

Recall that \lambda_{-1} of a bundle means we take the alternating sum of the wedge products of that bundle, interpreted as an element in K-theory.  We have the following further

Fact E: ch(\lambda_{-1} G) = c_k(G^*) \cdot td(G^*)^{-1} for G a bundle of rank k.

We may now conclude the proof of GRR:

By the same computation as in the divisor case, we must verify formula

(1) ch(i_! y) = i_* (ch(y) \cdot td(N)^{-1}).

Because the codimension of Y is large, we may apply Fact D to know that g^!(y) \cdot \lambda_{-1}(F^*) is equivalent to zero modulo 1 - L^*.  Then this class is in the image of j^! (exercise) and so by the previous case, we may apply GRR to this class and the map j.   This gives:

(2) ch j_! (g^!(y) \cdot \lambda_{-1}(F^*)) = j_*(ch(g^!(y) \cdot \lambda_{-1}(F^*)) \cdot td(L)^*).

We will then verify that pushing forward this equation by f gives equation (1).

By Fact C, we have that f_* ch j_! (g^!(y) \cdot \lambda_{-1}(F^*)) is equal to f_*(chf^!i_!y) = f_*f^*(ch i_! y) and then the LHS of our verification follows from Fact A.

Using the fact that ch is multiplicative and Fact E, we compute that

ch(g^!(y) \cdot \lambda_{-1}(F^*)) = g^*(ch (y)) c_{p-1}(F) \cdot td(F)^{-1}.

Since g^* td(N) = td(F) \cdot td(L), we get the formula

ch(g^!(y) \cdot \lambda_{-1}(F^*)) \cdot td(L)^{-1} = g^*(ch(y) \cdot td(N)^{-1}) \cdot c_{p-1}(F).

By Fact B then, the image of the two sides of this equation by g_* is equal to ch (y) \cdot td(N)^{-1} and the RHS of the verification follows from the fact that f_*j_* = i_*g_*.  Whew!

I’ll leave the Facts as exercises or inspiration to learn more about blowups.  Facts A and B are pretty easy and Fact E is a straightforward compuation.  Facts C and D require much more work – but at the moment I can’t see a great reason for going through their proofs.  Applications to follow (more quickly than this post did I hope).

About these ads