We’ve define the Chern classes now, but what about computing them, and computing with them? We have that long list of properties that will help, but there is a need to prove them, and they aren’t completely trivial.  What we need is a clever trick.  Vector bundles generalize line bundles, which we already understand, more-or-less, so if we can reduce computations with Chern classes to computations with the first Chern class, that would be wonderful.

Our clever trick is called the splitting principle.  The statement is that for any finite collection of vector bundles \mathscr{S} on our scheme X, there exists a flat morphism f:X'\to X such that

  1. f^*:A_*(X)\to A_*(X') is injective
  2. For all E\in \mathscr{S}, we have f^*E=E_r\supset\ldots\supset E_0=0 a filtration by bundles such that E_i/E_{i-1}\cong L_i is a line bundle.

To sketch the construction, we start with a single bundle, and we go by induction on r=\mathrm{rank}(E).  Then we just projectivize E, it has \mathscr{O}(-1) of the projective bundle as a subbundle, and so we can quotient by it to get a lower rank bundle.  For a collection of bundles, we just keep going and do them in sequence.

So, what does this really get us? The big bonus of splitting is that c_t(E)=\prod_{i=1}^r (1+c_1(L_i)t).  So Chern classes are all writeable in terms of those of line bundles, just maybe not on your original space.  We call these first Chern clases the chern roots of our vector bundle, and will write them as \alpha_1,\ldots,\alpha_r.

Specifically, we have that c_i(E)=\sigma_i(\alpha_1,\ldots,\alpha_r), where \sigma_i is the i^{th} symmetric polynomial.  So then anything that we can write as a symmetric polynomial in the Chern roots will, in fact, be a polynomial in the Chern classes! We’re going to make much more use of this fact in the future.

The splitting principle lets us prove universal formulas of Chern classes of a finite set of bundles by just looking at the case of a direct sum of line bundles.  It’s a quick exercise to prove all of the following, where E,F are vector bundles:

  1. c_i(E^\vee)=(-1)^ic_i(E)
  2. c_t(E\otimes F)=\prod_{i,j}(1+(\alpha_i+\beta_j)t)
  3. c_t(\bigwedge^p E)=\prod_{1\leq i_1<\ldots<i_p\leq r} (1+(\alpha_{i_1}+\ldots+\alpha_{i_p})t).

Before doing some applications, we make note that we’re going to abuse notation and for any p a polynomial in Chern classes, we’re going to write p instead of p\cap [X].  There shouldn’t be any real confusion.

So now, some quick applications of Chern classes to do some classical work:

  1. Adjunction on a Surface: Let C be an effective Cartier divisor on a complete surface X.  Then, by definition, (C^2)_X=\int_C c_1(N), where N is the normal bundle.  Also by definition, we have 0\to T_C\to T_X|_C\to N\to 0.  So we have c_1(N)=c_1(T_X|_C)-c_1(T_C).  Thus, we have C\cdot C=\int_C c_1(T_X|_C)-\int_C c_1(T_C).  Setting K=-c_1(T_X), we get C\cdot C=-C\cdot K+(2g-2), and this becomes C\cdot (C+K)=2g-2, the classical adjuction formula for surfaces.
  2. Riemann-Hurwitz: Let f:X\to Y be a map of smooth varieties of dimension n.  We want to look at R(f), the ramification locus, where the differential isn’t an isomorphism.  Well, that’s just the zero set of the map \wedge^n df:T_X\to f^*T_Y (take the determinant, it’s zero if and only if the map isn’t an isomorphism).  That tells us that [R(f)]=(c_1f^*T_Y-c_1T_X)\cap [X].  Now, take n=1 and integrate both sides, wne we get \deg R=(2-2g_Y)\deg(f)+(2g_X-2), and so we recover the Riemann-Hurwitz Theorem as a special case.

This will actually often be the case: classical theorems turn out to be special cases of far more general results, which, once the correct machinery is in hand, can be proved almost effortlessly.  Note that so far, we’ve managed to recover the intersection form on surfaces, the Adjunction formula for surfaces and the Riemann-Hurwitz Theorem, all almost effortlessly from our formalism.  Things get a bit trickier when we start trying to pull Riemann-Roch out of this, however, and before then, we’ll have to talk K-theory, Chern characters and Todd classes.

About these ads