Lecture 20. Upper and lower heat kernel Gaussian bounds

In this short Lecture, as in the previous one, we consider a complete and n-dimensional Riemannian manifold (\mathbb{M},g) with non negative Ricci curvature. The volume doubling property that was proved is closely related to sharp lower and upper Gaussian bounds that are due to P. Li and S.T. Yau. We first record a basic consequence of the volume doubling property whose proof is let to the reader.

Theorem: Let C > 0 be the constant such that for every x \in \mathbb{M}, R \ge 0, \mu(B(x,2R)) \le C \mu (B(x,R)). Let Q = \log_2 C. For any x\in \mathbb{M} and r > 0 one has \mu(B(x,tr)) \ge C^{-1} t^{Q} \mu(B(x,r)),\ \ \ 0\le t\le 1.

We are now in position to prove the main result of the Lecture.

Theorem: For any 0 < \varepsilon  < 1 there exists a constant C = C(n,\varepsilon) > 0, which tends to \infty as \varepsilon \to 0^+, such that for every x,y\in \mathbb{M} and t > 0 one has
\frac{C^{-1}}{\mu(B(x,\sqrt t))} \exp \left(-\frac{ d(x,y)^2}{(4-\varepsilon)t}\right)\le p(x,y,t)\le \frac{C}{\mu(B(x,\sqrt t))} \exp \left(-\frac{d(x,y)^2}{(4+\varepsilon)t}\right).

Proof: We begin by establishing the lower bound. First, from the Harnack inequality we obtain for all y \in \mathbb{M}, t > 0, and every 0 < \varepsilon < 1,
p(x,y,t)\ge p(x,x,\varepsilon t)  \varepsilon^\frac{n}{2} \exp\left( -\frac{d(x,y)^2}{(4-\varepsilon)t}\right).
We thus need to estimate p(x,x,\varepsilon t) from below. But this has already been done in the proof of the volume doubling property where we established:
p(x,x,\varepsilon t) \ge \frac{C^*}{\mu(B(x,\sqrt{\varepsilon/2} \sqrt t))},\ \ \ \ \ x\in \mathbb{M},\ t > 0.
On the other hand, since \sqrt{\varepsilon/2} < 1, by the trivial inequality \mu(B(x,\sqrt{\varepsilon/2} \sqrt t)) \le \mu(B(x,\sqrt t)), we conclude
p(x,y,t) \geq \frac{C^*}{ \mu(B(x,\sqrt t))}  \varepsilon^\frac{n}{2} \exp\left( -\frac{d(x,y)^2}{(4-\varepsilon)t}\right).
This proves the Gaussian lower bound.

For the Gaussian upper bound, we first observe that the following upper bound was proved in a previous lecture:
p(x,y,t)\le \frac{C}{\mu(B(x,\sqrt t))^{\frac{1}{2}} \mu(B(y,\sqrt t))^{\frac{1}{2}}} \exp \left(-\frac{d(x,y)^2}{(4+\varepsilon')t}\right).
At this point, by the triangle inequality and the volume doubling property we find.
\mu(B(x,\sqrt{ t})) \leq \mu(B(y,d(x,y)+\sqrt{ t}))
\leq C_1 \mu(B(y,\sqrt{ t})) \left(\frac {d(x,y)+\sqrt{ t}}{\sqrt t} \right)^Q.
with Q=\log_2 C, where C is the doubling constant.
This gives
\frac{1}{\mu(B(y,\sqrt{ t}))}\leq \frac{C_1}{\mu(B(x,\sqrt{ t}))} \left(\frac {d(x,y)}{\sqrt{ t}}+1 \right)^Q.
Combining this with the above estimate we obtain
p(x,y,t)\le \frac{C_1^{1/2}C}{\mu(B(x,\sqrt t))}  \left(\frac {d(x,y)}{\sqrt{ t}}+1 \right)^{\frac{Q}{2}} \exp \left(-\frac{d(x,y)^2}{(4+\varepsilon')t}\right).
If now 0 < \varepsilon < 1, it is clear that we can choose 0 < \varepsilon' < \varepsilon such that
\frac{C_1^{1/2}C}{\mu(B(x,\sqrt t))}  \left(\frac {d(x,y)}{\sqrt{ t}}+1 \right)^{\frac{Q}{2}} \exp \left(-\frac{d(x,y)^2}{(4+\varepsilon')t}\right) \le  \frac{C^*}{\mu(B(x,\sqrt t))} \exp \left(-\frac{d(x,y)^2}{(4+\varepsilon)t}\right),
where C^* is a constant which tends to \infty as \varepsilon \to 0^+. The desired conclusion follows by suitably adjusting the values of both \varepsilon' and of the constant in the right-hand side of the estimate \square

To conclude the lecture, we finally mention without proof, what the previous arguments give in the case where \mathbf{Ric} \ge -K with K \ge 0. We encourage the reader to do the proof by herself/himself as an exercise.

Theorem: Let us assume \mathbf{Ric} \ge -K with K \ge 0. For any 0 < \varepsilon < 1 there exist constants C_1,C_2= C(n,K,\varepsilon) > 0, such that for every x,y\in \mathbb{M} and t > 0 one has
p(x,y,t)\le \frac{C_1}{\mu(B(x,\sqrt t))} \exp \left(-\frac{d(x,y)^2}{(4+\varepsilon)t} +KC_2 (t +d(x,y)^2)\right).
p(x,y,t)\ge \frac{C^{-1}_1}{\mu(B(x,\sqrt t))} \exp \left(-\frac{d(x,y)^2}{(4-\varepsilon)t} -KC_2 (t +d(x,y)^2)\right).

This entry was posted in Curvature dimension inequalities. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s