Hey, Mphlee,

I presume you're confused by the term "kernel." As mathematicians are rather terrible at naming things; this is easily confused for the 100 other uses in the english language for the word kernel. As I'm kind of bored, I'll take you through the deep history and what I mean.

A "kernel" of a "linear operator" in hilbert spaces/functional analysis/operator theory is loosely related to matrices, but not exactly. Say you have a linear operator,

Where

is a linear functional space. Let's just say Hilbert space for simplicity; that means it has a norm, an inner product, is sequentially complete, and a whole bunch of other nice things. Let's also assume this operator is, so to speak, "nice." There are plenty of operators that are "nice" (Hermitian is a good enough restriction, for example; so let's just say it's Hermitian). Then, there are canonically two ways of representing this operator.

The first is the matrix manner (interestingly, this is Heisenberg's interpretation vs Schrodinger's interpretation, which is resolved by Von Neumann's martian mathematics). Take a sequence of orthonormal eigenvectors

(since this linear operator is Hermitian, we can always find such a base if we assume the Hilbert space is of the first order (isomorphic to

/second countable)). This means, for every

,

Where the brackets are the inner product. Now, we've assumed that,

So the first representation of this operator is,

By just applying the operator termwise. Which is, as I'm sure your familiar, just changing the coordinate system and diagonalizing the operator

. This leads to a whole study of diagonalizing linear operators, and changing coordinate systems with matrices of infinite dimension; looking for orthonormal vectors, and conjugating by matrices is this realm.

The second realm, can't remember for the life of me who proved it, but there exists a representation of this operator

as an integral. This is usually proved with the spectral theorem, and a bunch of crazy nonsense I need to reread to properly quote. But it certainly exists for Hermitian operators. So permit me to call the function

where every element

of

takes

.

Then, if

is the kernel of the operator

, we get,

This is known as the integral representation of a linear operator. Particularly, if you ever talk about hilbert spaces with people, by "kernel of a linear operator" they mean

. This representation is usually the best representation. I mean, can you imagine the Fourier Transform represented in the above matrix method. Sounds god awful (Heisenberg, what were you thinking!?). So moving forward, what I mean was simple.

If you take a Hermitian matrix (whose, thankfully) eigenvalues

are real. Let's assume that they're countable (our Hilbert space is second countable so this is okay). Let's assume that

. And let's let our eigenvectors

be orthonormal (they're already orthogonal because they're Hermitian, so normalizing is just multiplying by a constant). So our operator, again is,

But additionally,

AHA! We've iterated the operator and turned it into a semi-group (Von Neumann did masterful stuff using the same principle, but he was a master). But, how do we do this if

are non-obvious? How do we do this having minimal understanding of the operator, and only an integral operator........................?

Well what that stupid little Matrix thing (which U of T and other professors pointed out, I never really thought it would be that important) kind of says we can just use the fractional derivative.

And that,

Satisfies,

I mean, it's evident, each

is in

.....

That's more so what the U of T people and other professors were talking about last I heard. I don't want to say much more though. Don't really know how priority works, talking about other people's research before publication. I'm not taking credit for any of that. I never really thought of it like that. I'm more interested in

the cool kids operators, like hyper-operators/iteration operators. Screw matrix blather.