To D.Nordstrom, editor, Phys Rev D. prd@ridge.aps.org
``Zero point energies, the cosmical constant,
and supersymmetry'' (df8058) by Warren D. Smith (me).
It was rejected by a referee for Phys Rev D.
On reconsideration.. I've decided there IS something wrong with
my paper (which you had rejected anyhow). The following describes what.
So: I'm quitting complaining.
--Warren D. Smith.
-----------------------------
REFUTATION OF MY "ZEROPOINT" REPORT
====Warren D. Smith 30 Nov 2001=======
Well, I hate to admit this, but... it appears that my report
"Zeropoint energies, cosmical constant, and supersymmetry" is WRONG.
This is a pity, since it appeared to be a beautiful argument,
and it appeared to solve (or at least offer the potential to solve)
one of the biggest problems in theoretical physics, the "cosmical
constant crisis."
Unfortunately, the grandest of theories can be laid low
by some simple facts.
I'm still leaving the "Zeropoint" report on my web page even though I
currently believe it is wrong (but it now is accompanied by this here
warning), since it seems to me to contain interesting and useful
ideas which perhaps can be used for another purpose - or which perhaps
can be resuscitated for its originally-intended purpose
(see bottom for possible ways that might be done?).
Incidentally, this has something to do with the referee's
rejection of this paper. The referee was definitely wrong
in the sense that, apparently, he did not even look at over half of my
paper - the part with my new ideas that overcame the problems with old ideas!
He simply saw my paper had some old ideas in the half he looked
at, those ideas were "known" not to work, for reasons the
ref listed (but reasons I had already considered in my paper & shown how to
overcome - but the ref ignored that utterly) hence: reject. So the ref did
an incompetent job. BUT, just because the ref was wrong, does not
mean I was right! It now appears we BOTH did a lousy job...
I will now explain what my paper did and the problem with it.
Essentially, the "cosmical constant problem" is the fact that
space should be filled with "zeropoint modes" of various fields
(e.g. electromagnetic) at all frequencies up to some very high frequency
(perhaps the Planck frequency, but anyway very high).
The existence of these modes has experiemtnal consequences - which have been seen.
The total mass-energies of these modes should be enormous.
The vacuum therefore should weigh (i.e. gravitate) a lot.
But in fact, experimentally, it does not. Experimentally,
the "cosmical constant" (vacuum energy) is tiny, or zero. The disagreement
between theory and experiment here is enormous - that is the problem
that needs to be solved.
My proposed solution of this problem had 9 steps:
1. According to a proposal popular among theoretical physicists called "supersymmetry",
(SUSY for short) there must be 1 fermionic mode for each bosonic mode.
2. Fermionic modes have NEGATIVE zeropoint energies.
3. I can write an asymptotic expansion, in decreasing powers of
the (assumed high) frequency-cutoff, of the vacuum energy.
4. The coefficient of the first term in this expansion is EXACTLY ZERO
due to (2) - the negatives exactly cancel the positives at first order.
5. The coefficient of the second term in this expansion is ZERO
due to a mathematical identity (called by various names, one of
them is the "supertrace theorem") discovered in the 1970s by Ferrara.
This identity is supposed to hold "at tree level" in all supersymmetric theories
including broken SUSY. ("Tree level" means at the level of
the first term in a different asymptotic expansion in decreasing powers
of some different large numbers - e.g. the supersymmetric breaking energy.)
6. Great. We have now reduced the cosmical constant problem (assuming
SUSY) by 2 terms down in that expansion. But what about the 3rd, 4th, etc terms?
7. Well, it turns out that my asymptotic expansion was based on a certain
approximation of a sum by an integral. If you really do it right, i.e. use
that sum, then a function which (in the integral-based version) is nice and smooth,
turns out (in the better, sum-based version) to be "rough." It behaves
irregularly when you look at it closely. This ultimately is traceable
to irregularities in the distribution of prime numbers, etc. It
can be analysed. There are theorems about it (which I used) by Tsang.
8. So now, when this "roughness" is put into the picture, it turns out
to be BIGGER than the 3rd, 4th, etc terms. In fact, in the limit
as the high-freq cutoff ---> infinity, it is bigger even than the
2nd term would have been (if it had had some other coefficient than 0) -
and bigger by a factor which goes to infinity.
9. This roughness causes the energy of the vacuum to be extremely sensitive to
tiny perturbations in various quantities (such as the precise value of the cutoff, or the
precise size of the universe). I then made an argument that tremendous forces
should automatically arise, which will have a "feedback" effect causing the size
of the universe to self-adjust, tremendously quickly, by a tiny amount, to get
to a location on the "rough" curve, at which the zeropoint energy should be zero almost exactly.
One can crudely estimate the amount by which it still might be nonzero, and that amount
is consistent with experimental evidence.
This solves the cosmical constant problem wonderfully!
But now to explain the bug:
(A) In step (8), the "factor which goes to infinity" only does so logarithmically.
Furthermore, the Ferrara identity, despite being a "theorem" (physicists seem
to have a different notion of the word "theorem") is currently believed by physicists
who support SUSY, to be wrong ("softly broken SUSY") and thus my second coefficient
is NOT zero. (If the Ferrara identity held reasonably exactly, supposedly this would
force light particles to exist, which have never been observed experimentally.)
(B) So now the crux question is: we must compare the estimated now-nonzero value
of that 2nd coefficient, with the value of log(infinity)!
Is log(infinity) greater?
Well, of course it is. BUT... if the high frequency mode-cutoff is NOT infinite,
but in fact is the Planck frequency (which is large, but finite) then
according to some numerical estimates I just made (which in retrospect, I should have
made long ago...) in fact, it is NOT greater! It is
many orders of magnitude smaller. Oops.
This seems to blow my solution out of the water!
Can it be resuscitated? I do not know. The two most obvious ways to try are
(I) Resuscitate the Ferrara identity - at least for the present purposes.
(II) Argue that, for some reason, it is legitimate to take
the limit as the cutoff frequency tends to infinity - and I mean infinity,
not merely the Planck frequency.