[Physics] About "logical errors"

Arend Lammertink lamare at gmail.com
Sat Nov 5 14:40:25 CET 2016


On Fri, Nov 4, 2016 at 8:15 AM, Thomas Goodey <thomas at flyingkettle.com> wrote:
> On 3 Nov 2016 at 12:00, Arend Lammertink wrote:
>
>> Let me just quote Nikola Tesla... This logically
>> thinking realist already wiped the floor with the theory
>> of relativity in 1932 and thus proved for the umpteenth
>> time to be far ahead of his time:
>
> No, he didn't. He proved his total ignorance of the
> fundamental elements of the subject.
>

I like to think it as staying within the logical and reasonable
foundation layed by guys like Newton and Maxwell. Disagreeing with
relativity is not the same as being ignorant.

>> "It might be inferred that I am alluding to the curvature of
>> space supposed to exist according to the teachings of
>> relativity, but nothing could be further from my mind. I
>> hold that space cannot be curved, for the simple reason that
>> it can have no properties.
>
> He had made up his mind already, a priori without
> considering any facts.
>

Well, he was convinced he had observed waves propagating at speeds far
exceeding that of light, which waves he believed to be longitudinal
and not "Herzian" in nature. Of course, one can argue that he just
made a "measuring" error, just like Wheatstone is said to have made in
his 1834 experiment, but there is quite a lot of evidence suggesting
faster than light waves are possible, most notable "fast light"
anomalous dispersion. See for example Stenner, Thevenaz and Wang in my
collection:

http://www.tuks.nl/pdf/Reference_Material/Fast_Light/

So, while it is widely considered to be a "proven fact" that even
information cannot be transmitted with a speed exceeding that of
light, at the very least there is evidence which challenges this
"fact".

One may not judge this evidence to be convincing, but IMHO it is fair
to say that when someone would succeed in transmitting information
faster than the speed of light, the whole theoretic body derived from
the Lorentz transform  becomes untenable.

In other words: if Tesla indeed measured wave propagation faster than
c, that would be fact enough to be skeptic about relativity.


>> It might as well be said that God
>> has properties. He has not, but only attributes and these
>> are of our own making.
>
> This is religious talk, and sounds like some type of
> Catholic hairsplitting. What is the difference between
> "properties" and "attributes"? Scientifically, we just talk
> about what structural regularities can be identified in
> nature.
>

I wouldn't know and it's too late to ask him....

>> To say that in the presence of large bodies space becomes
>> curved, is equivalent to stating that something can act
>> upon nothing. I, for one, refuse to subscribe to such a
>> view."
>
> But actually it is the fact. Eddington proved that with the
> eclipse observation.

https://en.wikipedia.org/wiki/Arthur_Eddington

"He and Astronomer Royal Frank Watson Dyson organized two expeditions
to observe a solar eclipse in 1919 to make the first empirical test of
Einstein’s theory: the measurement of the deflection of light by the
sun's gravitational field."

Does this probe that space is indeed curved?

No, of course not!

It proves that light is deflected by the sun, but not what actually
causes this deflection.

In general, this goes for many experiments for which it is claimed
that it proves relativity. All that is proven is that relativity makes
correct predictions in many cases.

In fact, no theory can be proven:

https://en.wikipedia.org/wiki/Scientific_theory

'Stephen Hawking states, "A theory is a good theory if it satisfies
two requirements: It must accurately describe a large class of
observations on the basis of a model that contains only a few
arbitrary elements, and it must make definite predictions about the
results of future observations."

He also discusses the "unprovable but falsifiable" nature of theories,
which is a necessary consequence of inductive logic, and that "you can
disprove a theory by finding even a single observation that disagrees
with the predictions of the theory".'



> Of course it depends upon what you
> mean by "nothing", which depends upon what you mean by
> "thing".
>

That is true, of course.

I share the same fundamental view as Tesla did, which is that "space"
is considered to be an "empty room", while all that is "in" it is
considered to be "physical" and/or to have "physical qualities" as
Einstein put it.

In this view, we have a fundamental distinction between "space" and
that what is in it. In this view, it is very strange to model gravity
as a fictitious force:

https://en.wikipedia.org/wiki/Fictitious_force
"A fictitious force, also called a pseudo force, d'Alembert force or
inertial force, is an apparent force that acts on all masses whose
motion is described using a non-inertial frame of reference, such as a
rotating reference frame. [...] Gravitational force would also be a
fictitious force based upon a field model in which particles distort
spacetime due to their mass."

Of course, you can do such things and work out the math, but it leads
to very strange conclusions and this way of modelling makes it pretty
much impossible to ever unify gravity with the electromagnetic field.

Also, in this way of modelling, gravity is no longer considered to be
an interaction between two objects, but an interaction between an
object and "spacetime", or, an interaction between "a thing" and "an
empty room" or "no thing".


>> Isn't it just beautiful how Tesla makes perfectly clear that
>> the Emperor of modern physics has no clothes with simple
>> logic?
>
> No, it was very stupid, because he opened his mouth in
> total ignorance.
>

I have to disagree. His simple logic highlights the problems with
relativity at exactly the right place.


>> Think about it. Space is literally no thing, nothing. It is
>> the emptiness, the void, wherein physical stuff exists, but
>> space in and of itself is not part of anything physical.
>
> What do you mean by "physical" here - it's not clearly
> defined?
>

See above.

>> And because space is
>> not physical at all, it can have no physical properties.
>
> But it has metric properties. That is the whole point of
> space. (Let's leave time out of it for simplicity; we won't
> speak of space-time although really we should.)

That's a good point.

> The entire
> idea of space is that it serves as a foundation for
> measurement of distances between points (events). Whatever
> space "is" is not to the point. Read your Korzybski!
>

It is interesting that you make the connection of distance between
points and events, which highlights the fundamental differences
between our two points of view:

1) Metric (spatial) "points" and chronological (time) "events" are
best described as independent dimensions in a mathematical "space", in
order to be able to define a meaningful measurement of 'distance';

2) Metric (spatial) "points" and chronological (time) "events" are
best described as interdependent dimensions in a mathematical "space",
as defined by the Lorentz (coordinate) transform. Hence one cannot
define a unique measurement of 'distance' (in spacetime) and therefore
one has to consider 'distance' to be 'relative' to the observer.

Both of these views are mathematically correct and it is the
properties of the Lorentz coordinate transform which defines the
relation between the two views. As described by the late Dr. C.K.
Thornhill:

http://etherphysics.net/CKT4.pdf

"ABSTRACT The real space-time of Newtonian mechanics and the ether
concept is contrasted with the imaginary space-time of the non-ether
concept and relativity. In real space-time (x, y, z, ct)
characteristic theory shows that Maxwell’s equations and sound waves
in any uniform fluid at rest have identical wave surfaces. Moreover,
without charge or current, Maxwell’s equations reduce to the same
standard wave equation which governs such sound waves. This is not a
general and invariant equation but it becomes so by Galilean
transformation to any other reference-frame. So also do Maxwell’s
equations which are, likewise, not general but unique to one
reference-frame. The mistake of believing that Maxwell’s equations
were invariant led to the Lorentz transformation and to relativity;
and to the misinterpretation of the differential equation for the wave
cone through any point as the quadratic differential form of a
Riemannian metric in imaginary space-time (x, y, z, ict). Mathematics
is then required to tolerate the same equation being transformed in
different ways for different applications. Otherwise, relativity is
untenable and recourse must then be made to real space-time, normal
Galilean transformation and an ether with Maxwellian statistics and
Planck’s energy distribution.

[...]

It was the mistaken idea, that Maxwell’s equations and the standard
wave equation should be invariant, which led, by a mathematical freak,
to the Lorentz transform (which demands the non-ether concept and a
universally constantwave-speed) and to special relativity. The mistake
was further compounded by misinterpreting the differential equation
for the wave hypercone through any point as the quadratic differential
form of a Riemannian metric in imaginary space-time (x, y, z, ict).
Further complications ensued when this imaginary space-time was
generalised to encompass gravitation in general relativity."


Note that Thornhill wrote: "without charge or current, Maxwell’s
equations reduce to the same standard wave equation which governs such
sound waves".

This is exactly the point I refer to as "Maxwell's hole", because with
the current definitions for "charge" and "current", a "recursive"
problem is introduced in the model, whereby electromagnetic radiation
is defined as being caused by "quanta" of electromagnetic radiation:

http://www.tuks.nl/wiki/index.php/Main/OnSpaceTimeAndTheFabricOfNature

"Electromagnetic waves are considered to be produced by moving
"charged particles", while these particles show this "wave particle
duality" behaviour themselves, as does "EM radiation" on it's turn. In
other words: electromagnetic radiation is essentially considered to be
produced by movements of "quanta" of electromagnetic radiation, called
either "photons" or "particles". Kind of a dog chasing it's own tail,
or recursion as software engineers call it."

To me, that is a "logical error", which is found at the very equations
which, according to Thornhill, led to the Lorentz transform: Maxwell's
equations.

When we remove this recursive definition by deleting the term dA/dt in
Maxwell's definition for the electric scalar potential field Phi,
Maxwell's equations reduce to the same equations which are used in
fluid dynamics vector theory. This also allows us to define both the
electric scalar potential field Phi as well as the magnetic vector
potential field [A] in terms of the textbook fluid dynamics (aether)
flow velocity field [v]:

http://www.tuks.nl/wiki/index.php/Main/AnExceptionallyElegantTheoryOfEverything


Of course, this difference in view should (eventually) be settled by
experiment. I already pointed to experimental data which suggests
"faster than light" waves are possible and IMHO these are longitudinal
"sound like" in Nature. However, the applicability of the Lorentz
transform rests on whether or not the speed of light c is a Universal
constant, which can also be seriously questioned:

https://www.sciencenews.org/article/speed-light-not-so-constant-after-all
"Light doesn’t always travel at the speed of light. A new experiment
reveals that focusing or manipulating the structure of light pulses
reduces their speed, even in vacuum conditions.

A paper reporting the research, posted online at arXiv.org and
accepted for publication, describes hard experimental evidence that
the speed of light, one of the most important constants in physics,
should be thought of as a limit rather than an invariable rate for
light zipping through a vacuum."


http://www.wnd.com/2004/07/25852/
"Early in 1979, an Australian undergraduate student named Barry
Setterfield, thought it would be interesting to chart all of the
measurements of the speed of light since a Dutch astronomer named Olaf
Roemer first measured light speed in the late 17th century.
Setterfield acquired data on over 163 measurements using 16 different
methods over 300 years.

The early measurements typically tracked the eclipses of the moons of
Jupiter when the planet was near the Earth and compared it with
observations when then planet was farther away. These observations
were standard, simple and repeatable, and have been measured by
astronomers since the invention of the telescope. These are
demonstrated to astronomy students even today. The early astronomers
kept meticulous notes and sketches, many of which are still available.

Setterfield expected to see the recorded speeds grouped around the
accepted value for light speed, roughly 299,792 kilometers /second. In
simple terms, half of the historic measurements should have been
higher and half should be lower.

What he found defied belief: The derived light speeds from the early
measurements were significantly faster than today. Even more
intriguing, the older the observation, the faster the speed of light.
A sampling of these values is listed below:

In 1738: 303,320 +/- 310 km/second
In 1861: 300,050 +/- 60 km/second
In 1877: 299,921 +/- 13 km/second
In 2004: 299,792 km/second (accepted constant)

Setterfield teamed with statistician Dr. Trevor Norman and
demonstrated that, even allowing for the clumsiness of early
experiments, and correcting for the multiple lenses of early
telescopes and other factors related to technology, the speed of light
was discernibly higher 100 years ago, and as much as 7 percent higher
in the 1700s. Dr. Norman confirmed that the measurements were
statistically significant with a confidence of more than 99 percent.

Setterfield and Norman published their results at SRI in July 1987
after extensive peer review."

A rather interesting background article on this:

http://www.khouse.org/articles/2002/423/
"Barry teamed up with Trevor Norman of Flinders University in
Adelaide, and in 1987 Flinders itself published their paper, "Atomic
Constants, Light, and Time." Their math department had checked it and
approved it and it was published with the Stanford Research Institute
logo as well. What happened next was like something out of a badly
written novel. Gerald Aardsma, a man at another creationist
organization, got wind of the paper and got a copy of it. Having his
own ax to grind on the subject of physics, he called the heads of both
Flinders and SRI and asked them if they knew that Setterfield and
Norman were [gasp]creationists! SRI was undergoing a massive staff
change at the time and since the paper had been published by Flinders,
they disavowed it and requested their logo be taken off. Flinders
University threatened Trevor Norman with his job and informed Barry
Setterfield that he was no longer welcome to use any resources there
but the library. Aardsma then published a paper criticizing the
Norman-Setterfield statistical use of the data. His paper went out
under the auspices of a respected creation institution. "

Their paper:

http://www.ldolphin.org/setterfield/report.html
"A systematic, non-linear decay trend is revealed by 163 measurements
of c in dynamical time by 16 methods over 300 years. Confirmatory
trends also appear in 475 measurements of 11 other atomic quantities
by 25 methods in dynamical time. Analysis of the most accurate atomic
data reveals that the trend has a consistent magnitude in all
quantities. Lunar orbital data indicate continuing c decay with
slowing atomic clocks. A decay in c also manifests as a red-shift of
light from distant galaxies. These variations have thus been recorded
at three different levels of measurement: the microscopic world of the
atom, the intermediate level of the c measurements, and finally on an
astronomical scale. Observationally, this implies that the two clocks
measuring cosmic time are running at different rates.

[...]

(A). DYNAMICAL C VARIATION DISCUSSED:

In October 1983 the speed of light, c, was declared a universal
constant of nature defined as 299,792.458 Km/s and as such is now used
in the definition of the meter. However, in a recent article on this
subject, Wilkie points out that many scientists have speculated that
the speed of light might be changing over the lifetime of the universe
and concludes that it is still possible that the speed of light might
vary on a cosmic timescale. Van Flandern agrees. He states that
Assumptions such as the constancy of the velocity of light ... may be
true in only one set of units (atomic or dynamical), but not the
other.

Historically, the literature, particularly from the 1920's to the
1940's, amplifies this conclusion and indicates that if c is varying
it is doing so in dynamical units, not atomic. Thus, the values for c
obtained by Michelson alone were as follows in Table A (with full
details in Table 5).

TABLE A

DATE VALUE OF C (km/s)
1879.5 299,910 ±50
1882.8 299,853 ±60
1924.6 299,802 ±30
1926.5 299,798 ±15

These results are not typical of a normal distribution about today's
fixed value. However, the 1882.8 result is confirmed by the values
from two other experiments. One by Newcomb in 1882.7 yielded a c value
of 299,860 ±30 Km/s, while Nyren using another method in 1883 obtained
a definitive value of 299,850 ±90 Km/s (see discussion below for
details). In other words, Michelson's 1882.8 result was completely
consistent with the other values obtained that year. The mean of these
three values (299,854 Km/s) lies above today's value by 61.8 Km/s,
though the standard deviation of these three values is only ±5 Km/s.
The quoted probable errors thus seem to be conservative.

Assuming no c variation, the least squares mean for all these data
show they are distributed about a point 53 Km/s above today's value.
The mean error is ±45.8 Km/s, which places today's value beyond its
lower limit. If the students t-distribution is applied to these data,
the hypothesis that c has been constant at its present value from
1879.5 to 1926.5 can be rejected with a confidence interval of 98.2%.
One would expect that other results from this type of experiment would
lie below today's value by a similar amount to restore the normal
distribution. This is not observed."


I think, given this data, it becomes pretty hard to maintain that the
speed of light, c, can be considered to be a Universal constant, which
fundamentally challenges the Lorentz transform and, by extension,
Einsteinian relativity.


>
>> Saying that space becomes curved by large bodies is the same
>> as saying that a street map becomes curved because the
>> cities and villages that are printed on it are so heavy.
>
> But, in the case of space, this is the actual fact, as
> determined by observation. Therefore all Tesla's polemics
> may safely be ignored.
>

If the historical data analysis of Setterfield and Norman has been
done correctly, which is likely since Flinders University's "math
department had checked it and approved it", we are forced by
observational data to reject the idea of a Universally constant speed
of light.

This, on it's turn, invalidates the Lorentz transform as being
applicable in physics, which on it's turn invalidates Einsteinian
relativity.

In other words: no, it's not actual fact. It's a matter of which
observational data one chooses to ignore in order to maintain an
a-priory conclusion, founded upon the speed of light being a Universal
constant.

Best regards,

Arend.



More information about the Physics mailing list