[Physics] Do longitudinal FTL "Tesla" waves exist and, if yes, how should they be modelled?

Arend Lammertink lamare at gmail.com
Sat May 2 23:20:08 CEST 2020


On Sat, May 2, 2020 at 9:04 AM Ilja Schmelzer <ilja.schmelzer at gmail.com> wrote:
>
> 2020-05-02 2:19 GMT+06:30, Arend Lammertink <lamare at gmail.com>:
> > On Fri, May 1, 2020 at 5:37 PM Ilja Schmelzer <ilja.schmelzer at gmail.com> wrote:
> > I guess, ultimately, our differences lie mostly in what we trust the
> > most. You seem to primarily trust "hard" data and want to produce
> > equations which produce the right numbers and make as much sense as
> > possible, while I seem to rely much more on my intuition.
>
> Yes. If your intuition is not guided by hard data, you have no chance.

Yep, agree to that.

In that sense, it is rather interesting how one considers David
LaPoint's video, which I like a lot:

https://www.youtube.com/watch?v=siMFfNhn6dk

What I see is that people tend to reject the experiments that are
being shown, because they don't like LaPoint's theory and claims he
makes besides showing his experiments. To me, the experiments
themselves are hard data, because I have no reason to believe he faked
things. I consider it actual recordings from actual experiments until
someone comes up with a good argument explaining why they are likely
to be fake.

So, IMHO, one is completely free to interpretate such hard data any
way one pleases, but one cannot claim this is not hard data, unless
one can come up with a good argument explaining why they are likely to
be fake.

The same goes for more historic data, of which Wheatstone's experiment
is one of the most intriguing ones:

http://www.tuks.nl/wiki/index.php/Main/WheatstoneExperimentsToMeasureTheVelocityOfElectricity

He came up with a speed of 288,000 miles/sec or 463491 km/s, over 50%
off the currently accepted propagation speed of the electric field.
When you consider the pi/2 factor to be correct, however, this value
is within 2% of error, quite remarkable. For the sqrt(3) factor, one
would come at within 11%, also a lot better than 50%. Just did these
calculations and to me it is rather interesting to find that the pi/2
factor also matches better in this experiment than the sqrt(3) term
from Stowe, while the pi/2 term also seems to match better with other
experimental data such as Dollard and Meyl.

>From here, it also becomes interesting to consider Foucault's
experiment, who based his experiment on Wheatstone's revolving mirror
idea to measure the speed of light:

https://en.wikipedia.org/wiki/Fizeau–Foucault_apparatus
"In addition, unlike the case with Fizeau's experiment (which required
gauging the rotation rate of an adjustable-speed toothed wheel), he
could spin the mirror at a constant, chronometrically determined
speed. Foucault's measurement confirmed le Verrier's estimate. His
1862 figure for the speed of light (298000 km/s) was within 0.6% of
the modern value."

So, to me, Wheatstone's experiment is very significant to consider in
relation to what is being said here:

http://www.tuks.nl/pdf/Reference_Material/Fast_Light/Superluminal%20phase%20and%20group%20velocities-%20A%20tutorial%20on%20Sommerfelds.pdf
"In subsequent work, Sommerfeld and Brillouin [1] showed that the
“front” is accompanied by two kinds of “precursors”, now known as the
“Sommerfeld”, or the “high-frequency”, precursor, and the “Brillouin”,
or the “low-frequency”, precursor. These precursors are weak ringing
waveforms that follow the abrupt onset of the front, but they precede
the gradual onset of the strong main signal."

https://scientists4wiredtech.com/what-are-4g-5g/brillouin-precursors/
"Pulses of radiofrequency microwave (RF/MW) radiation must have
extremely short rise times or very rapid changes in phase in order to
create Brillouin precursors on entering “lossy” materials like soil,
water or living tissue. (Materials that absorb radiation are called
lossy.) Once generated, the new pulses propagate without significant
attenuation."

So, what you see is that the observation by Wheatstone seems to match
our theory and that in order to create these so called "precursors"
one needs "extremely short rise times", like one can expect from using
spark gaps rather than modern solid state semiconductor switches.

Then there's this article, which talks about using "mercury vapour relays":

http://www.tuks.nl/pdf/Reference_Material/Fast_Light/Pappas%20and%20Obolensky%20-%20Thirty%20six%20nanoseconds%20faster%20than%20light%20-%20Electronics%20and%20Wireless%20World%20-%20Dec%201988.pdf

Turns out the fastest switches one can get are mercury wetted relays:

https://en.wikipedia.org/wiki/Relay#Mercury-wetted_relay
"For high-speed applications, the mercury eliminates contact bounce,
and provides virtually instantaneous circuit closure. [...] The high
speed of switching action of the mercury-wetted relay is a notable
advantage. The mercury globules on each contact coalesce, and the
current rise time through the contacts is generally considered to be a
few picoseconds. However, in a practical circuit it may be limited by
the inductance of the contacts and wiring."

So, I've got my hands on one of these a while ago and want to
experiment with it, kind of like repeating what Wheatstone did, in
order to see what I can find out. Is there something there that has
been overlooked by main stream science?

The way I see it, one can come to new conclusions and theoretical
insights using existing data to such a degree that one becomes
convinced one's theoretical insights are correct,  but one cannot rely
on a handful of historic data points to make one's point towards the
scientific community.  And one also needs to make sure one's theory
matches with repeatable experiments, if only to rule out the
possibility there is something wrong with the data you have.
Experiment must be the final judge and if one cannot find conclusive
data, one has no choice but to perform the experiments himself.

Such things do take quite a lot of time and one has to deal with all
kinds of practical problems, such as how to create a pulse with as
fast a rise time as possible. So far, it seems semiconductors are just
not fast enough to be able to reproduce what Wheatstone saw. With my
experiments with time domain reflections, I did see something like
such a "precursor" (see attached image), but that's just not good
enough for my taste, so I'm curious what will happen when I get to
experimenting with the mercury relay. Much depends on whether or not
the contact bounce is indeed eliminated with mercury relays, because I
need a signal that is repeated exactly the same multiple times in
order to make a complete measurement of the waveform with my scope.


>
> > So far, Einstein's "instinctive attitude" with respect to "the Quantum
> > Theory" has not been proven correct, but like him I have no doubt the
> > day Einstein foresaw will come, one day.
>
> Today, Einstein would probably accept a hidden preferred frame, simply
> because the alternative, to give up realism and causality, would be
> completely unacceptable to him.

Could be. Would he also accept and perhaps even prefer an aether
theory based on fundamental ideas and a correction of Maxwell's
equations?


>
> >> I don't ignore your claim, I openly reject it as plainly wrong.
> >
> > Yep, that much is clear. :)
> >
> > I'm left with the question of "why?"
>
> You have, first, to learn elementary electrodynamics. On the
> elementary school level. Starting with what is observable - the fields
> E and B - and what can be used to measure them.

Are you aware I hold a Masters degree in Electrical Engineering from
the University of Twente?

Actually, the only 10 (perfect) I ever achieved at University was for
the course "Electromagetic Field I", the course for the "static"
electromagnetic field aka Maxwell's equations along with Coulomb and
Biot-Savart.


> Once you have accepted that one can simply measure E and B, then you
> should understand that to distinguish the theory without the dB/dt
> term from the Maxwell theory can be done in a quite elementary way, by
> creating some variable magnetic field (simply a rotating magnet) and
> measuring the E field. And that the result of such measurements was
> quite clear, and in favor of the Maxwell equations.

As I stated before, at high frequencies things become a bit harder to
measure, because then one has to deal with waves and the analogy that
works well at low frequencies, essentially considering electricity
similar as water/oil flowing trough pipes, no longer holds. Every
piece of wire has resistance, capacitance as well as inductance and
the higher the frequency, the more these "parasitic" inductance and
capacitance becomes important and needs to be accounted for. And
because the same thing goes for our measuring equipment, things become
complicated.

So, perhaps one of the most important things one learns is that there
is no such thing as a "simple measurement" in Electrical Engineering.
Sure, there are accurate meters for all kinds of parameters these
days, but they all have their limits and one has to be aware of those
limits in order to use them properly.

Measuring the B-field is not as simple as it seems. Take a look here,
for example: https://en.wikipedia.org/wiki/Magnetometer

Measuring the electric potential is also not as simple as it may seem:
https://en.wikipedia.org/wiki/Electrometer

When getting along further, one also learns that - at higher
frequencies - the energy actually flows in the space outside our wires
and only part of that penetrates a small distance into our wires, the
so-called skin effect. So, in actual fact, at higher frequencies, most
of the "current" flows there where there are no electrons to carry the
current.

However, at the end of the day, there is no argument that Maxwell's
equations predict the behavior of the fields very well insofar as
applicable(!!). In the design of mcirowave antenna's and transmission
lines, for example, simulators are used, which essentially compute
Maxwell's equations over a grid, and the predictions match very well
with practice, so one is tempted to conclude there cannot be anything
wrong with them.

However, note the "insofar as applicable", which is an important detail.

As I explained before, IMHO, there are two fundamentally different
types of electricity:

1) The type we are familiar with, which fundamentally involves
circulation and closed loops. This associates to both the two-wire
transmission line principle as well as the "transverse" wave and is
well predicted by Maxwell's equations. There's no arguing against this
part, as we both agree.;

2) The type which is virtually unknown, which fundamentally does not
involve circulation and forms an open loop, typically terminated with
a spherical capacitance at both ends of the transmission line, whereby
the current vibrates and moves back/forth rather than in a closed loop
and requires resonance in order to get to work, which already makes it
much more complicated than the familiar closed loop form of
electricity. This one associates to Tesla's single wire transmission
line principle as well as the "longitudinal" wave, which is NOT
predicted by Maxwell's equations.

So, when you look at Maxwell's equations as defined by potentials and
compare it with the general case as defined by LaPlace / Helmholtz
which can be applied in the fluid dynamics domain, it is exactly that
term dB/dt which stands out.

So, why is it there?  Well, that's Faraday's law of induction.

So, how did he discover this? What did he measure? What does that mean?

https://en.wikipedia.org/wiki/Faraday%27s_law_of_induction#Maxwell–Faraday_equation
"In Faraday's first experimental demonstration of electromagnetic
induction (August 29, 1831),[7] he wrapped two wires around opposite
sides of an iron ring (torus) (an arrangement similar to a modern
toroidal transformer). Based on his assessment of recently discovered
properties of electromagnets, he expected that when current started to
flow in one wire, a sort of wave would travel through the ring and
cause some electrical effect on the opposite side. He plugged one wire
into a galvanometer, and watched it as he connected the other wire to
a battery. Indeed, he saw a transient current (which he called a "wave
of electricity") when he connected the wire to the battery, and
another when he disconnected it."

When you look at the picture, it's not hard to see that this
measurement involves the use of electricity in the shape of a closed
loop. The transformer is fed by a battery along a closed loop and the
measuring device is also a closed loop device.

So, what does this experiment really tell us?

It tells us that at low frequencies, a changing magnetic field induces
a current in a closed loop electric circuit.

And yes, this relationship continues to hold at higher frequencies as
well, as long as one adheres to the fundamental closed loop /
circulation principle.


So, the question is: is this really the only possible relationship
between the electric field and the magnetic field?

When you remove the term dB/dt from Maxwell's equations, you are left
with nothing but fluid dynamics and one obtains a more fundamental
relationship between the [E] and [B] fields that includes a transverse
surface wave as predicted by Maxwell, but also a longitudinal "sound"
wave as well as vortices. So, it does not necessarily break the
relation as currently predicted by Maxwell, but it makes room for
other possibilities that potentially provides a better match with
observations, such as predicting the propagation speeds of the various
wave phenomena that would be possible if the aether really behaves
like a fluid. And predicting a quantized far field rather than only
one continuous transverse wave, which does in fact not match with
observations.

To me, taking all of this together, this makes sense, especially
because then the math aligns with the *fundamental* theory of
calculus, "symmetry" returns, we obtain an explanation for quite a lot
of sources which report "faster than light" phenomena in one way or
the other and on top of that we obtain an actual explanation for what
is the "near" field and what is the "far field".

>
> > It really beats me how one could possibly reject this, since for me it
> > speaks for itself. Thus, the human mind remains to be mysterious.
>
> Physicists simply rely on the facts which can be easily measured.

In the measurements of electromagnetic phenomena, the devil is in the
details. They are never "easy", even though lots of measurement
instruments can be used with ease, but always within a specific limit
of applicability and one always has to be careful to understand the
measuring principle of the device in order to make sure one actually
measures what one thinks one measures.

In Dutch there is a saying "meten is weten", "measuring is knowing"
and I always add: "provided one knows what one is measuring"...

>
> >> My proposal for cooperation is not about who is right and who is
> >> wrong. Of course, you will continue to think you are right, and I will
> >> continue to think I'm right. Never seen something different.
> >
> > Yep. Remarkable that two rational thinking people cannot come to an
> > agreement about what should be considered to be an absolute truth.
> > Apparently, even math does not have enough power to bridge the gap,
> > which is very unfortunate. I guess we have no other option but to
> > agree to disagree.
>
> We have, you should simply learn elementary electrodynamics from the
> start, not based on your intuitive ether ideas. There are simple hard
> facts, the E and B fields which you can measure. And if you (different
> from me) don't believe the mainstream experimenters of the last
> centuries, ok, then measure yourself what you can measure with your
> $1000.
>
> The point being that the E and B fields are well-defined things one
> can measure in the real world.

Yes, in principle.

> And that such measurements can show you
> that they follow the Maxwell equations - in the form without any use
> of potentials.

Within the limits of applicability, that is.

> That means, you should understand that the Maxwell
> equations are about hard facts about real things, and that you have
> simply no freedom to change them in your theoretical speculations.

Well, as said, the devil is in the details.

They are about real things, that's a fact. Problem is that measurement
data is not necessarily as hard as one may think it is, especially
within the electrodynamic domain and even more especially when one
deals with high frequencies. The higher the frequency, the more vague
things become.

When you talk about antenna's, for instance, you are talking about a
whole structure around which a complex multitude of waves can
propagate and all you have to measure the thing is a coaxial feedline
that has an impedance of 50 Ohms. Yes, with a VNA one can measure the
behavior of the thing from that feedpoint quite accurately, but it
remains to be a coaxial feedline that itself operates along the
double-wire transmission line principle and is therefore from itself
unsuitable to measure any kind of signal that would require a
single-wire transmission line principle. And the signals on your
feedline do not give you any information about how the actual waves
around you antenna behave, which is why theory and simulators are also
needed in order to understand what is going on.

>
> > Yep, I see that, too. But at the end of the day, I see no other option
> > but to conclude we are, indeed, divided.
> >
> > And the problem is, we have a fundamental difference of opinion and at
> > this moment I don't see it happen that we can bridge the gap, even
> > though from my point of view that would be entirely possible.
>
> Science does not care, finally, about opinions. First of all, it cares
> about hard observable facts. (Einstein has, BTW, never questioned the
> facts of QT.)

In the end, science is about making quantifyable models with which one
can understand and predict certain phenomena.  The more experimental
data is found that matches with the model, the more confident one
becomes that the model is correct, until the model itself eventually
is being considered as fact. Exactly what Einstein warned about:

"Concepts that have proven useful in ordering things easily achieve
such authority over us that we forget their earthly origins and accept
them as unalterable givens. Thus they might come to be stamped as
"necessities of thought," "a priori givens," etc. The path of
scientific progress is often made impassable for a long time by such
errors. Therefore it is by no means an idle game if we become
practiced in analysing long-held commonplace concepts and showing the
circumstances on which their justification and usefulness depend, and
how they have grown up, individually, out of the givens of experience.
Thus their excessive authority will be broken. They will be removed if
they cannot be properly legitimated, corrected if their correlation
with given things be far too superfluous, or replaced if a new system
can be established that we prefer for whatever reason." Obituary for
physicist and philosopher Ernst Mach (Nachruf auf Ernst Mach),
Physikalische Zeitschrift 17 (1916), p. 101

Yes, it is a fact that Maxwell's equations predict the results of what
we are currently able to measure within the electromagnetic domain
well, even very well.

But it is also a fact that other possibilities are conceivable and
that a number of anomalies exist whereby it is observed that even
Maxwell's equations have their limits.

As Einstein suggested, the way forward begins by "showing the
circumstances on which their justification and usefulness depend".  In
Maxwell's case, that is the principle of the circulating current and
the two-wire transmission line. As long as one remains within that
fundamental view on electromagnetics, it matches and works well.

But Tesla's one-wire transmission line system with the principle of a
non-circulating vibrating current is not covered by the current
Maxwell equations and therefore Maxwell's equations are not very
useful for the analysis of systems and phenomena that depend on this
principle.


>
> >> But I see myself in a quite good position to gain a particular
> >> victory, say, taking a town near my border given that I have gained
> >> the control of the mountain near that town. But alone I cannot take
> >> it.
>
> > Problem is I cannot defend a theory which foundation is incompatible
> > with my ideas on a fundamental level, even though I have no reason to
> > doubt your theory is a lot better than what the main stream has to
> > offer.
>
> You don't have to defend it as true. Defend it as better in some
> aspects. And force them to defend their own much weaker theories.

As said, I don't speak that many people, but I truely believe you
could accomplish a lot more if you would give what I'm trying to share
with you a chance and give it serious consideration.

>
> > I take your word for it that it's much simpler and
> > understandable, but as long as it's fundamentally built upon "gauge
> > freedom", I cannot defend it. Sorry.
>
> My approach here is quite different from the mainstream approach too.
> The mainstream rejects the gauge freedom as unphysical and invents
> complicate constructions to get rid of them (Faddejev-Popov ghosts).

I would obviously agree that gauge freedom is unphysical, but I would
also argue that in order to get rid of them one needs a better
understanding of the electromagnetic domain, which can be accomplished
by correcting Maxwell's equations along the principles I explained.

>
> My approach is much simpler. I start with the Maxwell equations.

Yep, everything starts at Maxwell.

> Then
> I accept that the potentials are the things which describe reality,
> even if I can measure only E and B but not A and Phi.

Agree so far, but bear in mind that measuring E and B is a lot more
complicated than you may think, especially at high frequencies, like
when you want to study wave behavior around an antenna, which is what
you would want if you would really want to understand the dynamics of
the electromagnetic fields.

All one can really do is put some kind of probe (a "capacitor" or a
"coil") somewhere around your antenna and hook it up to a measuring
device, ususally by means of a coaxial transmission line. That gives
you a magnitude of some signal, but that's about it and a scalar value
is not a vector, so in essence all you can measure is a projection
from your 3D wave phenomenon onto a scalar value as a function of
time. And your probe also has parasitic capacitance and/or inductance,
so things become complicated rather quickly and there's a lot more
room for error and alternative interpretation than you seem to think.
Phi denotes the electric potential and can be measured with a
capacitive probe and the (scalar) field strength of your wave (some
combination of E and B) can be measured with a probe antenna, but
that's about it. So, in essence, you always look trough a 1D peephole
and that's all you have to figure out what's actually going on in 3D.

> But I can make
> a reasonable guess about their equations, and this reasonable guess is
> they all move with the same c, which is the Lorenz gauge.

All right, now let's consider carefully what we are dealing with. What
we have is a bunch of differential equations and one wave equation. In
differential equations, you work with distances that are taken to the
limit of zero. That is why the propagation speed of the fields
themselves can be assumed to be infinite or any value one would like,
because the progation speed of the phenomenon you are describing
follows from the solution of the differential equation.  In other
words: the propagation speed of the wave is an output from the
differential equation and not an input.

Since we only have one wave equation which describes an otherwise
unspecified "transverse" wave, we obviously do not have enough wave
equations to be able to predict the propagation speed of the "static"
fields [E] and [B].

So, yes, one can take a guess and guess it's all c, which is indeed
the Lorentz gauge.

BUT that does not change the fact that we do not have the wave
equations we would need if we'd want to calculate the actual
propagation speed, so we don't have to guess.

So, when you remove that dB/dt term from the equations and use the
Laplace / Helmholtz versions instead within the FD domain, from the
single hypothesis that the aether behaves like a fluid and we know
it's characteristics, you look at something completely different.

Now we can derive a "sound" wave equation and equate that to what we
measure as the electric field, which would then propagate faster than
c, so we won't have to guess anymore.

We can also derive a wave equation for a transverse surface wave,
which would propagate along our antenna's and describe the "near"
field, which would propagate at c.

And we can probably derive a wave equation for waves which would
consist of expanding vortex rings, whereby their rotation direction
alternates between two of these successive rings, which would result
in a "wave" that shares aspects with both waves as well as particles,
because of the vortex, and would look somewhat similar to waterwaves
that expand in circles, but the rotation direction of the "particles"
would alternate between each ring.

Want to guess what propagation speed would be predicted by a wave
equation for such a wave?


> No
> fundamental role for gauge symmetry.  The Lorenz gauge is simply a
> nice guess which gives nice and simple equations for A and Phi which
> are compatible with the Maxwell equations.

Let me see if I get this straight:

If you wanted to connect your model to what I'm suggesting, all you'd
really need to do would be to exchange the Lorentz gauge with the
correct equations for [A] and Phi?


>
> > I'm afraid I can't do much. I'm mostly at home and don't speak many
> > people, especially not scientists. And I really wouldn't know what to
> > say, because of our fundamental difference in point of view.
>
> You can simply ask "what's wrong with that theory" questions.
>
> I do similar things myself. Say, I have not liked many aspects of the
> Bohmian interpretation of QT. But I support it whenever there is a
> discussion between Bohm theory defenders and Copenhagen or many world
> defenders.

So far, I haven't engage much in discussions, but it seems a good
strategy to do so more often in order to get to know people and
exchange ideas, like we're doing now.

>
> If we cooperate, you can tell me which parts of my pages you can
> understand and which not, I could try to improve them. Then, if you
> have, as a result of this, some pages which you understand well
> enough, then you can try to some scientists in forums or so and ask
> them those "what's wrong" questions pointing to these pages.
>

I've taken a quick look at some of your papers and started reading this one:

https://arxiv.org/abs/0908.0591

Honestly, this is way above my head.


>
> >> I take only the unproblematic parts from gauge fields, not the
> >> ideology that this symmetry is somehow fundamental or similar
> >> stupidity.
> >>
> >> The unproblematic part is that what we can measure - E and B - is
> >> clearly not all, because of Aharonov-Bohm we need the potentials A
> >> Phi.  The rest follows from the Maxwell equations (which are fine).
> >
> > Will have to agree to disagree on this one, too. Can't understand why
> > you apparantly can't follow my arguments, so the gap cannot be
> > bridged. It is what it is.
>
> I have not yet given up, you acknowledge a lot of different points, so
> the discussion seems not hopeless.

Copy that, you acknowledge some things too, so perhaps there's still a
way out. Would be wonderful.

>
> So, let's clarify where exactly you disagree. The first central point
> is here that E and B are some really existing fields which can be
> measured with real well-defined measurement devices.

I agree with the principle, but in practice measuring E and B is a lot
more difficult and limited than in theory.

>
> >> Of course, it does not solve your problem, and has no intention to do
> >> this. Your interest would be a different one: This is a playing field
> >> where alternative physics can win against the mainstream.  All what is
> >> necessary for this victory is already reached.  It remains the
> >> sociological problem of fighting the wall of ignorance.
> >>
> >> If this attack against the mainstream will be successful, you will be
> >> in a much better position to help with your ideas the US to regain Air
> >> superiority.
> >
> > “The supreme art of war is to subdue the enemy without fighting.”
> > ― Sun Tzu, The Art of War
>
> Which is what I have tried many years now. But without fighting, the
> method chosen by the mainstream - complete ignorance - will win
> without a fight.

Well, if we could cooperate, things may change for the better.

>
> >> So you think you can exclude the possibility that human observers are
> >> sometimes unable to distinguish by observation states which are really
> >> different?  I see no base for such claims of human ability to observe
> >> everything.
> >
> > I just don't see the point in consciously creating fields that are by
> > definition unobservable because they cannot have any physical effect
> > according to elemental vector analysis.
>
> The fields A and Phi obviously have observable consequence, they
> define the E and B completely, and those are observable directly.

In theory, yes. In practice, this is a lot more problematic.

> So,
> the only question is if they are better or worse than using E and B
> directly. There are two strong arguments in favor of the potentials,
> namely the Aharonov-Bohm effect as well as the straightforward,
> simplest way to introduce a charge into the Dirac equation.

In principle, they contain the same information, so it doesn't make
any difference which ones one prefers.

However, the way it's done in Maxwell is problematic. Not only do we
have that extra dB/dt term, we also have units of measurement that do
not match to one another, which is problematic because when one wishes
to adhere to LaPlace / Helmholtz, they need to have the same units of
measurement.


>
> But, I suggest you to think first about the E and B fields as things
> which can be explicitly measured, and that this allows you to measure
> them in situations which allow you to test if and how the dB/dt term
> changes the E field.

In practice, it is problematic to actually measure the fields,
especially at high frequencies. See above.

You can measure a scalar impression of the fields on a number of
"points", yes, and one can relate those to the theory and as long as
one stays within the "transverse" world, those impressions match with
what the theory predicts. The only way I know of to get 3D plots of
the fields themselves is by using simulators, which do work really
well.  Did quite a lot of simulations with CST Microwave studio and so
far, the simulator predicted the radiation patterns of the antenna
designs I actually built correctly.

To ilustrate some of the difficulties one encounters if one wishes to
hunt for a FTL signal, let's take a look at time domain reflectometry.
I built one of these:

https://www.epanorama.net/circuits/tdr.html

Want to do this with a 50 Ohm coax cable? No problem. Works as
expected. Here are some pictures of how nice that works:

https://www.allaboutcircuits.com/projects/build-your-own-time-domain-reflectometer/

Want to try and measure a signal along a single wire? Not so much.
There's no return path and therefore no defined impedance.
The damn thing rings and the signals are a lot more difficult to
interpret, as you can see on the attached picture.

In other words: it is quite challenging to measure these kinds of things.

>
> After this, you can either do such experiments yourself. I restrict
> myself to knowing that this can be done in principle, and with
> sufficiently simple tools.

A VNA and a digital scope are quite complex tools. You have to know
what you're doing and what it is that you're actually measuring.

> Take a rotating magnet, a wire nearby, and
> measure the voltage at the ends of the wire.  The question is if the
> rotation of the magnet somehow influences this voltage and how. As
> well how this voltage depends on the direction of the wire relative to
> the rotating magnet. And then think about the question if all this is
> correctly described by the Maxwell equations for E and B or not.

That's actually a good question, which does not have a simple answer.

First of all, fundamental problem with respect to the alternative, the
single wire transmission line principle, is that your volt meter is
not an ideal voltmeter, it has an internal resistance and therefore
closes a loop.

So, what are you _really_ measuring?

You are measuring a phenomenon we call "current" and which we express
in a unit of measurement called Amperes, or Coulombs per second, which
is usually considered to consist of a number of electrons moving
trough the wire. So, it seems to be simple and not much more than
counting the number of electrons going by. However, an actual simple
volt meter is actually a galvanometer
(https://en.wikipedia.org/wiki/Galvanometer ), a micro-amperemeter, in
series with a resistor.

So, what you have is a changing magnetic field interacting with a
closed loop circuit whereby you read a "Voltage" on a display, which
is actually an Amperemeter, or better: a micro-Amperemeter.

What you actually establish this way is that a changing magnetic field
induces a changing current in a closed loop, whereby a voltage is
developed over the series resistor, which is not actually measured.

So, if we assume the resistance of the wires to be neglible, the
voltage is developed over the resistor and the actual voltage
developed depends on the value of the series resistor.

Take a voltmeter with a different internal resistance and your mileage
may vary....

>
> This intentionally avoids all references to your ether model or any
> fluid dynamics. That's important, given that you should understand
> that E and B are well-known to exist and their equations are
> well-established without even thinking about such models.
>

Yep, they are well-known to exist and insofar as the "closed loop"
principle can be applied, the equations describe the observations very
well. BUT the devil is in the details and even something as simple as
measuring a voltage is not as simple and straightforward as it seems.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: WhatsApp Image 2019-12-04 at 4.00.27 PM.jpeg
Type: image/jpeg
Size: 112500 bytes
Desc: not available
URL: <http://mail.tuks.nl/pipermail/physics/attachments/20200502/1623fd9f/attachment.jpeg>


More information about the Physics mailing list