Nothing is impossible ...
... only mathematically Improbable
last updated 05/01/2003
Now for my own views of Engineering versus Science
... and I have a BS and MS in chemical engineering, minored in nuclear
engineering and numerical analysis, and worked 20 years for the 2 biggest
chemical companies in the U.S., both in production facilities and in their
corporate "think tanks".
Behind every machine is a theory that proposes a Mathematical Model
interms that our intellect and senses can comprehend. It is important to
realize that completely different theories can predict the same results
within certain boundaries or under given conditions. That doesn't help the
"purists" decide which one is the 'correct' one in the absolute sense, and
more "tie-breaking" experiments will have to be designed.
In the meantime the engineer could really care less about these math
models. As long as they can yield accurate results in their simulations, it is
"overkill" to a nuts-and-bolts type which one is the absolute 'right-est'.
Now if one is to design machines that will operate outside of the regimes
for which these models were developed, then that's another story. Else it is
perfectly acceptable to use simple curve-fitting regression models when
it is too difficult to experimentally obtain parameters to plug into these
theoretical models. It works GREAT and saves a heck of a lot of time!
I actually started out as a chemistry major before switching. And my
grades suffered in undergraduate school because I was always trying to
understand the fundamental processes to heat/mass/momentum transport
and kinetics. Until I finally realized that this understanding -- though
noble and "pure" -- was absolutely unnecessary and even counter-
productive to 'engineering'.
I'll give you some examples. The most complicated heat exchanger
designs use convective transport models, which themselves are based on
simple laminar flow diagrams. We know this isn't what is taking place.
There are more precise precise theories ("surface renewal" heat transfer,
for instance). But a crude mathematical model is made and then subjected
to thousands of experiments that produce constants for these semi-
empirical equations that are valid for a certain size, shape, flow regimes,
temperature/pressure, and all those parameters like the Reynolds number.
They enable engineers to accurately design commercial-sized units right-
on-the-button with the least possible expense.
The chairman of my chemical engineering department was world-
renowned for his catalytic fluidized-bed expertise. Once an engineer from
industry gave a talk about a particular reactor's performance. Dr. Wen was
interested to know how well it fit his model. But when asked what were
the sizes of the gas bubbles as they percolated through the "bed", the
engineer said they don't take such measurements. Dr. Wen appeared
dumbfounded. (Maybe that company used simple regression techniques
based upon hundreds of lab tests.) The elegant models one sees in
academia are not always used nor required in the real world. Now one
could make a case that a more rigorous model would maximize profit by
less energy consumption, greater yields, etc. Maybe yes, maybe no. But
that would be belaboring the point here.
My nuclear engineering courses emphasized 'engineering' and not
physics. I was taught that there were more precise models that described a
nuclear reaction. But engineers consistently used a "diffusion" model
which they knew did not represent the kinetics as well as other theories.
BUT this model lent itself to experimentation very well; it was almost
impossible to determine numerical values of the parameters in these other
more accurate models.. Hundreds of fictional diffusion regimes were
created to created to model a reactor and -- as in the case of heat exchanger
design -- constants and coefficients in these diffusion equations could be
easily determined. And any size reactor could be designed at optimum
cost. No need to know what is REALLY going on inside. From an
engineering standpoint and the utility power companies. Who cares?!
In the glory days of Heathkit and electronic projects, I did my share of
prototyping and even publishing some of these (the old Popular
Electronics and Radio-Electronics magazines). One project involved a
new EXAR-2240 chip. (I had the civilian and not the military version.) I
used a solderless breadboard to test my design. And I employed the usual
test instruments like a VOM and oscilloscope. I ran into something weird
where I could not get the chip to "re-trigger" when the scope probe was
removed. It was driving me crazy. As a last resort (after a couple of
days), I guessed the probe's input impedance was about 1-Meg and
substituted a same value resistor for it. The circuit then performed as
expected. If the resistance was even 820K or 1.2-M, it failed to work
under a 9-volt supply. It was just a 1-M resistor going to (plus) and
another one going to (ground) off the pin-2 input. Shouldn't have mattered
in the slightest. It almost seemed like a microamp of current was being
wasted. After some exhaustive cussing subsided, I wrote Forrest Mims --
who authored all those Radio Shack IC books -- and he couldn't figure it
out either. We ended up calling the configuration the "stealthskater
trigger". But in "proper" engineering mentality, I could care less as long
as it worked.
Once as a process chemical engineer, I was charged with finding out
what was causing color variations in a "catalyst" that was spray-painted
onto walls of portable "self-cleaning" ovens. When I asked the PhD
chemist at corporate headquarters, he said they really didn't know if the
materials acted as a true catalyst (and oxidized the grease) or as a "sponge"
with a tremendous amount of internal surface area. After so many years,
the oven walls would not oxidize (or absorb/adsorb) any more grease and
the customer would have to buy a new oven. Again -- from an engineering
standpoint -- nobody cared what actually was going on. It would be
theoretical overkill and would probably not contribute a penny of profit.
And with the other large corporation, I was modifying a very complex
business model of one of their petrochemical plants in Texas. It started
with the usual kinetic/thermodynamic models and then the economic stuff
was superimposed on top of that. They retained MIT and Case Western
researchers to formulate a customized non-linear global optimization
routine. Anyway, the R&D labs had just produced a new batch of yield
data from a new catalyst formulation in the "cracking" reactors. My job
was to produce a model that could be used in the computer programs.
Although I had taken the standard homogenous fixed-bed and
heterogeneous fluidized-bed courses in school -- all producing
mathematical models that would tax any PhD on the planet -- we ended up
dividing the reactor into fictitious zones (just like in nuclear reactor
design) and using "cubic splines" to regression-fit the data. More than
sufficient for our purposes. And it worked like a charm. Didn't need any
of those graduate-level courses. But I'm sure this approach would appall
any theoretical chemist.
If things proceeded "logically" in this world, physics would give
engineers a hint of the processes to be used to achieve some goal as well as
the energy requirements. Frequently the reverse happens: researchers
discover something which confounds the theorists and the physics books
are patched-up. It even happens in the medical world. It wasn't too long
ago when chiropracty and acupuncture were considered "voodoo" and
taboo. Now doctors grudgingly acknowledge these even they though they
still don't know how they work.
Now to the topic of this site. I am not criticizing any of the world-class
theorists who put my intellect to shame. Nor am I going to dwell on the
fact that they're divided into separate "camps" which don't agree with one
another. In the quest to unite General Relativity with Quantum
Mechanics, there seems to be several "camps" with their own world-class
scholars. Each makes a good argument as to why their mathematical
model/theory is the 'correct' one. But each has its shortcomings and is
constantly being "patched up" with wrinkles here-and-there. The Higgs
and graviton particles may not exist in reality but the Standard Model can
be "fudged" to compensate. If the extra dimensions predicted by
superstring and M-brane theories can't be found, however, that may spell
the end for that theory although it to-date is the only one that can contain
both G-R and Q-M under one mathematical "umbrella" (after paying the
"price" of allowing these extra dimensions). Loop Quantum Gravity theories
are strong competitors here. And for decades alternative physicists
have been arguing the merits of not "conveniently" subtracting Heaviside
parameters from equations believing there is nothing "fictional" about
accessing the Zero-Point Energy field via "scalar time domains".
The Standard Model needs gravitons and Higgs to make it "complete",
and it appears less likely that those particles exist. But the Standard Model
has been "patched" to where it can reliably predict all subatomic events up
to such-and- such energy levels. From an engineering view, that's all that
is important regardless of whether it is based on sound theory, semi-
empiricism, or a plain old regression equation. Nobody cares except the
"purists" who don't have to build working machines.
Critics of the new superstring/M-brane theories say it is a mathematical
"trick". One famous/infamous nuclear lab tech said that "when physicists
don't understand something, they always add another dimension." The
University of Washington just completed one of a series of experiments to
detect these extra dimensions with no success. Maybe they exist and
maybe they don't. But if the model -- which has its roots in Kaluza-Klein
theory -- can predict consistent results, that's all we engineers care about!
We want to BUILD ... not contemplate on the underlying
mechanisms. Who knows, maybe 'reality' depends on more than our
normal senses anyway; and the best we can do is "simulate" it based on
what we can measure.
On the flip side, it does irk me when alternative theories are put down by
so-called mainstreamers because they sound too "crackpot-ish". In the
final analysis all of these are just Mathematical Models. Even their's!
None of them may be "correct" in the purest sense. But it doesn't matter to
an engineer. Whatever works to build a machine is the "right" model
(whether it accurately represents reality or not). Somebody "extrapolates"
a laboratory observation and the theorists are challenged to figure out 'why'
this works. The experimentalists want to have some idea on the necessary
energy and material requirements to take it to the next step. They could
care less if the phenomenon involves black holes, pink holes, red energy,
11 dimensions or 11,000 dimensions. Whatever "model" that accurately
simulates the phenomena is the one to use; "reality" is secondary. Then
the purists can spend the next 50 years fine-tuning while everybody has
been using these fantastic machines. There may be times when theories
are proposed that cannot be currently validated in any laboratory. But
those should not be used to automatically label other experimental claims
as "crackpottery". Perhaps I'm biased, but I always give the edge to a
visual demonstration over a mathematical model no matter how elegant it
is. How many of our discoveries were "prohibited" by the best theories of
A few theories can explain some UFO behavior. Others lie in the
advanced theoretical realm and are awaiting experimental confirmation to
decide if they are appropriate to describe other UFO performances. And
yet I am entirely convinced -- as written above by Dean and others --
that some of these UFOs exhibit things which are TOTALLY
BEYOND our best science and would reside in the world of "science-
fiction". In my opinion, Corso, Lazar, and others were successful in back-
engineering certain "pieces" of UFO technology based upon present-day
Earth materials-engineering and science. Other things -- such as the
remaining items in Corso's "nut file" -- will have to await for advances in
both of these areas.
So when I read about companies like TransDimensions
(http://www.tdimension.com) and UNITEL
(http://www.stealthskater.com/UNITEL_background.htm ) that
propose laser-based propulsion by generating a Bose-Einstein "distortion"
(like a "tractor beam" in reverse, pulling the vehicle toward a black hole of
sorts), I don't dismiss them just because present-day theories don't
understand how it can be done. Nor do I dismiss the (admittedly disinfo-
laden which doesn't help their cause) claims of "outlandish" things like the
Philadelphia Experiment and Montauk Project, time-travel, remote-
viewing and paranormal phenomena. Or Tom Bearden's stance on scalar
waves and reports of rare successes by Tesla, Priore, Rife, Keely and
others. And we might as include the way-out stuff like "orgone energy"
and "ORMEs", too. The one thing these have in common are claims of
removing radioactivity; and I suspect it's really more akin to those effects
of atomic detonation that are still classified long after Hiroshima and
which the ETs seem to be concerned about. I have no clue as to what they
are. Only that there is a one-to-one correspondence with UFO sightings
and nuclear weapons production. And what about von Braun's comments
about a pyscho-reactive or "organic" hull material? And statements made
by alleged Project Pounce participants about viewing the immense inside
volumes of a crashed discs that were impossible by its outside
dimensions? (Maybe the mind-machine interface of these craft and crew
was doing a lot more than just navigating by thought control; maybe the
craft itself was being physically altered inside and out.) And former
ELINT Sgt. Dan Sherman who reported that some of his Project Preserve
Destiny "comms" with the "Greys" (<click> here) alleged they didn't
travel in time but "through and around time", somewhat reminiscent of
Oberth's alleged observation that some of these craft function more like a
"time machine". (Could this be similar to what Corso last talked about in
the above DATELINE interview?)
I myself have been guilty of being so immersed in a course-of-study that
I couldn't see "outside the box" and view the forest because of all the
trees. These incidents may be another example of Theory lagging
Experiment. Whether we live in a superstring universe, or one governed by
LQG, or even reactive to such obscure notions as "phase density shifting"
( <click> here ) is not as important to the engineering task of building
such machines as whatever mathematical MODEL does the best job in
predicting results that can be CONFIRMED by experiment. Although
mainstreamers are always necessary to safeguard precious resources
including money and manpower, they should not automatically ridicule
something just because it challenges their "common sense" and formal
education. Rather put the burden-of-proof on the claimant and always be
prepared to rewrite the text books. "The proof will be in the pudding" an
operating machine. (Since it doesn't appear that gravitons will be found,
and first results of finding "extra" dimensions have been unfruitful, and
LQG theories have their own problems, maybe a good alternative is the
one Ray Kramer suggests in his"The Equation" made famous by his
famous Missing Person son Philip Taylor Kramer ( <click> here ).
I'll close this discourse with another personal experience. The Professor
Emeritus of our department would always recount the following true story
to each graduating class as a "humility lesson". A master brewer was
contemplating early retirement. His company had many "apprentice"
brewers but none of them could make as fine a batch of beer as he could.
He wanted to be used as a part-time consultant but the company balked.
So he remained equally as stubborn and refused to divulge his secrets that
he had discovered over years of trial-and-error.
The company brought in many engineers and chemists who took notes
and measurements while he made batch after batch. But when they tried to
do it themselves, they were unable to make as "good" a batch as he did.
After months of complex thermodynamic calculations, alternative food
chemistry mechanism proposals, and advanced reaction kinetic models, the
company gave in and granted him a consultant contract in addition to his
pension. So he did his part and told him what his "secret" was:
"I let the temperature rise until just the point when it produces so much
steam that I cannot see the nail pounded halfway into that stud across the
room." The moral, of course, is not to look with disdain on the claims of
"amateur experimenters" -- they just might have uncovered something that
doesn't agree with anyone's best theories!
It wouldn't surprise me to see these new patents and inventions work but
by another mechanism/model than what their proponents propose. And I
foresee a collaboration where -- like the aforementioned medical
authorities and "old wives remedies" -- present-day theorists revise their
stance to come to grips what the "crackpot" experimenters are
demonstrating. Then statements like "I cannot create what I don't
understand" (already disproved in the examples cited above as well as in a
million more that take place all-the-time) are reworded to "I created
something which my best theories say is 'impossible' !"
"No door is closed to an OPEN mind !"
"Nothing is impossible ... only mathematically Improbable"
if on the Internet, press <BACK> on your browser to
return to the previous page (or go to
else if accessing these files from the CD in a MS-Word session, simply
<CLOSE> this file's window-session; the previous window-session should
still remain 'active'