Data Analysis and Systems Integration

What all technical systems or applications have in common is data, and for data to
serve some purpose there must be analysis of that data to achieve a goal; that goal
most generally being System Integration.  While you will encounter many systems in
this book--another way to say many “Technical Applications”, the focus of the book is
really on the subject of DATA ANALYSIS.  In the history of science and technology
the single closest recognized discipline of education and application to what is
considered Data Analysis today is that of Numerical Analysis.  {Sometimes the
discipline is called Numerical Methods, and years ago as large scale computers
became the tool of technical and scientific research, the closely allied field of
Operations Research developed knowledge and skill  to mold the discipline of
Numerical Analysis into engineering, computer science, and technology.}

1-1:  Simulation of Space Shuttle with MATLAB programming.
One example speaks louder than words, in the following case it is the space shuttle
in re-entry, being analyzed by way of simulation.  {By the way, the program coding is

%caclulate table for space shuttle re-entry parameters,
%first caclulate Z, 1.8 degrees Rankin equals 1 deg Kelvin
%velocity at reentery, Ve is 36 Kft/sec, beta = m/CD*S, m= 342 slugs,CD=1.3,
S=130 ft2;
clear all
Ve = 36000.;        %velocity at re-entry of 36,000 feet per second
CD = 1.3;           %Coefficient of Drag to be explained
S = 130;            %wing surface area
m = 342;            %mass of space shutte in slugs
g = 32.2            %gravity
beta = 342/(1.3*130);     %beta here is heading angle
format long g
R = 1716;                                                % r, Ryberg const in Eng sys is 1716 ftXlb/
degR = (288)*(1.8);    %temperature in Rankins
% calculate Z = go/RT; g in eng units 32.2 ft per secsquared
Z = 32.2/(R*degR);
% caclulate velocity deceleration in g's
VEsq = (36000)^2;
MaxDecel = (VEsq*Z*.043)/(2*exp(1));
MaxDecelg = MaxDecel/32;
% calculate density, first at 400,000 feet
h = 150000:10000:400000;  %setup for alt, h, in increments of 10Kfeet
lh = length(h);    %more looping setup with length of h
rhosl = .0023769;        %rho for density, rhos1 is initial density
rho = rhosl*exp(-Z*h);
hkft = h/1000;           %converting feet to Kfeet
%plot(hkft,rho)          %used only in initial debugging
%xlabel('altitude in Kft')
%ylabel('density slub/ft3')
%velocity versus altitude, h and rho
VEL = Ve*exp(-rho/((2*beta)*Z*.043));  %Velocity calculation
subplot(2,1,1)             %top subplot shown below
plotyy(hkft,rho,hkft,VEL)  %plot of rho on left and Vel on right
xlabel('altitude in Kft')
%calculate drag from D = 1/2 rho*VEL^2*S*Cd
const(1:lh) = S * CD;
nrho(1:lh) = const.*rho;
nVEL2(1:lh) = VEL.*VEL;
D = (nrho .* nVEL2)/2;
%calculate dV/dt = D/m;
mconst(1:lh) = m;
dVdT = D./mconst;
dVdTg = dVdT./g;
plotyy(hkft,D,hkft,dVdTg) %Drag on left axis and dV/dt = a on right

NOTE:  If you already know MATLAB, go for programming this script if you
wish.  If not, you see some of the interest and excitement that is in store for
you.  And the analysis in plots  as shown below enhances that excitement.
The comments to the right of program code, starting with “%” for comments,
like C in another language, will help you see what is happening.

1-2:  Numerical Analysis the Anchoring Discipline.

You might consider Numerical Analysis, along with Advanced Engineering Math, the
anchor during the digital atomic revolution.  The languages of emphasis changed
from FORTRAN to MATLAB, the computer hardware changed from large scale
computers to mini- and micro-computers; but Numerical Analysis with the additon of
only a few methods remained the same.  An Engineer of the 1960s in read a book on
programming in BASIC or FORTRAN IV would essentially recognize and see the
same names for methods like Runge-Kutta, as the Engineeer reading a modern
book on Numerical Methods with MATLAB.  The evolution of modern numerical
analysis {surely both Excel and MATLAB must represent modern analysis}, is in
many ways a paradox; for exmple a book written in 1984 as part of the NATO ASI
(Advanced Science Institutes) Series, and entitled  SIMULATION AND MODEL-
“Optimization in Simulation Studies” at the same time suggests the more modern
approach of state-space as used in most current textbooks on Automatic Control,
techniques of parameter estimation like Lliff lectured on at NASA in  1987,
supposedly for the best of research in Systems ID and linear theory, the cost
function of C&S introduced as “computational cost”; yet goes all the way back to the
Newton-Raphson method, brings up the famous simplex method for optimization and
numerical methods of Nelder and Mead of 1965, calling it the “polytope method” to
separate it from the simplex method of linear programming.  Regardless this bit of
history and perspective is interesting reading from 26 years ago as the author talks
about the increased availability of large scale computers and the need for
optimization techniques and specially designed software {ISIS, ACSEL, GEST,
COSY, MACKSIM, and FORTSIM} to keep up with an increased need for

“The determination of optimal values for parameters is often as important aspect
in both the formulation of mathematical models for systems nd in their
subsequent use in simulation studies.  Optimization subproblems are therefore
intimately associated with model-based studies.  The objective of this paper is to
explore some festures of the interface between these two problem classes and to
provide and overview of some of the numerical procedures that are available for
solving such parameter optimization problems.”

We technical workers who lived and worked through technical applications with
computers of the last 50 years, the half a century of what will later in the book be
called “The Digital Atomic Age” because digital did for technical applications with
computers what the atomic bomb did for science, also saw Numerical Analysis grow
and develop from applications to large scale vacuum tube and analog computers,
through the minicomputers of transistors, and then into the microcomputers of
integrated circuits (ICs).  This is not to say that some person or group instigated a
plan for this Digital and Digital Computer revolution often at the highest technical
levels riding on the back of Numerical Analysis, unlike the changes instigated by Bill
Gates and his Microsoft Empire {we will not now get into the borrowing Microsoft did
from the hard work of the MAC empire}; but it rather just happened as the natural
evolution of science and technology of many diverse applications like physics,
mathematics, engineering, computer and other electronics hardware and software.
Just the solution of a problem of physics was numerical analysis without being
considered as such.  The system was analyzed whether it was a simple falling body,
a pendulum, a bouncing ball, or the path of a missile; a drawing was made to
represent the system with all the known data recorded on the drawing; the applicable
physical equations of motion were applied; and then a numerical solution was
calculated.  Quite often a more detailed problem involved some error analysis
between the calculated and the standard as for example when in a physics lab we
were using the famous oil drop apparatus of Millikan to determine the charge of the
electron as close to the known value of 1.6 X 10^-19 as possible.  {Note we will use
in this text the notation of the MATLAB language.}  So the error analysis gave us a
percent comparison between our experimental calculations and the known standard.  
And what we were really doing without any such real noble goal as what Lord Kelvin
said {“It is not scientific until you attach a number.”}, was to understand and integrate
a technical problem by attaching a number that had more significance to us than for
example E = m X c^2 or Newton’s second law of Force is equal to mass times
acceleration (F = ma).  In fact, we might say in looking back on the early history of
science and classical physics, that Newton when the apple under the acceleration or
force of gravity fell from the tree on his head, or when he formulated with much
thought, application, and numbers, the second law of motion, was the process of
numerical analysis, or Data Analysis.

1-3:  Excel and MATLAB for Data Analysis.
Most people are at home with Excel, so some comparisons along the way will be
made with using Excel or MATLAB for Data Analysis.  {Yes, the very same Microsoft
Excel of Microsoft Works, and on your computer, can do Data Analysis.}  MATLAB
was originally developed by Cleve Moler in the 1970s while he was chairman of the
Computer Science Department at UNM, then with Jack Little, an engineer that
specialized in control system design, and Steve Bangert, they founded MathWorks in
1984, after rewriting MATLAB in the C language.  It is a little harder to date Excel,
especially for you younger generations who think it existed before computers.  It has
been a widely used spreadsheet since version 5 in 1993.  Microsoft first issued a
Windows version {2.05} in November of 1987, and after 1993 when Microsoft
included Visual Basic for Application {VBA}, Excel was well on its way of extensive
use in Data Analysis.  Although somewhat limited even in technical applications,
when in 2004 Robert de Levie wrote ADVANCED EXCEL FOR SCIENTIFIC DATA
ANALYSIS, since the emphasis and professional speciality of this author for 34
years was as professor of analytical chemistry and electrochemistry at Georgetown.  
If the copyright credits Levie gives in his excellent book on Data Analysis are any
indication of how long Excel has been used in Data Analyis, your at first would think
all the way back to 1974; however since this is impossible, we might think without
extensive research on Levie’s references, that VBA was used for Data Analysis.  
MINITAB, a statiscal software once more for teachers, is also used some for
analysis in this book.  It was developed by Penn State in 1972, although rarely known
or used until recently as it has become popular for quality control work in Six Sigma.  
{What I found useful was with histograms in Flight Test Reports and presentations.}  
We will want to compare analysis of data between Minitab, Excel, and MATLAB.

1-4:  Large Modern Systems in the Evolution of the Digital Atomic Age.
You will find in the course of the reading and study of this book that two systems
dominate the material--airplanes and missiles; but it is hoped that you will see up
front, and foremost, that these two now very complex systems as seen in the
Technical Applications to the Boeing 787, the General Dynamics F-16, and the
Space Shuttle which is now a Space Airplane robotically controlled, illustrate
techniques and tools of DATA ANALYSIS.  {Also we must write about what we know
from experience and having retired from General Dynamics as a Flight Test Engineer
on the F-16 and from Raytheon as a Principal Systems Engineer, testing the missiles
that carried the KW and EKV into the exo-atmosphere to shoot down incoming
ICBMs, the material naturally evolved into a focus on missiles and airplanes.}  Yet
you will find telemetry, flight test, communiations, and other modern systems, once
again as illustrations of DATA ANALYSIS.
Quite often modern and complex electro-mechanical systems consists of many
systems; for example the F-16 operates centered around over 20 distinct computers
and systems like the weapons control systems, the fire control system, the flight
control system {each having its own quad-redundant computer or in the case of the
FCS a system or 4 computers--the CADC, the FCC, the ECA, and the PSA}, the
engine warning systems, the engine control system, and on and on.  And almost as
often the modern in design, system integration, and flight test is that well designed
partial systems work quite well independent of the aircraft {that is, in the integration
lab}, but do not work as a whole in the total system of the aircraft.  This was not so
much a problem in the F-16 as it was after design of the first block 1 system was
further developed through the years in a block system, going from block 1, to 10, all
the way up to 40 and beyond; so that any subsequent total systems integration and
development problems such as engine warning, direct battery power to the flight
controls and an auxiliary generator just for the flight controls, secure voice, and even
newer ECM and weapons were wisely programmed through the years as large scale
modifications.  This was not the way the Lockheed Martin C-130J, an all computer
controlled aircraft,  program was designed and developed, so that for approximately
one year after exit from production the bugs of total systems integration {the well
designed parts working together as a whole} were still be worked out.  One obvious
case in point since LMC had previously sold the wind tunnel at Marietta to Ford Motor
Company and no tunnel model of the C-130J was tested, it was a surprise to all at
the total system flight test when the props, the composite airframe, the engines, and
aerodynamics of the J model departed from the flight history of many years of other
models of the C-130; in fact, departed during a stall with a slip right of approximately
5,000 feet.  That is a hard way to learn about a further need of systems integration
based on data analysis of wind tunnel data.
As you can see our concept of “system” goes far beyond the fundamental definition
of system as used in the Linear System Theory of an engineering course, like that of
by two professors of the Electrical Engineering Department at UC Berkeley.  It goes
something like this:  an abstract system {they do not apologize but support the
abstract concepts and proofs for them}; but then say, “or system” {engineering held
them down to practical applications, and they do refer to many typical systems like
electrical networks and even the famous mass-spring system} which is a partially
interconnected set of abstract objects termed the components.  Of course, in
something like in circuit analysis the components can be resistors and capacitors.  
While you will encounter many of these so-called systems on a smaller scale such as
for circuit analysis, even sub-systems of the Flight Control Computer, and yes, the
famous mass-spring system, you have already noted above that the modern systems
of this book and technical applications of computers are much larger like airplanes,
missiles, and flight control computers.  In fact the expertise of this author for writing
and previously for technial work was to rapidly “come-up-on” {a phrase used to
Aerospace to get smart on a system} on new systems.  These, if taken in the order
of a career in Aerospace, momentarily forgetting the 10 years of teaching electronics
engineering technology interspersed in the Aerospace years, would roughly go like
this:  an Air Early Warning Radar System; the Atlas Missile System; the Minuteman
Missile System; the Athena Missile System fired in Green River Utah and which
impacted on White Sands Missile Range from which was obtained re-entry data;
Telemetry Systems for data acquisition and intelligence; the C-130 Gunship Forward
Looking Infrared System; the F-16 which was really a system of many systems; and
the KW and EKV payloads of the missile shield around this nation.

This Chapter 1, "Data Analysis and Systems Integration" is continued on the page,
"Systems Integration".