Reverse-engineering a three-axis attitude indicator from the F-4 fighter plane

160 pointsposted 14 hours ago
by zdw

49 Comments

genter

14 hours ago

Thanks for including ridiculously high res images.

And it amazes me how many analog tricks they used. Modern day would be a couple lines of code.

syndicatedjelly

13 hours ago

The 1950s were a time in computing where it wasn't a given that digital computing was clearly "better". We still hadn't developed methods of mass-producing reliable, fast, and cheap microelectronics and controllers. So for high-reliability applications, analog computing was THE solution.

In 1954, Rex Rice wrote this piece about preferring a simple plugboard as the means of programming a computer, versus any sort of abstraction with a programming language (https://dl.acm.org/doi/10.1145/1455270.1455272). So it was still very much up for debate, whether high-level programming languages were even the right solution for the problems being faced.

But I agree with you, our forefathers were simply geniuses to have figured out how to manipulate the physical world to produce mathematical computations. Early in his career, my dad had to disassemble and reverse-engineer some Soviet-made aerospace devices, and he still fondly recalls how superbly engineered and precise the Soviet devices were. I wish there was more information out there about Soviet computing, but the winners do write history after all.

somat

8 hours ago

My understanding the surprising factor that settled the debate for analog vs digital computers. was that digital computers require far less precise electronics. the voltage in an analog computer has to be exactly 5.2648 volts, every resister, capacitor, transistor has to be a high precision part, the more complex the computer the higher precision required. while in a digital computer "close to 5 volts" is good enough. this makes the components cheaper, more reliable and smaller. and as such the digital computer won.

This is why I have my doubts about existing designs for quantum computers, the quantum algorithms we are trying to solve in hardware require an analog design and we rejected analog computers before, mainly because they were unable to scale. This inability to scale is the same problem we see with quantum computers.

stavros

6 hours ago

Not only is "close to 5V" "good enough", it's so "good enough" that we consider a CPU running a program practically idealized. When programming a computer, you largely don't think about any hardware cases going wrong, because they're so vanishingly rare.

This is a huge departure from the physical world, and it speaks to what a massive benefit computers bring.

kfarr

2 hours ago

The best part of a computer is its utter disconnection from reality.

spitfire

10 hours ago

It's interesting that you note the unreliability. I always assumed tubes were unreliable, but thought anything solid state (even those card based systems) would be "reliable enough" to start taking for granted.

But then you look at it and think Yeah, obviously they're not going to have MTBF times in the millions. It's going to be hundreds of hours - once a week, or maybe every few weeks between real hard crashes.

How would that change your behaviour.

Scene_Cast2

13 hours ago

I've wanted to add such an indicator to my car's dash (I already added a boat compass, which I find quite useful and aesthetic). Unfortunately, electronic indicators of any kind are much more rare than vacuum powered ones or all-glass cockpits.

bigiain

5 hours ago

I am currently pondering the idea of building a modern electronics "replica" one of these, with 3d printed sphere halves containing stepper motors, magnetic rotary encoders, and a 6dof compass/gyro imu. If you put an Arduino or ESP32 inside to drive those, you could have simple slip rings that only needed to supply power through the roll and pitch axes.

(Only pondering though, I have had the same idle thoughts about making my own Russian Soyuz mechanical navigation instrument too from this other writeup of Ken's https://www.righto.com/2023/01/inside-globus-ink-mechanical-... but somehow the idea of making replica soviet vintage tech isn't as appealing as it was a few years back...)

dumbo-octopus

10 hours ago

Boat compass on the dash is awesome, I might have to borrow that. Any issue with interference from the vehicle itself?

Scene_Cast2

9 hours ago

Yeah. My compass (a Ritchie) has two axis calibration at the bottom; I ended up maxing out one of the axes (so it's still a bit off). Also, it tends to shift by a decent amount when the car is pointing up or down steep hills.

Onavo

12 hours ago

What you need my friend is a ring-laser gyro.

liendolucas

11 hours ago

Asking just out of curiosity/ignorance. The author mentions that the F35 has a completely digital touchscreen to basically do anything on the aircraft (I assume). I can also image a powerful gun damaging it, then how does pilot manage if that screen stops working at all? Compare the same situation in the F4. The hit would only break/damage the instruments on that line of fire, correct? So in one case you would be totally screwed while in the other one you would partially lose some instruments, right? I must obviously not be taking into account something (or many things) for the F35, but in my mind having a 100% digital aircraft seems pretty scary.

StableAlkyne

11 hours ago

Generally, if the cockpit is getting hit with damage to the instruments, there is a very good chance the pilot has also been injured or killed, and doesn't care about the instruments anymore.

In old gun fights (which just don't happen anymore), shots were likely to come from behind (so, they intersect the pilot) or the top (so, through the canopy if they're hitting the instruments). This has to do with the orientation both planes are probably in if one is shooting at the other. Go back farther and you get shots from the front, not from fighters (head-ons are very difficult to pull off outside of videogames) but from bomber tail gunners - very old planes from WWII even had bulletproof glass in front of the pilot for this reason. If the F35 has gotten into a gunfight, the pilot has fucked up, it's not a dogfighter and wasn't designed to be one.

Even nowadays, if the missile or flak pops next to the cockpit and has managed to damage the instruments, there is a very strong chance the shrapnel has also hurt the pilot to the point that they're not flying home that day. This is the most likely way for the F-35 to be damaged in the modern era.

There are obviously scenarios where the instrument panel gets damaged but the pilot is okay, but it's such a low probability scenario that they likely deemed it to be less harmful than the benefit they foresee in a glass cockpit.

liendolucas

11 hours ago

Thanks for replying! As other mentioned I was missing/not considering the most important case that the pilot is assumed to be dead and that the plane is not supposed to receive such fire.

zppln

11 hours ago

Can't speak for the F35, but for the fighter I work on we basically consider the pilot dead if you have shrapnel damage in the cockpit. For instance, the FCS is located behind the pilot. That being said, I would assume the F35 display being at least dual redundant (think two displays merged together, which can be done seamlessly) for flight safety reasons.

stavros

6 hours ago

If the displays are merged seamlessly, how will you know if one has failed?

EasyMark

5 hours ago

I assume both displays in use, but have failover to one only if a display goes “dead”and then the single display can still display the most critical information/controls to the pilot; to me that seems like the only logical implementation. Giving 1 of the 2 screens failing. It should be fairly easy to set up a redundancy failover process, I’ve done that many times in embedded coding where we failed over to a backup system.

iancmceachern

11 hours ago

They're not just regular screens. They're highly hardened, redundant, specialized displays, it's a whole industry.

There are companies that make displays that have clear conductors over the screen so they can heat them so they can be used and maintain function even when on the deck of an aircraft carrier in the arctic.

There are companies that still make CRTs for specific military purposes.

These screens are safer, more reliable, and durable than the mechanical systems they replace.

dfox

9 hours ago

The displays aren't that much special. Probably the main two things that are special about them are color rendition and contrast and the rest is just about the certification process. And extrapolating from automotive experience, the color rendition and contrast is about some team of engineers being solely dedicated to simulating various lightning conditions and verifying that the screen remains legible, does not interfere with night vision and does not cause reflections on other instruments that would make them hard to read. In automotive this kind of simulations use multiple terabytes of reflectivity data for various mostly “dull” materials (gigabytes upon gigabytes of data on what the driver might wear…), so extrapolate from that to “most advanced fighter aircraft”.

jeff_vader

11 hours ago

Basic flight instruments almost always have a backup. In case of F-35 there's a small square screen in centre console which shows attitude indicator and flight parameters. Needless to say, if main screens are out you are turning around and looking for the nearest airport.

kens

9 hours ago

The backup for the display is an integrated standby instrument system (ISIS), which combines several essential instruments into one small digital display. An ISIS typically has its own sensors and a battery backup, so it should stay operational even if the main display fails. https://en.wikipedia.org/wiki/Integrated_standby_instrument_...

toast0

11 hours ago

I'd imagine the ejection system is going to be activated by traditional handles, and not a screen. Same with the basic flight controls; there's no reason to move to a touch screen throttle or flight stick.

the__alchemist

10 hours ago

Depending on the fighter: redundant systems. Ie multiple independent Ring Laser Gyros, (viewable on multiple independent displays), backed up by analog "round dials" instruments.

akgoel

11 hours ago

The F35 is not meant to be a dogfighter. If it has gotten shot such that the control screen in unusable, something else has gone wrong.

the__alchemist

10 hours ago

It's a high performance fighter with a gun and SRMs, so...

talldayo

7 hours ago

...so you turn around and go home long before you're forced into a gun fight.

Sure, the F-35 is a multirole fighter aircraft with a gun - so is the F-15E, but that doesn't make it a dogfighter. If you're in a position where a guns solution is your only kill option in an F-15 or an F-35, something has gone terribly wrong. Pretty much everything about both jets is designed to operate in a theater with extremely high levels of air support and friendly materiel. It is assumed that they won't get into a dogfight because it is extraordinarily rare for an F-35 to exhaust two AIM-120s in a single sortie, let alone the 6 it can carry in stealth mode or the whopping 12 AMRAAMs that the F-15 can lug along.

Even as far back as the Vietnam War, forcing a missile truck like the F-4 into a dogfight with a MiG-15 was a death sentence. You don't need a very active imagination to suppose how an F-35 fares in a guns-only dogfight against an Su-27.

hydrolox

11 hours ago

to be fair isn't the purpose of the F35 fairly different since it's extremely reliant on stealth and beyond visual range engagements?;Instead of getting close enough to be gunned down, it is supposed to strike from so far away that the enemy wouldn't know it's there.

heavenlyblue

11 hours ago

If you project a line that crosses an aircraft instrument panel it's hard to imagine a line that didn't also go through pilot's body.

Amir6

13 hours ago

Fun fact is these airplanes are still being used as the backbone of Iranian Airforce and the very same unit was being used before they upgraded the avionics a couple of years ago on some variants.

kens

14 hours ago

Author here if there are any questions...

farseer

12 hours ago

How accurate you think this instrument was compared to the ic based sensors found in your typical smartphone nowadays?

kens

9 hours ago

According to a paper on navigation sensors, commercial grade sensors have gyroscope drift of 0.1º/s (which is consistent with iPhone data), while navigation grade sensors have a drift of <0.01º/hour. I couldn't find specific numbers for the F-4's inertial navigation system, but I assume it is navigation grade. So the aircraft gyroscopes would be orders of magnitude better than a smartphone. For the azimuth, the F-4 used a flux valve compass, which must be much better than the relatively poor compass on a smartphone. Of course, the smartphone sensors are orders of magnitude cheaper and smaller.

[1] https://doi.org/10.1186/s43020-019-0001-5

bigiain

4 hours ago

Drone flight controllers can use 3 axis magnetic field sensors combined with their 3 axis gyros and accelerometers, and use various sensor fusion algorithms to produce higher accuracy compass headings which shouldn't have _any_ long term drift, since the magnetometer readings can compensate over time for any short term gyro drift.

I'm not sure how accurate the magnetometers are, and how accurate the calculated compass heading can be, but I recall a project where I used a pair 6DOF accelerometer/magnetometer breakout boards to measure the angle change between the body and the grind setting adjustment ring to datalog my coffee grinder, and I was getting 10bit resolution degrees which all _seemed_ to accurately track the small sub 1 degree adjustments when dialing in the grind. Although I'll admit to not taking the time to understand all the math involved to get from the raw 3 axis acceleration and microTesla magnetic measurements to compass heading, I just copy/pasted sample Arduino code which gave me 1 decimal place degrees. But my grinder has 100 "notches" around the 360 degrees of the grind adjustment ring, and my data logger at least looked like it was accurately/repeatably showing changes down in the quarter to half a notch range.

So while I don't think $50 worth of stuff from Adafruit will get you <0.01deg/hour gyro drift rate, with magnetic calibrations and sensor fusion you could probably hit at least 0.1deg long term precision/accuracy, so over 10 or more hours you'd have "navigation grade" accuracy. Which is probably "good enough" for anything traveling at less that a couple of hundred MPH. 0.1deg error is about 1km every 600km - so if you were sailing from Sydney to San Francisco (with no GPS or astro nav equipment) which is ~12,000km, you'd be within 20km or so when you got there. Which seems "close enough" for anything except delivering nukes.

dfox

10 hours ago

This instrument is only an display that shows data coming from another device that does the actual measurements. One of the reasons why the threewire synchro interface is used is that it is surprisingly accurate as long as you don't care about the fact that it is “slow” by modern standards. The same interface was used to direct artillery and similar things that require significant accuracy to be effective.

echoangle

9 hours ago

Well this is just an indicator, the accuracy from the actual IMU would also need to be considered. The indicator itself probably isn’t the main source of inaccuracy once the IMU has drifted a bit.

29athrowaway

an hour ago

Crazy to think that all that technology was built by people using slide rules.

bargle0

12 hours ago

I bet the engineers responsible for this would be so stoked that someone figured out how they solved all these problems.

dmitrygr

12 hours ago

kens@ is a treasure we do not deserve.

kens

9 hours ago

Thanks!

Wait, you're the Linux/4004 guy, aren't you? That project was truly amazing.

AIorNot

14 hours ago

pretty awesome to see the engineering details involved! -thanks. as a software person I always wonder how they handle bugs and QA when building complex pieces of hardware like this

dev_tty01

13 hours ago

The strangest concept for modern software engineers is that it had to ship bug free and it could never be updated with firmware patches. Shipping under those constraints brings a certain level of focus not experienced in modern design.

mpenet

10 hours ago

My dad used to work on certifying, servicing and making custom instruments for planes, subs, prototypes of all kinds of that era (60s to mid-90s).

His “lab” was basically all about testing and simulating environments for the instruments. He had tons of sayings about not having room for error in his line of work. This is as close as you can get from “building bridges” and to this day I don’t think I have seen this level of attention to detail/perfection in any other profession.

His job involved electronical engineering , mechanical engineering and programming amongst other things, not to mention a deep knowledge of the physics of these environments.

Back then also the tools or source of information that were available to them were quite crude compared to what we have now.

His spare time was all about flying, pimping his ham radio gear with all kind of “home made” electronics, build antennas and messing with computers. I guess he’d qualify as a “Hacker” nowadays.

drtgh

13 hours ago

I think the key is that in those days you didn't launch a product until you were absolutely sure it was going to work well, it was prototyped and debugged before it was launched. At least that is the impression one get with classical tech, solid reliability.

consp

13 hours ago

So basically like designing and building a bridge?

eschneider

9 hours ago

Umm...If you ship firmware today, sure it _can_ be updated, but almost nobody does update firmware, so yeah, that shit has to work when it ships.

Also, I've never been at a place that tested FW patches as well as full releases, so...do you _really_ want to install somebody else's random FW patch? I don't unless I have some known problem with a fix in the release notes...

syndicatedjelly

13 hours ago

Physical products require "test engineers" to design and run appropriate physical tests of products. It's an entire discipline worthy of study. Design for Six Sigma is a great place to start if you're ever interested in understanding ultra-high reliability applications.

https://www.youtube.com/watch?v=_g6UswiRCF0