battlefield chips that actually run vehicles definitely don't need small node chips, the simple reason is that for most applications not involving big data (which also implies big storage, which implies big data throughput, which implies stationary datacenter on the ground) you just need to drive things around.
For consumer chips the big computational expense is graphics. A 4K screen has 8 million pixels, each pixel is RGB enabled, and usually RGB is minimum 24-bit color. At 60 refresh per second, that's 11.5 GB/s data transfer rate. That's a ton of data just to drive the screen. Not to render graphics on it... just to make it do anything at all.
What does a 4K screen driver chip require? 28-90 nm for smartphones, 110-130 nm for TVs. Dirt cheap for TVs, still moderately cheap for smartphones.
Imagine what you can do with 11.5 GB/s of data transfer capability and a ~1.5-2 GHz core (early 2000's Intel, 130 nm process). How much data can you store, manage and display on a medium resolution display for trained pilots? Do pilots need to be playing Crysis or something on max graphics? No, but they do need to not fall out of the sky to random cosmic rays.