No real-world example means no proof for your claim that variable-geometry wing is better. Real-world engineering decisions have so far collectively decided against variable-geometry wing for modern fighter aircraft, and that's a fact. Real-world engineering decisions reflect what's better.
The key phrase is "fighter aircraft", which implies requirements to which VG wings are a poor solution due to weight penalty - doesn't mean it's not an excellent choice for other jobs with differing priorities.
Does high aspect ratio (i.e. VG wings unswept) give better induced drag than a (fixed) low aspect ratio wing? Yes.
Does relaxed stability reduce trim drag on a high aspect ratio configuration just as it does with a low aspect ratio wing? Yes.
These technologies address different drag contributions (trim vs. lift induced), hence one is not a replacement for the other and they can be combined for added effect.
Can the VG wing be swept aft for low aspect ratio to match supersonic wave drag characteristics of the fixed wing? Yes.
So if you require excellent subsonic loiter while simultaneously needing a Mach 2+ dash capability, VG wings (plus relaxed stability, but in the 21st century that should really go without saying) are best. That the J-20 (and most other fighters) doesn't have subsonic endurance requirements this exacting is a moot point.
All that says is that relaxed stability is good solution for decreasing trim (and enhancing endurance), thus proving my point. It does not support your view that endurance can only be achieved through variable-geometry wing.
No bias exists on my point, as the statement is about endurance achieved with variable-geometry wing can be achieved with relaxed stability; it is one versus another comparison. You are trying to attribute benefit of relaxing stability to variable-geometry wing to support your faulty assumption, and that's the real bias.
The claim was certainly not that VG wings are the ONLY way to enhance endurance - nice try. I never denied that relaxed stability works - let's not forget, though, that your original point (which I DID disagree with) was that LERX / vortex lift helped in this regard.
Nonetheless, while relaxed stability is effective, it will get you only so far
alone, because it does nothing to improve the inherently poor subsonic lift induced drag of a low aspect ratio wing. As I said, VG wings solve a completely different type of drag issue - these technologies are complementary and presenting one as an alternative to the other misses the point.
Not true, as the DSI is known to have higher performance than traditional fixed inlets, and that includes better pressure recovery. A simple conic or isentropic inlet is insufficient to explain DSI.
Better than historical fixed inlets? Sure, I can believe that - why *wouldn't* a fixed inlet (DSI or conventional) designed with today's methods and tools be superior to, for example, the 1950s F-104 intake? In the context of the Su-57 we are not talking about either a historical or a fixed inlet (given the caret-style shock sweep in two planes, even the "conventional" part is arguable), however.
Let's look at that rather vague statement in more detail:
Reference [3]
Only the abstract is available, so the specifics of the competing ramp inlet are not known to me, but if the number of shocks and inlet flow turning angle are the same as for the flow field used to generate the bump, it's not surprising that the DSI would come out on top. Under these circumstances a conical flow field will have shallower shock angles, meaning lower entropy rise, meaning higher pressure recovery - that's a consequence of using conical shocks though, not some magical property of DSI.
Reference [4]
Here the bump acts as a "drop in" compression surface replacement for a second ramp, so in this instance we really are replacing a 2D wedge shock with a 3D conical equivalent (though perhaps greater turning angle to maintain shock on lip without increasing length, i.e. higher static pressure rise for same loss). Again, no magical DSI effect required to explain the observed results.
Reference [5]
Same as [4], pretty much. Since the entire paper is available you can even see it by superimposing the two inlets in figure 5:
First ramp is parallel, the bump turns the flow through a slightly steeper angle to keep total length the same, with greater pressure rise for the same length (but also higher projected frontal area). Also, the ramp inlet was not fitted with a BL bleed system, so it suffers from stronger BL growth due to the adverse pressure gradient on the compression surfaces (the bump pushes most of that BL overboard without a bleed) - figure 6 illustrates that quite clearly.
Corroborates several of my points.
Reference [6] is in Chinese with only the abstract and figure captions in English, so I can't really comment beyond saying that none of the English parts and none of the images deal with conventional intakes. As the Chinese text is unfortunately not text at all but incorporated in the pdf as an image, you can't even get a machine translation without great effort, but given the experience with the other sources, I suspect any comparison will be similarly skewed. That doesn't make it necessarily redundant or even wrong (for the JF-17 as a real-world example the choice really WAS between DSI and a fixed ramp with diverter), it just isn't very applicable to our more general discussion.
For what it's worth:
(in addition to the NASA/USAF research project I posted earlier).
All streamline tracing method does is explain how the bump is generated. Design method for the bump tells you absolutely nothing about inlet drag and pressure recovery, nevermind performance of other inlet types.
Sure it does - the subsonic drag penalty of a complex, high flow turning angle shock system is due to the increase in wetted inlet area (longer compression surfaces) - happens the same way with a DSI designed to create a complex, high turning angle shock system as the bump size increases. OTOH the supersonic drag penalty comes from increased cowl angle which is directly related to the angle which the flow is turned through by the shock system, as the cowl should meet that air at low or zero incidence. DSI is therefore not exempt from any of these effects, and this does follow from an awareness of how the design methodology works.
Cute, but it doesn't work. LEVCON is simpler than canard as there is no canard downwash to deal with.
As I said, your canard analogy is an oversimplification. And while there is no downwash to deal with, in the specific case of the Su-57 the LEVCONs, due to their location, will interact very significantly with the engine inlets - unlike canards. This adds another layer of complexity which is at least as challenging, since what ordinarily is a purely aerodynamic design task is now coupled to a new set of constraints.
DSI, as you admitted yourself "is a bit harder to do than with a traditional intake", and more complex.
Yes, with respect to handling BL diversion and ensuring it really does keep the stagnant air out of the inlet as well as a conventional diverter (in an actual aircraft installation, DSI has to deal with the BL from the entire forebody, unlike the theoretical study in Ref. [5]). So the challenge is in *not worsening* the pressure recovery theoretically achievable from the chosen shock system, not so much improving it over a conventional (fixed - let alone variable) analogue.
Don't get me wrong, DSI is an extremely elegant solution with good pressure recovery at the design point and decent off-design performance spread, while offering advantages in weight and RCS - what's not to like? As I've said several times before, I *am* in fact surprised myself that Sukhoi didn't adopt it.
However I also realize it just isn't the answer to any and every requirement that you make it out to be, and that hence different priorities might favour another type of intake.