Not 360, just your primary frontal field of view.
The human body is not equipped and the human mind is not wired to process 360 views, and providing that will either require some sort of direct neural interface, or using up a portion of your existing natural field of view.
But regardless of how you get the additional images to the brain, I think the brain itself will be the main limitation, as one could easily get confused and disorientated if one could suddenly see all around in 360 degrees.
I think the best that can be managed is normal vision resolution in your natural field of view, with a blurry, less clear impression for where your natural vision would not normally be able to see projected into the mind via some form of neutral interface. This will help a pilot stay oriented, but also allow them to keep track of enemies outside their normal field of view without it becoming a distraction.
Although I fear the first few generations of this kind of tech will probably require implanting physical neural plugs into the head or spine of subjects, especially for demanding military applications.
But anyways, back to this specific station. Couple this with a HMS or just helmet with head tracking cues so the camera view pans with your head movement, and you are close to getting ground controllers the same visual situational awareness as pilots in the actual cockpits.
If they could resolve the lag/latency issues with long distances (via quantum entanglement perhaps?), and they will be well on their way to developing unmanned air superiority drones.