Leaderboard
Popular Content
Showing content with the highest reputation on 03/22/2022 in all areas
-
In today's progress report, we discuss the flight simulation experience from the perspective of the cockpit and how we approach trying to maximize immersion in our MU2 simulation. While undoubtedly one of the more enjoyable aspects of using a desktop flight simulator is having the ability to do things you cannot do in the real world, like get out of your plane and look at it from cool angles like you are a drone, or do a camera flyby at 5000 feet... it is the simulation of our experiences from the perspective of the cockpit that takes the highest priority for us. I think it fair to say that the goal of desktop flight simulation is to simulate reality, but those 4 syllables belie the complexity of just what constitutes reality. Remember the "...as real as it gets..." marketing slogan? Well if that was as real as it gets, why is flight simulation continuing to evolve and improve? Obviously it wasn't as real as it gets! So what was missing? The answer lies in noting that the perception of reality is comprised of many co-existing categories of fidelities with respect to how we sense the world. Some examples are: Geometric fidelity - how accurate is the 3D model Color fidelity - how accurate are the colors of the 3D Lighting fidelity - how accurate is the lighting model Aural fidelity - how accurate are sound dynamics Resolution fidelity - what is the depicted resolution compared to our eyes' abilities Dynamic fidelity - how accurate are dynamic events, movements, timings Tactile fidelity - how accurate is your physical interaction with the sim Cognitive fidelity - how close does your brain perceive the sim interaction as being realistic? With so many areas of fidelity for a developer to address, it is inevitable that some areas are addressed moreso than others, and in differing orders or priority. This is why one flight sim user (user A) might think a product is spectacular, while another (user B thinks that it is not. It is simply that user A and user B each have differing areas of fidelity that resonate with them as being important to their simulation experience. For many folks, a deficiency in one area, especially one that resonates with them as important, can ruin the whole experience for them. As to why user A/B have differing areas or interpretations that resonate with them as being important to simulation, who can say. Lets be thankful we're not all clones! It is also why constant improvements are available to be pursued and indeed should be. The first and foremost area to address by every developer is "what you see", that is the geometric and color fidelity...and X-Plane provides the lighting fidelity. This means, thanks to the powerful video cards we have today, that we can model more 3D detail into the geometry, rather than trying to fake it with textures. This is especially important for VR users and in conjunction with PBR rendering methods, results is more accurate lighting and depth, further contributing to the immersion. The screenshots below shows some implementations of the geometric and color fidelity in the MU2 cockpit. Knobs have raised ridges, edges are softened with bevels and roundovers, fasteners have recesses, etc. Such geometric detail is expected nowadays and the end result is a much more immersive experience in sim and in VR. Remaining with the "what you see" fidelity concept. The next area of focus is the dynamic behavior of cockpit elements, which is to say the animations. This area really resonates as important to me. If something looks realistic statically, but doesn't move realistic, then immersion can be lost significantly, but there are caveats in some cases. For example, X-Plane represents many things as "binary state", i.e. a switch position is represented by either a 0 or 1. When developers animate switches using such values, then the switch will just "snap" from one position to the other instantly, which isn't physically realistic. Conversely, if animated fully, then the switch animations can also be overly deliberate and slow looking, which can also seem unrealistic. If we look at an example of a simple 2-position toggle switch, you move the switch about half-way and then it snaps to its opposite position seemingly instantly. To our brains, the act of flipping a 2-position switch is almost autonomic and so to this day, there are many 2-position switch animations in aircraft that just snap between positions and we really don't give it a 2nd thought or view it as detrimental to our immersion in the simulation. This gives rise to the idea of "cognitive fidelity", in other words, what do most minds perceive as realistic behaviors? This is very subjective and as it turns out, we give this a fair amount of consideration in our animations and interactions in the MU2 cockpit. The fact is that our eyes and brains can perceive things quite a bit faster than 60 fps for some phenomenon, and less so for others. Space/time as it were. We pick up movement in our peripheral vision that we don't conciously think about, yet are alerted to if they are not quite 'normal' as our experience tells us they should be. Some things move quick enough that we don't pick up on the physical space they move through to get there, only the results (the 2-position switch). So how do we approach this and what does it all that mean for the MU2 cockpit experience? For animations, it most cases, it means we utilize a lot of physically based animation code to animate things they way they behave in reality. We don't use default X-Plane datarefs for anything when it comes to animations. The animation below shows what happens when powering the DC busses (inverter switch on too). Each instrument has some power on behavior, whether a self-test, or poweroff/poweron position. Some things move mechanically, others magnetically and each with their own unique behavior governed by their physical design. For example, the HSI and RMI compass cards are magnetically aligned/synchronized with a 'comparator synchro' behind it. When powered on, if the compass card isn't aligned with the comparator synchro, the card will very quickly "align magnetically" when energized, resulting in overshoot of the compass card, and some decaying vibrations before settling into stable alignment. Further, notice the comparator synchro is itself synchronizing with the magnetic heading from the flux compass, which happens at a fixed "synchronizing rate". On power on, this results in two dynamic behaviors in reality. 1) The compass card aligns with the comparator synchro magnetic field and 2) The comparator moves to align with the magnetic heading. While it is perfectly feasible to simulate a flight in the MU2 without all this animation behavior, doing so takes away from the immersion and that is the one thing we don't want. For those curious...the RMI / HSI cards can get misaligned whenever the plane is moved-around / rotated with power off. animations_1_opt.mp4 You may note previously I said we animate things physically correct in "most" cases. Why not all? Have you ever tried to flip a switch in X-Plane and you just couldn't grab the manipulator? or maybe you try to open some access panel and you can't drag in the correct direction and the panel won't open? Before long you realize you've taken 10 seconds to do something that in reality would have taken 1? This is an example of physical accuracy undermining cognitive fidelity in the simulation. In your mind, the action should be autonomic, in X-Plane it becomes a comedy show. I won't be too hard on X-Plane or developers here, because the mathematics of moving things "into and out of the plane of the screen" is ripe with challenges and difficulties. In select cases, we draw the line and depart from physical accuracy for the sake of cognitive fidelity. When I first developed the MU2 around 2006, I was getting perhaps 5 hours a week in the real thing for the better part of a year. When operating that aircraft, especially preflight, you're mind can get "in a zone" and things flow at a certain speed. I was finding that in X-Plane, this smooth flow of mental operation was impeded by the nuances of working with X-Plane's manipulators and camera view control in many cases and just did not capture the feel of operating the MU2 with the mental fluidity07 I felt in the real thing. I personally found that aligning certain interactions with the sim to 'cognitive fidelities' rather than 'dynamic fidelity' resulted in a more natural experience to me. In other words, the time it took to "effect something" was more important than simulating the physical movement to do something given the limitation of X-Plane "screen space". Without exception, our first efforts are to align the two paradigms and if we cannot get them to sync up, then we look at alternatives. We cannot escape the fact that we are trying to recreate a very 3D experience in quite a confining way and some departures from reality are acceptable. The good news is that most things are operated in both physically and cognitively natural ways with only minor departures. One practical example is that you cannot move the sun-visors all over the place. I'm not here to show off some XY manipulator wizardry. The real MU2 sunvisor is a ball/socket joint with quite an impressive range of motion that is limited by the limits of the cockpit. We don't do 'collision detection' of our sunvisors with the overhead...sorry! You click spot 1, it moves down, you click spot 2, it moves to the side window and vice-versa. I didn't think about, nor fight with it in the real thing and do not want to do so in the sim either! Again, the overriding goal is immersion and 95% of the time, pursuit of real, physical fidelities results in natural cognitive fidelity, but in cases where one interferes with the other, we depart from the real to cater to the cognitive, which to the philosophers out there, is the only one that matters. Enough with theory of game interactions. A quick word to wrap up on the cockpit variants all that gibberish above has been applied to. There are probably few, if any, MU2 Marquise's out there with original panels; however, it would be a slight injustice not to provide that variant...or to be completely honest, a variant of that variant. We may have mixed early and later model long-body features to better align with the practicalities and limitations of X-plane. So the old-school radio fans have their version. Next up is the GNS 430/530 GPS version. This simulates the most common upgrade to MU2s, and that is the addition of the 430/530 GPS units. A word of warning, upgraded GPS doesn't mean an upgraded autopilot! It is easy to forget that autopilots and that "magenta line" are not the same person. There is a bit of finagling to make things work, exactly as in the real thing. Its not unheard of for pilots to use the GPS to plan their flights and then simply adjust the HDG bug manually to keep the MU2 'around' the magenta line. Finally, for those glass junkies who purchase(d) the G500/600 product by RealSimGear, you get the GLASS version available to you for ultimate situational awareness and the gee-whiz, Captain-Kirk, lightshow. Next report, we'll talk a bit about the systems and engine simulation.2 points
-
Here is to refuel from non-persistent mode (which is the mode I always use) SELECT Ground services-> refuel -> internal panel and, Ground services -> refuel -> refuel truck I put one window above the other and energize the internal panel, and set in Preset the required quantity in USG on the 'Fuel truck'. And press ENTER on the truck panel seems to be necessary to have the fuel truck stop at the preset quantity. LBS to USG Handy rule of thumb to convert lbs to usg is to divide by 10, divide by two and add to the first value. E.g. 1500 lbs uplift required : 150 + 75 = 225 USG2 points
-
I should elaborate more. There are two ways the autopilot can "work" with the GPS...so that's a bit of a loaded question. The GNS units have analog outputs so that they can drive older autopilots via the "CDI tracking" capability the same way the autpilots track a VOR/LOC......but of course this means you have to adjust the OBS to the next course leg before every turn before the CDI guidance by the GPS can do its thing; however.....I have decided to implement a "GPSS converter" for the GNS variant, which is available within XPlane (going in right now in fact). ...and so there will be a GPSS/HDG button on the panel for that variant. The G600 variant has GPSS 'built in' as it were....no HGD/GPSS button needed, its good to go driving the autopilot via GPSS. -tkyler1 point
-
Beautiful! Will you be implementing RealityXP GTN/GNS integration?1 point
-
1 point
-
Unlike other aircraft types in the Collins auto flight system VNAV is a modifier on the standard vertical modes (VS, FLC, etc.). It basically instructs the system to respect altitude constraints in the flight plan, and if enabled, use speed targets from the FMS. When descending in VPATH on an arrival and initial approach, similar to an ILS, once the plane is flying towards the final approach point for the RNAV/RNP then one can push the APPR button and VGP will arm/capture. Once active the plane will descend on the final glide path below the preselected altitude. Hope that helps.1 point