Is the most GPU-demanding ORBX DLC = Chicago?

  • I noticed last night while trying out Chicago-Meigs that quick turns cause stutters but when I use the same graphic settings (U,U,U,M,M,M) for any other ORBX areas, quick turns are smooth (no stutters). Is this just because Chicago-Meigs is the only ORBX scenery with a bunch of tall buildings?

    I've flow around NYC a lot which is not ORBX and also had to use lower graphic settings (U,H,H,M,L,L) . I was hoping tall buildings in ORBX scenery would do better but maybe not. I have Render Scale @ 1.33 with my Samsung Odyssey.

    Is this because each building is unique versus the hand-placed houses in ORBX Monterey which are just duplicates of each other? Or is it because each building is tall / has more volume?

    Maybe some day we'll get autonomous graphics settings management where the user just selects one of the following: NO STUTTERS --- SOME STUTTERS/BETTER DETAIL --- MORE STUTTERS/MAX DETAIL and then the software dynamically adjusts to the scenery encountered along the route (takeoff at small airport and land at big city airport).

  • I would say yes, Orbx Chicago is the most demanding scenery, not only because of the custom buildings and endless autogen, but also because it seems to need to load scenery more often than anywhere else.

    That's just my observation.

  • What with AI in software these days perhaps we'll see an option like "auto-framerate" on/off, where you can opt to have the software automatically reduce various quality settings (beginning with the least noticeable) to keep the frame rate at a given level. I'm not sure if a bit of this is currently in FS2, but I believe it is in other gaming software. WMR has a bit of auto-framerate built into it, but I still see a small degree of juddering in ORBX Monterey and Innsbruck in both VR and 2D/monitor when making higher rate turns, that is not there on regular IPACS sceneries. If AI framerate can be implemented, perhaps when in VR it could degrade the outside edges first as blurring of the peripheral outer 40% of the image already occurs due to the "sweet spot" nature of Fresnel lenses in HMD's.

  • ... If AI framerate can be implemented, perhaps when in VR it could degrade the outside edges first as blurring of the peripheral outer 40% of the image already occurs due to the "sweet spot" nature of Fresnel lenses in HMD's.

    This is called foveated rendering and is very likely something we will see in VR HMDs in the future. The idea is that you combine this with eye tracking, so it follows your pupil and only renders the center of your field of vision with max quality, and matches the sharpness in your peripheral vision with the eye's capabilities so you're not wasting GPU resources in those areas. This will probably be a necessary technology to enable much higher resolution in future VR devices without sending the system requirements through the roof! :)

  • :thumbup: Thanks! I'm learning more than I ever thought I'd need to know, but it keeps the aging synapses firing , right?

    p.s. I'm looking to exploit my VR hardware with other types of sims, and am looking at a racing sim called "Project Cars 2". Absolutely stunning graphics. Both the car models and the racetracks are created by laser scanning. Perhaps this is a technology that could be exploited for FS2 models at some point in time. The future of VR definitely is looking good. I'm not sure if IPACS ever got a WMR headset for R&D for FS2 compatibility, but it appears in the racing sims I'm looking at that if it works with VIVE via SteamVR, it will also work with WMR via SteamVR, even though WMR is not called out as supported. Interestingly I'm seeing quite a few reviews where the author had a Rift or VIVE and switched over to WMR Odyssey for higher res and performance.