• Assuming that a tablet kneeboard would be a ways down the road,

    I was wondering if there was any to re-purpose the moving map from the C172


    Snipping out what seemed relevant from c172.tmd and inserting into asg29.tmd I was able to get the moving map to display on the Zander screen


    asg29 moving map.bmp


    The Zander definition has a "Page" element so I thought I might be able to display the moving map as an alternate page. No luck. There does not seem to be a way to associate elements with pages.

    Also currently no way to switch pages while flying. The moving map display depends on the precedence order in the RenderList and supercedes the Zander output display.


    The Zander screen is also rather small and does not appear to have any way to zoom the map.


    An interesting exercise in poking at the aircraft object model but I don't think this will be a useful approach for a moving map.


    I guess I will just have to wait for the tablet kneeboard. :(

    i7-6700K CPU @ 4.00GHz | ASUS Z170-A | 16Gb DDR4 | Samsung SSD 950 PRO NVME M.2 256GB | Samsung SSD 850 EVO 1TB | GeForce GTX 1080 Ti on GP102-A GPU | Oculus CV1 | Windows 10

  • I like the idea of having a moving map in the ASG29, though to be honest the real world zander does not have a colored moving map. It has a basic map with your track and air-traffic spaces.


    The pages in the zander graphics object are not completed yet. You can't switch to a moving map from there, that has to happen before the render call. Instead of adding the moving map and the zander to the render list, try adding a render switch in between. That renderswitch has an input select which you can use a page id for...

  • I realize adding the moving map to the zander is a break with reality, but jumping back to the menu and selecting the Location page kind of breaks the immersion as well.


    A tablet/smartphone kneeboard would be ideal as it could be zoomed and moved closer to the POV to overcome the VR resolution issues (and help my challenged eyesight) but I am not holding my breath.


    I was poking in the dark trying to infer the object interactions.

    Is there any object model documentation that IPACS would be willing to share as part of the SDK?

    i7-6700K CPU @ 4.00GHz | ASUS Z170-A | 16Gb DDR4 | Samsung SSD 950 PRO NVME M.2 256GB | Samsung SSD 850 EVO 1TB | GeForce GTX 1080 Ti on GP102-A GPU | Oculus CV1 | Windows 10

    Edited once, last by lenidcamper ().

  • Sorry, I should have prefaced the whole discussion that this is in VR where the popup map does not appear.

    i7-6700K CPU @ 4.00GHz | ASUS Z170-A | 16Gb DDR4 | Samsung SSD 950 PRO NVME M.2 256GB | Samsung SSD 850 EVO 1TB | GeForce GTX 1080 Ti on GP102-A GPU | Oculus CV1 | Windows 10

  • This same conversation has come up a few times before. We ask you to please be patient while this is developed into VR. It's still going to be some more time before this is completed.

    We are aware of this and it's an important feature that we know needs to be in for VR users, it's just not ready yet.

    IPACS Development Team Member

    I'm just a cook, I don't own the restaurant.
    On behalf of Torsten, Marc, and the rest of the IPACS team, we would all like to thank you for your continued support.


    Regards,


    Jeff