tilcdc and double buffering

We are attempting to bring up a beaglebone black based design. We are trying to roll our own image. So far this has been going rather well, but we have hit one stumbling point. We are trying to get the display driver configured properly. Currently we have a device tree with the following entry:

`

&lcdc {
status = “okay”;
};

/ {
panel {
status = “okay”;
compatible = “ti,tilcdc,panel”;
pinctrl-names = “default”;
pinctrl-0 = <&nxp_hdmi_bonelt_pins>;
panel-info {
ac-bias = <255>;
ac-bias-intrpt = <0>;
dma-burst-sz = <16>;
bpp = <32>;
fdd = <0x80>;
sync-edge = <0>;
sync-ctrl = <0>;
raster-order = <1>;
fifo-th = <0>;
};

display-timings {
native-mode = <&timing0>;
timing0: 1024x768 {
clock-frequency = <65000000>;
hactive = <1024>;
vactive = <768>;
hfront-porch = <110>;
hback-porch = <90>;
hsync-len = <30>;
vback-porch = <22>;
vfront-porch = <12>;
vsync-len = <4>;
hsync-active = <1>;
vsync-active = <1>;
de-active = <1>;
pixelclk-active = <0>;
};
};

`

This is a pretty basic sample that I have found referenced in various resources. We get output displayed at that resolution on a monitor connected to the controller. The problem is that I cannot figure out a way to default the memory allocated by the drivers to give us the ability to have a virtual y resolution of twice the height. This would allow us to have the correct buffer space right off the bat in user space. I have example images that have this done. A build of Robert C Nelson seems to have it set up for triple while various TI builds let us adjust the virtual resolution on the fly. I either don’t have the source for those or can’t quite seem to find what was done to make that happen. If anyone has any insight on this that would be fantastic.