1680x1050x60 Resolution via HDMI

I’ve been working for some time on trying to achieve the highest resolution possible for a BeagleBone Black system I’m setting up as a dashboard running Chronograf.

I’ve tried multiple operating systems and builds (including a brief stint with FreeBSD), but what I’m using now is a Debian 10.7 console build (4.19.94-ti-r57) downloaded from Robert Nelson’s server (BBB-blank-debian-10.7-console-armhf-2021-01-11-1gb.img.xz). FWIW, none of the versions I’ve tried (including those with LXQT) seem to make much difference with regard to the issue I’m facing…

The BBB seems to really want to default to 1280x1024x60. I can’t get the resolution I want - which is 1680x1050 (I really want 1920x1080, but I suspect my monitor won’t support 24 or 30 Hz).

I’ve been able to get 1440x900x60 by adding “video=HDMI-A-1:1440x900@60e” to “cmdline” in uEnv.txt. Unfortunately, setting this to 1680x1050x60e seems to cause the resolution to revert to 1280x1024. BTW - what does the “e” suffix do? I’ve tried with and without, and haven’t figured out what its significance is.

I don’t see 1680x1050 in the list on https://elinux.org/Beagleboard:BeagleBoneBlack_HDMI#Supported_Resolutions, but from I’ve read it should work?

I read somewhere that resolutions/bandwidth are limited via the audio channel. So I’ve also tried disabling HDMI audio (I don’t use it) by un-commenting “disable_uboot_overlay_audio=1”.

My monitor, by the way, is an older HP Compaq LA2405x. I have confirmed that it supports 1680x1050 at 60 Hz. It is connected to the BBB via a mini HDMI to HDMI cable, then an HDMI to DVI cable. I’m nearly certain that the monitor isn’t the issue, as it will report on-screen if an input resolution/frequency is out of range (and I’m not getting that).

I’ve also tried setting the resolution for my Xorg session, but from what I can tell, the driver (fbdev?) doesn’t permit changing resolution. When I run xrandr, it seems to indicate that my minimum and maximum resolution is equal to the resolution I’ve already forced via uEnv.txt.

Forgive me if I’ve missed important details or this is a bit scattered. Most of this effort occurred over the weekend (before I had access to this group), so I’m trying to re-visit the key steps I took (ignoring many that appeared to have no impact).

I don't remember where this is explained in the u-boot/kernel code.
But the BeagleBone Black's AM335x just doesn't have enough memory
bandwidth to fully support "large" screen sizes..

This is why 1080p maxes out at 30 fps, or 24 fps when audio is also involved..

The 'e' is documented here:

https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/tree/Documentation/fb/modedb.rst?h=v5.11-rc4

"DRM drivers also add options to enable or disable outputs:

'e' will force the display to be enabled, i.e. it will override the detection
if a display is connected. 'D' will force the display to be enabled and use
digital output. This is useful for outputs that have both analog and digital
signals (e.g. HDMI and DVI-I). For other outputs it behaves like 'e'. If 'd'
is specified the output is disabled."...

Regards,

Robert,

Thank you so much for the quick reply and clarification regarding enabling/disabling outputs. Makes sense.

But I’m still not sure why I can’t get 1680 x 1050.

I believe the AM335x pixel clock is limited to 126.5 MHz. If I understand how this works, and the relationship is 1:1 (pixels per Hz; color is communicated via parallel bits), 1920 x 1080 x 30 Hz = 124.4 MHz (just below the limit). So, it’s clear why we can’t do 1080p at 60 Hz.

That said, 1680 x 1050 x 60 Hz = 105.8 MHz. This should be below the limit, right? What am I missing?

BTW - While I didn’t have success with it in prior iterations, I just tried “video=HDMI-A-1:1920x1080@30e”. I didn’t get video, but I did get my monitor indicating “Input Signal Out of Range” - which tells me the BBB is doing its job, my monitor just can’t handle 30 Hz (not unexpected). I’m not sure what’s different this time around; I swear in the past it would just revert to 1280x1024 when I tried that setting, but maybe I’m not remembering some detail.

I’ve screwed up something in my math above…sorry. Reading up on pixel clocks.

My apologies. I think I was over-simplifying some assumptions for the pixel clock. Not knowing the exact standards, I think the “cvt” tool in Linux gives me the answers I need.

I’ve highlighted the resulting pixel clock for each.

For 1920x1080x60:

1920x1080 59.96 Hz (CVT 2.07M9) hsync: 67.16 kHz; pclk: 173.00 MHz

Modeline “1920x1080_60.00” 173.00 1920 2048 2248 2576 1080 1083 1088 1120 -hsync +vsync

For 1920x1080x30:

1920x1080 29.95 Hz (CVT) hsync: 33.01 kHz; pclk: 79.75 MHz

Modeline “1920x1080_30.00” 79.75 1920 1976 2168 2416 1080 1083 1088 1102 -hsync +vsync

For 1680x1050x60:

1680x1050 59.95 Hz (CVT 1.76MA) hsync: 65.29 kHz; pclk: 146.25 MHz

Modeline “1680x1050_60.00” 146.25 1680 1784 1960 2240 1050 1053 1059 1089 -hsync +vsync

For 1440x900x60:

1440x900 59.89 Hz (CVT 1.30MA) hsync: 55.93 kHz; pclk: 106.50 MHz

Modeline “1440x900_60.00” 106.50 1440 1528 1672 1904 900 903 909 934 -hsync +vsync

So, assuming I’m reading this right, I think this clears up what I can and cannot do…

Apologies for the confusion.