There are no technical difference between supporting 2 screens or 3 or more screens.
Thanks for trying. Could you elaborate the problems or is there existing documentation?
I am not really a developer, but since multiple systems do this, i have a hard time grasping the difficulties. My personal workaround is a DP to 2 x HDMI Adapter, which presents the two monitors as one double wide display to the system
Intel provides a lot of documentation for their video cards. To write a video driver, one has to read it (at least the relevant chapters) and implement it.
For the main display, the BIOS already does a lot of the bringup to display itself. This means our intel_extreme driver can get away with skipping a lot of the steps, as it finds the hardware mostly already ready to display things. But if the BIOS doesnāt initialize the other displays, they have to be set up with all the steps.
On earlier devices this was relatively simple. But now there is a variety of display protocols (HDMI, DisplayPort, eDP for laptops, VGA, and not even mentioning the crazy displays over thunderbolt or USB). Each of them needs different programming in the driver. And also each generation of videocard comes with some changes: more displays get supported (when I started playing with the intel_extreme driver, hardware had support for at most two displays, now I think it goes up to five), more protocols, different organization of the hardware (we went from a 3 chip design with a CPU, northbridge and southbridge to an all-in-one unit), ā¦
So, for each new generation of cards, the new documentation has to be reviewed, and the driver adjusted to take it into account.
And, as long as you donāt have the entire chain of initialization steps done, you get nothing but a black screen. So this is not very easy to debug.
NVIDIA provides ready to use API: open-gpu-kernel-modules/src/nvidia-modeset/interface/nvkms-api.h at main Ā· NVIDIA/open-gpu-kernel-modules Ā· GitHub
It is already ported to Haiku, but have some issues with DisplayPort support, probably because of missing timers support.
NVIDIA GPU is currently best option for experimenting with advanced display support such as multiple monitors, hot plug, proper v-sync etc..
Thank you very much for this answer, it helps understanding the problems
And thanks for your great work in general, i very much enjoy haiku, hopefully this nut will be cracked one day ![]()
āNever send a software to do a hardwareās jobā ![]()
Could you post a screenshot of your Screen preference panel, so we could see what driver is picking this up?
In addition to all the different GPU types (cards, on-chip, ā¦), thereās also Haikuās internal support.
Even though I have both DisplayPort and HDMI output on my machine (on-chip GPU) and both works with Haiku, they do not both show a picture at the same time (and I bet if I had a VGA port, that would also work). This is because Haiku currently supports only B_MAIN_SCREEN; so support in Haiku needs to be at least enabled if already implemented.
There will likely be some applications, which is incompatible with multiple displays, because they were written for a single display only, so this is an āexternalā issue. I can imagine that applications using BDirectWindow might have difficulties and perhaps some ported software and libraries as well. -So the task might be bigger than expected.
⦠However, since Linux has great support for multiple displays, showing the same image on all connected displays during boot, U-Boot, Grub and friends might contain some interesting code. I do not know, however, if they only support text-mode (I think Ubuntu shows some splash / animation during boot, IIRC).
Haiku supports basic dual screen, even BeOS does if you install the needed drivers. I have that set up on my old Athlon XP machine from 2003, with a Radeon 9250 card.
Itās not perfect, but you can set the video modes and display things on both displays. This is good enough to at least get started with writing drivers for modern cards with the same level of support as the older ones. Once we have that working, we can improve Haikuās support, and make sure the displays are exposed as BScreen objects, that BAlert and other windows are centered on the āmainā display and not in the middle of the entire framebuffer (likely just in between two displays), etc.
But the blocking point is mainly the drivers side.
Had thought of this tool a long time ago. Possibly something for future dev:
See: PowerDesk for DualHead2Go | Desktop Management Software | Matrox Video
If that is the case, fine. I had expected that handling more than two displays would require more work.
My 2Ā¢ here. KISS KISS KISS KISS KISS
If you want more people using Haiku, you NEED to make things as simple as possible for the majority even if it feels painful programming it.
Most people will realize that if they have two monitors with different physical sizes of monitor and different DPIs that having a window across both monitors is going to cause one to be larger and the other smaller. If they want them to be the same, there should be a KISS way to change the resolution on EITHER monitor to try to match as much as possible (finding the GOOD shouldnāt be sacrificed for the PERFECT).
People WILL understand that two different monitors will NOT look the same or have the same resolutions PERFECTLY together. They wonāt like it but they WILL understand that. OR, just say in the Preferences that it is impossible for āthese twoā monitors to look exactly the same because they are 1) different sizes of monitors 2) too different with native dots per inch (in their native language so they understand - most donāt know what DPI is - just saying from programming/systems analysis for work for 40 years - KISS KISS KISS).
Thatās my 2Ā¢. This post can be redeemable for 1/10th of one cent (not really) note that coupons used to say something like that on them in the 1980s for some reason or another.
In some cases, itās (almost) impossible for the hardware displaying the image to know the DPI.
Most monitors would know, but how about a projector; it can be moved closer to or farther away from the surface itās projecting onto. No itās not impossible for a projector to know the DPI, as it could measure the distance to the surface, but ⦠such hardware would make the projector more expensive and I doubt anyone would really care that much about DPI to pay the extra fee.
OK, that aside, now we connect a VGA-splitter to our computer, the VGA splitter is connected to a monitor and two (or more) projectors. Which deviceās DPI should be used ?
-Yes, it could be solved by having a preference panel, where you could choose the display that would inform about the DPI (and override by manually entering the HDPI and VDPI).
-So, if the DPI problem should be solved in an easy way, Iād say: 1: ask the monitor, 2: always allow the user to override (possibly in āadvanced settingsā).
DPI can only be reliably determined by the user physically measuring the screen width and height. everything else is basically guessing. Technically monitors have this info encoded, but practically they almost always have nonsense values.
I agree completely. Itās also necessary to know both horizontal and vertical DPI, as they often differ.
One would have to assume that if only a single value is given, both are the same (itās not always the case, though).
Off-topic: And then we have the web-browsers, that always report the DPI as being ⦠was it 72 ? ![]()
(Whatās the purpose of querying a value, when itās always wrong anyway? -yes, I know about the fingerprint junk, but they find ways to fingerprint you anyway, and now all of the great browser-features have been smashed to pieces)
Thatās where the user has an obligation to use their eyes and not worry about DPI and HOPE that the people writing the preferences panel have given them enough options to get the best out of the displays they have, no matter what kind they have.
All users NEED is to have the ability to adjust both monitors until they get the best combination of settings for both of them that they can without having them try to figure out what DPI or any other nerd letters mean. If they can see their options for all the video resolutions that the video card and monitor are supposed to be able to do, like you said, not all monitors and video cards talk to the OS and tell them everything. Sometimes you just donāt have everything that you want/need. And it isnāt like we can trust AI to search through the history of monitors and get all the useful specs and put them in a small enough and fast enough database where the OS is smart enough to figure it out for the user.
PS: I DO NOT trust AI at this point because Altman, Musk, Zuckerberg and the rest have PROVEN to be unreliable with our trust. Thatās a FACT! So how can we trust their software? I canāt. Not yet. I might not be alive (Iām 66 year) before they can if ever be trusted.
All my monitors have correct physical metric size information in EDID.
One thing Iād actually like very much, is to be able to create custom resolutions (including support for pixel-doubling/tripling/quadrupling).
Sometimes on Linux, even standard resolutions that my monitors support are not available (and I havenāt figured out how to make modelines for this, though Iāve made many custom resolutions for other things).