Multimonitor support

Its posible to use this for multimonitor support when you have on each lcd different screens? I mean not multiuser but one connected to vga another to hdmi and so? And all with current vesa driver

(I’ve split your question into a new topic because it was unrelated to the one you posted in.)

It is currently not possible as VESA does not support this, no. @PulkoMandy has been slowly working on it in the intel_extreme driver, with the eventual goal of supporting it in app_server, indeed.

1 Like

Actually, the intel_extreme and radeon_hd drivers already support “mirror” screen mode under some hardware.

Mirror mode is not useful. I and probably topic author want to have separate desktop on each screen and be able to move windows between screens.

2 Likes

intel_extreme is fine on my thinkpad t60 intel based. and works a lcd brightness, too. this laptop have a optional external vga port.
and my asus vivobook only go with haiku when i use a 64bit EFI framebufer

true, i ask becouse i see this Multi-desktop test_app_server
and first think what i have in head is multimonitor support.

i am ordinary user : D not programmer

Multi screen test_app_server is useful for developing and debugging multiple desktop support on each screen before video drivers will be ready. Actual multi-monitor support can’t be done without proper video drivers.

thanks for answer, i love haiku how it is now and i prefer test it from usb , for now i have for this thinkpad where works intel drivers and wifi and eth too

I know, I meant that the drivers themselves know how to set up the second display. Actually I recall kallisti5 saying that the radeon_hd accelerant and driver have all the necessary plumbing for true multi-display support, but the accelerant interface and app_server itself do not have a way to describe and handle that. I think intel_extreme is not quite to that level, but it probably will be soon.

If driver has all required for multi-monitor support, why not to add accelerant API for it? If proposed API is bad it can be easily refactored because it has few clients.

There was a draft, yes: https://review.haiku-os.org/c/haiku/+/329

However, it’s not complete, there are a bunch of unresolved comments. Nobody has gone back to it.

We already have multi-monitor support in the old Radeon (not radeon_hd) and I think the Matrox driver from rudolfc. However it was done at a time where BeOS compatibility for the drivers was desirable, so it doesn’t need any change to app_server or the accelerant API. How it works is, there are some extensions to allow the Screen preferences (or a modified version of it, I don’t remember if we have integrated it in Haiku) to configure the screen mode separately for each display. The video card manages one single large framebuffer and pretends to app_server to be one large display. This works reasonably well, with the obvious problems: no way to maximize a window to just one screen, CenterOnScreen puts dialog in between the two displays, etc.

The addition of a screen_id as linked by waddlesplash is the way to go, but we should try to add it in a way to not break the ABI (I don’t remember, depending on the order in which parameters are pushed on the stack, this would be either as the first or last argument of the function, so that it doesn’t shift the other arguments out of place, and it may or may not be possible depending on the calling convention). We can also decide to adjust all drivers and ignore that problem, as it is unlikely anyone is currently using Haiku with BeOS graphics drivers.

On my side I’ve indeed been working on the driver side first, because once I have clone mode working, it won’t be too hard to then have each display show a different thing (with the way I’m doing “clone” mode, I already have two independant display pipes running, so it’s litterally just changing the pointer to the framebuffer for one of them).

2 Likes

If ABI need to be preserved, some global or TLS state can be used to select display like in OpenGL API.

Can you point place in code?

Somewhere around here: https://git.haiku-os.org/haiku/tree/src/add-ons/accelerants/intel_extreme/mode.cpp#n471 and here: https://git.haiku-os.org/haiku/tree/src/add-ons/accelerants/intel_extreme/mode.cpp#n114

There is probably a bit more needed to have this work with app_server, of course, but one could already change the “framebuffer base” for one of the displays and see the effect. Then we need to allocate two independant framebuffers for each display (calling intel_allocate_memory twice) and finally have app_server draw on both of these. From the driver side, that’s about it for basic support (then there are more things to do: setting different video modes for each display, making wait_for_retrace wait for the interrupt from just one display instead of both, etc)

1 Like

So, The Matrox driver supports multi-monitor (stretch, mirror, invert stretch: stretch is horizontal here always, not vertical as Thomas his ATI driver does do). The same applies for my nVidia driver.

Matrox: G200…G550,
Nvidia: Tnt2–GF7950

Of course, this is a trick solution, implemented BTW with a different control scheme than Thomas did (his version is copied inside Haiku). If app_server really supports two seperate desktops these driver cannot use that without updating them.
But what can be done, is simply place multiple graphicscards in one system. I have coldstart support in place for the G200…G550 cards, and also for almost all nVidia cards, though you have to connect the screen via the VGA method (no DVI).
With coldstart support I mean that these drivers have a command interpreter in place that execute the ‘program’ (sort of script language) inside their card’s BIOSes initializing the cards when the computer’s BIOS does not do this because they are not the primary gfx cards.
This way I had a consumer node running with displaying video’s on a seperate graphics cards (maybe someone remembers :wink: I did this BTW using video overlay, and scaled_filtered_blit (nVidia))
Anyhow, If app_server supports this, I would be willing to do some testing and possibly some tweaking to my drivers for that.

BTW (update):
Since creating coldstart support for gfx cards is a bitch, I suggest someone try to simply execute each card’s BIOS since that way you get perfect startup without much hassle. I am hoping this can be done in much the same way VESA2 calls are being done for settings modes (realmode x86 emulation or so).

That would make all cards in the world be useable as secondary gfx cards…
I have a real nice old card here BTW: that one has quad G200 on it with 2 Y-cables: This is a PCI card I can plugin next to a nVidia for example, and have all 5 cards operational with drivers.

Hmm one more update: Instructions about tweaking my drivers is available on my site. The drivers have settings files that determine their behavour in for instance this coldstart thing. I think it’s disabled by default since sometimes it has a negative impact for some cards for instance: and the default settings are done for max. compatibility.
See here for more info on a lot…: http://rudolfs-place.nl/BeOS/BeOS.html

1 Like

I think at this point we should ask on the mailing list, but we probably should just break the ABI. We already broke it for kernel drivers (even for GCC2 ones) a few times over the past few years, and more before that, so if anyone has a GCC2 accelerant, it already will not work because the kernel portion won’t.

Well, as a teacher, when I am using my laptop for displaying slides, I like to use the mirror mode. However I do agree, from a programmer, graphic designer, or anyone who likes to have a big workspace, having separate desktop on each screen is better.

1 Like

Yes, that’s why I said “try to”. If it’s possible to add an extra argument to these functions without breaking ABI, it’s fine. Otherwise, ABI break it is. And maybe we can consider switching app_server to gcc8 on 32bit systems then, jua had tried this and there is performance to be gained (from a compiler really optimizing the agg code, mostly, I think)

is there a way to route the video output to the hdmi of the external monitor? maybe via boot, something similar to how it is done with linux via grub:
GRUB_CMDLINE_LINUX = "video = LVDS-1: d video = VGA-1: e video = HDMI-1: e"

Probably only by editing driver code (in /src/add-ons/accelerants).