Haiku on phones should not have separate mobile and desktop mode

While I like the direction the Maple discussion is going in the other thread I’ve come to the conclusion that imho Maple should NOT distinguish between mobile mode and desktop mode.

So I opened this thread “Maple general”.


1 Like

The problem with this is that it would mean changing existing Haiku applications to become convergent. Convergent apps generally suffer from having UIs that are either only good for one of the form factors or subpar for all form factors.

The separation of Mobile and Desktop mode is to ensure that applications made specifically with desktop or mobile form factors in mind are not compromised by making them try to adapt to both.

1 Like

That destinction is the entire point of maple.

And it was discussed in the other thread, what makes you say that it shoild not destinquish? desktop and mobile apps have completely different requirements.


I think that it should be possible to use a shell or any other desktop app with the on-screen keyboard.

And vice versa if I want to use a phone app in desktop mode nothing should prevent me from doing so.

And the screen stays small in desktop mode so it has to be designed for mobiles even in desktop mode.

I am not sure what you mean, if you connect a screen you get the desktop shell, if you don’t you get the mobile mode. They are exclusive.

Desktop apps simply don’t work properly on a small touch screen.
There is no reason to use the “mobile” variants in the desktop mode, you get the desktop versions which are more capable.


While the idea of using a phone’s on-screen keyboard as input for a desktop system is good, it has to be wireless and untethered to anything. This is quite cumbersome when having to connect an external display, mouse, and keyboard to the phone while using Desktop mode.

That sounds like the two different modes don’t make sense (on mobile).

The desktop mode runs on an attached display, with a keyboard and mouse connected. The phone of course needs to run the actual code to display it.

1 Like

Apple airplay2 supports video streams, many tvs support this, we could totally stream video for a display. and then have the phone function as a keyboard (and touchpad)

Forgot about AirPlay, that could work. Google Cast can do that as well and is supported by everything running Android TV and the Chromecasts themselves.

What would be necessary to support these protocols?

Wi-Fi Diirect supports connecting to displays, too. Are there any Haiku-supported Wi-Fi adapters and drivers that have Wi-Fi Direct support?

Ah didn’t know that. I assumed with phone display and attached keyboard.

Then two modes make sense.


Wi-fi direct is “just” a direct connection over wifi, you still need a protocol above it.

Airdrop uses wifi-direct, as do xbox one controllers. I don’t know if the wifi card has to have special support honestly

i know:
you can use leftovers of BeIA for both sony evilla
and compaq internet appliace

Not really. BeIA was very different. It used a browser to render the OS, based on Opera 4. None of this is open source. It also used its own file system, not BFS. It also was completely incompatible with R5 because api creep happened. The final version used Binder for IPC/COM like interface. I have experience with all of this. I had a WebPad at one point and I also have still got the SDK for building BeIA images.


you mean touch screen , small display applications, there’s nothing about a mobile ui other than branding that gives it separation from desktop.

To me this sounds like a parallel app server and UI need to be developed to accommodate touch and small display

Perhaps back when resistive touchscreens were in common use and mobile interfaces were only really usable with a stylus or pointed fingernail, yes.

But nowadays in the era of capacitive touchscreens used most frequently by fingers, not so much. It is outright painful to use a traditional desktop interface on a small screen with fingers, without scaling it to be huge up to the point where significant portions of the screen can’t be used by apps anymore.

Already tried that with Xfce and LXDE on a couple phones out of curiosity and it is just a rather horrendous experience that’s unusable to most folks.


And the opposite is also true. Using Android on a large screen with a mouse and physical keyboard is not a great experience. And that’s after they adjusted it somewhat to tablets and modern larger phones. Before that in Android 2.x days it would probably have been even worse still.

1 Like

I had an Android 2.2 tablet - barring the few apps modified by the vendor, most things were a disaster to use. Play had to be sideloaded as Google didn’t approve of tablet use.

Much more recently, I had a company car with an Android ICE unit. Other than the apps that the car manufacturer had built, most things were still a disaster! Nav appeared to be a standalone Garmin connected to the same monitor as there was a clunky transfer between the two.

I develop for Android in the day job and even phones with Play Store licensed Android versions are completely broken internally. Android is a mess. The BLE implementation in stock Android is horrible and tied to the UI thread, but it at least allows setting the MTU. Samsung has had a bug for the last 3 years where setting an MTU greater than 23 will crash the Bluetooth stack. Bear in mind, 23 is effectively 20 bytes per transfer. Ugh. They do allow the MTU to be negotiated higher by the device, but it is always asymmetrical, with 23 in the direction of the device, and higher from the device to the phone. Google stock Android will accept an MTU of 517.

This is ignoring how non standard all the skinned versions of Android are. Samsung’s UI is completely different to stock Android. And even trivial things like Bluetooth pairing and permissions requests look and work differently.

i meant ui and control logic can be used again in maple