I also own a 701, in fact two, so I would like to think that Haiku will continue to exist in a 32 bit version, but I have no desire to use a web browser on such an old and small machine.
I can understand that itās frustrating to keep maintaining an ever-growing beast like WebKit on limited resources on 32bit.
For old and low-powered devices,a more lightweight alternative like Netsurf will give a better experience anyway (even if that means that some pages arenāt usable).
What I really hope,however,is that Haiku itself will keep supporting 32bit.
My own devices are also mostly 64bit nowadays,but for the 32bit ones,Haiku is the only viable remaining solution.
While the BSDs and many Linuxes still support 32bit,their desktops and programs have grown so big and resource hungry that they arenāt fun to use especially on low-end hardware.
Haiku runs perfectly smooth and fast there,safing the devices from going to e-waste.
Also,with many Linuxes now dropping 32bit,Haiku may catch more users who search for a new OS that still works on their device.
5 posts were split to a new topic: Native Gemini client?
Running 32-bit Haiku and BeOS apps on 64-bit is great. Dropping 32-bit, not so much. There are plenty of people who would like to keep old computers running a modern operating system. I have a few. I also have a couple 64-bit systems running 32-bit for BeOS app purposes. I say Haiku should maintain 32-bit support as long as they maintain amd64 support. If a particular WebKit wonāt work on 32-bit, stick with the old one. Donāt throw out the baby with the bathwater.
At the very least, maintain 32-bit support until Haiku 64-bit can do everything that 32-bit can do, mainly running BeOS apps. After that, I would still maintain 32-bit support, even if you donāt update all
The same things you update in 64-bit. Haiku native apps should remain working on 32-bit, even if they donāt get all the new features. And even if not āsupportedā as in maintained, definitely keep files available. Thatās one pet peeve of mine are companies who ādrop supportā of something that was working before, still works, but they donāt let people download it anymore.
All you have to do is not break stuff. I understand that regarding the WWW, other people will break things for you.
There is still a 4G limit for 32-bit processes on 64-bit, possibly even lower.
Maintaining any kind of support for 32-bit hardware today seems a bit silly. I would still like to run 32-bit applications however.
Is this a concern? Those old applications werenāt designed to use gigabytes of memory.
Getting back to the titled subject, switching off the JIT on the JavaScript should save a lot of memory on the build. The last time I built HaikuWebKit, that was the most time and memory-consuming resource.
It is the very problem I am hitting with webkit. To build wbkit you need to run gcc or clang. On 32 bit Haiku, these are restricted to 2 or 3GB of RAM. WebKit has generated files so large that this is not enough.
Possible solutions:
- use a cross compiler, either from a 64 bit version of haiku (with a 64 bit compiler that generates 32 bit executables); or from linux
- modify the webkit perl script that generates such large sourcefiles to split the generated data into smaller parts
- maybe compiler tricks (disabling optimizations, changing compilers, disabling aslr, trying newer or older compiler versions, ā¦) - I am tired of doing this and to me it feels like a waste of my time and I think I tried everything, but maybe I missed just the right combination. But if someone else wants to try it, please do
- stop shipping new versions of webkit for 32 bit haiku (the one option I will go with, personally - not preventing other people to do the others). The last version remains availabne and so do other browsers (personally I also package NetSurf, other ones are handled by different people)
non solutions which are also things people are discussing in this topic:
- write a native gemini client or debate wether existing clients are native enough,
- run 32 bit gcc on a 64 bit system (this still will run out of memory in the exact same way)
- completely abandon 32 bit support (no one has plans for that)
Is the problematic file one of those āunified sourcesā files? Or what is this referring to?
Right, but this is software that is currently being developed, where you can just choose to develop for 64-bit. Actual legacy applications (especially from BeOS) need nowhere near this amount of memory. If anything, some applications might choke because they see too much memory.
Anyway, I fully support dropping 32-bit Webkit builds.
Iām building a 64-bit build right now and Iāll see how big JavaScriptCore is compared to the rest of the library. If itās the JavaScript engine, Iāll look into fixing up an embedded JavaScript like QuickJS. If I can get that to work, Iāll see about replacing the bytecode interpreter in it with a simple JIT.
Update:
Just by watching the build, it appears to be the unified sources in both JavaScriptCore and WebCore. If it were only in JavaScriptCore, I could have done something about it. Not with WebCore.
Update2:
I think Pulkomandyās idea about generating more but smaller sources with the PERL scripts, then linking them is the only real way to deal with this.
Thatās what I use on the 701, yeah. Itās light and fast, but has some harsh limitations. Been keeping WebPositive around too, for special situations. Not a problem if it stops being updated, but maybe a sign of times to come.
Yes, but no. I of course already disabled unified sources (that is a simple cmake build setting, easy to disable, but often build without unified sources is broken due to missing includes that are not detected because another file in the unified bundle includes the missing header).
The problematic file is JSDomWindow.cpp, which contains some 100000 lines of templates and macros to generate javascript binding for the dom window object (which can do a lot of things in javascript). This file is generated from an idl description file by a perl script. It is part of WebCore.
I think one of the WebKit developers replied to a post of yours on Mastodon about this not too long ago, suggesting the file should probably be split up anyway? So if someone wants to work on this, that sounds like the thing to do, and the patch can be fully upstreamed for everyone to use.
8 posts were split to a new topic: Link posting
Just finished building the current 1.9.16 package by (both needed):
- Bundle reduction, setting
maxBundleSize
to 2 in generate-unified-source-bundles.rb. 4 didnāt work. - Passing
-ftrack-macro-expansion=0 --param ggc-min-expand=10
via-DCMAKE_CXX_FLAGS
in the recipe.
Neat! Really makes me glad itās now a couple of us taking a stab at haikuwebkit : )
Iām trying that on the current sources, but:
- i hit some file cache corruhtion and compiler crashes, that will disappear when I reboot the machine
- disabling or lowering unified sources means fixing several missing includes
- even when itās working fine, the build on this old 32 bit machine is taking several hours, possibly an entire day
so I still feel I could spend my time in more enjoyable ways. Iāll see if I can eventually get through the build, hush the fixesto missing includes and ship a release, but Iād be super happy if someone else did it
Thatās not fun at all. No wonder youād like to abandon it.
imho CI looks like the way to go if weād like to keep building 32-bit WebKit builds.
And productive.
Are updated recipes without bumping the REVISION caught and built only for targets that donāt have the package yet? In that case, Iād say just do x86_64 with !x86 and let interested parties update that later with a x86-only patch.
Iāll try to do that for current master when I finish testing another compile option. What I wonāt do is upstream the includes fixes to webkit proper.