Haiku is not bloat

We have seen several, actually. The fact that these have been called Betas may not have been ideal for public perception, but objectively speaking, these have been very good and very stable, and each one a lot better than the one before. No R1 is only a problem if we are debating PR or semantics.

2 Likes

The problem with this is that the gpl mandates source code by the same distribution method be made available.
If you release a second source cd we need to keep that supplied for years if we make physical disks to distribute. If it is on the same disk then there is no need for this at all.

There is a minimum (or minimal, can’t never remember) profile if you build Haiku images yourself. So, very feasible, to the point that it is already done. We could make it even smaller and even less usable, too. The bootstrap image has even less things.

Now that’s also not very nice to me, because my paid job is (partially) working on satellite internet :frowning:

It does not have to be this bad, and there are several companies besides Starlink who can do it too (with much less satellites). One of our customers provides video-on-demand services to cruise and container ships, and it works much better than you say.

Please do not give Starlink more credit than they deserve. And remember that they currently run more than half of the entire float of satellites orbiting the earth, with plans to have even more, so, in terms of efficiency and bloat (to pretend I’m still on topic), that’s… not great.

First of all, this is not a fact until someone has measured it. Especially for boot time and “the system is getting slower”. The other two are easy to check by looking at the past release images, even if that’s not many datapoints.

Anyway, I don’t think we can go back to how Haiku was in 2008 (alpha1), for reasons I have explained already: we just added A LOT more things to the OS since then, which can’t be done for free. We are already not making very fast progress on reaching R1, and in addition to that, we should make sure that whenever we add something to the OS, we should optimize something else to keep the memory footprint and image size exactly the same? I don’t think that can work, or at least it conflict with the other complaint that the project is derailing and not making any progress.

So, let’s have a look at the disk image. The file is 1.5GB but there is empty space in there (so you can do some things while using it live), in reality there is just a bit less than 1GiB of files (I’m checking the 64bit version here, the 32bit will have more due to hybrid builds which need a lot of things to be duplicated), of which:

  • 336MiB of sourcecode for GPL compliance
  • 125 MiB of “welcome” package + userguides in various languages
  • 117 MiB of Noto japanese font (needed for our Japanese users to be able to see any text)
  • 84 MiB of gcc and binutils
  • 56MiB of LLVM (needed by modern Mesa)
  • 33MiB for HaikuWebKit

(these numbers were obtained by mounting the beta4 release image with “diskimge register” and then looking at the biggest files and directories using DiskUsage).

If we remove all these things, that’s a whole 751MiB that can be removed from the image. What remains is 270 MiB of a core system that will work just fine, but without fast 3D rendering (you could install TinyGL as a replacement for Mesa), without japanese language support, and without an usable web browser (but you can install NetSurf which is quite a bit smaller than WebKit), and without documentation.

Personally, I think under 300MB for the core system is not bad at all.

So, what is a realistic plan here? Do we make plans to remove japanese language support, replace webkit with netsurf, and stop using any GPL software so we can remove the “sources” directory? If that’s not acceptable, then, where in the 270MB that remains is the bloat that we should look for?

Next, there is RAM usage. Let’s look at the minimal requirements.

All alpha releases had a minimum requirement of 128MB.

The first two betas doubled this to 256MB.

Finally, beta3 and 4 made it to 384MB.

First of all, we have to note that this is rounded to the nearest multiple of 128. So, what looks like a 3x increase over the span of 15 years (already not that bad) is in fact a bit less than that. And also, who is still running Haiku on hardware with less than 512MB of RAM? Is that a thing we should worry about? Do we lose many users because they have less than 384MB of RAM in their system and can’t run Haiku? I don’t think so. Even on 20 year old hardware, you will find yourself with enough RAM already.

Now, if someone wants to investigate where all this extra RAM went, they are sure welcome to do so, and we’ll merge the patches, if it doesn’t cost too big of a regression in other aspects (removal of features, making the system slower in the process by chosing slower but less memory intensive algorithms, etc). But I don’t think, at this point, it is a problem worth spending time on. Apparently no one thinks it is worth spending their own time on, some people do think it’s worth someone else spending their time on it, but that won’t get us anywhere closer to a solution.

This leaves the “boot time” and “system is getting slower” arguments, for which I will wait for someone to provide actual measurements, because I have spent enough time writing this reply already.

18 Likes

atleast 100mib of that noto font can be shaved off : )

I’ve already made a ticket for this for whomever wants to investigate this, in the grand scheme of things it is not /that/ hard.

But indeed, complaining is certainly easier than spending their own time : D

https://dev.haiku-os.org/ticket/18381

Oh, was not previously aware of these. Suppose all that would be needed is to directly provide these images, so that people don’t need to build them by themselves, ay minimum.

Oh apologies, I was basing off the satellite internet solutions available in my area. The non-Starlink options here aren’t great, especially when it rains (frequently). Got a whole bunch of gripes with Starlink too (the router grrr), but this went off-topic as it is so not going to go further into this here.

Anyways for minimal images, would it be possible to have the Installer download additional packages that a user wants that aren’t locally present?

Are the welcome packages written in HTML or MarkDown? Maybe converting them at install time from MarkDown to HTML would shave off some megs.

update

I found the docs directory in the repo.

Edit

Nope. Not there. It’s in userguide translator.

I think the size of the userguide and welcome package is mainly from the screenshots. And I don’t see how markdown would be significantly smaller thao html.

A possible solution would be to use BPicture based screenshots (a vector format, I thing x512 prototyped this with a translator but I don’t remember if there is an easy way to make screenshots using it) instead of PNG. Orjust reducing the PNGs to less colors or using a different image format to compress them better.

3 Likes

What about using JXL? It is one of the newer image file formats that supports both lossy and lossless compression. Good thing WebPositive uses WebKit too, since JXL support was added in whatever version Safari 17 has. However the JXL images should (for now) only be shipped in the on-disk User Guide and Welcome, at least until wider web browser support occurs (cough Chromium cough).

Side note:
Just found out that neither Windows nor macOS have native JXL support. Haiku could come out ahead of both in supporting and using it.

3 Likes

I like the sound of BPicture. I’ll check on GitHub if it’s there. If the translator is WonderBrush compatible, I’ll see what works.

Found it

haiku/src/kits/interface/Picture.cpp at master · haiku/haiku · GitHub it was right in the Haiku source all along.

found the viewer too

Yeah, BPicture is just part of the API, but I think pulkomandy was referring to using it to make screenshots - that’s the part that was prototyped and may or may not be easy to do.

It’s not a translator that I can tell. It does view some flattened BPictures.

I was about to suggest JXL too. It allows to save 25-50% compared to PNG.

Apple has added native JXL support in the latest macOS and iOS versions released last month. So far it seems to be read-only, meaning Haiku still has a chance to be the first OS with full JXL support :wink:

p.s.: after hacking with Haiku ports a few years ago, and following the Haiku forum for a while, I’ve finally decided to join the community. So, hi everyone :wave:

11 Likes

It’s just a point of reference, I could have used old Haiku as well but that was also much less capabe than Linux at the time.

I think there is at least some room for a valid perception of “bloat” in this context, especially as regards the size of the core operating system.

In other words, what is the the size of the minimum possible functional install of Haiku? Can it fit on a single CD-R (~700-750 MB) or a single layer DVD-R disc? The impetus here is not that it must fit onto a burnable optical disc, though that would be nice, but rather that it should be compact as possible without the install being totally useless. — As much as 4 GB after install to the boot/primary drive seems reasonable to me.

Likewise, the “weight” of an OS is an important metric and can be measured in terms of the minimum hardware resources required for a usable experience. We can all have our own ideas on this matter, but just off the cuff I think you should probably be able to run it well on hardware that was considered current in 2010.

Mind you that I would, at this point, consider whether it is usable to browse/access/utilize the internet as an important factor there. Maybe not as a baseline, but definitely as an important indicator.

@PixelLoop

Comparing 32 bit OSes from the 1990ies vs. our current 64 bit systems might also skew perceived resource consumption since pretty much everything is automatically twice the size on a 64 bit system.

Can you explain what you mean by that?

I understand the basic concept that if you have something like an integer based pointer that the base size will be 64 bits (8 bytes) for a 64-bit OS running of 64-bit hardware. But how does that extend to "pretty much everything? Does it necessarily mean that a 1 MB image is now 2 MB?

Depends on what you mean by image (machine code image, disc image, or image file like an RGB picture). My comment is referring to compiled machine code and common data structures in memory using integers and pointers since that’s what I was assuming we are talking about when discussing bloat. Raster image files, text strings and other data files of course stay the same.

If it is instead the size of the system installation media that is meant by “bloat”, then I wonder what they are considering the limiting factor here that any kind of optimization should address. “The installer image doesn’t fit on a 4 GB thumb drive” would be a valid practical concern for example. Or “Installation takes twice as long as on Beta 2”.

If that’s our standard, I guess that the availability of an effective in-browser ad blocker or system-level ad-blocking firewall will be much more of a critical factor than space and memory consumption of the underlying OS.

I’d expect the minimal usable system with and without browsing the modern web will have quite different hardware specs. And large parts of the current web’s system requirements are outside the control of Haiku.

You can read my previous post where I anamyze this: it is possible to boot from a 300 to 400MB image. So, if 4GB is OK, we already do 10 times better than needed. The current release image is 1.5GB, with 500MB of free space to allow users to install a few things in live mode. So, even there, we are 3 to 4times under the arbitrary 4GB limit. And for CDs, it would be OK and not so difficult to provide a 2 CD install, with the live/install CD, and a separate one for sourcecode, optional packages and documentation. But is that worth the effort of changing the buildsystem to ship two disks instead of one? Is the requirement of a DVD drive or booting from an USB stick really yhe one thing that prevents people from installing and using Haiku?

It runs reasonably well on hardware from 2003.

On the x86 architecture, pointers also stay the same by default. This is already the case on other systems, but even more so in Haiku, since all APIs use explicitly sized integers (such as int32, int16, …) to be sure of what is used. So, that leaves pointers which are twice as wide (of course, to be able to address more memory). But, on the other hand, there are also some new features in the CPU that improves some things that were difficult to do before. I can think of the addition of more CPU registers, which should help optimized code to use less stack, and also of some specific instructions for PC-relative addressing, which will make position-independant code more efficient. And, in Haiku, everything is compiled as position-independant code.

Also, the x86 instruction set uses variable width instruction: instructions can be 1, 2, 3, 4, or more bytes, depending on what needs to be encoded. That is not the case for some other CPU architectures where the switch to 64-bit meant that all instructions suddenly became 8 bytes instead of 4 (but even then, not all CPU architectures did that, because it would indeed be a massive waste of space, and no one designing a CPU architecture is stupid enough to make such a move).

Finally, in the 64bit version of Haiku we can assume that some extensions to the 32bit instruction set (SSE, SSE2, …) are always available. We can’t do this on the 32-bit version since our minimal platform is still the Pentium MMX from 1997. So, every place where we want to use the newer instruction sets, we have to test at runtime if it is available, and provide a fallback path using only instructions available in the Pentium Pro. So far, the main places where this happens is in ffmpeg for audio and video decoding, and in some code related to bitmap scaling. On the 64bit version, all this legacy code does not need to be included.

So, it’s only the pointers that increase size, and that is counterbalanced by other changes that allows to make the code a bit smaller. In the end, there will certainly not be a 2x increase in memory use, but a much smaller one, if at all. And you can indeed verify that our system requirements for the 32 and 64bit versions of Haiku are not different in terms of disk size or RAM, and you can check this experimentally by booting Haiku in a virtual machine, reducing RAM size until it stops working.

Installation completes in just a few seconds. How can this be considered bloat? There is no other OS that can be installed this fast in the market currently.

I think this thread is just showing more and more that the supposed “bloat” is not really something to worry about at the moment (well, it said that in the thread title anyways).

2 Likes

I personally see including more resources on the install image a good thing, it’ll theoretically serve the needs of many more people. Include more fonts, more locales, more input methods, more wallpapers. I don’t consider this bloat, rather a convenience.

What I’d consider bloat would be unoptimised code, unoptimised routines, features that are implemented without care, and make the system laggy/slow. And features that are not Haiku-like, for instance Linuxisms. These can stay away.

3 Likes

Thank you very much for the in-depth explanation, that makes a lot of sense!

Sorry for phrasing that ambiguously, that was purely a hypothetical example of something we could measure and, if we can reproducibly prove that it got worse at some point, we could always specifically target and optimize for. I meant to say that that way of thinking is a more useful approach than saying “the boot image got bigger, this means we have some invisible magic problem somewhere that will ruin Haiku”.

Essentially I completely agree with you. This thread has not been able to point to any specific sign of “bloat” other than that the installation image got slightly larger and all changes can be attributed to very useful features and modernizations. I see that as a good sign that we’re beyond the point of a mere clone of a 1999 operating system, nothing more.

1 Like

Yep, Haiku is running just fine on my Thinkpad T42 from 2004 and supports pretty much all the hardware except wifi (but I could change the wifi card for a supported one). It still feels quick even on that old hardware. And I haven’t tried it for a while but it used to also work completely fine on an X31 from 2003.

2 Likes