Oh, was not previously aware of these. Suppose all that would be needed is to directly provide these images, so that people don’t need to build them by themselves, ay minimum.
Oh apologies, I was basing off the satellite internet solutions available in my area. The non-Starlink options here aren’t great, especially when it rains (frequently). Got a whole bunch of gripes with Starlink too (the router grrr), but this went off-topic as it is so not going to go further into this here.
Anyways for minimal images, would it be possible to have the Installer download additional packages that a user wants that aren’t locally present?
I think the size of the userguide and welcome package is mainly from the screenshots. And I don’t see how markdown would be significantly smaller thao html.
A possible solution would be to use BPicture based screenshots (a vector format, I thing x512 prototyped this with a translator but I don’t remember if there is an easy way to make screenshots using it) instead of PNG. Orjust reducing the PNGs to less colors or using a different image format to compress them better.
What about using JXL? It is one of the newer image file formats that supports both lossy and lossless compression. Good thing WebPositive uses WebKit too, since JXL support was added in whatever version Safari 17 has. However the JXL images should (for now) only be shipped in the on-disk User Guide and Welcome, at least until wider web browser support occurs (cough Chromium cough).
Side note:
Just found out that neither Windows nor macOS have native JXL support. Haiku could come out ahead of both in supporting and using it.
Yeah, BPicture is just part of the API, but I think pulkomandy was referring to using it to make screenshots - that’s the part that was prototyped and may or may not be easy to do.
Apple has added native JXL support in the latest macOS and iOS versions released last month. So far it seems to be read-only, meaning Haiku still has a chance to be the first OS with full JXL support
p.s.: after hacking with Haiku ports a few years ago, and following the Haiku forum for a while, I’ve finally decided to join the community. So, hi everyone
I think there is at least some room for a valid perception of “bloat” in this context, especially as regards the size of the core operating system.
In other words, what is the the size of the minimum possible functional install of Haiku? Can it fit on a single CD-R (~700-750 MB) or a single layer DVD-R disc? The impetus here is not that it must fit onto a burnable optical disc, though that would be nice, but rather that it should be compact as possible without the install being totally useless. — As much as 4 GB after install to the boot/primary drive seems reasonable to me.
Likewise, the “weight” of an OS is an important metric and can be measured in terms of the minimum hardware resources required for a usable experience. We can all have our own ideas on this matter, but just off the cuff I think you should probably be able to run it well on hardware that was considered current in 2010.
Mind you that I would, at this point, consider whether it is usable to browse/access/utilize the internet as an important factor there. Maybe not as a baseline, but definitely as an important indicator.
Comparing 32 bit OSes from the 1990ies vs. our current 64 bit systems might also skew perceived resource consumption since pretty much everything is automatically twice the size on a 64 bit system.
Can you explain what you mean by that?
I understand the basic concept that if you have something like an integer based pointer that the base size will be 64 bits (8 bytes) for a 64-bit OS running of 64-bit hardware. But how does that extend to "pretty much everything? Does it necessarily mean that a 1 MB image is now 2 MB?
Depends on what you mean by image (machine code image, disc image, or image file like an RGB picture). My comment is referring to compiled machine code and common data structures in memory using integers and pointers since that’s what I was assuming we are talking about when discussing bloat. Raster image files, text strings and other data files of course stay the same.
If it is instead the size of the system installation media that is meant by “bloat”, then I wonder what they are considering the limiting factor here that any kind of optimization should address. “The installer image doesn’t fit on a 4 GB thumb drive” would be a valid practical concern for example. Or “Installation takes twice as long as on Beta 2”.
If that’s our standard, I guess that the availability of an effective in-browser ad blocker or system-level ad-blocking firewall will be much more of a critical factor than space and memory consumption of the underlying OS.
I’d expect the minimal usable system with and without browsing the modern web will have quite different hardware specs. And large parts of the current web’s system requirements are outside the control of Haiku.
You can read my previous post where I anamyze this: it is possible to boot from a 300 to 400MB image. So, if 4GB is OK, we already do 10 times better than needed. The current release image is 1.5GB, with 500MB of free space to allow users to install a few things in live mode. So, even there, we are 3 to 4times under the arbitrary 4GB limit. And for CDs, it would be OK and not so difficult to provide a 2 CD install, with the live/install CD, and a separate one for sourcecode, optional packages and documentation. But is that worth the effort of changing the buildsystem to ship two disks instead of one? Is the requirement of a DVD drive or booting from an USB stick really yhe one thing that prevents people from installing and using Haiku?
It runs reasonably well on hardware from 2003.
On the x86 architecture, pointers also stay the same by default. This is already the case on other systems, but even more so in Haiku, since all APIs use explicitly sized integers (such as int32, int16, …) to be sure of what is used. So, that leaves pointers which are twice as wide (of course, to be able to address more memory). But, on the other hand, there are also some new features in the CPU that improves some things that were difficult to do before. I can think of the addition of more CPU registers, which should help optimized code to use less stack, and also of some specific instructions for PC-relative addressing, which will make position-independant code more efficient. And, in Haiku, everything is compiled as position-independant code.
Also, the x86 instruction set uses variable width instruction: instructions can be 1, 2, 3, 4, or more bytes, depending on what needs to be encoded. That is not the case for some other CPU architectures where the switch to 64-bit meant that all instructions suddenly became 8 bytes instead of 4 (but even then, not all CPU architectures did that, because it would indeed be a massive waste of space, and no one designing a CPU architecture is stupid enough to make such a move).
Finally, in the 64bit version of Haiku we can assume that some extensions to the 32bit instruction set (SSE, SSE2, …) are always available. We can’t do this on the 32-bit version since our minimal platform is still the Pentium MMX from 1997. So, every place where we want to use the newer instruction sets, we have to test at runtime if it is available, and provide a fallback path using only instructions available in the Pentium Pro. So far, the main places where this happens is in ffmpeg for audio and video decoding, and in some code related to bitmap scaling. On the 64bit version, all this legacy code does not need to be included.
So, it’s only the pointers that increase size, and that is counterbalanced by other changes that allows to make the code a bit smaller. In the end, there will certainly not be a 2x increase in memory use, but a much smaller one, if at all. And you can indeed verify that our system requirements for the 32 and 64bit versions of Haiku are not different in terms of disk size or RAM, and you can check this experimentally by booting Haiku in a virtual machine, reducing RAM size until it stops working.
Installation completes in just a few seconds. How can this be considered bloat? There is no other OS that can be installed this fast in the market currently.
I think this thread is just showing more and more that the supposed “bloat” is not really something to worry about at the moment (well, it said that in the thread title anyways).
I personally see including more resources on the install image a good thing, it’ll theoretically serve the needs of many more people. Include more fonts, more locales, more input methods, more wallpapers. I don’t consider this bloat, rather a convenience.
What I’d consider bloat would be unoptimised code, unoptimised routines, features that are implemented without care, and make the system laggy/slow. And features that are not Haiku-like, for instance Linuxisms. These can stay away.
Thank you very much for the in-depth explanation, that makes a lot of sense!
Sorry for phrasing that ambiguously, that was purely a hypothetical example of something we could measure and, if we can reproducibly prove that it got worse at some point, we could always specifically target and optimize for. I meant to say that that way of thinking is a more useful approach than saying “the boot image got bigger, this means we have some invisible magic problem somewhere that will ruin Haiku”.
Essentially I completely agree with you. This thread has not been able to point to any specific sign of “bloat” other than that the installation image got slightly larger and all changes can be attributed to very useful features and modernizations. I see that as a good sign that we’re beyond the point of a mere clone of a 1999 operating system, nothing more.
Yep, Haiku is running just fine on my Thinkpad T42 from 2004 and supports pretty much all the hardware except wifi (but I could change the wifi card for a supported one). It still feels quick even on that old hardware. And I haven’t tried it for a while but it used to also work completely fine on an X31 from 2003.
I just think that browsing the modern web is an important utility distinction, but not super critical unless you want to use Haiku on your primary machine for tasks requiring that functionality. Many other tasks can be performed just fine on a computer with no internet access at all, so long as it is up to the job without external assistance.
In fact, most fundamental “productivity tasks” could probably still be done on a Macintosh computer from the mid to late '90s as long as you can either take your data and: (a) print it out, (b) export it into a portable format that is accessible to more recent software, or (c) transfer the critical “data” via a format that’s independent of file types. That last bit is, in many situations, the real problem with using older computers, operating systems, and application software.
I personally see including more resources on the install image a good thing, it’ll theoretically serve the needs of many more people. Include more fonts, more locales, more input methods, more wallpapers. I don’t consider this bloat, rather a convenience.
See, I would consider those to be potential bloat.
In my view, many of those things rightly belong in HaikuDepot (or some other software source/repository), they could even be supplied as an “extras” image for putting on a second disc or a in moderately sized archive file for download.
Extra wallpapers (more than maybe 3 options plus software produced solid colors/patterns) don’t need to be included on the install media (certainly not for resolutions larger than 1920x1080). Likewise, extra fonts are fluff as long as the defaults are good choices for most people. t
But, to a point, some variety in locales and input methods (like supporting Japanese) is important for testing purposes and extending basic functionality to most users.
No it doesn’t actually, yes there is an increase in pointer sizes but it doesn’t double everything by a long shot at least not on x86.
Even if it were the case it would at most only double memory use (not increase it by factors of 10), realistically its more like maybe 10% or so since the 64bit data is mostly pointers and data for images and other normal data remains the same.
That’s the rub though… Haiku would be usable on even much lower end machines if it werent’ using as much ram as it does today. Also a lot of the memory use, isn’t contributing to system speed… package installation is something you do periodically and isn’t something you should optimize for speed over having more ram available for applications.