POLL: Should release iso's be zipped?

GCC + binutils, glibc, git, coreutils, ffmpeg, bash, those are the 10+ MB ones. Also, Haiku’s sources, I guess due to glibc, libiconv, etc.

The R5 Pro Installer included a whooping 37.2 MB of “GNU sources”, among the optional stuff to be installed.

Gentoo/kHAIKU

I think compressing the ISOs is a good idea,but ZIP is the completely wrong tool for it,so I voted no.
It doesn’t make sense to have a whole compressed archive/folder that needs to be completely extracted if it contains only a single file.
Many OSes offer downloads compressed using xz (without tar) which can be piped directly to dd,removing the need for storing the uncompressed image.
I’d much prefer going that way,as it can save a lot of bandwidth and download time,without wasting disk space and time for extracting (xz does it on the fly while writing it to the usb stick).
For those who don’t know xz images,it’s as easy as: xz -c -d haiku.iso.xz | dd of=/dev/sdXX bs=4M status=progress

1 Like

if is really necessary to keep the sources included in the iso, these sources should be compressed with the most powerful algorithm (if it’s not already like this)

2 Likes

Good idea, nobody uses them anyway.

1 Like

afaik, by default , 90% of pc user’s have zip tools built in

Another idea, maybe upload the isos to archive.org (which has built in torrent seeding). But I don’t know if this can be done manually.

Fun story, i’ve been working on converting our build pipelines over to using rclone to distribute artifacts. rclone supports archive.org :grin: I’m working towards getting the creation and distribution of release artifacts more automated.

2 Likes

This was the original justification to move to zip across all builds vs distributing xz, gz, etc. like we used to.

We used to build “all the architectures” and build “multiple image types” in multiple compression types. The issue was as we grew this model got unsustainable. We started reaching several terabytes of data (even pre package management) which was painful to lug around on cheap / commodity server hardware with software raid. One SLES server upgrade failure put multiple terabytes of data at risk.

These are all the reasons why i migrated us to s3 buckets. We can “toss the data out there” and let the hosting provider worry about it. Wasabi has cut us off a few times now with threats of us using too much bandwidth, which is why the IPFS stuff exists as a local backup we could host from in an emergency.

Anyway, also all reasons i’m moving our build pipelines to distribute via rclone. I’m hopeful we can nail down “running a build pipeline, and having iso artifacts distribute to multiple systems and locations”

2 Likes

(Hello all - I’m a new member just adding my thoughts)

I think that most people these days use pendrives, therefore I would agree that uncompressed is the normal distribution method employed by most Linux & BSD downloads.

I appreciate that there will be some who will find the size of downloads a problem, & they would most likely be the users of older 32bit systems - in which case, may I put forward that perhaps the 32bit be compressed, whilst leaving the 64bit uncompressed.

Anyway, just my thoughts. :slightly_smiling_face:

can’t the ISO come with the source already compressed (with zip, 7z or whatever format) inside it to save some space?

1 Like

Today, I see most Linux/BSD repos have ISOs uncompressed. The Haiku R1B ISOs are below the miniDVD capacity. You could zip the inactive ISOs to save space for archival purposes.

Note: You can buy a 256GB USB drive for about $10 USD which can fit all your ISOs and the entire BeOS/Haiku software library.

I think the sources are provided as hpkgs, which then uses whatever compression method is available/enabled. If i recall it correctly, it supports lzma at least.

1 Like

Yes, but it could be compressed further, due to the fact that hpkg is designed for quick random access and most compression algorithms are designed for streaming (and so, accessing data in the middle of the stream requires decompressing everything before).

The algorithm we currently use could be mode more efficient for random access but we have not yet researched and implemented this.

How much bandwidth do you think the distrobution of b4 will take, similar to b3 ? otoyh, what was that amount ? maybe rent some space from AWS for a month or 2 to cover the R1B4 release downloads.

I’d personally offer 2 versions:

  • fully featured Desktop edition, with LibreOffice, Gimp, wallpapers, fonts etc. size be damned, and offer a decent out of the box experience.
  • embedded absolute minimum edition, primarily for device builders. Command line pkgman to start building up.
5 Likes

I’d definitely think this would be good for newcomers

We technically have quite a few rsync mirrors setup for releases due to the generosity of several groups (check them out on our download page).

Mostly, from a bandwidth perspective, the end users bandwidth / time to download is the biggest factor.

As for exact numbers, i really don’t remember. “A few TiB” of bandwidth at minimum. The bittorrent + IPFS + mirrors really help cut that number down.

2 Likes

I’ll reach out to a hosting service, see if i can setup a mirror for say 90 days, no promises, money’s tight RN

1 Like

No compression IMHO is better. In this day and age of fiber and GbE links, a 400Mb difference isn’t as dramatic as early/mid 2010’s, and a direct ISO is far quicker* to handle (you can directly make a USB key, or virtualize it). I think the extra Mb’s are worth the little extra wait.

Note I didn’t say easier, but quicker. The more streamlined, the better final users will regard the entire experience. That’s why most Linux distros went that way, I think.