Is 32Bit development to continue (long term)?

Sure,and that fork will usually be a non-commercial community project then.
I’ve seen that a few times already as a response to a corporate-owned project becoming shit.
It only confirms my point that commercialization is generally a bad thing in the software world.

I don’t see how. You still benefitted from the commercial work if you fork it later

Maybe not made shit, but theoretically a project could be forked and forked until there is only a single dev working on any particular fork and progress fragments too much to see any appreciable progress from any of them.

I mean, some people say that there are already too many “flavours” of Linux and even desktop environments (kde/gnome/cinamon/enlightenment etc) for the average consumer to grasp, for example; What was once perceived as a strength is now a potential barrier to uptake so yeah, not shit as such but too disparate to be effective.

Please keep developing the 32-bit version. I run Haiku on P4.

3 Likes

I, for one, would like to see 64-bit Haiku able to run 32-bit Haiku programs.

1 Like

Investors are bad, but you don’t need to do that when commercializing something.

Yes we plan to support 32-bit Haiku until R1 is released. No we do not plan to add 32-bit app support or BeOS apps to 64-bit Haiku anytime soon if ever.

There is an incomplete patch set to support 32-bit apps on 64-bit if you’re interested in this: https://review.haiku-os.org/c/haiku/+/2874

1 Like

About corporate backing: I am no fan of big corporations, but without corporate backing Linux wouldn’t have seen the light of day and by extension the FOSS world wouldn’t exist as we know it, it’d probably be a small circle of enthusiasts running some GNU/Hurd version capable of a fraction of what GNU/Linux can do today.

Of course those big corporations didn’t do it out of the goodness of their hearts, but because they saw the value (and potential future gains) of having a free Unix clone in the market. They were willing to bet on the long-term and it all worked out.

Since the Linux project succeeded (in creating a free Unix clone) they now have a vested interest in contributing to it for their own goals and that ends up benefiting everyone using the kernel.
(And even if they did try stupid/malicious stuff, serious projects have proper guardrails against crummy features and code slipping in.)

They get more money, we get more features, it’s a win/win situation I wish was more common in computing in general.

Granted, by comparison today’s corporations mostly seem incredibly shortsighted, tone-deaf, out of touch with their consumers and only caring about “green line go up” for their investors.

But they’ll get burned for that behavior sooner or later, a company is worth nothing without its customers. If you want to see what that looks like just check how the AAA gaming studios have been doing in the last few years.

Also, you can be a greedy corporation even without all that “stocks” nonsense, just look at OpenAI.

2 Likes

I should have really been more clear, when I mentioned “commercialized”.

I know a commercial developer of a DAW ( MuLab ). He is the only developer, and the only issue I have with him being commercial is that he must follow many of the commercial trends in his product category. He still retains control over his product. The choice to follow, some of the trends, is his personal choice for balance between commercial relevance and having a unique product.

This isn’t wonderful, but it is better than the “commercialized” category I had intended to point out. In that example, the product is a closed source commercial product. But the same could be true of a commercial(ized) opensource product/project.

The issue, with projects like Linux, is immense commercial influence.

If a capable group of people forked Linux, against a some undesirable trend, you may end up with an increasingly uphill battle. The amount of influence and implementation surrounding what you’d like to change will likely encroach wide reach, if you are up against commercially influenced code you’d like to alter or remove. This will be especially true, it your direction is a threat to large industry or commercial interests.

Probably one of the best examples is Web Browsers. I don’t think any of us wanted the mess, we have today. The commercial interest made it difficult to even stand a chance. Firefox remained relevant, by having to “themselves” include the trends that larger industry forced for relevance.

Linux is a little different. I don’t want to dismiss the valuable and strenuous efforts of developers ( and lead members ) trying to keep Linux modular. But, it must be pretty obvious, by now, that most efforts to fork the Kernel, beyond providing release patches, would result in larger and larger efforts.

I have no stake in or against the Rust movement. But, you can see that there seems to be a commercial/industry push for it. For or against it, once the inclusion of Rust reaches a certain point you would have to rework constantly, to fight a rapidly advancing target.

Just fork it, is the biggest joke I have ever heard.

In “very” large projects, “code rot” can be hard to keep out with a fork of an abandoned project, relying on active progressing dependencies/tooling. I would imagine “code infection” would be even worse.

1 Like

My question was answered, very well. Thank you, Haiku developers, for providing clear insight; especially to a first time poster.

@MichaelPeppers After looking over the “more formal” statements, about x86 32bit Linux Kernel future, I have to agree; that i686 will still be kept. It seems like the biggest current issue is high memory changes. Much of the, more public, discussion seems to be more focused on embedded 32bit systems. My only issue there, is that the “formal” discussions seem to be void of “mentioning” the less formal discussions. LKML is a different world, but not necessarily formal. What sometimes seems evident there, doesn’t always come out in the wash.

However, in talking about 32bit embedded systems, there seems to be a strong emphasis on keeping support for systems that are still actually being used. The only distinction, I would point out here, is that this is mostly in reference to commercial client machines. There are exceptions to that, as there was some talk about the Nokia 770 still being used in the wild; thus still getting support.

Not much is being said about i686, and up, other than highmem ( over 800Mb ) changes.

What I would like to point out, only as a kind of premonition, is that the less formal discussions are not so friendly towards x86 32bit support. In the same talk where it was mentioned Debian, this year, making the first 64bit time complete release ( Trixie ), also mentioned that Linux Distributor were in the front line of phasing out 32bit Linux. It was also mentioned that “userland” was going to have issues fixing 32bit bugs, in packages using system calls like ( futex ). Debian did not release x86 32bit installations or Kernels. The 32bit userland, that has been released, is intended to run on dual library 64bit systems. The official word is that that, you may risk breakage upgrading a 32bit system with these libraries. These binaries are built less legacy CPU tolerant ( instruction dependency ). It may be that i686 support is still there; I haven’t tested.

We will have to wait and see, what happens here. This post was intended to provoke questions about 32bit long-term support, from a Developer perspective. As x86 32bit Linux users, even if support ended tomorrow, you and I have options for some time. If the Kernel doesn’t give up on x86 32 ( entirely ), some Distros may continue to support it. I do encourage, anyone interested in this, to keep an eye on 32bit package repository commit commentary; as its a good place to keep and ear on the heart beat of this situation.

Thanks, folks.

1 Like

The equation for Linux is simple: if there is someone keeping it working, it stays in. For x86 32-bit there seem to be few people willing to actually do the work. That’s what the talks are about: checking if anyone wants to put the effort. If no one does, you may cry as much as you want, it will be removed.

In fact, “commercialization” would be good here. Someone could offer to maintain 32bit support and be paid by the community of 32bit users. But if there is no money, this will not happen. In that case, technical decisions (of making the code simpler) will prevail.

The situation for Rust is similar. The language is just so much better and solves a lot of the problems of C and C++. Of course developers are embracing it, both professionally and personally. Only people who have interest (financial or otherwise) in keeping software broken would push against that. Maybe they exploit bugs, maybe they sell tools to analyze code and find bugs, maybe they get paid for finding and fixing bugs. A language with less bugs is detrimental to them, but benficial for everyone else. There’s currently a problem with supporting Rust on less-used platforms. This takes some work and so far it seems there is low interest in making it happen. But for Haiku (one of these obscure platforms), someone put in the work and we have Rust support. Migrating our own codebase to it may not be worth the effort at this point, but there is no reason to be against other people using Rust.

True in case of C, less true in case of C++. Modern C++ is a powerful language, so while it’s less safe than Rust, it has other advantages over it.

There’s a reason to be against poorly done hype driven rewrites. Good Rust software is usually new projects like Typst or Jujutsu, not rewrites.

2 Likes

I suspect that x86 will continue to work pretty well in debian even though it’s officially moved to debian ports.

I use debian PPC and it works fine, most of the issues are endian problems, which wont affect x86. The only other issues are that package dependencies break frequently because it never has a stable release… so you may have to manually solve package issues locally while they are worked through upstream (e.g. missing dependencies can occur because the package builders have not yet caught up, and so on). PPC mac hardware is 20 years dead at this point and still supported pretty well by debian, so x86 will probably be OK for a while.

I do also run debian x86 on a pair pentium 2/3 laptops, and there the main problems are graphics drivers… mesa gradually drops support, particularly for drivers that were never updated to KMS so my compaq armada m700 is no good for running games now that it ate up when it was new (25 years ago haha). But mesa just dont have enough developers to update drivers, stop bit rot and keep testing everything for these old systems.

I can’t say too much about Rust. If I did have a problem with Rust, it would be reworking large mature projects established in it. I haven’t done that. But, from what I see, it looks like forking large code bases, in a different direction, would be perhaps more unpleasant; this compared to existing languages ( depending on project implementation ). Keep in mind, this is “never” pleasant.

It is just a personal “merging” theory, that Rust is good at keeping everything tied together. If you really want to change something, you are going to work harder changing it everywhere. I don’t have experience here. It is also my assumption that, this will be more dependent on how you use the language; just as anywhere else.

I can think of several reasons why this potential could be seen/used as a feature, for large commercial/industry contributors, of open source projects.

I really can’t speak on this, with any confidence.

@PulkoMandy, I can “fully” agree, with paying for your coding interests. I will say that it can be a tricky model; but “for obvious reasons” that isn’t something I want to get into. I do support it, though; and thanks for bringing it up.

1 Like

I hope so. And hopefully other distros do well there, as well.

There is a “little” that can be done, on the packaging side of things. But it does require effort, on the part of the distribution team. Of course, nothing can be done for some of our older “cherished” devices. I still admire the guy maintaining the VIA and older Nvidia/Ati GPUs. Don’t know if he is doing it anymore.

1 Like