The graphics acceleration can of worms

i don’t think the number of people gaming in a windowed multi window environment is large enough to justify the work to allow multi window accelerated compositing.

it’s also a feature that can be added after 3d acceleration is fleshed out

full screen,single window, accelerated gaming is where it needs to start, kiss , keep it simple stupid

2 Likes

That was a catch 22 situation. It would have been financially viable if the big companies or smaller had seen the potential and had a longer outlook but of-course BEOS didn’t help by changing its focus to internet appliances. Did they change their focus because they realized that the the hardware companies had a short outlook. Which came first?

Be started by building their own computers. That didn’t really work and so they started trying to sell themselves to Apple or possibly other computer manufacturers, first as a replacement for MacOS and Windows, but quickly they turned down their ambitions and started to sell themselves as an alternative OS that you would run side by side with your main one. And that was before even shipping a stable release of the OS, still in the “developer release” days.

Apple decided to go with NextStep instead. PC manufacturers remained with Windows. Microsoft was not willing to sell OEM licenses for Windows to people who would be dual booting it.

That’s what happened first. And then Be gave up the operating system market and tried to do something else. But a large part of their development team wasn’t interested in doing something else, and left the company. Eventually Be was sold to Palm, which could have been a way to get back into the end-user operating system market. But it was too late, the engineers already had left, and not much came out of it.

6 Likes

Be did exactly what Next did.

  • Build hardware and tightly integrated OS (FAILED)
  • Make software available on other computers (FAILED)
  • Try to license OS to other third parties (FAILED)

The main difference was that Next tried to license the API, where as Be tried to make their OS fit in to a OEM space for consumer oriented appliances. Be also made the mistake of going through PowerPC V2 (Mac and Clones) before porting to Intel. Had they ported to Intel and skipped Mac, the story would probably be a little different. I think Gassé really thought he would sell to Apple and then PowerPC would be significant.

At the time Apple was a bit more open to such experiments than it is now. There were the licensed clones, there was the Macintosh Application Environment allowing to run Macintosh applications on Solaris and HP-UX, and Apple was indeed in need of a new, more modern OS.

So it wasn’t completely crazy to go with that. It also provided a much less diverse choice of hardware, at a time where Be could not have done a generic OS supporting the dozens of different video and network cards used on PC (even later on, BeOS was known for not-so-great hardware support, until OpenBeOS/Haiku developers started providing drivers).

Of course, looking at it from where we are now and knowing what happened next, it’s easy to say it was the bad decision. But at the time, it certainly wasn’t so obvious.

3 Likes

JLG bet all the company on a “sure thing” that fell in the last furlong. He didn’t see the Steve Jobs reality distortion field till it was too late. If we believe what is implied in the movies, Jobs created Next as a vehicle to make Apple uncomfortable and take him back.

Yes, but what does this have to do with the “graphics acceleration can of worms”? It appears you’ve opened the wrong can of worms! :stuck_out_tongue_winking_eye:

5 Likes

Probably about as much as the Amiga… BeOS is at least related to Haiku. :crazy_face: This all went off topic a long time ago.

1 Like

In my previous posts I wrote about the AmigaOne series computers. They use modern graphics acceleration unlike the old, familiar 5 bpp planar screen modes on the classic models. Just because the PPC models have been hacked to run a version of the AmigaOS doesn’t mean the hardware has anything to do with Amigas. AmigaOnes are just PPC-based semi-modern computers.

Now, in an attempt to steer this thoroughly derailed thread back on topic, drivers are the bane of third-party operating systems. Graphics drivers are their graveyard. This reopens the correct can of worms. Discuss.

as I’ve stated, build everything as low level, clean simple and clear as possible, and use the vulkan api as the native api, use opengl etc wrappers above it.

realistically, Intel has more total integrated GPU hardware in the wild, but I believe AMD Radeon may be easier to do because there’s performant drivers open source now to build from.

2 Likes

Gassée likewise intended Be to prove to Apple that he was right. It’s just that he wasn’t right. His career is an impressive record of failure, losing Apple a tonne of money before he was kicked out, losing money at Be, losing yet more money for PalmSource, and then writing about what prospective “leaders” should do, presumably with the intent not to lose money and destroy everything they touched, although who knows?

The idea was, Be shows Apple that Gassée had the right ideas and they were fools to get rid of him, they buy the company to get him back, or, if they’re too proud to do that, it takes on Apple and destroys them, becoming a new top dog for premium desktop computers.

In one respect only Be is right where Apple was wrong, you need to ship a Unix-type operating system, you want protected memory, pre-emptive multi-tasking, all that jazz, and you need it in the 1990s, you can’t wait around and lose money for a decade while you realise which way the wind is blowing.

But in lots of other places Be guesses were all wrong. Be’s guess is that raw clock tops out by the mid-1990s and multi-socket is the way forward. In fact it takes much longer and faster single core performance is still a thing into the 21st century, while single socket (but eventually with multiple cores) remains normal. It guesses that everything interesting is embarrassingly parallel and so you needn’t work on the concurrency problem, trusting that just spinning up more threads will be fine. And it’s more or less entirely blind-sided by the Internet.

Much worse, everybody else (except Apple) has already seen the light about protected memory and pre-emptive multitasking. Instead of shipping a finished, polished system years before anybody else has a demo, Be find themselves behind both Microsoft’s product (NT) and the upstart Linux. BeOS R5 would look pretty good if your competition is MacOS 8, Windows 95 and expensive Solaris workstations, but in fact your competitors have Windows 2000, OS X is on the horizon (previews are out), and cheap Linux systems are everywhere.

The story of BeOs is very off-topic here, even if it is very interesting!
Should be moved to BeOs/Haiku history site!

This thread is about graphics acceleration and how to plan future use for Haiku.

8 Likes

Well, OS X wasn’t on the horizon when Gassee was betting the company on Apple buying them. OS X took a long time to materialise after Apple Acquired Next. What is true is that he didn’t see Next as a real opponent.

The massive Linux boom also had yet to materialise. That boom started in that era, but it wasn’t till the mid 2000’s that everyone was using Linux for everything.

@brunobastardi the issue was that the OP tried to bring in a legacy OS as a yardstick I think.

1 Like

If you can find another OS that has graphics acceleration that we can use or that shows how much or how little effort it will take to add graphics acceleration, your posts will start to be on-topic.

3 Likes

What about magma dirvers from Fuchsia Os? those are over the userland.

Fuchsia is 64-bit only. Otherwise it could have been interesting.

1 Like

At the risk of sounding rather presumptuous, would the lack of 32 bit support be a problem? I imagine there must be very few people who need legacy compatibility with BeOS apps (or maybe even hardware) yet are crying out for modern drivers.

Besides, wouldn’t Fuchsia only support recent hardware rather than the sort of system 32bit Haiku might be running on?

2 Likes

Being magma out of the kernel, doesn’t it better conform to the philosophy of HAIKU, instead of tangling with the nonlinear logic of linux drm drivers?
Am I asking a silly question?

https://fuchsia.dev/fuchsia-src/concepts/graphics/magma/design

Above all, there would be no old software that takes advantage of modern drivers to be able to take advantage of them.

I currently plan to implement most of DRM ioctl logic in userland. libdrm has drmIoctl function that usually calls regular ioctl that calls kernel, but I put actual implementation there instead of calling kernel.

Fuschia has a few graphics drivers for now, supporting DRM based Linux drivers is more practically useful.

10 Likes

Some experiments with ring buffer. I put NOP command to DMA ring, incremented write pointer and confirmed that GPU incremented read pointer ,indicating that command is executed.

regs[DMA_CNTL]: 0x8210400
regs[DMA_IB_CNTL]: 0x80000000
regs[DMA_CNTL]: {10, 16, 21, 27}
regs[DMA_RB_CNTL]: {0, 2, 4, 12}
regs[DMA_IB_CNTL]: {31}
ringSize: 0x1000
rptr: 0
DMA_PACKET_NOP
rptr: 0
rptr: 4
15 Likes