Github from Haiku browsers

It’d be nice to able to actually work with Haikuports on GitHub hassle-free, but I’m not finding it that way… I can’t find any browser on Haiku that gives me access to the features I need there! I wanted to add an issue to an app on Haikuports, with a very short patch, so I connected with WebPositive, but found that all of the buttons just have ‘black’ tooltips, and don’t seem to work anyway! There was no way I could enter the patch as ‘code’. And when I submitted and saw it was screwed up, there was no way to edit things. I tried Bezilla and even QupZilla (which works fine here) without success. Eventually I moved over to Linux/FireFox and easily did all the editing needed. Not good. :unamused:

Same here, I’m running haiku-64 in VirtualBox on Windows. to get my changes on github, I had to transfer them to my 32bit haiku machaine and then to Windows ( using PoorMan ) so I could use a browser that I could work with. :confounded:

You can use quzilla?

Hey Pete

it seems Github updated its software, and uncovered bugs in ours. I submitted a bug report for the fork button.
https://dev.haiku-os.org/ticket/13661

You can submit your own.

Bye,

I should also mention that the Qt-Haiku team is hard at work updating Qupzilla to a much recent version, and the Haiku team (well, mostly me with some occasional help from other devs) is working on merging 1.5 years worth of commits from WebKit upstream. This should fix at least some of the problems.

It takes some time as I’m careful to not introduce too many regressions, and build times as well as time needed to run the test suite are quite long. But I’ll get there eventually.

Please do report each bug you notice in a separate ticket, so that we can investigate them one at a time and close them.

Yep – QupZilla works fine for me. It’s the only way I can access this forum from Haiku!

YEs, OK, I should report the problems I hit. Which is the proper bug tracker for WebPositive, though!

I think my main itrritation is not with Haiku. It’s with this endless “updating” of sites that don’t really provide any major increase in usablility. They just make it impossible to access except with the latest browser. It’s a “Red Queen’s Race”, and not what I think Berners Lee intended! (A typical OT example: A local pub has live music, and publishes a schedule. Until this year even BeZilla was happy with it. This year they’ve gone to SquareSpace, and I can’t get any info off it without going to the latest FireFox on Linux… Grrr…)

dev.haiku-os.org is used for WebPositive and WebKit problems.

“I think my main itrritation is not with Haiku. It’s with this endless “updating” of sites that don’t really provide any major increase in usablility. They just make it impossible to access except with the latest browser. It’s a “Red Queen’s Race”, and not what I think Berners Lee intended! (A typical OT example: A local pub has live music, and publishes a schedule. Until this year even BeZilla was happy with it. This year they’ve gone to SquareSpace, and I can’t get any info off it without going to the latest FireFox on Linux… Grrr…)?”

Just wait until HTTP2 let’s them convert everything to proprietary binary streams, replete with DRM, for the entire page! The corporations are trying to hog-tie the internet for profit. BL is actually on-board with some of this (at least the DRM part). So, it won’t be a “Red Queen’s race” - it’ll be a “Red Queen race” !

proprietary? HTTP2 is binary mainly for performance reasons. And I don’t think there are DRM inside the protocol (there are in HTML5 but that’s unrelated).

HTTP1 is a mess, years of slight changes and standards that are not actually implemented by anyone, so it is a good time to think about a new and clean replacement.

Did you ever look at the raw HTTP stream to debug something? Even as the main dev of our HTTP code, I don’t find myself doing that quite often. So I have no problems using a binary protocol there. And, they are careful of providing debugging tools, including wireshark dissectors, so in fact looking at the protocol is not a problem.

We can’t stay in 1990 forever…

By itself, it’s not proprietary. But, it’s an enabler for a proprietary internet. The original HTTP was by nature open and “democratized” - partly because everything was in plain text.

The binary stream of http/2 is a multiplexer. It allows for the packing of binary bitstreams into a “container” connection. This is similar to the way h.264/h.265 video streams are both streams and containers for a proprietary format.

I can see the creation of a special binary format (there’s nothing to stop an allegiance of commercial interests, including browser makers, from doing this) - that actually patents a special sort of binary stream used with http/2. So, to use the internet (once everything was converted) - you’d need to use patented stuff - and websites would pay royalties.

Now, you’ll say that they could do this now. Yes, they could - but it would be difficult - an easy mechanism that is universally deployed is not in play yet. Will be for http/2, which paves the way for this kind of thing. As I wrote, it’s the mass adoption of an enabling format that will allow for the “hog-tying of the internet.”

Plain-text and proprietary have nothing to do with one another. Plain-text formats can be completely proprietary too, see many XML-based formats out there. Also, don’t forget that what we call “plain-text” is just binary as well anyway – ASCII and others are binary encodings too. The only thing that makes ASCII special is how ubiquitious it has become. But a different data encoding doesn’t make things any more opaque or proprietary, as long as an open and complete format specification is available.

Current HTTP is already being used to transfer all kinds of “binary” data streams all the time. So whatever evil things can be thought up, they can already be done right now by simply encapsulating the proprietary formats into a plain old HTTP connection, see HTTP Tunnel. It’s not “difficult”, it’s being widely used because HTTP goes so well through firewalls and proxies. Even entire VPNs can be spun up through HTTP(S) connections… In short, I don’t see how HTTP2 makes evil things more likely than they already are.

1 Like

The thing I have run into is not having the ability to download source code, which requires finding other crafty ways to get it.

You don’t need to “jail-break” an ascii file. It’s impossible to protect plain text, other than by legal action - which in many countries is also impossible. So, all the things that proprietary companies use to protect their IP - are binary.

The current use of multiplexed http is related to video streams only, a pretty narrow usage, and not “universal” enough to tie down the internet. Come to me in five years and tell me how this will be working. I already have the answer.

Yes – this is a major annoyance. I don’t think any of the Haiku browsers works with the green “Clone or Download” button.

I can think of one restriction from a binary HTTP/2… No “View Source” option! I’ve had a few occasions when that has been essential. (Admittedly to overcome roadblocks – intentional or not.) As an instance:

Yesterday I discovered that an early influencer of my tastes – Radio Caroline – is still broadcasting. (Over the net now, though, rather from a ship in international waters! From a ship again this week, but moored in a river.) Anyway, I decided I’d try to receive it in Haiku, but the “RadioPlayer” that popped up from Web+ was not functional. Looking at the script source, I was able to find the stream URL. Plugged that into SoundPlay, and dug Caroline again all evening [and well into the night :grinning:].

Have you ever tested it?

Sorry… failure to connect here.:slight_smile: Have I tested what?

This looks really strange for me.