Hi, I’m working on a lightweight web browser that intentionally doesn’t support JavaScript, but instead has updateable set of “fix scripts” that fixes and even improves various websites so they’re usable without JS.
While the browser is not yet usable for everyday usage, there is also FixProxy that uses the “backend” portion of the browser (everything except the layout/rendering) with a regular browser. I’ve been using it for multiple years as my primary way of web browsing with good results.
With the 0.2 version I’ve added initial support for Haiku, there are still some minor input issues, I’ll fix them in a future version.
I was impressed with Haiku so much that at some point I’ve decided to officially support it in all of my software even when I’m not actively using it myself (I have already a working setup that I’m most comfortable with), though I’m using it from time to time for some tasks (one of them was some installation stuff where it worked better than Ubuntu that kept crashing) and I always find it great to use.
One of my software is a programming language called FixScript that provides an easy out-of-the-box support for Haiku so that also other developers can easily support it.
I would like to hear what features I should support to make it even more integrated. I’m using a custom GUI library that uses a combination of native and custom widgets (with emulation of look and feel) depending on what makes sense for a particular usage.
I found FixBrowser when 0.1 was only a few days old and already tried 0.2 on Haiku shortly after it was released.
While it’s not yet usable for everyday browsing,it’s already a very impressive project,especially thinking about the new programming language you invented for it.
I think it aligns very well with Haikus goal of being lightweigt,while all other browsers are growing bigger and bigger.
Also,I think your solution of writing scripts to unbreak the Javascript mess rather than supporting it directly is the best way to deal with the worst offenders.
I doubt that it will scale infinitely (you can’t ship a fixscript for every single website out there,which are probably millions),but for those that are awfully slow especially on low-end hardware,contain tons of tracking bullshit and are unfortunately still successful,that’s probably the best solution to reduce the pain while being forced to browse them.
I’ll keep an eye on the project and wish you the best for it
I think the scaling problem is not that big in practice. (Un)fortunatelly there is a lot of centralisation on the web, either by users flocking to a few very big services or by a lot of websites using a common technology for publishing.
This assumes that I’m the only one who is working on the fix scripts, I do believe I can add support for a very wide range of websites even alone, that’s why I’ve setup the donation system where people can ask for websites to support (it also focuses on what is directly relevant to the actual users and lessens the scope a bit).
But if others would also contribute the fix scripts then this scaling can be less of an issue. This is somewhat similar how Linux distributions can handle such vast number of packages with good quality.
Of course there will be always websites that would be unfixed, that’s unavoidable. In that case you would open it in a regular browser (or using CEF directly in FixBrowser, maybe even using WebKit in Haiku’s case).
Websites are also changing from time to time breaking such fix scripts. But I’ve found from a long-term experience this doesn’t happen that often (in order of years) for most websites (I expect specifically YouTube to be an issue but that’s a very special case). This will be handled by an automatic test system that will alert any such breakage so it can be proactively fixed instead of waiting for the bug reports.
That’s a great idea. One of my most favorite browsers to use on Haiku is https://depot.haiku-os.org/#!/pkg/links/haikuports/haikuports_x86_64/2/30/-/-/1/x86_64?bcguid=bc239-KFHA because I like the simplicity of just getting to the content I care to navigate and view. I’d love to see a browser that would snag the relevant content and render it the best it can and make easily navigable without JavaScript getting in the way. I’ll check your browser out really soon.
You can try it on sites like AbcLinuxu (in Czech) which is actually quite usable, there are also promising websites like: Slashdot or Reddit, some at least show the content like OSnews.
Otherwise it’s better to use FixProxy to get the benefits of the “fix scripts” and browsing with privacy.
I think a web browser without javascript support is effectively a dead-end for anything but a retro computer and very simple websites. Many, many features of the “modern web” require javascript and not all of that is bad.
Your work has merit, but I don’t think building a fully-featured web browser supporting even a modest fraction of websites is going to be viable. Also, if you don’t support anything before HTML5 you will end breaking stuff on on some sites/pages.
Please don’t mistake this for discouragement, I look forward to seeing where this goes.
I’m guessing the problem is a lack of support for the ‘align’ attribute of the ‘img’ tag (HTML 4.x) which was deprecated in HTML5. HTML <img> align Attribute - GeeksforGeeks
There are other issues too like limited or no support for tables, unordered lists, etc. And it would be nice if it conformed to the historical standard of making link text blue and underlined…
The overuse of JavaScript (especially client side) is what’s bad for WWW. I like the vision of this web browser. Letting companies like Google control the way the WWW works is a very bad thing.
The overuse of JavaScript (especially client side) is what’s bad for WWW.
Except that it really isn’t the amount of javascript being used which poses the biggest problems. And the language itself was originally created for the purpose of adding interactivity to web pages, specifically for use on the client-side.
Letting companies like Google control the way the WWW works is a very bad thing.
I think I know what you are trying to say, but things are far more complicated than that and trying to reset the web to pre-1995 just isn’t a great plan.
And the language itself was originally created for the purpose of adding interactivity to web pages, specifically for use on the client-side.
This much I know. Brendan Eich developed JavaScript for this very reason who has also said that in how it’s used today goes well beyond what the language was originally designed for.
This following rant speaks volumes on the issue plaguing the WWW today, at least as far as web browser development is concerned.
I think a web browser without javascript support is effectively a dead-end for anything but a retro computer and very simple websites. Many, many features of the “modern web” require javascript and not all of that is bad.
That’s why a major component of the browser are the “fix scripts” that fix and even improve individual websites as well as groups of websites.
Even websites that are fully coded in JS can be handled in this way (by basically reimplementing them). Currently Imgur is for example handled that way (and many more will be added in the future).
I totally agree that not all JS is bad. Personally I really like things like WebAssembly, PWAs, WebGL etc. The point is to make a browser that is lightweight and fast for most websites except for those that you want bigger interactivity with. This will be achieved by having an ability to show the page in full using CEF engine (or WebKit in Haiku’s case) for a specific tab or website.
That way you will pay the price for memory and CPU consumption only where it is needed. It will also run each such tab in a separate process so once you close it the memory will be released.
Also the browser is currently an alpha version. Major foundations are resolved but many features need to be implemented still. It’s better to use FixProxy for practical usage for now.
On top of that, while the “fix scripts” are limited by outputting HTML/CSS so that it will have limits on the interactivity unless you run the tab in a “full mode” (using the CEF/WebKit engine), there will be also one other way to provide additional interactivity for selected websites.
The extensions will allow native code (every version of these will have to be manually verified so don’t worry) and there will be trusted extensions that will be automatically downloaded and run (configurable). These could be requested by the “fix scripts” to show some part of the webpage with more interactivity.
Some examples include:
WebAssembly applets - this will be limited to specific websites as the interface between WASM and the browser is not universal, but there are some commons interfaces such as Emscripten (so various cross-compiled SDL games could work out-of-the-box)
ShaderToy - I want ability to render it without GPU, even if it means “precaching” it due to slowness, but would be then accessible to any device and without worry about GPU driver crashes
Archive org DOS games collection - this will ideally use the native DOSBox to run the games without the overhead of WebAssembly
Maps - this will be most likely also supported as maps are important and do not map well to plain HTML/CSS (both outputs would be provided though)