Posted: . At: 6:33 AM. This was 2 years ago. Post ID: 16252
Page permalink. WordPress uses cookies, or tiny pieces of information stored on your computer, to verify who you are. There are cookies for logged in users and for commenters.
These cookies expire two weeks after they are set.


What is the future of Mozilla? Where is Firefox heading?


What is the future of Mozilla? I was the one who originally asked why they didn’t answer Electron. I agree that being funded by Google is not the problem. But Mozilla needs to consider what its “core mission” is. And they are: their focus is shifting to offering services. They want to provide a VPN, news (Pocket), and possibly even e-mail and other common subscriptions with a “privacy focus”. And I think it’s doomed. Anyone caring about privacy already has niche providers like Mullvad or Protonmail. Anyone who doesn’t has free webmail and a ton of overpriced VPNs to choose from. Mozilla’s name is meaningless in this space. Their “core mission” is to provide a platform-neutral, independent web rendering platform to as many users as possible so they have a truly open, free option that is standards-compliant. They’re unable to do that when they don’t have a majority of the market share, especially since they have to play follow-the-leader to Chrome’s autistic non-standard implementations. Even if Chrome and all the other browsers downstream died today, Electron would have to be maintained for embedded software well into the future. I have a feeling if Chrome does kill itself, Electron will be the IE6 equivalent for an extra decade after. Web standards changes will be halted because “Electron needs it to work this way”.

It’s about market share and projecting dominance over standards. Plus it would have allowed income from offering support for the product directly. And while I personally like that Thunderbird uses a native GUI, they could have probably rewritten it on a Gecko-based Electron equivalent and pushed it as a stable local e-mail client for webmail. And if that took off, they would have a good reason to open their own privacy-oriented webmail service. The way they’re going about this is all wrong. They want money to fund Firefox forever when all they had to do in the first place was not lose market share and make bundling deals. Maybe if they’d been more aggressive, Edge would be Gecko-based. Does anybody else consider the possibility of offloading a bunch of stuff the web browser usually does to a server? I don’t know why I hadn’t thought of it before, but there is a project I heard of that allows you to browse the modern web with old browsers that can’t handle all the HTML5 code or even older code depending on how old of a browser you use. It’s not very good, obviously, since it is limited to what those old browsers could support, but it does what it says. Here’s the project I’m referring to: https://github.com/tenox7/wrp.

More importantly, I think that idea has so much potential, but I’m not sure exactly what it could grow into. I just know it’s paving the way to the future. I can almost taste it. The freedom to use any browser from any device of any architecture running any operating system and still have all of the privacy features we need. The same user agent for everyone. Everybody connecting through the same server could potentially have the same exact rendering so nobody can be fingerprinted individually (strength in numbers). Content blocking was done upstream so you could use a browser that doesn’t support extensions if you had to. Or if you think having it prerender everything for you is a dead end, or you simply would want something a bit more individualized anyhow (not a bad idea, how can you trust one big man in the middle? and there would be less benefit to everyone running their own server in that case), then perhaps instead it could be something more in-between than a full rendering proxy. Like say, it still outputs actual HTML but without the original javascript, then updates in real-time, for anyone on a modern browser to take advantage of. Then the user could still inject custom CSS or run custom JS. Maybe custom JS can be optionally passed through somehow also to enable it to actually interact with the original unmodified page. Assets could be served from the server, obviously.

Cached, and the server could even have a ridiculous amount of fonts and other stuff packaged to deliver without it even having to hit up CDNs (Decentraleyes style). It would take a lot of brainstorming to come up with the best plan, but I think it’s an excellent idea that could grow into something able to kill these bloated unmaintainable monstrosities and better enable niche software and hardware to have a fighting chance again without completely having to abandoned most of the modern web or have extra hardware, or emulation/virtualization software, on hand when needed.


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.