I recently decided to play around with Mozilla Hubs and see how the project developed. Amazing stuff by the way… if anyone’s into virtual worlds (like Second Life) I recommend giving it a try.
It helped me discover a strange issue: Under Firefox my WebGL performance is horrible, I can only reach 40 FPS usually even less. Under Chromium however, I constantly get the full 144 FPS supported by my 144 Hz monitor. Something is HEAVILY degrading my WebGL performance under Firefox, but that something doesn’t seem to affect Chromium at all… since FF is my main browser I’d really like to figure out what it is and how to fix it. I already looked at the problem with a Hubs developer on their Discord server last week, but they couldn’t find an answer based on the data they suggested I provide which they said appears in order. What do you suggest I look for?
The general info: My OS is openSUSE Tumbleweed x64 KDE, Firefox is installed from its system packages so it’s always the latest version available. I have an AMD card so I’m using the amdgpu module, once more the one preinstalled by the distro (no custom drivers). Here’s the output of my about:support page from Firefox: https://pastebin.com/qV62xsQ6
Looking through the Firefox info texts, this obviously stands out:
blocked by env: Acceleration blocked by platform
unavailable by default: Hardware compositing is disabled
unavailable by env: Hardware compositing is unavailable.
opt-in by default: WebRender is an opt-in feature
blocked-release-channel-amd by env: Release channel and AMD
disabled by default: Disabled by default
disabled by default: Disabled by default
I have an Nvidia GTX 1050, and my Firefox reports roughly the same as yours: although everything seems to perform flawlessly with Steam, SeaMonkey 2.53.1, Leap 15.0 and the Nvidia 390.132 drivers I’m using.
I usually just quickly check the page http://madebyevan.com/webgl-water/ with my installed browsers (SeaMonkey, Firefox, Chromium) and leave it at that. Having watched images load slowly, line by line, through 9600baud modem connections for years since 1994, I just don’t expect marvels of performance or throughput from my web browser. But even I have noticed that Chromium just animates Google-Maps 3D landscapes and street views far better than Firefox or SeaMonkey. (I install Chromium just for Google Maps, and for testing pages.)
I’d love to get to the bottom of this, though. From line 172 of your info dump…
WebGL 1 Driver Renderer: X.Org -- AMD Radeon (TM) R9 390 Series (HAWAII, DRM 3.36.0, 5.6.2-1-default, LLVM 9.0.1)
… it seems clear that Firefox detects the Radeon driver correctly (as my Firefox detects the Nvidia driver). So what happens then? Web browsers have become such complicated systems.
See the section titled “Still No Default WebRender on GNU/Linux Systems (and that is a good thing)”
Very interesting and insightful info, thanks for sharing! In my case I didn’t have WebRender enabled in Firefox. Out of curiosity I just did: Despite the article suggesting it would make performance worse, it fixed some of the issue and is a noticeable improvement!
Sadly it still doesn’t fix it all the way: While I’m now getting 60 FPS where I’d previously get only 40 FPS, on rare occasions even 90 FPS briefly, it’s very jittery and still doesn’t reach the 144 FPS Chromium can (for the same demo / scene of course). Is this simply the best Firefox can do on Linux at the moment, or are there other settings I can hunt for in about:config hoping to further improve this?
If you set layers.acceleration.force-enabled to true in about:config, does this help?
No, there doesn’t seem to be any further improvement with that.
About 4 mths ago I reported to Mozilla that hardware acceleration (which was enabled by default at the time) was causing my system to stutter and then lock up on a particular site that apparently made heavy use of WebGL (It aggregated at least 6 streaming apps on the page at once and did some fancy tricks to send at least one stream of data back to the server). Disabling hardware acceleration fixed the problem for me. Although the site didn’t publicly state, I also suspect that the site was “mining” client CPU cycles to power the app which is why I won’t name that site here. Although it could be considered similar to bitcoin mining, it wasn’t that so I don’t think it crossed that line.
You may want to carefully monitor your CPU and GPU resources with hardware enabled, and of course look for patterns if performance patterns crop up.