Problem with fglrx driver

Hi, I’ve a laptop with an HD6490 Graphic Card.

I’ve installed fglrx driver (following howto on the opensuse site) but after reboot i have a black screen.
I’ve looked in X log files, and it gives me the errors:


(II) fglrx(0): Invalid ATI BIOS from int10, the adapter is not VGA-enable
(EE) fglrx(0): Invalid video BIOS signature!
(EE) fglrx(0): GetBIOSParameter failed
(EE) fglrx(0): PreInitAdapter failed

Could you help me?

Thanks

Hi loverdrive,

#1 an idea for a solution: Go into your BIOS and switch you graphics card to “discrete” as described here: Problems with fglrx - ThinkWiki

#2 as advice from an other ATI-user: use the open-source radeon-driver. It’s superior in every aspect apart from 3D-gaming and maybe power-consumption. But in case you have some experience with linux and opensuse you can update the kernel to version 3.11 and enjoy the full benefits of the newly released power management in the open-source driver.

I do so and can confirm that kernel 3.11 is a huge step forward for all ATI-users.

Best wishes,
Simon

maybe still a slight edge to the prop. drivers in some cases, but overall I’d say the differences are pretty much negligible. Once DPM stabilizes (and possibly even tweaked a little bit further) there is unlikely to be any real difference between the two.

kernel 3.11 is a huge step forward for all ATI-users.
not all users … only those with r600 and greater based hardware … older asics will still have to use the older “dynpm”

I hope future will prove you right. But when it comes down to 3D-gaming, the open-source driver doesn’t really stand a chance: [Phoronix] Radeon HD 5000 Series Gallium3D Performance vs. Catalyst](http://www.phoronix.com/scan.php?page=article&item=amd_hd5000_dpm93&num=1) (the only benchmark I know so far).

I agree. I am still a bit over-excited of the benefits that did the kernel 3.11 for me.

To be clear, my comments were strictly in regards to PM. And in that regard, the future is already now.

But when it comes down to 3D-gaming, the open-source driver doesn’t really stand a chance: [Phoronix] Radeon HD 5000 Series Gallium3D Performance vs. Catalyst (the only benchmark I know so far).

[ul]
[li]Just over three months ago I wrote: [/li]> as for 3D …by the r600g … this driver is getting better and better all the time… bleeding edge code and the enabling & disabling of certain features (of the driver and the cpu fq governor), by my estimate, now places its OpenGL performance typically within 85-90%, and in some cases above, that of the prop. catylst/fglrx driver stack’s … this is/would be, no doubt, news to most … though there may be variation amongst hardware … r600 covers a broad swath of hardware

[li]Larabel typically tests with stock settings (i.e. the OOTB experience). That was no different in the case of the article you linked to: [/li]> The rest of the platform was left stock, including not forcing the R600 SB back-end, but separate Phoronix articles will deliver the shader optimization back-end benchmarks especially now that it looks like it could soon be enabled by default.

[li]At the time I wrote the above, sb support was not even built within the driver by default ( you had to pass the --enable flag), let alone being turned on. Through Mesa 9.2 (and including the 9.3 dev. pull Phoronix tested with) sb support was being built by default, though you still had to enable it yourself (by simply passing “r600_debug=sb” via anyone of the numerous way that are available to do so). As indicated, Phoronix did not do the later. Mesa 9.3 now not only builds sb support into the driver by default, it also enables it too. [/li][li]The result of using sb generally speaks for itself:[/li][LIST]
[li]linux + r600 Shader Optimization Back-End vs Windows 7 + Catalyst 13.1 | Gears on Gallium [/li][li]11-Way AMD Radeon GPU Comparison On Linux 3.12, Mesa 9.3 - Page 3 [/li][/ul]

[/LIST]
But, as they say (and as I intimated earlier) YMMV

In any regard, I think that it is good to keep things in perspective (see this comment)

Hi Tyler_K,

thank you for the interesting figures. They’re very impressive. I wasn’t aware, that the developers of the radeon-driver have done such a huge step forwards with the latest releases.

So would you say there isn’t any advantage of the fglrx over the free driver nowadays and therefore give the threat-starter a clear go to change?

Simon

One must always be careful to differentiate in regards to the hardware because there is no one “driver”; its plural. I’ll copy part of what I wrote elsewhere recently and then expand it just a bit in regards to being relevant to this thread:

Here is a very brief (and grossly simplified) description of (the key components of) that for contemporary AMD graphics adapters under the X Display Server:

[ul]
[li] kernel component[/li][LIST]
[li]DRM/KMS kernel driver … the radeon driver (radeon.ko) … what you see listed if you use “lsmod”, “lspci”, etc… [/li][/ul]

[li] userspace components[/li][ul]
[li]DDX driver … the Xorg driver (radeon_drv.so) [/li][li]3D/OpenGL (Mesa) driver) … for which there exists, applicable to the particular hardware, the r300g (r300_dri.so), r600g (r600_dri.so), radeonsi (radeonsi_dri.so) [/li][/ul]

[/LIST]
As for the 3D drivers, their naming represents a class of hardware that the driver begins coverage for (Example: r600g begins for r600 adapters up through to NI (Northern Islands)). The g appended on the names of the two listed above denotes that they are a gallium type of Mesa driver (distinguishing it from classic Mesa drivers … there used to be (actually, originally were only) classic r300 & r600 drivers, but these have since been removed when the gallium versions became mature). Support for SI (Southern Islands) class hardware (and now extended above for the future CIK/Sea Islands adapters) was developed using a gallium driver from the very beginning, so no such naming distinction is used (needed). The support for SI devices (i.e. radeonsi driver) is slowly coming along, but is still not as mature or feature-full as the r600g.

So, for hardware that falls within the coverage of the r600g in the OSS world (which is essentially HD2000 through HD76xx, and all the APU parts (they may have HD8xxx gpu names but they are indeed rebrands, and not of the GCN arch)), the difference between the OSS stack and the prop. stack is, in the general case, not particularly meaningful (IMO) to warrant use of the prop. driver any more. The one exception of where the prop. drivers will provide a much better experience is with OpenCL. Other then that, PM is now on par, accelerated video decode tips in OSS favour, 2D is much better with OSS, 3D is slightly better with prop. There, of course, will be corner cases which will go one way or the other, and perhaps in some cases in a much more meaningful way then that of the general case. In addition, and this is the real important point, you’d have to profile your particular application usage and see if it makes sense for you. I can only advocate what I think makes sense on general terms. And given the OP is using a HD64xx adapter (a very weak GPU series), I’m pretty sure that there are no pressing 3D or OpenCL needs to be met, so I would indeed endorse the recommendation to, after upgrading their kernel and graphics stacks, switch to the OSS drivers (note: they will still have to manually set a few options to take full advantage of the new features and capabilities)

For hardware that falls within the scope of coverage of the radeonsi in the OSS world (i.e. HD77xx through current and future parts, less the current APU parts as explained above), if you’re in anyway using 3D above basic desktop, then you would currently be much better served via the prop. drivers.

Just posting a correction just in case anyone ever reads old stuff or comes to it via a search (google or otherwise) …

Have seen some recent material that has reminded me that what I wrote above isn’t correct – I neglected to consider the new(ish) Kabini & Temash based APUs … they use a Jaguar core and the gpu is indeed GCN/SI based. The Richland APUs, however, are, as I said earlier, rebrands of the older gpu arch (though slightly updated).

Great explanation, Tyler_K.

Thank you!