Hi … I have openSuse 12.3 running on a Dell Optiplex 9010. It has an add-in ATI HD 7000 series video card and an integrated Xeon E3-1200 v2/3rd Gen Core processor Graphics Controller Intel video card. I currently have two monitors, one to the ATI card and the other to the Intel video port. By default, openSuse does not detect this hardware configuration and only enables the Intel graphics card. Being a linux user for some time I decided to work a little deep. On a terminal I tweaked the default xorg.conf file produced by the command:
It basically has two graphics card enabled, the Intel on the left and the ATI on the right. When I restart the X Server it all seems fine on the login window. I’m using Gnome 3.6.2 in fallback mode ( A.K.A Gnome classic ). When I login all my panel applets appear duplicated, and when I logout and login again they all keep copying themselves on the panel. I’ve seen many threads on the web suggesting using the propietary drivers, or using Gnome 3, and others seem like they have no [SOLVED] state.
What can I do to keep using the free radeon drivers on my openSuse 12.3 with Gnome Classic and enable a multiple display support …
I tried it now and the effect seems to be the same, the applets start copying themselves. Also the screens are completely independent. I can jump from one to the other with the cursor but no luck in dragging windows or anything else.
I’ve tried other things in the meantime. Tried using the propietary fglrx driver but nothing new. I just can’t seem to enable the hybrid graphics mode ( one display on the ATI card another on the integrated card ) at least on Gnome. Don’t know about KDE.
Hi … I have openSuse 12.3 running on a Dell Optiplex 9010. It has an add-in ATI HD 7000 series video card and an integrated Xeon E3-1200 v2/3rd Gen Core processor Graphics Controller Intel video card. I currently have two monitors, one to the ATI card and the other to the Intel video port.
Ah, so you have hybrid graphics hardware. (Sorry, missed that. I thought you had two separate graphics cards). Both ports should serve the intel or ATI chipset, depending on the active chipset at the time AFAIU.
We really need a hybrid graphics user to chime in here. In the meantime google is your best friend here.
Ok sorry for being out for sometime. In this time I’ve done a lot of googlin’ around and here I post my solution to the problem. First, it seems to be a cross distro problem. Something in the way went wrong with multiple display support and the tools or means of each distro and Xorg to autodetect them whether you’re using free or proprietary drivers , I’m talking about Ubuntu, Debian and openSuse mainly from what I’ve seen.
These conclusions derive from my personal experience. OpenSuse 12.3 currently doesn’t automatically detect a dual display configuration with an integrated and a dedicated video card ( using the proprietary or the free video drivers ). I arrived to the solution manually tweaking the /etc/X11/xorg.conf file. At first the original configuration [Xorg Config] Xorg config file for multi display issue - Pastebin.com](http://pastebin.com/jmc88UMM) shows a multiple display but with the problem of stacking the second screen panel applets on the first screen’s panels when using Gnome Fallback and causing variable behavior on Gnome3 that go from disabling the second screen entirely to showing an X cursor and not being able to do really anything with the second screen.
The final solution is adding:
**Option "Xinerama" "true"**
to the ServerLayout section in the previous config file. With it I now have a multiple display desktop, being able to drag windows between them and also being able to publish it here for other people. I cannot confirm if the original problem arises when using both the displays on the same dedicated ATI card.
Now I can see from your xorg.conf file that you are NOT referring t hybrid graphics hardware, just two different GPUs (as I first thought). Your description was a little confusing. So, now it makes sense as to why you’re using Xinerama. Anyway, good that you posted again to announce your success.
I cannot confirm if the original problem arises when using both the displays on the same dedicated ATI card.
This situation would be handled adequately by Xorg with xrandr without any xorg.conf required.
It’s not a problem per see (it is from the user’s perspective, but there is no problem from the software perspective). What you have run up against are limitations in the way that X can work.*
Most users are unaware of these limitations, plus have formed expectations stemming from “it works this way in Windows” experience, and, as well, the situation is further complicated by the fact that the relevant technical aspects related to X employ descriptive language that is not likely to be differentiated from the same terms that are used in common vernacular. For example – you have to differentiate between what a Display and Screen are in X parlance (and I do so by the intentional use of the capitalizations here to make that distinction) from that which the average user might use to describe a monitor/screen/display/panel …
in brief, as Deano already alluded too, it wouldn’t work because you currently can not have more then one graphics adapter attached to a Screen, within the X Display.
You can have as many monitors being driven by the graphics adapter within that Screen, within that X Display as is possible.
alternatively, you can have multiple graphics adapters, each attached to their own Screen within the X Display
alternatively, you can have Multiple X Displays … say for example you had 3 graphics adapters, then you could have Dispay 0 with its Screen 0 (denoted as :0.0); Display 1 with its Screen 0 (:1.0); and Display 2 with its Screen 0 (:2.0) … or maybe you set the adapters up running two X Displays, with one of the X Displays having two Screens attached to it – so one possible configuration could be :0.0, :1.0 and :1.1 … and so forth
you can get even more complicated with the outputs on each of the graphics adapters, by (instead of attaching all the outputs to the same Screen) effectively breaking the outputs across several Screens or Displays
now, as to the solution that you discovered – what xinerama does is effectively combines Screens (of a X Display ) into one Screen … however, there are a number of drawbacks and limitations to its abilities … the obvious benefit that it does bring is that you can move windows across the various monitors, as ordinarily, you can not move windows across Screens
[QUOTE]I cannot confirm if the original problem arises when using both the displays on the same dedicated ATI card.
This situation would be handled adequately by Xorg with xrandr without any xorg.conf required. [/QUOTE]If you followed what I wrote above, then hopefully it should be clear that such a configuration would have all monitors contained within one Screen within the X Display. There are no limitations of moving windows across the various monitors contained/attached within a single Screen for the reasons Deano outlined. Its when you get into more elaborate configurations/setups, such as what you are attempting, that you run into the inherent and current short comings of how things work in X
So … will a configuration like this be supported in openSuse in the near future ?? Maybe something that asks the user the type of configuration he’s using with all the alternatives you explained above, maybe even with the verbatim copy if they wish to learn more on the subject ??. As a dedicated Linux user I’m used to the fact of searching a bit more for solutions even if they involve getting my hands a bit dirty, but, for the ordinary user or a newbie linux starting user, this problem would have made him either switch distros or even OS. And from the threads I’ve seen, none seem solved.
Sorry, to which specific configuration do you refer?
Right now, you have to manually configure the more “advanced” setups. And even then, you don’t achieve seamless usability/operation – there are inherent limitations and, on top of that bugs within them. This is true regardless of whether you’re running openSUSE, Ubuntu, Fedora …etc, etc. … its an X server thing, and until that changes, you can’t escape from it
Maybe something that asks the user the type of configuration he’s using with all the alternatives you explained above, maybe even with the verbatim copy if they wish to learn more on the subject ??.
I’m sure a gui could be written, as well as proper documentation.
As a dedicated Linux user I’m used to the fact of searching a bit more for solutions even if they involve getting my hands a bit dirty, but, for the ordinary user or a newbie linux starting user, this problem would have made him either switch distros or even OS.
I agree, it would certainly perplex a newbie, especially if they come to the distro with certain expectations that things should work this way or like that (because they do in Windows). But it is the way that it is … for now.
(In the future, “Shatter” support within Randr will hopefully make all the current complexities and limitations moot, and bring functional parity with Windows in these regards … also, Wayland, being a new display server design is meant to address many of the shortcomings too)
Wow! Unbeknownst by me (up until two days ago), there are some awesome abilities in RandR v1.4 in terms of multiple gpu handling … which pretty much set the stage and usher in the beginning of the end for xinerama