openSuSE 13.1 (libvirtd/libxl): How to eliminate the framebuffer?

Greetings!

Is there any way to completely eliminate the virtual framebuffer from a VM configuration? Since I’m accessing said VM by means of XDMCP and have directed tty output to a virtual serial port it’s absolutely unnecessary to retain the graphical fb.
I have already removed the <graphics …> and <video …> sections from the VM definition, but it still presents me a fb on 127.0.0.1:5900 - the default config as I take it.

However, since I want to place another VM on port 5900 on every available interface, I would need a way to get rid of the fb altogether.

Is there any way to get this done?

I may be incorrect because I haven’t looked closely at how XDMCP works recently but IIRC XDMCP is mainly a protocol definition and doesn’t absolve the Guest from having to create required video frames.

Admittedly could be 'way off base,
TSU

To be more precise, I’m not talking about the fb required by X to generate its frames. Even a machine without any graphics card can still run X - providing you give it some means of displaying the desktop, like XDMCP.
I’m attempting to instead eliminate the virtual graphics card (and thus the fb created by libvirt), because it is superfluous since the display runs via XDMCP.

First, you might request that this thread be moved to the Install/Boot forum where this thread would likely be seen by more people who dig deeper into what happens under the hood.

But, you might take a look at what the MAN page for xdm says…
http://www.x.org/releases/X11R7.6/doc/man/man1/xdm.1.xhtml

Although what you describe might be possible, I cannot think for the moment how what you describe can be. The MAN page also pretty much confirms my understanding that XDMCP is the protocol and not central to your question. Although XDMCP can be used to access remote X managers other than XDM, I’m pretty sure XDM is what is forwarded by default and is most typical. Unless you can specify a replacement for XDM, I don’t see what your’e describing is likely.

But, as I suggested maybe someone else can provide a better and more authoritative answer than my “understanding” and my guess if your question can be answered differently it’ll be in the Install/Boot forum.

TSU

Here’s a little more information which might help in your investigation. It describes running the x display manager on the client instead of server, but still requires an x server running on the target machine. Is reasonable, because I’m not aware that the X server on the target machine can be eliminated.

http://www.thefullwiki.org/XDMCP

TSU

What I’m concerned with here isn’t XDMCP (that works absolutely fine), but with how to completely get rid of the emulated graphics card.
My problem is that although I have removed both the <graphics> and the <video> section from the domain specification, libvirtd still has it up and running (although it’s only visible on localhost). However, I’m attempting to generate a completely headless VM.

You might start by finding and posting a reference that supports what you say is possible… an X server not running on the target machine.
The MAN page link I posted suggested that at least ordinarily an X server has to be running, which I assume also means that an emulated “framebuffer” also is required.

TSU

Why do I have a feeling that we are talking at cross-purposes right now?

So, to take X out of the equation once and for all, let’s take it with one of my other VMs that only has SSH login and no X running, yet libvirt still sets up a virtual graphics adapter for that one (if only to display the system’s text console): How do I get rid of the virtual graphical subsystem set up by libvirtd?

Although I’d be guessing, it’d be because vm manager under the hood is using using vnc to display guests(which of course requires a running x server regardless whether you’re connecting to a Desktop or console).
That wouldn’t mean that X server is required to run the guest, it’s only for remote viewing within a specific app, vm manager.

You could probably try removing the graphics card and starting the Guest from the command line and then SSH in and in this case I’d assume the Guest would be like any physical machine… it should work, but only to display a console and should not support a running Desktop (in that Guest).

TSU

Thought I might add a few things I ran across since I last posted…

It looks like there <is> a way to implement a remote framebuffer device, and… no surprise it uses the rfb (remote framebuffer) protocol
RFC 6143 - The rfb protocol
The Wikipedia rfb article
And based on that article, VNC actually uses it to communicate with VNC servers.

Since the X Server runs on top of the Linux Framebuffer Device and it’s remote to the running machine, it stands to reason that the X Server is also on the remote client.

So, there you probably have it…

  • Implement VNC as the easiest way to implement rfb.
  • You should be able to stop any X servers which might be running on the server machine if you wish, even uninstall if the machine is never to be accessed locally (IMO a dangerous decision).
  • Once connected through a rfb connection, you can run a command like the following which should verify no Xserver is running
pidof X && echo "X server is running"

TSU