Setting up a desktop environment for VNC admin from an iPad/Android tablet

I want to set up an alternative desktop environment that is better suited for touchscreen devices. I followed this thread:

http://unix.stackexchange.com/questions/21515/window-manager-and-desktop-environment-for-touchscreens

and saw that KDE offers alternatives such as Plasma Active and Plasma Netbook. But when looking through the OpenSUSE repositories I cannot find them. The option is to install GNOME 3 but then there will be software conflicts with KDE as I do prefer KDE on the desktop.

So what touchscreen-adapted desktop environments (to be choosen at the VNC login screen) are available for OpenSUSE 13.1?

Tablets are getting pretty common now so I think there should be at least something available for remote administration of an OpenSUSE system.

I’ve played around a bit with the additional touchscreen packages on my touchscreen-enhanced laptop and found no real improvement/difference. The OS seems to interpret basic touch gestures and similar accurately as mouse events without special configuration. Maybe if I wanted to do something unusual I’d have a different experience.

TSU

Well, last time I checked, the ordinary KDE/Plasma desktop is not that well suited for touchscreen swipe gestures or multitouch. I tried GNOME 3.12 in the Tumbleweed release and it seems to work although I haven’t tried its multitouch/swipe capabilities over VNC. GNOME 3.10 is still very buggy in the 13.1 release.

I do prefer SSH for remote administration and KDE Desktop for desktop work but for portable touchscreen devices, neither solution is optimal.

Unfortunately IBM SPSS 22 intermittently refuses run in Tumbleweed for some reason and the GUI of GNOME doesn’t sit well with the java-GUI of SPSS. This is a dealbreaker. I also don’t get the “NetworkManager not running” error message in the bottom panel of the KDE desktop. The network works just fine! Swapping between IfUP and NM in YaST doesn’t help.

The only machine I’ve run KDE on that’s touch-enabled is my HP Envy 17 Touch. When a machine like this is both a laptop(with conventional touchpad) and touch screen-enabled, I’m pretty sure touch events can be interpreted as click events through the hardware only. This is likely different hardware architecture than a dedicated touch device, ie a tablet or smartphone (no mouse device). And may be different if the hardware was fundamentally an ordinary PC with mouse, but with a touchscreen display added.

This could be why I don’t see much difference, because I don’t need software packages to interpret touch screen input. Maybe if I were to install KDE on a tablet instead I would have a different experience.

I’d recommend you consider your own hardware as well to evaluate whether your situation is the same as mine.

As for using VNC to remote into my device, I haven’t tried that but in theory I doubt should make a difference on my hardware.

TSU

A touchscreen device generate far more events than just a click. Swiping is tantamount to click and drag with the mouse (e.g. the hand cursor in Adobe Reader). Then you have a series of multitouch events such as pinch to zoom, or in more advanced implementations pinch to zoom and rotate, see e.g. here:

https://www.youtube.com/watch?v=P0Dgqq-93ls

Doubleclick is very inconvenient on a touchscreen and although a swipe can be interpreted (by the VNC client) as a drag-and-drop, the fingers are not as precise as a mouse pointer so a thin scrollbar can be quite painful to operate on a touchscreen device.