Using an Android device as a second monitor

Until I can afford a real VR headset like the Oculus Rift, I decided to order a Google Cardboard. With it I hope to use my Android smartphone (Samsung Galaxy S3) as a screen for my computer, maybe even get some cheap head tracking support! There’s mainly one problem: How do I use a smartphone or tables as a screen? Since openSUSE Tumbleweed (KDE) is my OS, and the question is half a Linux one, I took the liberty of asking here.

More precisely, I’d like to know if and how I can use an Android smartphone or tablet as a second monitor, through the USB cable or Bluetooth. Can X11 or whatever manages displays on my distro recognize it as such, and properly mirror my monitor onto the device?

As a bonus, although this isn’t absolutely necessary: Would it be possible to use the phone’s gyroscope / accelerometer as an input device on the computer at the same time? If they could be recognized like a mouse or a drawing tablet, and position the mouse pointer based on the rotation of the phone (starting from the center), I could use it to get head tracking support too!

Note 1: My Android device isn’t rooted. I couldn’t find any working tools for Linux to do that, nor want to risk damaging it now.

Note 2: I’m aware that a Mirco USB to HDMI cable exists, which would be the most natural way to do this. I seek to use the normal USB cable however… both because I hope to use other features of the phone simultaneously (eg: gyriscope) and because it will need to charge while used as a screen else it will run out of battery quickly.

Note 3: Please don’t suggest VNC! It’s an idea I’m considering if all else fails, but I don’t believe using remote desktop over a WIFI connection is the right way of connecting a display to a computer… rather a hack that would be very laggy.

Sorry for the double post, but short update; I tried both a VR-enabled VNC client, as well as streaming my computer screen to HTTP using VLC… both over local WIFI. In both cases it worked, but the lag makes it impractical: With VNC the image barely updates once a second, whereas with VLC streaming I have a constant FPS but the image is several seconds behind due to buffering. I want to use this for gaming, and my hope was to have full and instant screen mirroring at 60FPS… obviously that’s only achievable over USB cable, or maybe miraculously over Bluetooth. So my question remains open.

Re. your first question, I suppose a remote desktop client would do the trick. The problem is, the app* would have to support 3D.

  • why “app”? what’s wrong with program or application? Or is it a simian thing?

Cross-posted, sorry.

Sorry, but please read my first posts: I already tried remote desktop (VNC) via local WIFI, and although it works the lag makes using it for gaming impossible. I’d be interested in a VNC solution if it was doable via Bluetooth or the USB cable, not an internet connection… so I could get delay-free uncompressed 60FPS. If anyone knows of a fast VNC system, I would like to hear more!

What might still work is using cvlc / ffmpeg server, to stream my desktop to the device… if compared to last night’s test I find a way to get buffering down to just a few milliseconds. Since it’s over local WIFI, I should be able to stream 720p (device screen size) @ 60FPS without much buffer time. I lose gyroscope based head tracking then, but that didn’t seem to work with PC games anyway due to the way they expect the mouse pointer to move.

Years ago I did this (mirrored part of the desktop on my Android and vice versa) but that was between Windows and my Android.

IIRC the app and method then was called “screencasting” but now today that term more often means something entirely different (recording the Desktop).

You can still find the Google project which has been continually updated through the years
https://developers.google.com/cast/docs/remote
https://github.com/googlecast/CastRemoteDisplay-android

The only thing I can think of that <might> do something similar on Linux is KDE Connect
https://community.kde.org/KDEConnect

Note that these solutions aren’t really useful for a HUD or 3D goggles like Cardboard or Oculus because if you wear your phone that way, you won’t be able to touch the screen to send mouse like gestures to the remote machine.

You really are getting <only> a display on your phone mirroring a part or all of the remote display. You can’t get a true virtual environment that utilizes your phone’s sensors without software like what might be embedded in a gaming console app.

TSU

Thank you for that info, I might look into it! Although it seems like methods of streaming the screen via USB are sadly limited and complicated, especially on a Linux PC.

I believe I know how I want to do this now however: I’ll try using VLC or ffmpeg to stream my screen through a http stream… but with a buffer time of under 200ms, which should be sustainable via local WIFI. I will then play the stream on the Android device, using a web player that supports looking into the video with the gyroscope (there are a few on Google Play)… meaning I can locally emulate the VR component, at the expense of being unable to see the entire image at once.

This is a perfect technique, since the gyroscope is seriously not usable as an input device in any normal PC game. I can still control the real view with the mouse, but look into that view from the Android side! If all goes well this this should allow for Oculus Rift equivalent PC gaming, with minimal lag and quality loss. I shall see how far I get though.

Realtime streaming using VLC is very, very resource intensive because VLC doesn’t stream raw video packets… VLC encodes the video stream into a standard streaming protocol which can then be viewed on any device which can read that protocol (You don’t need a VLC client to view a video stream from a VLC streaming server). At the very least, unless you have enormous computing power or dedicated encoding hardware you won’t likely be able to stream a full screen desktop without hesitations.

I’d instead probably look at any screencasting projects which can automatically capture and send using something like XBMC, and investigate viewers that run on Android that support that protocol.

TSU