Sunday , May 16 2021

Writing userspace USB drivers for abandoned devices, Hacker News

Nov 17 2019

A picture of some Epiphan VGA2USB LR

I recently found some USB devices on eBay (Epiphan VGA2USB LR) that could take VGA as input and present the output as a webcam. Given that I was keen on the idea of ​​not needing to lug out a VGA monitor ever again and there was claimed Linux support I took the risk and bought the whole job lot for about £ 20 ( (USD).

When they arrived, I plugged one in under the expectation that it would come up asUSB UVC Devicesbut they did not. Was I missing something?

After looking through the vendors site I discovered there was a custom driver required for them to work. As I normally live the easy life on Linux of never needing to pull in drivers because the distribution kernel I am using has them already, this was a reasonably novel concept.

Sadly, it seems like driver support for the devices in question ended at Linux 4.9. Meaning none of my systems would run this device anymore (Debian 10 [Linux 4.19] or latest LTS Ubuntu [Linux 5.0])

But surely this was something I could patch myself right? Surely the package files where actually just aDKMS packagethat built the driver from source code on demand like a lot of the out of tree drivers out there…

Sadly. This was not the case.

Inside the package is just a pre-compiled binary calledvga2usb.o. I started doing some basic investigations on how hard it might be to reverse engineer and found some interesting string table entries:

$ strings vga2usb.ko | grep 'v2uco' sort | uniq v2ucom_autofirmware v2ucom_autofirmware_ezusb v2ucom_autofirmware_fpga

Is this device actually anFPGA– on-a-stick? What would the process be to get something like that running even look like?

Another both amusing and mildly alarming find was the strings for DSA private key parameters. This made me wonder if there was private key material inside this driver and what could be protected with it:

$ strings vga2usb.ko | grep 'epiphan' | sort | uniq epiphan_dsa_G epiphan_dsa_P epiphan_dsa_Q

To observe the driver in its normal operating environment, I made a Debian 9 (the last supported release) VM, and did aKVM USB Passthroughto give it direct access to the device. I then installed the driver and confirmed that it worked.

After that, I wanted to see what the wire protocol looked like. I was hoping that the device sent raw (or close to raw frames) over the wire as this would make the task of writing a user space version of the driver easier.

To do this, I loaded theusbmonmodule on the VM’s host machine and used Wireshark to take a packet capture of the USB traffic to and from the device during startup and whilst capturing video.

Wireshark IO graph of USB traffic over time

I found that on device startup there was a large number of small packets to the device before the device could capture data. I assumed that this meant that the device was in fact as described above an FPGA based platform that had no persistent storage. Every time the device was plugged in the devices firmware would have to be “bitstreamed”from the driver itself.

I confirmed this by opening one of the units up:

The inside of a Epiphan VGA2USB LR

(ISL) CRZ – 170 – Acting as an Analog to Digital Converter for the VGA signals

XC6SLX 16 – Xilinx Spartan 6 FPGA

64 MB of DDR3 RAM

CY7C (A – USB Controller / Frontend for the device)

Given that to “boot” this device I needed the bitstream to send to it, I got to work on the pre-compiled binaries to try to extract the bitstream / firmware . After runningbinwalk -xand watching it find a few (zlib ) compressed objects. I wrote a script that would search them for a known hex sequence and picked 3 bytes from the pcap that I knew were from the bitstreaming process to search for

$ bash " (3f)  " trying 0.elf trying 30020 trying 30020 .zlib trying 30020 .zlib.decompressed ... trying 84 BB0 trying 84 BB0.zlib trying 84 BB0.zlib.decompressed trying AA 240 trying AA 240 .zlib trying AA 240 .zlib.decompressed  (D0) ************************************************************************************************************************************************************** (2f) ***************************************************************************************************************************************************************** (3f)   (7d 7c)  00 00 00 00 00 00 00 | ./.? UP} | ........ | trying C 6860 trying C 6860 .zlib

After decompressing the AA 240 .zlib file. I found that there was not enough data there to be the full bitstream. So I instead went down the route of extracting the firmware out of the USB packet capture.

I found that while bothTsharkandtcpdumpcan read USB packets inside pcap files, they both would only dump bits of information in the capture. Since that each program had different parts of the puzzle, I wrote asmall program that would unify the output of both programsintogo structs so they could be replayed back to the device.

At this point I noticed that bootstrapping comes in two stages, first the USB controller and then for the FPGA itself.

For at least a few days I was stuck on an issue where it would appear the whole bitstream would upload. But the device would not start up, despite it seeming like the packet captures between the real driver and userspace one looking identical.

This was eventually solved by combing through the pcap paying attention to the time it took to respond to each packet and noticing a large difference in one particular packet’s timing:

Timing difference shown between two pcaps in wireshark

It turned out a manually enteredtypocaused an USB control write to go to the wrong area of ​​a device. Serves me right for manually entering a value in…

Regardless, I now had a green blinking led on the device! A massive achievement!

Since it was relatively trivial to replicate the same packets that seemed to start the data streaming, I was able to write up a USB Bulk transfer endpoint and have data being dumped to disk in no time!

This is where the real challenge started. Because after analysis it appeared that the data was not obviously encoded in any way.

To start with, I usedperfto get a general view of what the driver stack traces looked like while it was running:

perf stack trace from inside the driver

Whilst I made progress with being able to hook functions that had frame data in them, I still didn’t get any closer to figuring out the encoding of the image data itself.

ffmpeg running while showing a kprobe trace running at the same rate

I did try theNSA’s Ghidrato get a better idea of ​​what was going on inside of the real driver:

a sample of some of the decompiled C from ghidra

While Ghidra is incredible (this was my first time using it compared to IDA Pro) it still wasn’t quite good enough to reasonably help me understand the driver. I needed another path of investigation if I was going to reverse engineer this.

I decided to provision a Windows 7 VM and check if the Windows driver was doing anything different, I also noticed during that time that there was a SDK for the devices. One of the tools ended up being of particular interest:

PS>ls      Directory: epiphan_sdk-3. 30 .3. 00 07  epiphan  bin  Mode LastWriteTime Length Name ---- ------------- ------ ---- -a --- 10 / 26 / 2019 10:  (AM)  frmgrab.dll -a --- 10 / 27 / 2019 5:  (PM) -a --- 10 / 26 / 2019 10:  (AM)  v2u.exe -a --- 10 / 26 / 2019 10:  (AM)  v2u_avi.exe -a --- 10 / 26 / 2019 10:  (AM)  v2u_dec.exe -a --- 10 / 26 / 2019 10:  (AM)  v2u_dshow.exe -a --- 10 / 26 / 2019 10:  (AM) -a --- 10 / 26 / 2019 10:  (AM)  v2u_edid.exe -a --- 10 / 26 / 2019 10:  (AM)  v2u_kvm.exe -a --- 10 / 26 / 2019 10:  (AM)  v2u_libdec.dll  PS>.  V2u_dec.exe Usage:       v2u_dec[format] [compression level]               - sets compression level [1..5],                - captures and saves compressed frames to a file       v2u_dec x [format]               - decompresses frames from the file to separate BMP files

This tool let you fire “one shot” captures, noting that in the source it didn’t apply compression to the frames so that the output can be processed on a faster machine later. This was practically perfect, and I replicated the USB packet sequence to obtain these uncompressed blobs, and looking at the byte counts, it matched with getting around 3 bytes (RGB) per pixel!

Initial processing of these images (just taking the output and writing it as RGB pixels) resulted in something roughly inspired by the input I was giving to the device over VGA:

The mess that has some resembalnce to the input

After some more debugging with a hex editor, I discovered there was some kind of marker every 1028 bytes, it took a slightly embarrassing amount of time to write a watertight filter for that, On the other hand I ended up producing some modern art in the process.

Is it a programming error or modern brand art, who knows

After realizing that the tilt / sheer in the image was caused by me skipping and carrying over a pixel on every line (x=799 !=x=800), I finally ended up with an image that was almost spot on apart from the color:

A very orange or pink AB test pattern

Initially I thought this might have been a calibration thing, caused because I took some sample data when the VGA input was stuck on a solid color, in order to fix this I built a new test image that would try to smoke these issues out, in hindsight I should have used something like aPhilips PM 5544 test card

The test card I made

After loading this image on to the VGA producing laptop, I ended up with an output of:

The test card before correction

At this point I had a flashback to some 3d rendering / shader work I did long ago. This looked a lot likeYUV color.

I ended up reading up on YUV and remembered during my reverse engineering of the official kernel driver I’d found if I set a breakpoint on a function calledv2ucom_convertI 420 toBGR 24the system would hang without the ability to resume. So maybe the input was in I 420 encoding (of- pix_fmt yuv 420 pfame) and the expected output was Blue Green and Red as 8 bit bytes ?

After using Go’s built inYCbCrToRGBthe image suddenly looked much closer to the original.

yay, a perfectly coded test card

We did it! Despite the “WIP” quality we were able to do 7 FPS. Honestly, for me that was good enough, since my use for these are as an emergency VGA screen rather than anything else.

So now we know this device well enough to explain how to operate it from a cold boot:

  1. You need to (initialise the USB controller) . I assume based on the size of information, this is actually uploading code to the USB controller
  2. When you’ve done the USB upload, the device will disconnect from the USB bus, and come back a moment later, with only a single USB endpoint.
  3. You can then startsending the FPGA bitstream, one 64 byte USB control packet at a time.
  4. Once you are finished, the LED on the device will start blinking green. At this point you can send what appears to be aparameter sequence(overscan and other properties)
  5. You can thenfire a single control packet to get a frame, the control packet has the resolution embedded in it. If you use a 4: 3 request packet on a widescreen input you will often end up with a corrupted input.

To make use as easy as possible, I ended up rigging up a small web server inside the driver to make it super easy to use in a rush. Thanks to theMediaRecorder APIin browsers, it also allows for an easy way to record the output of the screen to a video file.

Example liveview page

As I’m sure a lot of people can relate regarding experimental code; I can’t say I’m proud of the code quality. But it’s likely in the state where it works well enough for me to use.

You can find the code for this (and pre-built versions for Linux and OSX) on github:

Even if this is never used by anyone else, this was a hell of a roller coaster in USB protocol details, kernel debugging / module reverse engineering, and general video format decoding! If you liked this kind of stuff, you may likethe rest of the blog. If you want to stay up to date with what I do next you can use my blog’sRSS Feedor you can follow me ontwitter

Until next time!

Brave Browser
Read More

About admin

Check Also

How to flash an LED: writing ARM assembly for an STM32 microcontroller, hacker news

Fri 27 March 2020Today we are going to be learning how to flash an LED on a microcontroller by writing ARM assembly. If you write software but are unfamiliar with basic electronics or embedded software development, there will be explanations of some fundamentals – I expect you will not feel left behind :). If you'd…

Leave a Reply

Your email address will not be published. Required fields are marked *