Great open source high definition 4K camera, with an interesting update about the new AXIOM Beta:
With the AXIOM Beta, we maintained this approach but thought about how we could push the boundaries. Currently, with the HDMI plugin module we are limited to 1080p60 as the highest possible throughput mode (It’s a frequency limitation of the lanes coming from the Zynq FPGA. A future plugin module with another FPGA onboard featuring gigabit transceivers will be able to output UHD/UK signals directly.). So we thought about methods where we could pack more data into the existing modes and the result is that we can capture 2160p30 inside a 1080p60 (or in the future 2160p25 inside 1080p50 and 2160p24 inside 1080p48) video on an external recorder connected to an AXIOM Beta. So far this mode was called “experimental 4K RAW”, but since it’s getting less and less experimental and we now know it works (we shot the entire April Fools Joke video this way) we actually dropped the “experimental” in the name.
Second Generation Raw Mode
To solve the downsides from the previous generation, we have implemented the following changes:
- Recorded the image at double frame rate (1080p60), for the purpose of combining every two HDMI frames into a single UHD frame (3840×2160).
- To fix the resolution loss in the green channel, we sent the Bayer G1 channel on even frames, and G2 on odd frames.
- To fix the resolution loss in the red and blue channels, caused by 422 subsampling, we sent the Bayer R/B channels on the even frames from the HDMI stream, and a shifted version (by one pixel) on the odd frames.
- Subtracted the static components of the row/column noise. Ideally, we should have subtracted a complete dark frame, but this is not yet possible in the current FPGA firmware. Therefore, we have corrected the row/column offsets before sending the data to HDMI (because they were the main reason for the codec struggles), and subtracted the rest of the dark frame from the HDMI image in post-production.
- To minimize the image alteration caused by encoding the linear 12-bit data as 8-bit, we have designed an optimal LUT (similar to a log gamma), based on the noise profile of the image, that attempts to throw away only bits of noise, without altering the original signal, if possible.
- In a nutshell, we ended up sending (R,G1,B) on even HDMI frames, and (R’,G2,B’) on odd frames, after applying a LUT. Note: the apostrophe ‘ means “shifted by one pixel”.