ToDo

From apertus wiki
Jump to: navigation, search

1 Axiom Alpha Prototype

1.1 DNG Converter Software

  • Add other DNG tags such as dead pixels, register settings, black level calibration data, etc.
  • Implement the matrix as a LUT.
  • Eventually port to from python to C/C++

Code: https://github.com/apertus-open-source-cinema/CMV12000_DNG_Writer

1.2 Linux Bash script/C Code to interpret all image sensor registers

We require a c/c++ commandline tool that takes the 128x16bit sensor dump as stdin. Displays the registers in a hex editor like way with addresses and content in a table and next to the table the human readable interpretation of the corresponding sensor parameters (exposure time, etc.)

Sensor Datasheet: https://github.com/apertus-open-source-cinema/alpha-hardware/raw/master/Datasheets/datasheet_CMV12000%20v8.pdf

Bash script example: Axiom_Alpha_Prototype#Reading_and_Writing_Sensor_Register

Note: The current memory mapping is 32bit aligned, so the 16bit SPI registers show up as 32bit where only 16bits are significant.

1.3 RAW image metadata reader

All sensor registers are embedded into the raw image formats. Details about the file format: RAW16

We need code that can read/display these values in a human readable way. (Similar to the above, but not necessarily targeted at the ARM platform)

1.4 dead/hot pixel detection software

A software that is supplied with several raw images and is able to detect dead/hot pixels and output all detected pixels positions.

1.5 Flat-field correction generation software

A software that is supplied with several (around 20 is sufficient) raw images of an out of focus 30% grey wall and is able to generate the sensitivity offset of each photosite and a resulting flat-field correction matrix (https://en.wikipedia.org/wiki/Flat-field_correction). The samples images are preferably done with a fast 50mm lens stopped down to F4-F5.6 facing an evenly lit surface (white wall with area at decent distance from wall light). To further smooth results the lens should be set to out-of-focus. We need to test if these measurements work even better with a lee filter in front of the sensor instead of a lens.