Cosmic Rays in CCDs
Some old talks on cosmic rays in optical telescope CCDs:
Phone Camera CCDs
Sensor info from iPhone5 teardown
Software APIs for accessing smart-phone cameras:
- What is the expected footprint of a cosmic muon, taking account of angle, energy, and charge diffusion?
How to deal with RGB filters?
- What does JPEG compression to do isolated pixels? (possible to get raw RGB instead?)
Is it feasible to calibrate algorithms with a gamma-ray source, e.g., Cs-137?
- How similar is expected CCD response for gamma rays and muons?
- What are expected rates for a typical sealed source? (advanced lab Compton experiment uses 1 milliCurie of Cs-137)
Here is a "back of the envelope" calculation for the rate you should expect in a camera outside at sea level: rate = flux*area*(1-f)*eff + fake where: flux ~ 140 cosmics per m^2 per sec area = Npix * pixsize^2 f = dead time fraction eff = algorithm efficiency for genuine cosmics fake = fake rate for electronic noise In the best possible world, with f = 0, eff = 1, fake = 0, a 5MP CCD with 1.5 micron pixels should have event frequencies ~ 18e-4 Hz ~ 6.5/hour
1 milliCurie = 37 MHz of decays. The iPhone 5 sensor has 8M 1.4um pixels, for an area of 1.57e-5 m^2, which reduces the rate by 1.25e-6 at 1m, leading to 46 Hz.
There is a discussion of using phone sensors as Geiger counters here.
Assuming that n isolated MIP in an RGB sensor saturates a single sub-pixel, we expect RGB = 0xff0000, 0x007f00, or 0x0000ff. How does JPEG compress this signature, with different levels of noise?