Signal Processing details

From the Crayfis group: ( https://arxiv.org/pdf/1410.2895v2.pdf )

With the camera as the detector element, the phone processor runs an application which functions as the trigger and data acquisition system. To obtain the largest possible integrated exposure time, the first-level trigger captures video frames at 15-30 Hz, depending on the frame-processing speed of the device. Frames which con- tain any above-threshold pixels are stored and passed to the second stage which examines the stored frames, sav- ing only the pixels above a second, lower threshold. All qualifying pixels, typically a few per frame, are stored as a sparse array in a buffer on the phone, along with their arrival time and the geolocation of the phone. When a wi-fi connection is available, the collected pixels are up- loaded to a central server for offline shower reconstruc- tion; most events are between 50 − 200 bytes of data.

Cosmic Ray App

This is a single block event with a score of 610. Score is a device dependent relative score.

Cosmic Ray App takes a different approach. Look at fig 5 from the arXiv paper. This is with a muon source, so there are many tracks. Also there muons were shot at the phone along the edge of the detector, so all the tracks are long. A real event comes in at a random angle, and as such most of them will look like pin pricks. Or pin pricks on an angle. Instead of capturing at 30Hz, we capture a little slower (longer exposure), and then do lots of processing on phone. Cosmic Ray instead of looking for individual hot pixels looks for hot regions – essentially dividing the image captured into many small blocks. These blocks are processed and standout blocks are promoted to events. The events Cosmic Ray captures are thus tiny pictures of the hot parts of the screen. Usually this is one block, but sometimes two blocks are found on the same screen. These tiny images along with the data details constitute an event.

In a future version we will allow you to share these 60×60 pixel images with the world – effectively making a very large device.

Number of Cosmic Ray muons you can expect to see:

Muons arrive at sea level with an average flux of about 1 muon per square centimeter per minute. (http://cosmic.lbl.gov/SKliewer/Cosmic_Rays/Muons.htm)

An iPhone has a detector size of about 4mm x 4mm, so 0.16 cm^2, so about one muon every 6 mins or so goes through the camera sensor. What percentage does the camera sensor pick up? Time will tell, as a full calibration has not been done, but the rate of decent events does seem to be about that.

15 comments

  1. I’m getting one photo (detection) every 30 seconds. What does this mean? High radiation? They’re generally around 450 to 250. Tried saving to iCloud but doesn’t seem to work. Can’t find them on my iCloud photos 🙁

    • pa-admin says:

      One per 30 seconds is about the target frame rate. Basically its always looking for small flashes of light and cosmic rays and natural background can cause events about every minute or so, so most of those events are ‘real’ but perhaps low energy, etc. iCloud – they won’t be put into your iCloud photos – but rather on the iCloud drive on a Mac or internet account, or if you access iCloud drive on your phone, etc.

  2. Have you considered the orientation of the CCD and iPhone? I’ve placed my iPhone vertically, and wonder if this has any effect on sensing Cosmic rays, assuming they come generally from Outer Space. I notice your software still reports point sources in the images with the iPhone in a vertical position. Wouldn’t one expect ‘tracks’ in the vertical position if the source is Outer Space?

    • pa-admin says:

      Make sure you try the ‘Show top’ to see the biggest events. I have tried on the side and it seems I get a few more long tracks, but you have to realize just how thin the CCD/CMOS on an iPhone is, so most events will simply be dots. A nice project would be to calculate the thickness of the active area by measuring track length in a few hundred images, and running some stats.

    • pa-admin says:

      The colours are the colour of the pixel hit by the particle. In reality its all just signals, but there is a for instance a green little lens covering about 1/2 the pixels. So when a pixel that was green was hit, the camera assumes green light. So the colours don’t indicate anything about the particle. A particle with a lot of energy will light up a bunch of neighboring pixels and so make a white pixel.

  3. Braden says:

    Is there any way for sure yet to know the energy of the stuff we’re seeing, or would all our data be analyzed after the fact by you guys?

  4. Don Limuti says:

    I am designing a device to focus muons.
    1. Does “Score” track linearly with the muons/minute/cm output of lab grade muon counters?
    2. Can I leave the iphone in my setup and display the “Score” remotely simply as a number?

    Thanks,
    Don L.

    • pa-admin says:

      1. Score does not track linearly, but almost. As every detector is different, I actually do something like auto calibrate to one event at a score of ~1 every 30 seconds or so. The calibration likely takes ~20 min to settle in. Muons are so much heavier that they score much higher than typical electron beta tracks. So for a certain phone in a stable location with background sources only, its going to be close to linear. There is such a difference in score depending on the exact angle that the muon goes through the CCD at.

      2.If your phone is running the app, there is a place to get a URL that leads you to a web page showing all uploaded events for your device. This has the raw data and URLs to images for each uploaded event. There is also raw data that we are working on opening up more to the community, but it does not have much more.

    • pa-admin says:

      If an event is small, just a few pixels in size I only need to grab the nearest 8 20×20 blocks of pixels around it (total 9). When it’s a long one it may require several regions, added in groups of 9. So you see 9, 18, 27 etc.

Leave a Reply to pa-adminCancel reply