Greg.Randall

Long Term Timlelapse

June 10th 2020

I’ve been working on a Raspberry Pi system to shoot a year long timelapse of the garden.

I did a couple of 24 hour test timelapses. The first trial was super simple, I had the camera shoot a photo once a minute. I discovered that the Rasberry Pi camera’s auto exposure didn’t work well at night. The night photos were completely dark so I cut most of them out.

For the second trial I wrote some python to check if the auto exposure gave a correctly exposed result, and if not the program would increase the exposure until it hit the camera’s maximum of 6 seconds at 800 iso. (I was also experimenting with HDR which is why this clip looks particularly bad.)

raspberry pi timelapse camera in eaves of shed

After getting things working more or less I built a basic enclosure for the camera and Rasberry Pi out of a take-out soup container. I spray painted the whole container to protect it from UV, and then cut the bottom off and siliconed a lens filter onto the bottom of the container. This all got jammed under the eave on the shed.

Right now the camera shoots a photo every 20 minutes night and day. Which doesn’t sound like a lot until you think about 3 shots an hour, 24 hours a day time 365 days or 26,280 photos. Each photo is about 5.3mb, so a year of photos will take up about 140gb. I’m guessing that processing the photos will take 2-4 times as much space since there will be several intermediate processing steps.

At this point I have a couple of weeks of photos. Processing the images to make a pleasing result is hard. The images during the day flicker like crazy from the sun going behind clouds, and the night photos are super dark until a car headlight shines into the yard or the moon comes out. The camera shifts very slightly too, I assume the plastic of the container is changing size slightly in response to temperature changes, or maybe the shed is wiggling slightly in the wind.

This is my current processing pipeline:

  1. ImageMagick command takes each frame, and tries to smash it’s histogram into a theoretically correct exposure, but it’s super harsh, so I average the original image with the corrected image:
    convert input.jpg ( +clone -equalize ) -average output.tif
  2. Then I run each frame through Fred’s ImageMagick script removecolorcast. This gives everything a flat blueish tone, but the images are much more consistent:
    removecolorcast input.tif output.tif
  3. I run all the frames through After Effects’ stabilize motion.
  4. To deal with the remaining flickering I average the images. I’m currently averaging batches of 12 images, so images 1-12 get averaged, then images 2-13, 3-14, 4-15, and so on:
    convert 0000.tif 0001.tif 0002.tif 0003.tif 0004.tif 0005.tif 0006.tif 0007.tif 0008.tif 0009.tif 0010.tif 0011.tif -average final_0000.tif
  5. All of the images get tweaked with Adobe Camera Raw. I’m cropping, resizing, fixing the color, sharpening, etc.
  6. Finally frames are made into a video with FFmpeg:
    ffmpeg -i %04d.tif -crf 15 final_video.mp4

This is the full set of timelapse photos processed into a video:

I like where this video is going, but I find the changes from day to night to be kind of off putting after a while. Also, the strobing effect of the light moving across the yard is pretty neat the first couple of times, but gets hard to watch.

I wrote some code to select a few photos from each day about 90 minutes before solar noon and then ran the same processing pipeline. It worked okay, but the light still changed more than I liked. The clip is about a second long, so it might look better when I have more photos.

So, I wrote some code that picked images that were relatively low contrast and really close to the same medium brightness. This cut my images down from ~1500 to ~250. I then ran the same processing pipeline on it, and got this result:

I’m pleased with this, but it still needs some refinement. Once I have a few months of photos I think I’ll have a much better handle on what the final product will look like in a year.