June 10th 2020
I’ve been working on a Raspberry Pi system to shoot a year long timelapse of the garden.
I did a couple of 24 hour test timelapses. The first trial was super simple, I had the camera shoot a photo once a minute. I discovered that the Rasberry Pi camera’s auto exposure didn’t work well at night. The night photos were completely dark so I cut most of them out.
For the second trial I wrote some python to check if the auto exposure gave a correctly exposed result, and if not the program would increase the exposure until it hit the camera’s maximum of 6 seconds at 800 iso. (I was also experimenting with HDR which is why this clip looks particularly bad.)
After getting things working more or less I built a basic enclosure for the camera and Rasberry Pi out of a take-out soup container. I spray painted the whole container to protect it from UV, and then cut the bottom off and siliconed a lens filter onto the bottom of the container. This all got jammed under the eave on the shed.
Right now the camera shoots a photo every 20 minutes night and day. Which doesn’t sound like a lot until you think about 3 shots an hour, 24 hours a day time 365 days or 26,280 photos. Each photo is about 5.3mb, so a year of photos will take up about 140gb. I’m guessing that processing the photos will take 2-4 times as much space since there will be several intermediate processing steps.
At this point I have a couple of weeks of photos. Processing the images to make a pleasing result is hard. The images during the day flicker like crazy from the sun going behind clouds, and the night photos are super dark until a car headlight shines into the yard or the moon comes out. The camera shifts very slightly too, I assume the plastic of the container is changing size slightly in response to temperature changes, or maybe the shed is wiggling slightly in the wind.
This is my current processing pipeline:
This is the full set of timelapse photos processed into a video:
I like where this video is going, but I find the changes from day to night to be kind of off putting after a while. Also, the strobing effect of the light moving across the yard is pretty neat the first couple of times, but gets hard to watch.
I wrote some code to select a few photos from each day about 90 minutes before solar noon and then ran the same processing pipeline. It worked okay, but the light still changed more than I liked. The clip is about a second long, so it might look better when I have more photos.
So, I wrote some code that picked images that were relatively low contrast and really close to the same medium brightness. This cut my images down from ~1500 to ~250. I then ran the same processing pipeline on it, and got this result:
I’m pleased with this, but it still needs some refinement. Once I have a few months of photos I think I’ll have a much better handle on what the final product will look like in a year.