Human in Photo?

April 13th 2021

I shot a timelapse of Sarah binding her thesis. I used the timelapse system described in a previous post. I was left with about 12,000 frames, but only some of them had Sarah actually doing work in them.

I needed to build a timelapse of Sarah working. I tried several libraries that look for human shapes in photos, but they ended up being, inaccurate, slow, or overly complicated.

I had tested out Imagga for an unrelated tagging project a few years back, and it looks like at some point in the interim they added some face detection tools. With a basic account you get 1000 API calls per month, and after some testing it looked like Imagga would be fast and easy to use.

This passes an image to Imagga asking it to find a human, and then just counts the number of characters in the returned data. The data is pretty short if there’s no humans, and it’s longer if there is a human.

Fast Thumbnails from Raws

February 17th 2021

I use cloud storage for my work-related photography. I make a folder of thumbnails of all the raw images from each shoot so that I don’t have to download gigs of raws to look back through the photos, just a few tens of megs of thumbnail jpegs.

This is by far the fastest way I’ve found to generate a set of thumbnails for raw photos. dcraw doesn’t actually decode the raw for this it just extracts the thumbnail that’s embedded in the raw. ImageMacik’s mogrify resizes the images, tweaks the brightness/contrast/color a bit and saves it down.

Long Term Timlelapse v2 & v3

February 8th 2021

I have been working on and off on building out better timelapse systems. v2 used the updated Raspberry Pi HQ Camera, a more serious plastic case, and a battery backup among other things.

So, the v2 setup is better by far than the first setup. The enclosure is actually waterproof, the images are higher quality, and the system runs even if the power goes out.

v3 is a slight upgrade adding a high dynamic range light sensor to speed up figuring out proper exposure for the photos.

timelapse camera

v3 Parts List

Item Cost Supplier
Raspberry Pi Zero Camera Cable $2.99
Raspberry Pi HQ Camera Wide Angle Lens $30.00
Raspberry Pi HQ Camera $50.00
Light Sensor – Adafruit TSL2591 $6.95
Raspberry Pi Zero W $10.00
Power Bank – RavPower 10000mAh 5V/3A* $33.27
MicroSD – SanDisk 128GB Extreme $23.99
USB Wall Charger – Ailkin $4.50
6″ USB A to Micro USB cable $4.00
6″ USB A to USB C cable $4.00
Power Cord $5.00
Lens Filter – Marumi 46mm EXUS** $34.68
Project Case Hammond – RZ0269C $17.30

*This power bank can do pass-through charging, so acts like a UPS for the Pi. In initial testing it seems to run the Pi and camera for around 24 hours.

**Good filters are really expensive. This filter seems to shed water and self-clean much better than others I’ve tried.

Annoyingly, the software has gotten complicated. It seems like you could just tell the camera to shoot a photo every 10 minutes and you’d be good to go, but the camera’s auto exposure only works during well-lit parts of the day. Once you hit dusk the photos are solid black. So, I had to write an auto-exposure system.

Also, There are a couple of bugs with the newer RaspberryPi HQ Camera software and long exposures take 4x the shutter speed. I think I’ve found workarounds for most of the bugs, and with a non-obvious set of options you can reduce the long exposure time to 2x the shutter speed.

Hopefully in the next few weeks I’ll have some long-term samples from the garden timelapse. Sarah & I moved late December, so the timelapse final garden timelapse will run 05/19/2020 – 12/21/2020.

Long Term Timlelapse

June 10th 2020

I’ve been working on a Raspberry Pi system to shoot a year long timelapse of the garden.

I did a couple of 24 hour test timelapses. The first trial was super simple, I had the camera shoot a photo once a minute. I discovered that the Rasberry Pi camera’s auto exposure didn’t work well at night. The night photos were completely dark so I cut most of them out.

For the second trial I wrote some python to check if the auto exposure gave a correctly exposed result, and if not the program would increase the exposure until it hit the camera’s maximum of 6 seconds at 800 iso. (I was also experimenting with HDR which is why this clip looks particularly bad.)

raspberry pi timelapse camera in eaves of shed

After getting things working more or less I built a basic enclosure for the camera and Rasberry Pi out of a take-out soup container. I spray painted the whole container to protect it from UV, and then cut the bottom off and siliconed a lens filter onto the bottom of the container. This all got jammed under the eave on the shed.

Right now the camera shoots a photo every 20 minutes night and day. Which doesn’t sound like a lot until you think about 3 shots an hour, 24 hours a day time 365 days or 26,280 photos. Each photo is about 5.3mb, so a year of photos will take up about 140gb. I’m guessing that processing the photos will take 2-4 times as much space since there will be several intermediate processing steps.

At this point I have a couple of weeks of photos. Processing the images to make a pleasing result is hard. The images during the day flicker like crazy from the sun going behind clouds, and the night photos are super dark until a car headlight shines into the yard or the moon comes out. The camera shifts very slightly too, I assume the plastic of the container is changing size slightly in response to temperature changes, or maybe the shed is wiggling slightly in the wind.

This is my current processing pipeline:

  1. ImageMagick command takes each frame, and tries to smash it’s histogram into a theoretically correct exposure, but it’s super harsh, so I average the original image with the corrected image:
    convert input.jpg ( +clone -equalize ) -average output.tif
  2. Then I run each frame through Fred’s ImageMagick script removecolorcast. This gives everything a flat blueish tone, but the images are much more consistent:
    removecolorcast input.tif output.tif
  3. I run all the frames through After Effects’ stabilize motion.
  4. To deal with the remaining flickering I average the images. I’m currently averaging batches of 12 images, so images 1-12 get averaged, then images 2-13, 3-14, 4-15, and so on:
    convert 0000.tif 0001.tif 0002.tif 0003.tif 0004.tif 0005.tif 0006.tif 0007.tif 0008.tif 0009.tif 0010.tif 0011.tif -average final_0000.tif
  5. All of the images get tweaked with Adobe Camera Raw. I’m cropping, resizing, fixing the color, sharpening, etc.
  6. Finally frames are made into a video with FFmpeg:
    ffmpeg -i %04d.tif -crf 15 final_video.mp4

This is the full set of timelapse photos processed into a video:

I like where this video is going, but I find the changes from day to night to be kind of off putting after a while. Also, the strobing effect of the light moving across the yard is pretty neat the first couple of times, but gets hard to watch.

I wrote some code to select a few photos from each day about 90 minutes before solar noon and then ran the same processing pipeline. It worked okay, but the light still changed more than I liked. The clip is about a second long, so it might look better when I have more photos.

So, I wrote some code that picked images that were relatively low contrast and really close to the same medium brightness. This cut my images down from ~1500 to ~250. I then ran the same processing pipeline on it, and got this result:

I’m pleased with this, but it still needs some refinement. Once I have a few months of photos I think I’ll have a much better handle on what the final product will look like in a year.

Reverse Engineering Oatly: Part 2

May 14th 2020

I read all the Oatly patents (Sarah translated the Swedish one) and watched videos and read the whole Oatly website. The key piece of information missing from the previous recipe I made is that Oatly uses a couple of enzymes to convert starch in oats to sugars.

home made oatmilk in coffee

All of this oat research excited Sarah and we’ve been working together to design the perfect oatmilk. We want something slightly sweet, with the thickness of whole milk or cream, and it has to be great in coffee — no splitting!

The American Oatly patent has a sample recipe with huge quantities — in summary:

  1. Steam dehulled oats.
  2. Wet grind the oats at 129F (54C).
  3. Add water plus alpha amylase, beta amylase, and protein-glutaminase enzymes.
  4. Cook at 133F (56C) for 2 hours.
  5. Heat to 203F (95C) to deactivate enzymes.
  6. Cool to room temperature and decant.
  7. Dilute with water, then add oil, vitamins, salt, and various calciums.
  8. Pasteurize and package.

With the ingredients list from my previous trials and the patents we started to design a recipe.

We looked at using enzymes directly, but decided to try using malted barley. Malting grains make alpha and beta amylase. Plus malt is easy to get from brewing suppliers. (We do have some enzymes on order for testing too!)

Data from brewing charts suggests that we do a one hour cook at 148F (64C) for the beta-amylase, a second cook at 158F (70C) for the alpha-amylase, and at the end we bump the temperature up to 197F (92C) which denatures the enzymes. (We are using 197F/92C because that’s the max temperature of the immersion circulator we own.)

After a few trials we had a breakthrough when we found out about toasting the oats before processing them. It really lowers the oat smell and gives the resulting oat milk a light pleasant roasted flavor.

Our Current Recipe

This recipe compares favorably to Oatly, and is better than the other commercial oat milks that we have tried. We met all of our oat milk goals, but it could always be better, so we’re still experimenting. Watch out for updates.

This recipe makes a batch of about 17oz (500ml) after filtering and takes approximately 3 hours (which sounds long, but it’s mostly waiting).

Malted Barley6
Canola/Rapeseed Oil18
  1. Pre-heat the immersion circulator bath to 148F (64C).
  2. Toast oats in the oven at 250F (121C) for 8 minutes.*
  3. Add oats and malted barley to water.
  4. Blend until fine; add mixture to a 1 quart (~1 liter) Mason jar.
  5. Put the jar in the water bath at 148F (64C) for 1 hour; shake the jar every 15 minutes.
  6. Increase the water bath temperature to 158F (70C) and cook the oat mixture for an additional 1 hour; shake the jar every 15 minutes.
  7. Increase the water bath temperature to 197F (92C). Once the water bath reaches temperature, wait 10 minutes to ensure that the oat mixture has come up to temperature too.
  8. Remove the jar from water bath and allow to cool to ~110F (43C). Temperature isn’t critical here, cool enough so you don’t burn yourself.
  9. Filter the oat mixture through a mesh kitchen strainer, and then through a reusable gold coffee filter.
  10. Blend the salt and oil into the filtered oat mixture. If you see oil floating on top of your oat mixture blend more.
  11. Chill & drink!

*Depending on the kind of oats the toasting time might be different. We suggest doing a test toast of the oats that you’re using. Preheat your oven to 250F (121C), put some oats on a cookie sheet, and set a timer for 4 minutes. At 4 minutes open the oven grab a few oats, close the oven, and set your timer for another 4 minutes. Taste the oats; when they’re done they will have a hint of roastiness with no bitter/burnt flavor. Repeat until you figure out the perfect roasting time. Different brands of oats have required between 4 and 12 minutes. Instant oats seem to need longer, while fancier non-instant oats need shorter times.

This filter combination seems to be sufficient; it doesn’t leave sediment in the oat milk and is much faster than paper coffee filters or kitchen towels. Protip: for faster filtering slowly run a spoon over the inside of the gold coffee filter move the filtered material out of the way.

Next Page