Warning: this post is not about programming. Andy suggested I should sneak some homebrewing material onto my blog, and I thought hmm, why not? If you violently object, let me know in the comments and I won't do it again :-)
Yesterday I brewed a traditional German hefeweizen, which was the first time I tried a decoction mash. This got me thinking about how incredibly accurate traditional brewers managed to be, long before the arrival of chemistry, biology, or scientific measuring tools. This particular trick might be obvious to people with a stronger background in physics, but I thought it was pretty amazing. So, a puzzle...
Hans is a 17th century German brewer. He wants to attract a loyal customer base who will seek out his beer in preference to that of competitors. To achieve this, he must:
One of the key steps in brewing is heating a mix of water and crushed malted barley (called the "mash") to 152 °F, where amylase enzymes will convert starch molecules into sugars that yeast can later ferment into alcohol. But Hans does not have access to a thermometer, so he has no way to accurately hit the right temperature. He can estimate by sticking his finger in the mash, but even with lots of practice this is horribly inaccurate. Sometimes he ends up at 148 °F, which produces a dry, thin beer. Other days he ends up at 158 °F, which gives a thick, syrupy end result. One time he overshot all the way to 170 °F, which accidentally denatured the enzymes, leaving him with a couple of tons of watery grain that could no longer be used to make beer at all!
Is it possible to do better? How can Hans heat his mash to exactly 152 °F without the use of a thermometer?