It doesn't really work out that cleanly. If each hive has a 30% of loss, and they're each independent trials, like flipping a coin. We know that if we flip a coin an infinite number of times, half will come up heads, half tails. But if I flip it 10 times, each combination of counts has a probability, so I could get all tails, that's possible, and it has a probability, albeit a low one, but it's possible. I have a probability of getting 3 tails and 7 heads, that's closer to half-and-half, so that probability is a little higher. When I start saying things like "at least 3 tails", there are formulas with lots of sums that tell us how to calculate this. The sums are so annoying and long to calculate that before computers there were tables where you'd look up these probabilities.

So think of the hives the same way. If each hive has a 36.7% chance of loss, and you have an infinite number of them, your loss will tend toward 36.7%. But if you have 10 hives or 20, each combination of death/live has a probability. So my numbers are such that with a 36.7% chance of death for each hive, if I want to have less than a 5% chance of losing more than half my hives, I need 26 hives to make that happen. With 26 hives, the chance of having more than 13 die is less than 5%. You can caculate any percentage you'd like though. With those 26 hives and a chance of loss at 36.7% for each hive, what's my percentage of losing at least 10 of those? 50%. You've got a 50% chance of losing 10 hives or more, because that includes the probability of losing 10, 11, 12, 13, 14....all the way to 26. If I lower the chance of loss to 29.5%, what's the chance of losing 10 hives or more? 21%. Now I only have a 21% chance of losing 10 hives or more. My point is I believe that every point you shave from the chance of loss makes a fairly large difference. It isn't just the difference of a fraction of a hive.

## Bookmarks