Canary deployments of IIS using Octopus & AWS
I'm writing this more to clear my head than anything else. If it helps someone, great. We have measured significant 'release impact' when deploying one of our core applications. The main problem is initialization/warmup of the appPool. We've tried the built-in methods, but for whatever reason we are always stuck with ~25s of dead time while the first request warms things up (we assume it's warming things up, not really sure what is happening). After that 25s wait things are very snappy and fast, so how do we prevent all of our web servers from going into 25s of dead time with production traffic inbound? Starting point - why do this? We care about our customers, and we want to help drive our business forward with as much quality/safety/speed as possible. Because we want to drive our business forward, we are pushing to do more and more deploys ( currently we do a daily deploy, but want to see 5x that ) ( if you have to ask why we want 5x, read this ...