I’d like to share an Amazon blog entry detailing Georgia Tech’s use of AWS to guarantee emergency communications.
In the event of an emergency, Georgia Tech’s communications team will direct clients to their emergency website, a quick-loading static site hosted on Amazon S3. By using Amazon’s services, Georgia Tech can be confident that they have the capacity to communicate to any size audience at any time.
S3’s standard tier stores a minimum of three copies of data across multiple physical locations within a region.
S3 is built to be scalable, spreading load across a huge number of systems. Adding thousands of additional requests per minute is inconsequential.
Since Amazon bills for actual usage, the main steady state cost is for the web content. The site the blog describes probably totals a few megabytes of storage. At $0.03 per gigabyte per month, that’s going to round up to a penny.
Even during a crisis, with a large number of clients repeatedly reloading the page, Georgia Tech estimates their costs will max out at $20-some dollars.
Instant updates via a simple API
Content is managed through a simple web application running on EC2 instances. When new information is available, the communications team can log into that application, update the content, and publish a new page to S3. The updated content is immediately available to clients.
I’d initially thought that CloudFront might make a good addition to this design, since it speeds up content delivery and lowers costs even further. But CloudFront requires 10-15 minutes to process a cache invalidation request, which could delay critical communication in an emergency situation.