Amazon Web Services Cloud Credits Available

The Technology Services Amazon Web Services (AWS) team is positioned to assist you with the AWS Cloud Credits for Research Program. Through this program, Amazon directly awards researchers credits for the use of AWS to enhance cloud-based research.

If you apply for credits, you are encouraged to inform the Tech Services AWS team by emailing aws-support@illinois.edu. Doing so allows the Tech Services AWS team, and its Amazon Account Managers, to assist with the AWS process and improve chances of credits being awarded.

The AWS Cloud Credits for Research Program is specifically designed to award credits to those researchers who:

  1. Build cloud-hosted publicly available science-as-a-service applications, software, or tools to facilitate their future research and the research of their community.
  2. Perform proof of concept or benchmark tests evaluating the efficacy of moving research workloads or open data sets to the cloud.
  3. Train a broader community on the usage of cloud for research workloads via workshops or tutorials.

The program is not oriented towards providing credits in support of ongoing operations.

Proposals are reviewed once a quarter by research experts at AWS and awarded amounts are in the form of promotional credits to be used on AWS services. The quarterly deadlines for submitting grant applications are:

  • March 31
  • June 30
  • September 30
  • December 31

Decisions are typically communicated 2-3 months following the respective quarterly deadline.

To apply visit https://aws.amazon.com/research-credits/.

MRI takes to the Cloud

Illinois is deeply tied to the development of MRI technology. The Nobel Prize-winning inventor of the technology, Paul Lauterbur, served on the faculty at Illinois for 22 years, for example. And now Illinois researchers are innovating again, turning to the power of the cloud to make MRI data faster and more cost-effective than ever to process.

Brain imaging. Image provided by Dr. Sutton.

Brain imaging. Image provided by Dr. Sutton.

Dr. Brad Sutton talks about how moving MRI data processing to Amazon Web Services (AWS) is helping researchers at Illinois contribute to our evolving understanding of the human brain.

Technology Services (Tech Services): What does a typical Beckman Institute’s Biomedical Imaging Center (BIC) project look like?

Brad Sutton (BPS): Our most common imaging projects are neuroimaging, where research groups are trying to identify brain-based biomarkers of physiological differences in the brain, either between two groups of subjects (such as younger and older adults) or before and after an intervention. Interventions could include a wide variety of things, including aerobic exercise, cognitive training, or nutritional supplements.

We collect 1-2 hours of neuroimaging data on our MRI scanners and then a variety of post-acquisition image processing and statistic steps need to be done to determine what is different in the brain measures between the two groups. Our MRI acquisitions provide information about brain structure, anatomy, blood flow, connectivity between different regions of the brain, and brain function during particular tasks or at rest. We also have methods that measure the mechanical properties of the brain or other aspect of brain physiology.

Image processing steps can be very computationally intensive. One typical measure that we may get (which is the one I did on AWS) is a structural connectivity map, looking at the white matter (cabling in the brain) to see the likelihood that different regions in the brain are wired together. A typical workflow to get structural connectivity would require about 16 hours processing to segment an individual’s brain into distinct, labeled regions; about 16 hours to process diffusion weighted data to determine which direction the cabling is going at every region in the brain, then about 12 hours to determine which regions are connected to which other regions. When a typical study includes 50-100 subjects, this can exceed the computational capabilities in a particular lab. And this is for only one type of measure that we will extract from a neuroimaging session. Other types of measures, such as functional connectivity, may take similar amounts of time.

Tech Services: Why did you want to use AWS? In particular, what features of AWS make your work/BIC’s work easier?

BPS: The project that I used with AWS had ~230 subjects that were part of an intervention. For each subject, we had measures taken pre-intervention and post-intervention, meaning 460 data sets from which to extract the structural connectivity measures described above. We had run some test analyses on our own private cloud at BIC using the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC, www.nitrc.org) computational images, which have all the common neuroimaging software installed. Our private cloud runs Eucalyptus and is similar to AWS, but only has the ability to launch 30 dual core workstations. The particular data in this project required analysis to be done quickly in order to meet grant-sponsor deadlines and to get our hypotheses addressed before the data set has to be shared with other neuroimaging researchers. Given that we had processing pipelines tested on the NITRC images and that these NITRC images are available for use on AWS, we decided to use the scaling capabilities of AWS to run all 460 analyses runs in parallel. Potentially getting our entire processing pipeline done on all subjects in about 2 days.

Tech Services: What were you using before AWS?

BPS: The Biomedical Imaging Center has a Neuroimaging Compute Cloud, called BICNICC, that runs Eucalyptus and enables users of BIC to access properly configured machines to process their data. Some disk images on BICNICC include custom image reconstruction software that enables users to collect advanced acquisitions for which the MRI scanner cannot produce images on its own. The BICNICC enables us to provide a flexible environment to distribute workflows that are tested and with limited scaling. The NITRC image is our most popularly used software on the BICNICC as it includes as one of the main neuroimaging processing software packages. Many psychology and neuroscience users only have Windows workstations in their own lab, but this neuroimaging software must be run on a Linux workstation. The BICNICC resource, coupled to the NITRIC image, gives users access to a Linux workstation through their web browser.

Tech Services: Can you describe the MRI processing workflow before and after AWS? What has changed?

BPS: AWS has enabled us to scale our processing capabilities. It is an ideal setup: users can test something locally without additional costs, on similar software to what they will have access to on AWS. They can run small pilot studies for minimal costs. When they are ready to scale and run their analysis, the transition requires a minimal amount of changes to their processing scripts, only accessing data through S3 on AWS instead of through a network drive mount. An additional aspect of running this large INSIGHT data on AWS is that it has provided useful information on how much this scaling costs. We were surprised by the low costs associated with storage of a very large dataset. We were also surprised to be able to get this large computing power at basically a cost of $5 per subject for analysis. When compared to the costs of acquiring the data, this is a very small price to get all the results almost immediately.

Tech Services: What does increased processing time allow you to do?

BPS: There are quite a few benefits to this scaling. First, we can try several different parameters related to analysis. We can also see how sensitive our results were to the specific parameters that we are using. Often, the computational requirements can just be met for one analysis run. Now, we can explore more about how fine-scale parcellation of the brain into more regions impacts the specificity of the structural connections in the brain.

Second, we can do large intermediate runs in order to be ready for a grant-sponsor site visit or preliminary results for a conference presentation. Previously, we would either make an analysis decision based on software available at the start of a multi-year project and try to keep up with the data acquisition. Or we would wait until the end and then spend a significant focused time on applying the analysis to all the data. The AWS workflow enables us to update pipelines as new software becomes available.

Tech Services: Can you name a few specific projects that have really benefitted from the switch to AWS?

BPS: Since we just did this run in the last couple of months, other users are just now starting to include the modest costs of AWS in their grant submissions. We will be able to roll-in wide-scale use of AWS resources as the new projects are started.

If you are a University of Illinois researcher interested in using AWS in your work, please visit https://aws.illinois.edu/ or email aws-support@illinois.edu for more information.

AWS S3 Outage

Amazon has posted their summary of this week’s S3 disruption in us-east-1. While this was just 1 of 60 services in 1 of 16 regions, it had an outsized impact on operations. A number of AWS components and third party services depend on S3 in us-east-1, and the outage cased widespread service disruptions across the internet.

S3 was the first publicly available Amazon service, and us-east-1 was the first AWS region, which helps explain why so many services were built on this particular instance of the service.

In the summary, Amazon transparently details what went wrong as well as the measures they’re taking to ensure that this class of mistake cannot reoccur. The lesson I’m taking from this is to expect failures, but ensure that you never fail the same way twice.

Public Cloud Services Comparison

Have you ever gotten the feeling that cloud providers are trying to confuse you with all their vendor-specific names for commodity services? Ever wondered what major services are available on which clouds? Have I got a resource for you!

Public Cloud Services Comparison

While we don’t yet have a way to buy Microsoft Azure or Google Cloud services on campus, it’s good to know what’s out there and how things compare.