Even though RedLine13 uses AWS to spin up load agents for testing, our service also plays well with other cloud platforms. In this article we will explain some of the ways RedLine13 can work in a mixed environment, including some real-world examples from some of our customers.
The beautiful simplicity of RedLine13 shines with its showcase ability to abstract nearly all aspects of performance testing. We handle everything from the provisioning of load agent EC2 instances, installation of test suites such as JMeter, and fully managed execution of those tests. When completed, we even handle the results collection, clean-up, and power down those instances.
However, many architectures are hybrid constructs built from an amalgamation of offerings sourced from a myriad of cloud providers. There are logistical considerations and it often makes good sense to do this, and it actually is increasingly difficult to find architectures built exclusively under a single cloud provider roof. It turns out this is a good thing, making it possible to combine the best aspects from each offering.
There are a few reasons why we use Amazon EC2 instances as the sole platform for load agent worker machines. The most obvious of those reasons is cost, which allows us to pass those savings along to our customers in delivering on our mission statement. Another practical reason is simplicity of configuration. This indirectly supports cost savings but perhaps more importantly confers reliability and stability. Load tests in their purest form are experiments, and their value is directly tied to reproducibility. There is little difference between virtual machines running in Azure vs AWS vs Google Cloud Services. However, by delivering a core feature from a single platform we can focus on making our service as solid as possible.
RedLine13 is Compatible
Even though we use EC2 instances exclusively as load agents, there is nothing which inherently makes our architecture incompatible with any other cloud provider out there. In fact it is often advantageous to simulate loads from different origins than your hosted application. Doing so will better simulate organic user traffic. Whether this occurs from one AWS region to another, or from AWS to Azure is inconsequential. RedLine13 will facilitate this without any special configuration.
Occasionally we will receive a request for the ability to spin up load agents in an Azure account. Some reasons for not doing this points to what we have described above. Another reason it is less desirable is for cost reasons. Using Linux-based EC2 agents in AWS is by far the least costly option. Accordingly, we can pass these cost savings on our customers. We add no markup whatsoever regardless of how many load agents or of what type are used in our tests.
Recently, one RedLine13 customer had a large enterprise deployment primarily set up as an Azure classic Cloud Service. However, they also had several components in AWS as well. They needed to test various components of this architecture. Using RedLine13, their performance test engineer was able to easily devise test plans that not only tested each stack independently, but also spanned the gap between these service offerings.
Testing was done in two phases. The first phase involved testing the Azure side of the architecture. This focused on both capacity testing of the routing architecture, as well as throughput testing of the application instances. Scaling up load tests from RedLine13, the primary application architecture was tested to tens of millions of requests per hour.
The second phase of testing involved a load test from RedLine13 against the complex supplementary API based in AWS. This particular system worked by aggregating “big data” feeds from the application and storing it in S3, whereby Athena was used to expose to a Lambda function accessible through an API gateway.
Of the multitude of third party performance monitoring services out there, NewRelic is arguably amongst the most popular. In this case NewRelic metrics were used to supplement performance metrics out of both Azure and AWS in the final aggregated performance results. Post-test analysis of the data allowed some extremely useful graphs to be produced which really helped the business team make informed decisions about cloud resource choices. With a system as large as this, those savings were in the hundreds of thousands of dollars per year.
Given current trends, odds are your next performance testing project will span multiple cloud providers. RedLine13 makes perfect sense as a framework for testing across AWS and other cloud platforms.
Did you know that we offer a free trial which allows you to try almost all the features of RedLine13 for yourself? Sign up today and get started!