AWS Simple Storage Service—S3 in short—is a cheap, secure, durable, highly available object storage service for storing unlimited amount of data. S3 is simple to use and scales infinitely in terms of data storage. This is true at least in theory, although it’s remarkable that nobody has ever complained and are unlikely to do so in the foreseeable future. In S3, data is stored across multiple devices across a minimum of three availability zones in a highly redundant manner. S3 is highly flexible to store any type of data. It is a key-based object store. S3 is based on a global infrastructure and using it is as simple as making standard https calls.
With its ever-growing popularity and ease of use, it provides many features and is still growing and evolving to enable a large number of use cases related to data storage and management across the industry, big or small. But there is something that has everyone worried, namely, S3 usage costs. Costs are not a matter of worry on their own, given that S3 is indeed quite an inexpensive storage available on cloud. What bothers is the visibility into the various factors which affect cost calculations. These factors are too detailed to be taken lightly. In this article, we will take a scenario, neither a very happy one nor very complex, as an example to calculate usage costs, which hopefully will make us aware of its fine details.
With incredibly wide scenarios of data storage and management support in S3, there are a limited but large number of technical features and usage criteria which have attendant costs. Given all these features, it becomes quite challenging to visualize and evaluate how costs add up. Typically, the cost of S3 is calculated based on the Data storage, S3 APIs usage, that is, Requests and data retrievals, Network Data Transfer, and the other S3 features for its data management or operations, i.e., Management and Replication. There are different data storage classes in S3. Storage pricing on these classes are different; pricing also varies for the various API calls for data usage like uploads, listing, downloads, or retrievals. S3 Pricing is listed at https://aws.amazon.com/s3/pricing/. We will refer to this page for all calculations.
We have our S3 bucket in the Mumbai region. Suppose we upload 10240GB (10TB) of data per day every day until 31st March 2020, and a total number of 10 Million objects corresponding to the 10TB data per day. This data is uploaded to a S3 bucket from application servers. Also, data is uploaded to the Standard Storage Class in S3. This upload Operation uses an S3 Object PUT request starting 1st March 2020. We will assume that each object is larger than the S3 Intelligent Tier minimum object size criteria, i.e., 128KB. Also, we will assume that the objects in the S3 Intelligent Tier would be in the Frequent access Tier for first 30 days and in the Infrequent access Tier for the next 30 days. The next assumption is that new uploads on each day would occur at 00:00AM and that the new data uses the full 24hours on the day of the upload. In addition, we assume that this is a single AWS account billed individually, that it is not part of any AWS Organization with any other member accounts, and that the free tier coverage is over. We will use the S3 pricing page for all the price rates and units for calculations pertaining to the Mumbai region.
We can create Lifecycle Transitions to move data to ensure lower storage cost. Here, we will create a Lifecycle Transition using the following steps:
Now, without further theoretical talk, let’s begin the journey of cost calculations.
Cost of Uploads
We calculate the price of uploads, i.e., PUT API calls as shown below:
(A) So, the cost of Total PUTs = 10,000,000 * $0.005 / 1000 * 31 = $1550
Storage cost
Now, we will calculate the price for the total data stored in S3 Standard for a month—from 1st March to 31st March. We can calculate data in terms of only GB-Month because data is moving to other tiers and each object would add to monthly pricing for only the amount of time it is stored in S3 Standard during the month. Accordingly, the price for data storage for this one-month period is as follows:
Since data is changing every day, we will calculate the effective storage for March in terms of GB-Month. This calculation will signify that the changing data size, on a cumulative basis, was in total that much GB per month for the month of March. Notably, any new data uploaded to S3 Standard will remain there for 30 days. So, the calculation of cost for March would be as follows:
(B) The cost of storage for the month of March is 50 TB-Month at $0.025 per GB + 109.677 TB-Month at $0.024 per GB = 50 * 1024 * $0.025 + 109.677 * 1024 * $0.024, amounting to $3975.42
Transition to Intelligent Tier
After 30 days from creation, data would be moved to the S3 Intelligent Tier and remain there for the next 60 days. For the first 30 days, it would be in Frequent Access Tier; then it will move to the Infrequent Access Tier for the remaining days. For simplicity, we assume that there is no access while data is stored in S3 IT. If some data is accessed frequently beyond the first 30 days in S3 IT, such data would remain in the Frequent Access Tier; but we would need to dig deeper to find out which data subsets have been accessed frequently and at what times; this is a difficult task. However, we can make estimations using the Bucket’s CloudWatch Storage Metrics Graph.
Similarly, data uploaded to S3 on other days would move to S3 IT after 30 days. Hence, the date of this transition would fall somewhere in April. So, for days in April before this date, this data would be in S3 Standard. From this date onwards, the data would be in S3 IT and then remain there for a total of 60days. Finally, data uploaded to S3 on 31st March would be in S3 Standard from 1st April to 29th April, and then would move to S3 IT on 30th April; it will remain there till 29th June.
Objects uploaded to S3 on 1st March would move to S3 IT on 31st March and remain there till 29th May.
Transition Cost
(C) Thus, the cost of Transition to S3 IT on 31st March is 10,000,000 * $0.01 / 1000, which works out to $100
Storage Cost
(D) Thus, the cost of data storage for 31st March is 10240GB * 1 day / 31 days per month * $0.025 per GB, which equates to (10240 * 0.025) / 31, totaling $8.258
Monitoring and Automation Cost
(E) Monitoring and Automation Cost for March = 10,000,000 * $0.0025 / 1000 = $25
Storage Cost
So, the actual calculations are as follow: (10TB * 1 day / 30 days per month) + (10TB * 2 days / 30 days per month) + … + (10TB * 29 days / 30 days per month) This equates to 10 * (1+2+…+29) / 30 TB-Month = 10 * 435 / 30 TB-Month, totaling 145 TB-Month (F) The cost of storage for Standard Tier in April = ((50 * 1024) GB * $0.025 per GB) + ((95 * 1024) GB * $0.024 per GB), which is $3614.72
The objects uploaded on 1st March transitioned to the S3 Intelligent Tier on 31st March; these objects will be in the S3 Intelligent Tier (S3 IT) in April from 1st April to 29th April in the Frequent access Tier; then in the Infrequent access Tier on 30th April. Objects uploaded from 2nd March to 31st March will get transitioned to S3 Intelligent Tier (S3 IT) in April from 1st April till 30th April and remain there accordingly for 60 days.
Transition Cost
(G) So, the cost for Lifecycle Transition in April is 300,000,000 * $0.01 / 1000, which amounts to $3000
Storage Cost
+ (10TB * 1 day / 30 days per month) (Infrequent access)
(H) Thus, the cost for S3 IT storage = (50 * 1024 * $0.025) + (114.67 * 1024 * $0.024) + (1/3 * 1024 * $0.019), which works out to $4104.533
Monitoring and Automation Cost
(I) So, the cost of S3 Intelligent Tier Management is ($0.0025 / 1000) * (( (10Million * 29 days / 30 days per month) + (10Million * 30 days / 30 days per month) + (10Million * (30-1) days / 30 days per month) + (10Million * (30-2) days / 30 days per month) + (10Million * (30-3) days / 30 days per month) + … + (10Million * (30-29) days / 30 days per month) ) (Frequent access) + (10Million * 1 day / 30 days per month)) (Infrequent access)) Hence, the actual calculations are
Transition Cost
Since there is no transition from the S3 standard to the S3 Intelligent Tier in May, and S3 Intelligent Tier’s sub-Tiers (i.e., Frequent/Infrequent Access Tiers) don’t count for Transition, there is zero cost of Lifecycle transition this month.
Storage Cost
For Frequent Access Tier,
(J) So, the cost for S3 IT storage (Frequent access) = (50 * 1024 * $0.025) + (90.322 * 1024 * $0.024), which is $3499.753472 For Infrequent Access Tier,
(K) The cost for S3 IT storage (Infrequent access) is then 168.7096 * 1024 * $0.019, totaling $3282.41398
Monitoring and Automation Cost
(L) The cost of S3 Intelligent Tier Management is ($0.0025 / 1000) * (Frequent Access Tier: (10Million * 1 day / 31 days per month) + (10Million * 2 days / 31 days per month) + (10Million * 3 days / 31 days per month) + … + (10Million * 29 days / 31 days per month) + Infrequent Access Tier: (10Million * 29 days / 31 days per month) + (10Million * 30 days / 31 days per month) + (10Million * (31-1) days / 31 days per month) + (10Million * (31-2) days / 31 days per month) + … + (10Million * (31-29) days / 31 days per month)); this would translate to ($0.0025 / 1000) * (10Million * (1+2+3+…+29) days / 31 days per month +10Million * (29+30+(31-1) +(31-2) +…+(31-29)) days / 31 days per month); that is, ($0.0025 / 1000) * ((10 * ((1+2+3+…+29) + (29 + 30 + (29 * 31) - (1+2+3+…+29)))/ 31) MillionObject-Month); further, ($0.0025 / 1000) * ((10 * (29 + 30 + (29 * 31)) / 31) MillionObject-Month); thus, ($0.0025 / 1000) * (309.032258065 MillionObject-Month), amounting to $7725.80645
Objects uploaded to S3 will finally get transitioned to the S3 Glacier Deep Archive Tier after 90 days of creation and remain there accordingly for the remaining period of 30 days before being completely deleted from S3. Object uploaded on March 1st will get transitioned to the Deep Archive on May 30th and remain there on 30th and 31st May; subsequently, it will remain in the Deep Archive from 1st June for the remaining 30 days of the month. Similarly, the object uploaded on March 2nd will get transitioned to the Deep Archive on May 31st and remain there on 31st May; then, from 1st June, it will remain for 30 days in Deep Archive.
Transition Cost
(M) Thus, the cost of Transition to the Deep Archive in May is 2 * 10,000,000 * $0.07 / 1000, that is, $1400
Storage Cost
The storage Price for objects in Deep Archive is affected by 3 factors:
Calculations are as follows:
(N) Thus, the cost of Storage in Deep Archive in May = 1000.81212194 * $0.002 + 2.461095 * $0.025, totaling $2.06315
Storage Cost
(O) Then, the cost for S3 IT storage (Infrequent access) is (10 * (1+2+3+…+28) / 30 * 1024 * $0.019), totaling $2633.04533
Remember, objects will be transitioned to S3 Glacier Deep Archive after 90 days from creation and deleted after 120 days of creation.
The minimum storage duration in the Deep Archive is 180 days. So, any object in Deep Archive that is deleted before 180 days from the day of transition to Deep Archive will be charged as if it were stored for 180 days after transitioning there.
Here is information from AWS on S3 Pricing:
Objects that are archived to S3 Glacier and S3 Glacier Deep Archive have a minimum 90 days and 180 days of storage, respectively. Objects deleted before 90 days and 180 days incur a pro-rated charge equal to the storage charge for the remaining days. Objects that are deleted, overwritten, or transitioned to a different storage class before the minimum storage duration will incur the normal storage usage charge plus a pro-rated request charge for the remainder of the minimum storage duration.
This applies to the current scenario; thus, we need to calculate cost for 180 days in Deep Archive, instead of 30 days.
Calculations
Transition Cost
(P) So, the cost of Transition to Deep Archive in June is 29 * 10,000,000 * $0.07 / 1000, which works out to $20300. Again, the minimum Deep Archive storage period is 180 days.
All data transitioned in between will fill up the middle months and days, in chronological order, between these two extremities.
Storage Cost
As stated earlier, there are 3 components for Deep Archive Storage:
(For early deletion, I will be adding this component too. If anybody finds it incorrect, please provide your comment; I would be happy to correct)
(For early deletion, I will be adding this component too. If anybody finds it incorrect, please provide your comment; I would be happy to correct) In case of Early deletion, as referred above from the Pricing page, storage cost would be equal to storage for 180 days from the date of transition to Deep Archive. There is also a pro-rated request charge for early deletion cases, but I am not able to find the pricing unit for that, so I will be skipping such request charges. (I will try to figure it out and update later, or if anybody has any information, please provide in a comment below; I would be happy to update and appreciate your inputs.)
Deep Archive Storage Costs in June
(Q) Thus, the total Costs of Storage in June are ((154.666666667 * 1024) + 4720.05208333) * $0.002 + (1180.01302083 * $0.025); this amounts to $355.697763
Deep Archive Storage Costs in July
(R) So, total Costs of Storage in July = (((310 * 1024) + 9460.44921875) * $0.002) + (2365.11230469 * $0.025), totaling $712.928706
Deep Archive Storage Costs in August
(S) So, the total Costs of Storage in August are calculated (((310 * 1024) + 9460.44921875) * $0.002) + (2365.11230469 * $0.025); this amounts to $712.928706
Deep Archive Storage Costs in September
(T) So, the total Cost of Storage in September is (((310 * 1024) + 9460.44921875) * $0.002) + (2365.11230469 * $0.025), totaling $712.928706
Deep Archive Storage Costs in October
(U) So, the total Cost of Storage in October is (((310 * 1024) + 9460.44921875) * $0.002) + (2365.11230469 * $0.025); the total for October is $712.928706
Deep Archive Storage Costs in November
AWS S3 is a highly available and fault tolerant object storage services. However, pricing is complex. There are many features and options that need to be used judiciously. Improper usage can have significant cost leakage. Contact us if you need any help in building a storage and archive solution that can scale.
Subscribe to our blogs and be the first to know about innovations in the field of cloud storage
Only a competent AWS Consulting Partner will understand your unique needs and goals. The smart, enterprise-ready cloud solutions from Axcess.io can make life easier for your organization.
© 2025 All rights reserved