Blog

Interesting Reads

We are a cloud native service company with an eye on the latest trend in the cloud industry.

500+

Blog Articles

20.02

Last update


Blogs

search-icon
Execute AWS Lambda locally in Pycharm and Deploy to AWS 12
Well, as a software developer we generally prefer to use the tool of our own choice while coding. The priority varies from developer to developer. Some directly starts with the terminal while others do with the IDEs. The Development is required everywhere and the earnestness of Serverless technology can’t be ignored where also development plays a crucial role. One among them is AWS Lambda, It’s fun automating the stuff using Lambda but it becomes challenging when we need to do it online via the AWS console. In order to get rid of that muddle, let’s frame-up the environment where we get the same feel without compromising our interest, i.e. Coding from our favourite IDEs on the local computer.This recipe will provide a modus operandi to set up and manage the configurations of PyCharm in a local development environment, and how you can go ahead and deploy your first Python serverless function directly to AWS Lambda. Believe me, it's fun!!I am using macOS Catalina. The setup is going to remain indistinguishable in almost all the macOS. However, the instructions to do the same in all other OS is also mentioned. Here are the steps to setup PyCharm locally:1. In order to achieve our target, there are certain application pre-requisites.Ensure that Python is installed in your system. If not, click here to download the Python version of your choice. (Preferably Python 3.X)Check if everything is in order using the below command from the terminal.python --version# Python 3.7.3.2. Install the AWS Serverless Application Model (SAM) command-line interface using brew. (Windows Users can try this).Before Installing Brew there are few prerequisites. To complete that click hereAfter the successful installation of Brew checks the version.Note: Brew (It is not required in Windows or Linux OS)brew --versionThe output should look like this:Homebrew 2.1.6Homebrew/homebrew-core (git revision ef21; last commit 2019-06-19)Let’s install SAM CLI now.brew tap aws/tapbrew install aws-sam-cliTo upgrade the AWS SAM CLI, you still use Homebrew, but replace install with upgrade as follows:brew upgrade aws-sam-cli3. In order to run the Lambda function locally, we will be needing a docker container. Let us install docker .brew cask install dockeropen /Applications/Docker.appWindows and Linux based user can follow this link to get docker installed in their system.As soon as docker has been launched, check if it is running:docker ps4. Once done with the above steps, let’s install the IDE of our choice. Here in this case Pycharm. (You can make use of another tool like Visual Studios as well).Follow this link to install Pycharm. You may install the package as per the requirement. Here we are installing the community edition.5. Let us start the installation of a toolkit and set up the project structure.Steps: Navigate to Pycharm preferences > Plugins > Marketplace > Search Plugin in marketplace > Search for AWS Toolkit and Install it.Note: The way to install a plugin in windows and Linux may be different. You may click here to get it done in your system.Once done, restart Pycharm to get the plugin into action.6. Setup the Project in Pycharm.Let’s set up your project in PyCharm. Create a new project from File > Create new project and choose AWS Serverless Application as project type. Be sure to select the correct setting (you can choose a “Hello World” template from SAM Template).The project setup will be done automatically, and a function called lambda_handler, which returns a basic ,Hello World“ example, will be created in app.py.7. The directory structure will be like this:Note that, there are two files of major concern in our case right now. i). app.py ii). template.yamlClick on app.py file and write some sample lambda function. You may simply copy paste this as well or take the default created.import jsondef lambda_handler(event, context):return {"statusCode": 200,"body": json.dumps({"message": "hello world",}),}8. Now ensure that you have your Correct AWS credentials configured in your system.To install AWS CLI in Mac — Click here Windows — Click here Linux — Click hereOnce, installed, configure using the credentials, generated using IAM Service. To configure AWS CLI click here.9. Configure Pycharm to connect to your AWS account. Click on AWS Connection and Select the credentials along with the region.Selecting AWS credentials10. Once done, Select the JSON Input to trigger the lambda function. You may select Hello World Template or can write your own input file.Selecting the Input Json for Lambda Function11. Test the function locally now!! Simply go to your function and click on Run.Run Lambda LocallyNow it will take a while, as it will download the Docker Container for the first time. Once done, something like this will appear!Output after execution12. You’ll need to create an AWS CloudFormation stack and Amazon S3 Bucket (if needed) in order to proceed with the deployment.In order to deploy your function to AWS Lambda, you can right-click on your project folder in the sidebar, and choose Deploy Serverless Application:The deployment will go live in a short while!Give some name for the Cloudformation Stack and Select the S3 Bucket.Now, you’ve successfully set up PyCharm with AWS toolkit to test your functions locally, subsequently deploy it live to AWS Lambda (via AWS CloudFormation).Note: Feel free to reach us via a comment in case of any issue, we will solve it together.
Running NBFC Workload on AWS
Digital Transformation ChallengesDigital transformation in Financial Services is impossible without modernizing your core system. Amazon Web Services (AWS) is built to handle the complexity, rigor, and regulatory requirements unique to the Financial Services industry. By running your core system on AWS, you can access the agility and speed you need at any time, break down the siloes that hold your data hostage, and drive innovation at the enterprise level – all while reducing IT costs. Integrate your most valuable data with the cloud to automate manual processes, improve customer experiences, and launch new market-facing applications more quickly. AXESS.IO, an AWS Advanced partner has deep industry expertise, solutions that align to AWS best practices, and AWS-certified staff.AWS as the Trusted Choice for FSI IndustryWhat is AWS?Amazon Web Services (AWS) is a cloud platform that provides a secure and resilient cloud infrastructure that financial services can use to innovate, build and safely handle, process and analyze sensitive financial information.AWS is Highly AvailableGlobally, AWS has 76 availability zones with three availability zones in India.Also, AWS delivers the highest network availability with 7 times fewer downtime hours than the next largest cloud provider.AWS is Easily ScalableWith AWS, companies don’t have to provision resources to handle peak levels of activity. They, in turn, can just scale up or down based on their business needs and also pay only for what they use.AWS is Highly SecureThe AWS infrastructure is built to satisfy the security requirements for global banks and other high-sensitivity organizations and is monitored 24/7 to ensure confidentiality, integrity and availability of your data.AWS Security Hub comprises of solutions like Amazon Macie, Amazon Inspector and Amazon GaurdDuty that not only protect your infrastructure and data but also perform compliance monitoring.Effective and Efficient InfrastructureThe AWS cloud infrastructure is equipped to cater to the increasing mark of customer satisfaction, contain, process and analyze massive amounts of financial data without technical glitches while being highly secure and easily adaptable.Managing Compliance on AWSHaving said that, though operating on the cloud has immense opportunities in terms of business growth while being able to process and analyze hundreds of terabytes of financial data in very little time, it has its own list of setbacks. Migrating into a cloud infrastructure that doesn’t address these setbacks could turn the whole finance sector into a disaster.In addition, NBFCs are highly regulated. RBI has published guidelines in regards to the outsourcing of IT Infrastructure. These requirements are very stringent and have been designed to ensure business continuity in the event of a disaster or geopolitical problem.But the good news is that AWS with its local legal entity named Amazon Internet Services Private Limited (AISPL), meets all the compliance requirement from the RBI. We have published a couple of whitepapers in this regard:Whitepaper - Running NBFC Workload on AWSPlease do not hesitate to reach us at ciso@axcess.io to discuss how we can help you with your cloud journey.
What is Cloud Managed Services?
Cloud management is the process of exercise for control of administrative activities over public, private, and hybrid cloud services. A well-implemented cloud strategy enables users to maintain control over these dynamic and scalable cloud computing environments.Cloud Managed ServicesManaging cloud services refers to outsourcing daily IT management for cloud-based resources and technical support to automate and enhance your business operations. These services provide skilled resources that improve in-house functionalities and IT infrastructure to be managed in association with a third party Managed Service Provider (MSP) via cloud platforms.The services under cloud managed services include managed network operations, security managed operations, and cloud management, IT lifecycle management and mobility managed applications.As cloud managed services market is witnessing that with this accelerated growth in recent years is due to increased usage of cloud computing, big data, and mobility services, etc., It is estimated that cloud managed services will become a $120 billion business by the year 2020.How can a Third Party Managed Services Provider help?Because of the availability of IT infrastructure as a service, many new-age technological product based companies have adopted Cloud. They focus on their core products and partner with a third party Managed Services Provider to help them in keeping the service availability and continual business operations. An MSP helps in two broad categories:24/7 Monitoring and ManagementAn MSP takes care of the continual monitoring and associated management of the IT infrastructure. A proactive monitoring ensures that the downtime is minimized by analyzing the key metrices over a single dashboard, performing trend analysis and using machine learning algorithms to detect failures before its occurrence.They offer a team of Cloud Engineers skilled to automate routine tasks, remove single point of failures, assess and audit the infrastructure vulnerabilities based on the Industry’s best practices. They are an expert in this field and understands the nuances of the business. Moreover, it is a cost-effective proposition considering the kind of professional expertise that is brought to the table.DevOps AutomationCloud managed services providers will make your release and deployment processes simpler, by automating them. Adopting Cloud and harness its true power should be the objective of the IT leaders and Continuous Integration and Continuous Deployment (CICD) process is a path to progress. A good MSP has DevOps experts who help businesses in planning and strategizing their Cloud Roadmap, so that the developers could collaborate better and deliver results faster and better.
Transform Legacy Application using Amazon Q Developer (Transform)
The Shift Towards Assisted CodingThe world of software development is rapidly evolving, with developers increasingly relying on AI-powered tools to streamline workflows, reduce manual effort, and accelerate project delivery. Assisted coding tools are becoming indispensable in modern software engineering, enabling teams to tackle complex challenges with greater efficiency and precision.In this era of digital transformation, legacy systems—particularly those built on older frameworks like ASP.NET—pose significant challenges. Modernizing these systems requires not only technical expertise but also innovative tools that can simplify the process. Enter Amazon Q, AWS’s Generative AI-powered conversational assistant, which is designed not only to build cloud native AWS applications but is also proving to be a game-changer in the modernization of legacy Windows workloads.Strengths of Amazon Q in ModernizationAmazon Q stands out as a powerful tool for modernizing legacy systems, thanks to its ability to provide actionable insights, automate repetitive tasks, and guide developers through complex technical challenges. Its strengths lie in:Intelligent Recommendations: Offering tailored suggestions for containerization, dependency management, and deployment strategies.Seamless Integration: Working effortlessly with AWS services like EKS, making it an ideal choice for cloud migrations.Contextual Understanding: Interpreting legacy codebases and providing solutions that align with the existing architecture.These capabilities make Amazon Q an invaluable partner for teams looking to modernize legacy workloads without compromising on stability or performance.Case Study: Modernizing Legacy ASP.NET Applications with Amazon QIn a recent project, we partnered with a leading financial services provider to modernize their legacy ASP.NET applications by containerizing and migrating them to AWS Elastic Kubernetes Service (EKS). Amazon Q played a pivotal role in overcoming the challenges associated with this transformation. Here are three key highlights of how Amazon Q contributed to the project’s success:Simplifying Containerization of Legacy ASP.NET ApplicationsLegacy ASP.NET applications, built on the .NET Framework, are not inherently designed for containerization. Amazon Q provided step-by-step guidance on how to containerize these applications effectively. As a first but vital step, it recommended using a Windows container image as the base image and ensured compatibility with Internet Information Services (IIS), a critical component of the legacy applications. By automating much of the containerization process, Amazon Q significantly reduced the time and effort required to prepare the applications for migration. Resolving Complex Dependency ChallengesOne of the major hurdles in the project was managing dependencies like the Oracle Client, which was essential for the application’s functionality. Amazon Q provided detailed instructions on: -Installing the Oracle Client within the container.-Handling scenarios where both 32-bit and 64-bit Oracle Client versions were required.-Configuring IIS to accommodate these dependencies seamlessly.This level of granular guidance ensured that the applications ran smoothly in the new containerized environment. Streamlining Build and Deployment PipelinesAmazon Q also played a crucial role in optimizing the build and deployment process. It helped the team determine whether the applications could be built using a standard pipeline or required additional steps. For applications lacking .csprojg and .sln files, Amazon Q suggested building them using Visual Studio, while the Dockerfile handled the publishing and deployment processes. This approach minimized manual intervention and ensured a smooth transition to AWS EKS.Final ThoughtsAmazon Q and Q Transform (Currently in Preview) has emerged as a go-to tool for modernizing legacy Windows workloads, as demonstrated by its pivotal role in the successful migration of legacy ASP.NET applications to AWS EKS. By simplifying containerization, resolving complex dependencies, and streamlining build pipelines, Amazon Q enabled us to deliver a robust and scalable solution.As organizations continue to embrace cloud technologies, tools like Amazon Q will play an increasingly critical role in bridging the gap between legacy systems and modern infrastructure.Key Takeaways:Amazon Q is a powerful ally for modernizing legacy Windows workloads.Its intelligent recommendations and contextual understanding simplify complex challenges.By leveraging Amazon Q, teams can accelerate modernization efforts while maintaining stability and performance.
How to Understand and Evaluate S3 Costs over the Data Lifecycle
AWS Simple Storage Service—S3 in short—is a cheap, secure, durable, highly available object storage service for storing unlimited amount of data. S3 is simple to use and scales infinitely in terms of data storage. This is true at least in theory, although it’s remarkable that nobody has ever complained and are unlikely to do so in the foreseeable future. In S3, data is stored across multiple devices across a minimum of three availability zones in a highly redundant manner. S3 is highly flexible to store any type of data. It is a key-based object store. S3 is based on a global infrastructure and using it is as simple as making standard https calls.With its ever-growing popularity and ease of use, it provides many features and is still growing and evolving to enable a large number of use cases related to data storage and management across the industry, big or small. But there is something that has everyone worried, namely, S3 usage costs. Costs are not a matter of worry on their own, given that S3 is indeed quite an inexpensive storage available on cloud. What bothers is the visibility into the various factors which affect cost calculations. These factors are too detailed to be taken lightly. In this article, we will take a scenario, neither a very happy one nor very complex, as an example to calculate usage costs, which hopefully will make us aware of its fine details.AWS S3 PricingWith incredibly wide scenarios of data storage and management support in S3, there are a limited but large number of technical features and usage criteria which have attendant costs. Given all these features, it becomes quite challenging to visualize and evaluate how costs add up. Typically, the cost of S3 is calculated based on the Data storage, S3 APIs usage, that is, Requests and data retrievals, Network Data Transfer, and the other S3 features for its data management or operations, i.e., Management and Replication. There are different data storage classes in S3. Storage pricing on these classes are different; pricing also varies for the various API calls for data usage like uploads, listing, downloads, or retrievals. S3 Pricing is listed at https://aws.amazon.com/s3/pricing/. We will refer to this page for all calculations.Data Scenario and AssumptionsWe have our S3 bucket in the Mumbai region. Suppose we upload 10240GB (10TB) of data per day every day until 31st March 2020, and a total number of 10 Million objects corresponding to the 10TB data per day. This data is uploaded to a S3 bucket from application servers. Also, data is uploaded to the Standard Storage Class in S3. This upload Operation uses an S3 Object PUT request starting 1st March 2020. We will assume that each object is larger than the S3 Intelligent Tier minimum object size criteria, i.e., 128KB. Also, we will assume that the objects in the S3 Intelligent Tier would be in the Frequent access Tier for first 30 days and in the Infrequent access Tier for the next 30 days. The next assumption is that new uploads on each day would occur at 00:00AM and that the new data uses the full 24hours on the day of the upload. In addition, we assume that this is a single AWS account billed individually, that it is not part of any AWS Organization with any other member accounts, and that the free tier coverage is over. We will use the S3 pricing page for all the price rates and units for calculations pertaining to the Mumbai region.The S3 Lifecycle Transition—moving data to reduce costsWe can create Lifecycle Transitions to move data to ensure lower storage cost. Here, we will create a Lifecycle Transition using the following steps:Keep data in S3 Standard for 30 days after uploading to S3 Standard.After 30 days of creation, move the data to the S3 Intelligent Tier.After 90 days of creation, move the data to the Glacier Deep Archive Tier.After 120 days of creation, delete the data. Now, without further theoretical talk, let’s begin the journey of cost calculations.An S3 Standard Cost for MarchCost of UploadsWe calculate the price of uploads, i.e., PUT API calls as shown below:The PUT per 1000 requests in S3 Standard = $0.005(A) So, the cost of Total PUTs = 10,000,000 * $0.005 / 1000 * 31 = $1550Storage costNow, we will calculate the price for the total data stored in S3 Standard for a month—from 1st March to 31st March. We can calculate data in terms of only GB-Month because data is moving to other tiers and each object would add to monthly pricing for only the amount of time it is stored in S3 Standard during the month. Accordingly, the price for data storage for this one-month period is as follows:Total number of days of storage = 31 daysPrice of data Storage in S3 Standard for the first 50 TB per month = $0.025 per GBPrice of data Storage in S3 Standard for the next 450 TB per month = $0.024 per GBSince data is changing every day, we will calculate the effective storage for March in terms of GB-Month. This calculation will signify that the changing data size, on a cumulative basis, was in total that much GB per month for the month of March. Notably, any new data uploaded to S3 Standard will remain there for 30 days. So, the calculation of cost for March would be as follows:Total GB-Month = (Data Uploaded on 1st March and be there till 30th March) + (Data Uploaded on 2nd March and be there till 31st March) + (Data Uploaded on 3rd March for its GB-Month till 31st March) + … + (Data Uploaded on 31st March)Hence, total GB-Month for March = (10240 GB * 30 days/31 days per month) + (10240 GB * (31-1) days / 31 days per month) + (10240GB * (31-2) days / 31 days per month) + … + (10240 * (31-30)) days / 31 days per month) = (10240 GB / 31 days per month) * (31 + (31-1) + (31-2) + … + (31-30)) daysThis works out to (10240/31) *((31*31) -(1+1+2+…+30))) GB-Month, and further simplified as (10240 * (961 – 466) / 31) GB-Month = (10240 * 495 / 31) GB-Month The final figure amounts to 163509.677 GB-Month, or 159.677TB-Month(B) The cost of storage for the month of March is 50 TB-Month at $0.025 per GB + 109.677 TB-Month at $0.024 per GB = 50 * 1024 * $0.025 + 109.677 * 1024 * $0.024, amounting to $3975.42Transition to Intelligent TierAfter 30 days from creation, data would be moved to the S3 Intelligent Tier and remain there for the next 60 days. For the first 30 days, it would be in Frequent Access Tier; then it will move to the Infrequent Access Tier for the remaining days. For simplicity, we assume that there is no access while data is stored in S3 IT. If some data is accessed frequently beyond the first 30 days in S3 IT, such data would remain in the Frequent Access Tier; but we would need to dig deeper to find out which data subsets have been accessed frequently and at what times; this is a difficult task. However, we can make estimations using the Bucket’s CloudWatch Storage Metrics Graph.Objects uploaded to S3 on 1st March would move to S3 IT on 31st March and remain there till 29thObjects uploaded to S3 on 2nd March would move to S3 IT on 1st April and remain there till 30thObjects uploaded to S3 on 3rd March would remain in S3 standard on 1st April; then they move to S3 IT on 2nd April and remain there till 31stObjects uploaded to S3 on 4th March would remain in S3 standard from 1st to 2nd April; these objects then move to S3 IT on 3rd April and remain there till 1stSimilarly, data uploaded to S3 on other days would move to S3 IT after 30 days. Hence, the date of this transition would fall somewhere in April. So, for days in April before this date, this data would be in S3 Standard. From this date onwards, the data would be in S3 IT and then remain there for a total of 60days. Finally, data uploaded to S3 on 31st March would be in S3 Standard from 1st April to 29th April, and then would move to S3 IT on 30th April; it will remain there till 29th June.The S3 Intelligent Tier Cost for MarchObjects uploaded to S3 on 1st March would move to S3 IT on 31st March and remain there till 29th May.Transition CostThe Lifecycle Transition Price to S3 IT is $0.01 per 1000 Lifecycle Transition requests (objects)(C) Thus, the cost of Transition to S3 IT on 31st March is 10,000,000 * $0.01 / 1000, which works out to $100Storage CostThe price for Storage in Frequent Access Tier, first 50 TB / Month is $0.025 per GB(D) Thus, the cost of data storage for 31st March is 10240GB * 1 day / 31 days per month * $0.025 per GB, which equates to (10240 * 0.025) / 31, totaling $8.258Monitoring and Automation CostFor Monitoring and Automation, all Storage / Month = $0.0025 per 1,000 objects(E) Monitoring and Automation Cost for March = 10,000,000 * $0.0025 / 1000 = $25 The S3 Standard Tier Cost for AprilStorage CostThe total GB-Month in April is 10TB Data on 1st April for 3rd March upload + 10TB Data on 1st and 2nd April for 4th March upload + … + 10TB Data from 1st to 29th April for 31st March upload.So, the actual calculations are as follow: (10TB * 1 day / 30 days per month) + (10TB * 2 days / 30 days per month) + … + (10TB * 29 days / 30 days per month) This equates to 10 * (1+2+…+29) / 30 TB-Month = 10 * 435 / 30 TB-Month, totaling 145 TB-Month (F) The cost of storage for Standard Tier in April = ((50 * 1024) GB * $0.025 per GB) + ((95 * 1024) GB * $0.024 per GB), which is $3614.72S3 Intelligent Tier Cost for AprilThe objects uploaded on 1st March transitioned to the S3 Intelligent Tier on 31st March; these objects will be in the S3 Intelligent Tier (S3 IT) in April from 1st April to 29th April in the Frequent access Tier; then in the Infrequent access Tier on 30th April. Objects uploaded from 2nd March to 31st March will get transitioned to S3 Intelligent Tier (S3 IT) in April from 1st April till 30th April and remain there accordingly for 60 days.Transition CostThe Lifecycle Transition price to S3 IT is $0.01 per 1000 Lifecycle Transition requests (objects)Total Objects Transitioned to S3 IT in April are 10,000,000 * 30, that is, 300,000,000(G) So, the cost for Lifecycle Transition in April is 300,000,000 * $0.01 / 1000, which amounts to $3000Storage CostThe total GB-Month of Data in the S3 IT for the month of April is ( (10TB * 29 days / 30 days per month) + (10TB * 30 days / 30 days per month) + (10TB * (30-1) days / 30 days per month) + (10TB * (30-2) days / 30 days per month) + … + (10TB * (30-29) days / 30 days per month) ) (Frequent access)+ (10TB * 1 day / 30 days per month) (Infrequent access)This translates to 10TB * (29 + 30 + (30 –1) + (30 –2) + … + (30-29) ) days / 30 days Per month + (10TB * 1 day / 30 days per month);that is, 10 * (29 + (30*30) - (1+2+…+29) ) / 30 TB-Month = 164.67 TB-Month (Frequent access) + 1/3 TB-Month (Infrequent access)(H) Thus, the cost for S3 IT storage = (50 * 1024 * $0.025) + (114.67 * 1024 * $0.024) + (1/3 * 1024 * $0.019), which works out to $4104.533Monitoring and Automation CostFor Monitoring and Automation, all Storage / Month is $0.0025 per 1,000 objects(I) So, the cost of S3 Intelligent Tier Management is ($0.0025 / 1000) * (( (10Million * 29 days / 30 days per month) + (10Million * 30 days / 30 days per month) + (10Million * (30-1) days / 30 days per month) + (10Million * (30-2) days / 30 days per month) + (10Million * (30-3) days / 30 days per month) + … + (10Million * (30-29) days / 30 days per month) ) (Frequent access) + (10Million * 1 day / 30 days per month)) (Infrequent access)) Hence, the actual calculations are($0.0025 / 1000) * (10Million * (29 + 30 + (30-1) + (30-2) + (30-3) + … + (30-29) + 1) days / 30 days per month),which translates to ($0.0025 / 1000) * (10Million * (30 + 30 + (30-1) + (30-2) + (30-3) + … + (30-29)) days / 30 days per month);that is, ($0.0025 / 1000) * (10Million * 495 days) / 30 days per month) = ($0.0025 / 1000) * 165MillionObject-month, amounting to $5S3 Intelligent Tier Cost for MayObjects uploaded on 1st March will get transitioned to the S3 Intelligent Tier (S3 IT) Infrequent access Tier on 30th April; they will remain there accordingly for 30 days, i.e., on 30th April and from 1st May until 29thObjects uploaded from 2nd March to 31st March will get transitioned to the S3 Intelligent Tier (S3 IT) in April—from 1st April till 30th They will be in the Frequent Tier for 30 days. Then these objects will be automatically moved to the S3 Intelligent Tier (S3 IT) Infrequent access Tier for another 30 days.Object transitioned to S3 IT on 1st April will be in the Infrequent access Tier from 1st May to 30thObject transitioned to S3 IT on 2nd April will be in the Frequent access Tier on 1st Then, it will be in the Infrequent access Tier from 2nd May to 31st May.Object transitioned to S3 IT on 3rd April will be in Frequent access Tier on 1st and 2nd Then, it will be in the Infrequent access Tier from 3rd May to 31st May and 1st June.Similarly, transition will occur for data uploaded on other dates.Object transitioned to S3 IT on 30th April will be in the Frequent access Tier from 1st May up to 29th Then, it will be in the Infrequent access Tier from 30th May to 31st May and from 1st June to 28th June.Transition CostSince there is no transition from the S3 standard to the S3 Intelligent Tier in May, and S3 Intelligent Tier’s sub-Tiers (i.e., Frequent/Infrequent Access Tiers) don’t count for Transition, there is zero cost of Lifecycle transition this month.Storage CostFor Frequent Access Tier,Total GB-Month of Data in S3 IT for month of May = (10TB * 1 day / 31 days per month) + (10TB * 2 days / 31 days per month) + (10TB * 3 days / 31 days per month) + … + (10TB * 29 days / 31 days per month) (Frequent access);this translates to 10TB * (1+2+3+…+29) days / 31 days per month;that is, 10 * (1+2+3+…+29) / 31 TB-Month, resulting in 140.322 TB-Month(J) So, the cost for S3 IT storage (Frequent access) = (50 * 1024 * $0.025) + (90.322 * 1024 * $0.024), which is $3499.753472 For Infrequent Access Tier,The total GB-Month of Data in S3 IT for month of May is (10TB * 29 days / 31 days per month) + (10TB * 30 days / 31 days per month) + (10TB * (31-1) days / 31 days per month) + (10TB * (31-2) days / 31 days per month) + … + (10TB * (31-29) days / 31 days per month) (Infrequent access);this would be 10TB * (29 + 30) days / 31 days per month + 10TB * (31*29 - (1+2+3+…+29)) days / 31 days per month;which equates to, 10 * (29 + 30 + 31*29 - (1+2+3+…+29)) / 31 TB-Month, i.e, 168.7096 TB-Month(K) The cost for S3 IT storage (Infrequent access) is then 168.7096 * 1024 * $0.019, totaling $3282.41398Monitoring and Automation CostMonitoring and Automation, All Storage / Month is $0.0025 per 1,000 objects(L) The cost of S3 Intelligent Tier Management is ($0.0025 / 1000) * (Frequent Access Tier: (10Million * 1 day / 31 days per month) + (10Million * 2 days / 31 days per month) + (10Million * 3 days / 31 days per month) + … + (10Million * 29 days / 31 days per month) + Infrequent Access Tier: (10Million * 29 days / 31 days per month) + (10Million * 30 days / 31 days per month) + (10Million * (31-1) days / 31 days per month) + (10Million * (31-2) days / 31 days per month) + … + (10Million * (31-29) days / 31 days per month)); this would translate to ($0.0025 / 1000) * (10Million * (1+2+3+…+29) days / 31 days per month +10Million * (29+30+(31-1) +(31-2) +…+(31-29)) days / 31 days per month); that is, ($0.0025 / 1000) * ((10 * ((1+2+3+…+29) + (29 + 30 + (29 * 31) - (1+2+3+…+29)))/ 31) MillionObject-Month); further, ($0.0025 / 1000) * ((10 * (29 + 30 + (29 * 31)) / 31) MillionObject-Month); thus, ($0.0025 / 1000) * (309.032258065 MillionObject-Month), amounting to $7725.80645S3 Glacier Deep Archive Tier Cost for MayObjects uploaded to S3 will finally get transitioned to the S3 Glacier Deep Archive Tier after 90 days of creation and remain there accordingly for the remaining period of 30 days before being completely deleted from S3. Object uploaded on March 1st will get transitioned to the Deep Archive on May 30th and remain there on 30th and 31st May; subsequently, it will remain in the Deep Archive from 1st June for the remaining 30 days of the month. Similarly, the object uploaded on March 2nd will get transitioned to the Deep Archive on May 31st and remain there on 31st May; then, from 1st June, it will remain for 30 days in Deep Archive.Transition Cost The price of Lifecycle Transition Requests per 1000 requests is $0.07(M) Thus, the cost of Transition to the Deep Archive in May is 2 * 10,000,000 * $0.07 / 1000, that is, $1400Storage CostThe storage Price for objects in Deep Archive is affected by 3 factors:The total data size storedThe total S3 standard index size @ 8KB per objectThe total S3 Glacier Deep Archive index size @ 32KB per objectCalculations are as follows:The total Object Count transitioned to Deep Archive in May is 20,000,000So, the total Data Size = 2 * 10TBThe total TB-Month of data size in May is (10TB * 2 days / 31 days per month) + (10TB * 1 day / 31 days per month) + (32KB * 10,000,000 * 2 days / 31 days per month) + (8KB * 10,000,000 * 1 day / 31 days per month) (S3 Standard);that is, (990.967741935 GB-Month + 9.84438 GB-Month) (Deep Archive) + (2.461095 GB-Month) (S3 Standard);so, 1000.81212194 GB-Month (Deep Archive) + (2.461095 GB-Month) (S3 Standard)The Deep Archive Storage Price = All Storage / Month is $0.002 per GBThe S3 Standard Price = First 50 TB / Month is $0.025 per GB(N) Thus, the cost of Storage in Deep Archive in May = 1000.81212194 * $0.002 + 2.461095 * $0.025, totaling $2.06315S3 Intelligent Tier Cost for JuneObject transitioned to S3 IT on 2nd April will be in Frequent access Tier up to 1st Then, in Infrequent access Tier from 2nd May to 31st May.Object transitioned to S3 IT on 3rd April will be in Frequent access Tier up to 2nd Then, in Infrequent access Tier from 3rd May to 31st May and 1st June.Similarly, object transitioned to S3 IT on 4th April will be in Frequent access Tier on 1st, 2nd and 3rd Then, in Infrequent access Tier from 4th May to 31st May and then on 1st and 2nd June.These considerations apply similarly to objects uploaded later in April or May.Object transitioned to S3 IT on 30th April will be in Frequent access Tier from 1st May up to 29th Then, in Infrequent access Tier from 30th May to 31st May; subsequently, from 1st June to 28th June.Storage CostThe total TB-Month in June can be calculated as (10TB * 1 day / 30 days per month) + (10TB * 2 days / 30 days per month) + (10TB * 3 days / 30 days per month) + … + (10TB * 28 days / 30 days per month);which is, 10 * (1+2+3+…+28) / 30 TB-Month, totaling 135.33333 TB-Month(O) Then, the cost for S3 IT storage (Infrequent access) is (10 * (1+2+3+…+28) / 30 * 1024 * $0.019), totaling $2633.04533S3 Glacier Deep Archive Tier Cost for the remaining periodRemember, objects will be transitioned to S3 Glacier Deep Archive after 90 days from creation and deleted after 120 days of creation.Price of Lifecycle Transition Requests per 1000 requests is $0.07Price of Lifecycle Transition Requests for Deletion is $0Deep Archive storage Price = All Storage / Month is $0.002 per GBThe S3 Standard Price = First 50 TB / Month is $0.025 per GB, next 450 TB / Month is $0.024The minimum storage duration in the Deep Archive is 180 days. So, any object in Deep Archive that is deleted before 180 days from the day of transition to Deep Archive will be charged as if it were stored for 180 days after transitioning there.Here is information from AWS on S3 Pricing:Objects that are archived to S3 Glacier and S3 Glacier Deep Archive have a minimum 90 days and 180 days of storage, respectively. Objects deleted before 90 days and 180 days incur a pro-rated charge equal to the storage charge for the remaining days. Objects that are deleted, overwritten, or transitioned to a different storage class before the minimum storage duration will incur the normal storage usage charge plus a pro-rated request charge for the remainder of the minimum storage duration. This applies to the current scenario; thus, we need to calculate cost for 180 days in Deep Archive, instead of 30 days.CalculationsObject transitioned to Deep Archive on 30th May will remain in Deep Archive in June from 1st up to 28thObject transitioned to Deep Archive on 31st May will remain in Deep Archive in June from 1st up to 29thObject transitioned to S3 IT Infrequent access Tier on 2nd May will be transitioned to Deep Archive on 1st Then, it will remain there up to 30th June.Object transitioned to S3 IT Infrequent access Tier on 3rd May will be transitioned to Deep Archive on 2nd Then, it will remain there up to 30th June and on 1st July.This logic is similarly applied to later objects.Object transitioned to S3 IT Infrequent access Tier on 30th May will be transitioned to the Deep Archive after 28th Then, it will remain there on 29th June and 30th June and from 1st July to 28th July.Transition CostThe price of Lifecycle Transition Requests per 1000 requests is $0.07(P) So, the cost of Transition to Deep Archive in June is 29 * 10,000,000 * $0.07 / 1000, which works out to $20300. Again, the minimum Deep Archive storage period is 180 days.Object Uploaded to S3 on 1st March has transitioned to Deep Archive on 30thThe month wise days it should have been in Deep Archive, if not deleted early, would be as follows – 30th May and 31st May; the full months of June, July, August, September, and October; then the days from 1st November until 25thObject Uploaded to S3 on 31st March has transitioned to Deep Archive on 29thMonth wise days it should have been in Deep Archive, if not deleted early, would be – 29th and 30th June; the full months of July, August, September, October, and November; then, the days from 1st December up to 25thAll data transitioned in between will fill up the middle months and days, in chronological order, between these two extremities.Storage CostAs stated earlier, there are 3 components for Deep Archive Storage:The total data size storedA total S3 standard index size of 8KB per object(For early deletion, I will be adding this component too. If anybody finds it incorrect, please provide your comment; I would be happy to correct)A total S3 Glacier Deep Archive index size of 32KB per object(For early deletion, I will be adding this component too. If anybody finds it incorrect, please provide your comment; I would be happy to correct) In case of Early deletion, as referred above from the Pricing page, storage cost would be equal to storage for 180 days from the date of transition to Deep Archive. There is also a pro-rated request charge for early deletion cases, but I am not able to find the pricing unit for that, so I will be skipping such request charges. (I will try to figure it out and update later, or if anybody has any information, please provide in a comment below; I would be happy to update and appreciate your inputs.)Deep Archive Storage Costs in JuneThe total TB-Month for Data Objects is calculated as (10TB * 30 days / 30 days per month) + (10TB * (30-1) days / 30 days per month) + (10TB * (30-2) days / 30 days per month) + … + (10TB * (30-27) days / 30 days per month) + (10TB * (30-28) days / 30 days per month);this equates to 10TB * ((30-0) + (30-1) + (30-2) + … + (30-27) + (30-28)) days / 30 days per month;that is, 10TB * (30*29 - (1+2+…+28)) days/ 30 days per month = 10 * ((30*29) - (14*29)) / 30 TB-Month;which amounts to 154.666666667 TB-MonthThe total GB-Month for Deep Archive Index Objects is (32KB * 10,000,000 * 30 days / 30 days per month) + (32KB * 10,000,000 * (30-1) days / 30 days per month) + (32KB * 10,000,000 * (30-2) days / 30 days per month) + … + (32KB * 10,000,000 * (30-27) days / 30 days per month) + (32KB * 10,000,000 * (30-28) days / 30 days per month);so, 32KB * 10,000,000 * ((30*29) - (14*29)) / 30 KB-Month, which amounts to 4720.05208333 GB-MonthThe total GB-Month for S3 Index Objects = (8KB * 10,000,000 * 30 days / 30 days per month) + (32KB * 10,000,000 * (30-1) days / 30 days per month) + (8KB * 10,000,000 * (30-2) days / 30 days per month) + … + (8KB * 10,000,000 * (30-27) days / 30 days per month) + (8KB * 10,000,000 * (30-28) days / 30 days per month);so, 8KB * 10,000,000 * ((30*29) - (14*29)) / 30 KB-Month, totaling 1180.01302083 GB-Month(Q) Thus, the total Costs of Storage in June are ((154.666666667 * 1024) + 4720.05208333) * $0.002 + (1180.01302083 * $0.025); this amounts to $355.697763Deep Archive Storage Costs in JulyThe total TB-Month for Data Objects is 31 * (10TB * 31 days / 31 days per month), amounting to 310 TB-MonthThe total GB-Month for Deep Archive Index Objects is 31 * (32KB * 10,000,000 * 31 days / 31 days per month), that is, 9460.44921875 GB-MonthThe total GB-Month for S3 Index Objects is 31 * (8KB * 10,000,000 * 31 days / 31 days per month), which, is 2365.11230469 GB-Month(R) So, total Costs of Storage in July = (((310 * 1024) + 9460.44921875) * $0.002) + (2365.11230469 * $0.025), totaling $712.928706Deep Archive Storage Costs in AugustThe total TB-Month for Data Objects is 31 * (10TB * 31 days / 31 days per month), that is, 310 TB-MonthThe total GB-Month for Deep Archive Index Objects is 31 * (32KB * 10,000,000 * 31 days / 31 days per month), which is 9460.44921875 GB-MonthThe total GB-Month for S3 Index Objects is 31 * (8KB * 10,000,000 * 31 days / 31 days per Month), amounting to 2365.11230469 GB-Month(S) So, the total Costs of Storage in August are calculated (((310 * 1024) + 9460.44921875) * $0.002) + (2365.11230469 * $0.025); this amounts to $712.928706Deep Archive Storage Costs in SeptemberThe total TB-Month for Data Objects is 31 * (10TB * 30 days / 30 days per month), that is, 310 TB-MonthThe total GB-Month for Deep Archive Index Objects is 31 * (32KB * 10,000,000 * 30 days / 30 days per month), which is 9460.44921875 GB-MonthThe total GB-Month for S3 Index Objects is 31 * (8KB * 10,000,000 * 30 days / 30 days per month), which equals 2365.11230469 GB-Month(T) So, the total Cost of Storage in September is (((310 * 1024) + 9460.44921875) * $0.002) + (2365.11230469 * $0.025), totaling $712.928706Deep Archive Storage Costs in OctoberThe total TB-Month for Data Objects is 31 * (10TB * 31 days / 31 days per month), that is, 310 TB-MonthThe total GB-Month for Deep Archive Index Objects is 31 * (32KB * 10,000,000 * 31 days / 31 days per month), amounting to 9460.44921875 GB-MonthThe Total GB-Month for S3 Index Objects id 31 * (8KB * 10,000,000 * 31 days / 31 days per month), which is 2365.11230469 GB-Month(U) So, the total Cost of Storage in October is (((310 * 1024) + 9460.44921875) * $0.002) + (2365.11230469 * $0.025); the total for October is $712.928706Deep Archive Storage Costs in NovemberThe total TB-Month for Data Objects = 31 * (10TB * 25 days / 30 days per month) + 30 * (10TB * 1 day / 30 days per month) + 29 * (10TB * 1 day / 30 days per month) + 28 * (10TB * 1 day / 30 days per month) + 27 * (10TB * 1 day / 30 days per month) + 26 * (10TB * 1 day / 30 days per month);that is, 10 * ((31*25) + 30 + 29 + 28 + 27 + 26) / 30 TB-Month, which equals 305 TB-Month

subscription

Subscribe to Our Blogs

Subscribe to our blogs and be the first to know about innovations in the field of cloud storage

Ready to discuss your cloud project?
Have questions?

Get In Touch

Only a competent AWS Consulting Partner will understand your unique needs and goals. The smart, enterprise-ready cloud solutions from Axcess.io can make life easier for your organization.


Services

    © 2025 All rights reserved

    Terms of Service|Privacy Policy