❌

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Managing EKS Clusters Using AWS Lambda: A Step-by-Step Approach

By: Ragul.M
20 December 2024 at 12:20

Efficiently managing Amazon Elastic Kubernetes Service (EKS) clusters is critical for maintaining cost-effectiveness and performance. Automating the process of starting and stopping EKS clusters using AWS Lambda ensures optimal utilization and reduces manual intervention. Below is a structured approach to achieve this.

1. Define the Requirements

  • Identify the clusters that need automated start/stop operations.
  • Determine the dependencies among clusters, if any, to ensure smooth transitions.
  • Establish the scaling logic, such as leveraging tags to specify operational states (e.g., auto-start, auto-stop).

2. Prepare the Environment

  • AWS CLI Configuration: Ensure the AWS CLI is set up with appropriate credentials and access.
  • IAM Role for Lambda:
    • Create a role with permissions to manage EKS clusters (eks:DescribeCluster, eks:UpdateNodegroupConfig, etc.).
    • Include logging permissions for CloudWatch Logs to monitor the Lambda function execution.

3. Tag EKS Clusters

  • Use resource tagging to identify clusters for automation.
  • Example tags:
    • auto-start=true: Indicates clusters that should be started by the Lambda function.
    • dependency=<cluster-name>: Specifies any inter-cluster dependencies.

4. Design the Lambda Function

  • Trigger Setup:
    • Use CloudWatch Events or schedule triggers (e.g., daily or weekly) to invoke the function.
  • Environment Variables: Configure the function with environment variables for managing cluster names and dependency details.
  • Scaling Configuration: Ensure the function dynamically retrieves scaling logic via tags to handle operational states.

5. Define the Workflow

  • Fetch Cluster Information: Use AWS APIs to retrieve cluster details, including their tags and states.
  • Check Dependencies:
    • Identify dependent clusters and validate their status before initiating operations on others.
  • Start/Stop Clusters:
    • Update node group configurations or use cluster-level start/stop APIs where supported.
  • Implement Logging and Alerts: Capture the execution details and errors in CloudWatch Logs.

(If you want my code , just comment "ease-py-code" on my blog , will share you 🫢 )

6. Test and Validate

  • Dry Runs: Perform simulations to ensure the function executes as expected without making actual changes.
  • Dependency Scenarios: Test different scenarios involving dependencies to validate the logic.
  • Error Handling: Verify retries and exception handling for potential API failures.

7. Deploy and Monitor

  • Deploy the Function: Once validated, deploy the Lambda function in the desired region.
  • Set Up Monitoring:
    • Use CloudWatch Metrics to monitor function executions and errors.
    • Configure alarms for failure scenarios to take corrective actions.

By automating the start and stop operations for EKS clusters, organizations can significantly enhance resource management and optimize costs. This approach provides scalability and ensures that inter-cluster dependencies are handled efficiently.

Follow for more and happy learning :)

How to Create a Lambda Function to Export IAM Users to S3 as a CSV File

By: Ragul.M
16 December 2024 at 15:36

Managing AWS resources efficiently often requires automation. One common task is exporting a list of IAM users into a CSV file for auditing or reporting purposes. AWS Lambda is an excellent tool to achieve this, combined with the power of S3 for storage. Here's a step-by-step guide:

Step 1: Understand the Requirements
Before starting, ensure you have the following:

  • IAM permissions to list users (iam:ListUsers) and access S3 (s3:PutObject).
  • An existing S3 bucket to store the generated CSV file.
  • A basic understanding of AWS Lambda and its environment.

Step 2: Create an S3 Bucket

  1. Log in to the AWS Management Console.
  2. Navigate to S3 and create a new bucket or use an existing one.
  3. Note the bucket name for use in the Lambda function.

Step 3: Set Up a Lambda Function

  1. Go to the Lambda service in the AWS Console.
  2. Click on Create Function and choose the option to create a function from scratch.
  3. Configure the runtime environment (e.g., Python or Node.js).
  4. Assign an appropriate IAM role to the Lambda function with permissions for IAM and S3 operations. (If you want my code , just comment "ease-py-code" on my blog , will share you 🫢 )

Step 4: Implement Logic for IAM and S3

  • The Lambda function will:
    • Retrieve a list of IAM users using the AWS SDK.
    • Format the list into a CSV structure.
    • Upload the file to the specified S3 bucket.

Step 5: Test the Function

  1. Use the AWS Lambda testing tools to trigger the function.
  2. Verify that the CSV file is successfully uploaded to the S3 bucket.

Step 7: Monitor and Review

  • Check the S3 bucket for the uploaded CSV files.
  • Review the Lambda logs in CloudWatch to ensure the function runs successfully.

By following these steps, you can automate the task of exporting IAM user information into a CSV file and store it securely in S3, making it easier to track and manage your AWS users.

Follow for more and happy learning :)

Automating AWS Cost Management Reports with Lambda

By: Ragul.M
11 December 2024 at 16:08

Monitoring AWS costs is essential for keeping budgets in check. In this guide, we’ll walk through creating an AWS Lambda function to retrieve cost details and send them to email (via SES) and Slack.
Prerequisites
1.AWS Account with IAM permissions for Lambda, SES, and Cost Explorer.
2.Slack Webhook URL to send messages.
3.Configured SES Email for notifications.
4.S3 Bucket for storing cost reports as CSV files.

Step 1: Enable Cost Explorer

  • Go to AWS Billing Dashboard > Cost Explorer.
  • Enable Cost Explorer to access detailed cost data.

Step 2: Create an S3 Bucket

  • Create an S3 bucket (e.g., aws-cost-reports) to store cost reports.
  • Ensure the bucket has appropriate read/write permissions for Lambda.

Step 3: Write the Lambda Code
1.Create a Lambda Function

  • Go to AWS Lambda > Create Function.
  • Select Python Runtime (e.g., Python 3.9).
    1. Add Dependencies
  • Use a Lambda layer or package libraries like boto3 and slack_sdk. 3.Write your python code and execute them. (If you want my code , just comment "ease-py-code" on my blog , will share you 🫢 )

Step 4: Add S3 Permissions
Update the Lambda execution role to allow s3:PutObject, ses:SendEmail, and ce:GetCostAndUsage.

Step 5: Test the Lambda
1.Trigger Lambda manually using a test event.

  1. Verify the cost report is:
    • Uploaded to the S3 bucket.
    • Emailed via SES.
    • Notified in Slack.

Conclusion
With this setup, AWS cost reports are automatically delivered to your inbox and Slack, keeping you updated on spending trends. Fine-tune this solution by customizing the report frequency or grouping costs by other dimensions.

Follow for more and happy learning :)

❌
❌