Category: Cloud Architecting

  • Automatically Transcribing Audio Files with Amazon Web Services

    Automatically Transcribing Audio Files with Amazon Web Services

    I wrote this Lambda function to automatically transcribe audio files that are uploaded to an S3 bucket. This is written in Python3 and utilizes the Boto3 library.

    You will need to give your Lambda function permissions to access S3, Transcribe and CloudWatch.

    The script will create an AWS Transcribe job with the format: 'filetranscription'+YYYYMMDD-HHMMSS

    I will be iterating over the script to hopefully add in a web front end as well as potentially branching to do voice call transcriptions for phone calls and Amazon Connect.

    You can view the code here

    If you have questions or comments feel free to reach out to me here or on any Social Media.

  • Monitoring Disk Space with CloudWatch

    I recently had a request to monitor disk space. Being that I don’t use a traditional monitoring platform but rather send all of my alerting to Slack I wondered how this would work.

    There is not a direct metric in CloudWatch so we will utilize the scripts available in this guide.

    You can follow along on the Amazon guide or follow some simple steps here geared towards Ubuntu based Linux distributions.

    First, let’s install some dependancies:

    sudo apt-get install unzip

    sudo apt-get install libwww-perl libdatetime-perl

    Next, we will download the scripts from Amazon:

    curl https://aws-cloudwatch.s3.amazonaws.com/downloads/CloudWatchMonitoringScripts-1.2.2.zip -O

    Once downloaded you can unpack the ZIP file:

    unzip CloudWatchMonitoringScripts-1.2.2.zip && \ rm CloudWatchMonitoringScripts-1.2.2.zip && \ cd aws-scripts-mon

    This will put the scripts into a directory called aws-scripts-mon inside of whatever directory you are currently in. I recommend doing this inside of /home/your-user.

    There are a few ways to allow your scripts to have permissions to CloudWatch. I preferred to create the awscreds.conf method but you can also give your instance an IAM role or specify the credentials inline. If you are unsure of how to create IAM policies or roles feel free to message me and we can chat more about that.

    Inside the directory there is a template file that you can utilize to generate your awscreds.conf file.

    cp awscreds.template awscreds.conf && vi awscreds.conf

    Modify the file as needed and save and close it.

    Now let’s test the scripts to ensure functionality:

    ./mon-put-instance-data.pl --disk-space-util --disk-path=/ --verify --verbose

    You should see a “Completed Successfully” message. If not, troubleshoot as needed.

    The scripts have a lot of functionality but we are specifically looking at disk usage. I added the following line as a Cron Job:

    0 * * * * /home/ubuntu/aws-scripts-mon/mon-put-instance-data.pl --disk-space-util --disk-path=/

    This runs the script every hour on the hour and reports the data to CloudWatch.

    Now that our data is being put into CloudWatch we need to alert on any issues. For the purpose of testing I created an alarm that was below my threshold so I could verify the alerting worked. You can adjust as you need to.

    Login to your AWS Management Console and navigate to the CloudWatch Console. Your data will be placed into the “Metrics” tab. Once the Metrics tab is open you will see a section called “Linux Systems”. Navigate to this and you should metrics called “Filesystem, InstanceId, Mountpath”. This is where your metrics live. You can navigate around here and view your metrics in the graphing utility. Once you have verified that the data is accurate you can create an alarm based on this metric.

    Navigate to the Alarms section of CloudWatch. Click “Create alarm” in the top right corner. Follow the steps to create your Alarm. For Metric navigate to the metric we found in the previous step. For Conditions, I chose the following:

    Threshold Type: Static
    Whenever DiskSpaceUtilization is…: Greater than the threshold
    Than…: 45% (Note this value will change based on your actually usage. For testing I recommend setting this to a value lower than your actual usage percentage so that your alarm will fire.)

    Click Next to continue. On the following page you can setup your notifications. I covered creating an AWS Chatbot here I have all of my CloudWatch Alarms sent to an SNS topic called aws-alerts. You can create something similar and have your AWS Chatbot monitor that topic as well. Once the alarm fires you should be getting an alert in your specified Slack Channel that looks something like this:

    Once your alarm is firing you can fine tune your thresholds to notify you as you need!

  • AWS CLI For CPU Credit Balance

    AWS CLI For CPU Credit Balance

    Here is how you create a CloudWatch alarm to monitor CPU Credit Balances less than a certain amount:

    aws cloudwatch put-metric-alarm --alarm-name YOUR NAME HERE--alarm-description "Alarm when CPU Credits is below 200" --metric-name CPUCreditBalance --namespace AWS/EC2 --statistic Average --period 300 --threshold 200 --comparison-operator LessThanThreshold --dimensions Name=InstanceId,Value=INSTANCEIDHERE --evaluation-periods 2 --alarm-actions ARN:YOURSNSTOPIC

    CloudFormation Template:
    https://github.com/avansledright/CloudFormation-CPU-CREDIT-BALANCE

  • Encrypt an Existing EBS Volume

    Encrypt an Existing EBS Volume

    Say you have an existing EBS volume on Amazon Web Services that you wanted to encrypt. How would you do that? The following guide shows you how to do so via the AWS Management Console.

    1. Login to your console.
    2. Navigate to the the EBS Volume you would like to encrypt

    3. Right click on your colume and create a snapshot.

    4. I always give my snapshots descriptions. But we are going to end up deleting this one.

    5. Make a copy of the snapshot you created in step 4.

    6. In the copy settings you simply need to choose to encrypt the volume. You can specify the encryption keys to use. For this guide we will just use the standard EBS encryption key.

    Once you have your new encrypted snapshot you can easily create a volume from that snapshot and then re-attach it to your instance!

  • AWS Backup

    AWS Backup

    Recently Amazon Web Services announced its new service called AWS Backup. The goal is to create a simple, automated backup solution for resources within the AWS Cloud.

    There have been plenty of other solutions out there for backups but most are quite costly. Here is a look at the pricing for the AWS Backup solution:

    AWS Backup Pricing Snapshot

    The pricing for an EBS Snapshot is the same as the pricing for manual snapshots so it is quite a compelling argument to set this up.

    Let’s look at a quick example of how to setup a simple recurring EBS Snapshot. In this example I have a Linux EC2 instance with a single EBS volume attached to it.

    Login in to your AWS console and search for “Backup” in the services menu. You will see AWS Backup.

    AWS Console Menu – AWS Backup

    Once you are in the console for AWS Backup, choose “Manage Backup Plans”

    Manage AWS Backup Plans

    To get the full experience of AWS Backups I chose to make my own plan. You could also choose to use one of their existing plans.

    AWS Backup Options

    Give your backup plan a name. Something so you can remember what the plan is going to be doing. For my example I named my plan “7Day-Snapshot”. My plan will take a snapshot of the EBS volume and store it for 7 days before discarding it.

    Inside of your plan you are going to create a rule. In the example we only need one rule.


    I filled the fields out as follows:

    Rule Name: 7DayRetention

    Frequency: Daily

    Backup Window: Use Backup Window Defaults

    Transition to Cold Storage: Never

    Expire: 7 Days

    Backup Vault: Default – You can create different vaults with various options. I would suggest this if you are wanting to separate your projects or customers.

    Tags: You can add various tags but I didn’t set any up for this example.

    Once you have all the options filled out hit “Create Plan” to save your new plan. You can now assign resources to your plan which is how you actually choose what is going to be backed up!

    In Resource Assignments click “Assign resources”

    You will need to define a few things in the next step which is choosing your resources.

    Resource assignment name: I used the hostname of my Linux Server

    IAM Role: I used default

    Assign Resources: This is where you can get creative. One thing I am going to setup going forward is that every EBS volume with Key: Backup and Tag: Yes will fit this resource. Then I don’t have to add each volume individually. Feel free to explore. What I did was to choose “Assign By” Resource ID. Then Resource Type of EBS Volume and then found my resource in the list.

    Hit Assign Resources when you are done.

    That’s it! You now have a backup plan that will take a snapshot of your EBS volume during each maintenance window every day. It will then store them for one week and then delete them.

    This service by AWS should solve a myriad of problems for many organizations.

    If you have questions feel free to reach out!