Category: Python

  • AWS Tag Checker

    I wrote this script this morning as I was creating a new web server. I realized that I had been forgetting to add my “Backup” tag to my instances so that they would automatically be backed up via AWS Backup.

    This one is pretty straight forward. Utilizing Boto3 this script will iterate over all of your instances and check them for the tag specified on line 8. If the tag is not present it will then add the tag that is defined by JSON in $response.

    After that is all done it will iterate over the instances again to check that the tag has been added. If a new instance has been added or it failed to add the tag it will print out a list of instance ID’s that do not have the tag.

    Here is the script:

    import boto3
    
    
    ec2 = boto3.resource('ec2')
    inst_describe = ec2.instances.all()
    
    for instance in inst_describe:
        tag_to_check = 'Backup'
        if tag_to_check not in [t['Key'] for t in instance.tags]:
            print("This instance is not tagged: ", instance.instance_id)
            response = ec2.create_tags(
                Resources= [instance.instance_id],
                Tags = [
                    {
                        'Key': 'Backup',
                        'Value': 'Yes'
                    }
                ]
            )
    # Double check that there are no other instances without tags
    for instance in inst_describe:
        if tag_to_check not in [t['Key'] for t in instance.tags]:
            print("Failed to assign tag, or new instance: ", instance.instance_id)        

    The script is also available on GitHub here:
    https://github.com/avansledright/awsTagCheck

    If you find this script helpful feel free to share it with your friends and let me know in the comments!

  • Lambda Function Post to Slack

    I wrote this script out of a need to practice my Python skills. The idea is that if a file gets uploaded to an S3 bucket then the function will trigger and a message with that file name will be posted to a Slack channel of your choosing.

    To utilize this you will need to include the Slack pip package as well as the slackclient pip package when you upload the function to the AWS Console.

    You will also need to create an OAuth key for a Slack application. If you are unfamiliar with this process feel free to drop a comment below and or shoot me a message and I can walk you through the process or write a second part of the guide.

    Here is a link to the project:
    https://github.com/avansledright/posttoSlackLambda

    If this helps you please share this post on your favorite social media platform!

  • Automatically Transcribing Audio Files with Amazon Web Services

    Automatically Transcribing Audio Files with Amazon Web Services

    I wrote this Lambda function to automatically transcribe audio files that are uploaded to an S3 bucket. This is written in Python3 and utilizes the Boto3 library.

    You will need to give your Lambda function permissions to access S3, Transcribe and CloudWatch.

    The script will create an AWS Transcribe job with the format: 'filetranscription'+YYYYMMDD-HHMMSS

    I will be iterating over the script to hopefully add in a web front end as well as potentially branching to do voice call transcriptions for phone calls and Amazon Connect.

    You can view the code here

    If you have questions or comments feel free to reach out to me here or on any Social Media.

  • Amazon S3 Backup from FreeNAS

    Amazon S3 Backup from FreeNAS

    I was chatting with my Dad about storage for his documents. He mentioned wanting to store them on my home NAS. I chuckled and stated that I would just push them up to the cloud because it would be cheaper and more reliable. When I got home that day I thought to myself how I would actually complete this task.

    There are plenty of obvious tools to accomplish offsite backup. I want to push all of my home videos and pictures to an S3 bucket in my AWS environment. I could:

    1. Mount the S3 bucket using the drivers provided by AWS and then RSYNC the data across on a cron job.
    2. Utilize a FreeNAS plugin to drive the backup
    3. Build my own custom solution to the problem and re-invent the wheel!

    It is clear the choice is going to be 3.

    With the help of the Internet and I put together a simple Python script that will backup my data. I can then run this on a cron job to upload the files periodically. OR! I could Dockerize the script and then run it as a container! Queue more overkill.

    The result is something complicated for a simple backup task. But I like it and it works for my environments. One of the most important things is that I can point the script at one directory that houses many Symlinks to other directories so I only have to manage one backup point.

    Take a look at the GitHub link below and let me know your thoughts!

    [GitHub]