Category: Amazon Web Services

  • Setting Up AWS Chatbot

    Amazon Web Services pushed their new Chatbot into beta recently. This simple bot will allow you to get alerts and notifications sent to either Slack or Amazon Chime. Because I use Slack for alerting I thought this would be a great tool. Previously I utilized Marbot to accommodate a similar function. Marbot is a great product for teams as it allows a user to acknowledge or pass an incident. I am a team of one so this feature is nice but ultimately not useful for me at this time.

    Let’s get started!

    Navigate to the new AWS Chatbot in the console

    On the right-hand side click the drop-down menu to choose your chat client. I am going to choose Slack because that is what I use. I assume the process would be the same for Chime. You will be prompted by Slack to authorize the application. Go ahead and hit “Install”.

    On the next screen, we will get to our configuration options. The first being to choose our Slack Channel:

    I chose the public channel that I already have created for Marbot #aws-alerts. You can do what you want here. Maybe you want a private channel so only you can see alerts for your development environment!

    The next section is IAM Permissions

    I chose to create an IAM role using a template and utilized the predefined template and just made up a role name called “aws-chatbot-alerts”.

    The last configuration options is for SNS topics

    You can have your bot subscribe to SNS channels to receive notifications to publish there as well. I don’t currently use any so I skipped this section but, this could be super useful in the future! Look for future posts about this idea!

    I will update this post soon with how to create the chatbot using the CLI and/or CloudFormation

  • Copying Files To & From an AWS S3 Bucket

    Copying Files To & From an AWS S3 Bucket

    Recently I needed to download an entire bucket worth of data for an offsite backup. Easy right? Go to the Amazon Web Services Console and hit download! WRONG.

    You can download individual files but not an entire bucket. Seems silly. Luckily there is an easy way to do it via the Amazon Web Services CLI. Enter simple commands:

    $ aws s3 cp s3://YOUR_BUCKET/ /LOCAL_DIRECTORY --recursive

    Let’s dissect this just a little bit. The first couple of options in the command should be pretty self-explanatory. We are going to use the AWS CLI, we chose S3 as our service and then the ‘cp’ means we are going to copy. Now, there are a bunch of other options that you can do here. I suggest taking a look at the documentation here to learn more. After that, you simply add in your bucket name, note the trailing forward slash, then where you want to put your files on your local machine. Finally, I added the --recursive flag so that it would read through all the lower directories.

    Ultimately a very simple solution to transfer some data quickly! The AWS S3 CLI functions very similarly to that of your standard directory functions. So, feel free to poke around and see what it can do!

  • AWS CLI For CPU Credit Balance

    AWS CLI For CPU Credit Balance

    Here is how you create a CloudWatch alarm to monitor CPU Credit Balances less than a certain amount:

    aws cloudwatch put-metric-alarm --alarm-name YOUR NAME HERE--alarm-description "Alarm when CPU Credits is below 200" --metric-name CPUCreditBalance --namespace AWS/EC2 --statistic Average --period 300 --threshold 200 --comparison-operator LessThanThreshold --dimensions Name=InstanceId,Value=INSTANCEIDHERE --evaluation-periods 2 --alarm-actions ARN:YOURSNSTOPIC

    CloudFormation Template:
    https://github.com/avansledright/CloudFormation-CPU-CREDIT-BALANCE

  • Encrypt an Existing EBS Volume

    Encrypt an Existing EBS Volume

    Say you have an existing EBS volume on Amazon Web Services that you wanted to encrypt. How would you do that? The following guide shows you how to do so via the AWS Management Console.

    1. Login to your console.
    2. Navigate to the the EBS Volume you would like to encrypt

    3. Right click on your colume and create a snapshot.

    4. I always give my snapshots descriptions. But we are going to end up deleting this one.

    5. Make a copy of the snapshot you created in step 4.

    6. In the copy settings you simply need to choose to encrypt the volume. You can specify the encryption keys to use. For this guide we will just use the standard EBS encryption key.

    Once you have your new encrypted snapshot you can easily create a volume from that snapshot and then re-attach it to your instance!

  • AWS Backup

    AWS Backup

    Recently Amazon Web Services announced its new service called AWS Backup. The goal is to create a simple, automated backup solution for resources within the AWS Cloud.

    There have been plenty of other solutions out there for backups but most are quite costly. Here is a look at the pricing for the AWS Backup solution:

    AWS Backup Pricing Snapshot

    The pricing for an EBS Snapshot is the same as the pricing for manual snapshots so it is quite a compelling argument to set this up.

    Let’s look at a quick example of how to setup a simple recurring EBS Snapshot. In this example I have a Linux EC2 instance with a single EBS volume attached to it.

    Login in to your AWS console and search for “Backup” in the services menu. You will see AWS Backup.

    AWS Console Menu – AWS Backup

    Once you are in the console for AWS Backup, choose “Manage Backup Plans”

    Manage AWS Backup Plans

    To get the full experience of AWS Backups I chose to make my own plan. You could also choose to use one of their existing plans.

    AWS Backup Options

    Give your backup plan a name. Something so you can remember what the plan is going to be doing. For my example I named my plan “7Day-Snapshot”. My plan will take a snapshot of the EBS volume and store it for 7 days before discarding it.

    Inside of your plan you are going to create a rule. In the example we only need one rule.


    I filled the fields out as follows:

    Rule Name: 7DayRetention

    Frequency: Daily

    Backup Window: Use Backup Window Defaults

    Transition to Cold Storage: Never

    Expire: 7 Days

    Backup Vault: Default – You can create different vaults with various options. I would suggest this if you are wanting to separate your projects or customers.

    Tags: You can add various tags but I didn’t set any up for this example.

    Once you have all the options filled out hit “Create Plan” to save your new plan. You can now assign resources to your plan which is how you actually choose what is going to be backed up!

    In Resource Assignments click “Assign resources”

    You will need to define a few things in the next step which is choosing your resources.

    Resource assignment name: I used the hostname of my Linux Server

    IAM Role: I used default

    Assign Resources: This is where you can get creative. One thing I am going to setup going forward is that every EBS volume with Key: Backup and Tag: Yes will fit this resource. Then I don’t have to add each volume individually. Feel free to explore. What I did was to choose “Assign By” Resource ID. Then Resource Type of EBS Volume and then found my resource in the list.

    Hit Assign Resources when you are done.

    That’s it! You now have a backup plan that will take a snapshot of your EBS volume during each maintenance window every day. It will then store them for one week and then delete them.

    This service by AWS should solve a myriad of problems for many organizations.

    If you have questions feel free to reach out!

  • The Security Specialty Certification

    Today I sat the AWS Security Specialty Exam. While I didn’t pass I thought to provide some commentary on the experience in relation to the training that I sought out to assist myself in the process.

    I have been a big fan of ACloudGuru. They helped me pass my Solutions Architect exam last year so naturally, I returned to train and learn from them again. Much of the content that I found in this course I found to be a repeat of what I saw in the Solutions Architect material. I didn’t think much of it because I assumed this to be the correct curriculum.

    Boy was I wrong.

    Upon sitting down at the exam center I utilized my standard method of test taking. Answer the questions that you know the answer to first and then go back and hammer out the harder ones using the process of elimination and your knowledge.

    Ryan Kroonenburg does a great job of explaining all the features of AWS and how to utilize them in a lab environment, we miss the actual application level that AWS is asking for in the exam. Now, I’m not saying that Ryan doesn’t know what he is talking about. Quite the contrary. Nor am I blaming my failure on ACloudGuru.

    Advice

    On top of learning all the content outlined in ACloudGuru or LinuxAcademy or whichever training resource you want to utilize, you really need to seek out real life application to these topics. 

    I will be going back over all the labs in the training material and applying them into my product environments (after testing). I think that this is the only way to truly learn what is needed.

    Current Exam Rankings

    Hardest to Easiest (based on what I’ve taken):

    1. Security Specialty
    2. Solutions Architect Associate
    3. SysOps Associate

    If you have any questions regarding the exams feel free to reach out!

  • AWS Summit 2018 – Recap

    AWS Summit 2018 – Recap

    This was my second year attending Amazon Web Services Summit. Both times I have headed down to Chicago for a few days to network, learn, and get excited about new AWS developments.

    This year, the summit was scheduled for only one day. Being that the summit started early in the morning I decided I was going to head down early. By happenstance, I was invited to attend a workshop put on by Dynatrace. 

    Dynatrace is a logging and monitoring platform built inside AWS. It integrates with nearly any piece of technology you can think of. For me, monitoring is important for the web servers that I manage for my customers. In this workshop, we learned how to create a continuous development pipeline. Essentially what this means is that we deployed our application which had various staging and production environments that Dynatrace was able to monitor and ensure successful deployments.

    After the workshop, Dynatrace hosted a lovely rooftop cocktail party. Thanks again for the invitation!

    Quick lookout shot from the rooftop!

    The summit began early the next morning. I spent the morning visiting some vendor booths and getting the lay of the land before attending the keynote.

    This years keynote was centered around the concept of “Builders”. Amazon wants all of its customers to be builders. By that, they mean that they want us to explore and be curious with their platform. When we see a problem they want us to solve it within Amazon Web Services. While this concept is great fundamentally, I do believe that is catered more towards developers and people that code rather than infrastructure gurus like myself. Nevertheless, I still found the concept compelling in my adventures.

    4th row for the keynote!

    The day continued with various sessions. I spent a good amount of time working through the business executive track which focuses on migrations and security. 

    Large scale migrations – one of my favorite sessions

    Overall the summit was good. I did miss the two day format. By the end of the day it was a very long day of travel and learning. 

    If you or someone you know is interested in cloud computing, AWS Summit is a great place to get excited about all the possibilities!