Categories
Amazon Web Services Cloud Architecting Technology

Encrypt an Existing EBS Volume

Say you have an existing EBS volume on Amazon Web Services that you wanted to encrypt. How would you do that? The following guide shows you how to do so via the AWS Management Console.

  1. Login to your console.
  2. Navigate to the the EBS Volume you would like to encrypt

3. Right click on your colume and create a snapshot.

4. I always give my snapshots descriptions. But we are going to end up deleting this one.

5. Make a copy of the snapshot you created in step 4.

6. In the copy settings you simply need to choose to encrypt the volume. You can specify the encryption keys to use. For this guide we will just use the standard EBS encryption key.

Once you have your new encrypted snapshot you can easily create a volume from that snapshot and then re-attach it to your instance!

Categories
Technology

Fixing Unadoptable Unifi Devices

I wrote an article about this before that utilizes Robo3T. I figured I should also have a version for those of you who utilize SSH and Command Line.

DISCLAIMER: I am not responsible if you break anything. If you need help let me know before you create a big mess!

EDIT: I wrote a Python Script that can handle all of this for you just enter in your MAC address. Grab it here: https://github.com/avansledright/unifideletedevice

SSH into your Unifi Controller utilizing whatever means you typically use.

Connect to MongoDB by issuing the command:
mongo --port 27117

If you are utilizing a different port then change the port flag.

Once connected select the Unifi Database:

use ace

Then you can utilize the following queries to preform actions:

Find device:
db.device.find({ 'mac' : 'XX:XX:XX:XX:XX:XX' })
Remove device:
db.device.remove({ 'mac' : 'XX:XX:XX:XX:XX:XX' })

Should you want to find what site a device is registered to you can utilize the “Find Device” query from above. In the JSON output locate the Site ID. Then utilize the query below and replace the X’s with your found site ID. The result should be a nice JSON output with the name of the site.

Find site query:
db.site.find(ObjectId('XXXXXX'))

Categories
Technology

Counting Web Requests

I manage a ton of web servers. Occasionally I see attempts at flooding the servers with traffic. Typically in a malicious way. Generally these are just small attacks and nothing to write home about. But, I wanted a way see how many times a server was getting a request from a specific IP address.

Obviously this would be very challenging to accomplish by just looking at the logs. So, I put together a small Linux command that will read and count Apache requests based on unique IP addresses.

cat access.* | awk ‘{ print $1 }’ | sort | uniq -c | sort -n

Try it out and let me know what you think!

Categories
Amazon Web Services Cloud Architecting

AWS Backup

Recently Amazon Web Services announced its new service called AWS Backup. The goal is to create a simple, automated backup solution for resources within the AWS Cloud.

There have been plenty of other solutions out there for backups but most are quite costly. Here is a look at the pricing for the AWS Backup solution:

AWS Backup Pricing Snapshot

The pricing for an EBS Snapshot is the same as the pricing for manual snapshots so it is quite a compelling argument to set this up.

Let’s look at a quick example of how to setup a simple recurring EBS Snapshot. In this example I have a Linux EC2 instance with a single EBS volume attached to it.

Login in to your AWS console and search for “Backup” in the services menu. You will see AWS Backup.

AWS Console Menu – AWS Backup

Once you are in the console for AWS Backup, choose “Manage Backup Plans”

Manage AWS Backup Plans

To get the full experience of AWS Backups I chose to make my own plan. You could also choose to use one of their existing plans.

AWS Backup Options

Give your backup plan a name. Something so you can remember what the plan is going to be doing. For my example I named my plan “7Day-Snapshot”. My plan will take a snapshot of the EBS volume and store it for 7 days before discarding it.

Inside of your plan you are going to create a rule. In the example we only need one rule.


I filled the fields out as follows:

Rule Name: 7DayRetention

Frequency: Daily

Backup Window: Use Backup Window Defaults

Transition to Cold Storage: Never

Expire: 7 Days

Backup Vault: Default – You can create different vaults with various options. I would suggest this if you are wanting to separate your projects or customers.

Tags: You can add various tags but I didn’t set any up for this example.

Once you have all the options filled out hit “Create Plan” to save your new plan. You can now assign resources to your plan which is how you actually choose what is going to be backed up!

In Resource Assignments click “Assign resources”

You will need to define a few things in the next step which is choosing your resources.

Resource assignment name: I used the hostname of my Linux Server

IAM Role: I used default

Assign Resources: This is where you can get creative. One thing I am going to setup going forward is that every EBS volume with Key: Backup and Tag: Yes will fit this resource. Then I don’t have to add each volume individually. Feel free to explore. What I did was to choose “Assign By” Resource ID. Then Resource Type of EBS Volume and then found my resource in the list.

Hit Assign Resources when you are done.

That’s it! You now have a backup plan that will take a snapshot of your EBS volume during each maintenance window every day. It will then store them for one week and then delete them.

This service by AWS should solve a myriad of problems for many organizations.

If you have questions feel free to reach out!

Categories
Technology

Fixing Unifi Controller Errors

Recently I was working on a device that for the life of me I could not get to attach to my Unifi Controller. Repeatedly I would get

used default key in INFORM_ERROR state, reject it!

error on my server. The other error that I kept getting on the device itself was

Decrypt Error

when running the Inform Command.

Quite frustrated I spent a lot of time removing and adding my SSL certificate thinking that had something to do with it. I was wrong.

The real issue resides when someone deletes a whole site without removing the devices that are inside the site first. What happens is that the devices stay in the database and have a site associated with them that no longer exists. This results in me not being able to adopt them into a new site.

So Let’s Fix It

To resolve this issue we need to delete the device out of the controller by accessing the MongoDB that stores all of our information. While most of you are probably more fluent in writing Mongo queries and thus could do it from the command line I prefer to find a GUI solution so that I could understand what I am doing.

Enter Robo 3T. This is a GUI connector for MongoDB.  Depending on your setup you will need to modify your connection type. I used SSH with my private key.

Once connected you should see a list of your databases in the left column.

The Unifi Database (unless you changed it) will be called ace. Go ahead and expand out Ace and then Collections to display all your sites information. You will see a tabled called “device”. This table stores all the specific information about our devices and how they are programmed.

We now need to find our specific device so using the built in shell in Robo 3T run the following query replacing the X’s with your MAC Address.

db.device.find({ 'mac' : 'XX:XX:XX:XX:XX:XX' })

The MAC address string must be all lower case.

NOTE: Please backup your database before you do any of the following!

Once you find your device, verify that the MAC address does, in fact, match your device.

Right click on the ObjectID block. Should look something like this:

In the right click menu you can choose to Delete the document. This will permanently remove the device from your controllers database.

Once you have deleted the Document run your Inform command again and it your device should populate into your controller like normal!

If you have any questions let me know!

Categories
Amazon Web Services

The Security Specialty Certification

Today I sat the AWS Security Specialty Exam. While I didn’t pass I thought to provide some commentary on the experience in relation to the training that I sought out to assist myself in the process.

I have been a big fan of ACloudGuru. They helped me pass my Solutions Architect exam last year so naturally, I returned to train and learn from them again. Much of the content that I found in this course I found to be a repeat of what I saw in the Solutions Architect material. I didn’t think much of it because I assumed this to be the correct curriculum.

Boy was I wrong.

Upon sitting down at the exam center I utilized my standard method of test taking. Answer the questions that you know the answer to first and then go back and hammer out the harder ones using the process of elimination and your knowledge.

Ryan Kroonenburg does a great job of explaining all the features of AWS and how to utilize them in a lab environment, we miss the actual application level that AWS is asking for in the exam. Now, I’m not saying that Ryan doesn’t know what he is talking about. Quite the contrary. Nor am I blaming my failure on ACloudGuru.

Advice

On top of learning all the content outlined in ACloudGuru or LinuxAcademy or whichever training resource you want to utilize, you really need to seek out real life application to these topics. 

I will be going back over all the labs in the training material and applying them into my product environments (after testing). I think that this is the only way to truly learn what is needed.

Current Exam Rankings

Hardest to Easiest (based on what I’ve taken):

  1. Security Specialty
  2. Solutions Architect Associate
  3. SysOps Associate

If you have any questions regarding the exams feel free to reach out!

Categories
Amazon Web Services Travel

AWS Summit 2018 – Recap

This was my second year attending Amazon Web Services Summit. Both times I have headed down to Chicago for a few days to network, learn, and get excited about new AWS developments.

This year, the summit was scheduled for only one day. Being that the summit started early in the morning I decided I was going to head down early. By happenstance, I was invited to attend a workshop put on by Dynatrace. 

Dynatrace is a logging and monitoring platform built inside AWS. It integrates with nearly any piece of technology you can think of. For me, monitoring is important for the web servers that I manage for my customers. In this workshop, we learned how to create a continuous development pipeline. Essentially what this means is that we deployed our application which had various staging and production environments that Dynatrace was able to monitor and ensure successful deployments.

After the workshop, Dynatrace hosted a lovely rooftop cocktail party. Thanks again for the invitation!

Quick lookout shot from the rooftop!

The summit began early the next morning. I spent the morning visiting some vendor booths and getting the lay of the land before attending the keynote.

This years keynote was centered around the concept of “Builders”. Amazon wants all of its customers to be builders. By that, they mean that they want us to explore and be curious with their platform. When we see a problem they want us to solve it within Amazon Web Services. While this concept is great fundamentally, I do believe that is catered more towards developers and people that code rather than infrastructure gurus like myself. Nevertheless, I still found the concept compelling in my adventures.

4th row for the keynote!

The day continued with various sessions. I spent a good amount of time working through the business executive track which focuses on migrations and security. 

Large scale migrations – one of my favorite sessions

Overall the summit was good. I did miss the two day format. By the end of the day it was a very long day of travel and learning. 

If you or someone you know is interested in cloud computing, AWS Summit is a great place to get excited about all the possibilities!