Category: Amazon Web Services

  • Why I built my own WordPress Platform

    Why I built my own WordPress Platform

    I’ve been building websites for a long time. I remember learning HTML and Microsoft FrontPage as my first website builder. It was such a fun time to be creating horrific looking websites back in the early 2000s. As the internet progressed so did my skills and back in 2016 I formed my company 45Squared to build websites for small businesses. My whole goal is to be your trusted resource when it comes to being online.

    When I started the company I built WordPress websites of various shapes and sizes but they always ran on AWS. This helped me expand my AWS skills as well as provide robust infrastructure for my client’s websites to live on. I managed the website and the underlying infrastructure for a small monthly cost that beat the competition. The result, a bunch of paying customers a decent side hustle.

    As time went on, selling became harder and the race to zero for cost was apparent. So, as the AI boom is on, I decided that it was time to automate the site building process.

    I started documenting out how I would want this to work. Fully automated website deployments, design, content, custom domains, good SEO base and deployed FAST!

    Enter https://ai.45sq.net. This platform is fully automated. The customer can provide inputs and descriptions of what they want as well as photos or other graphical content. The workflow takes all of the inputs and builds a fully functional WordPress website hosted on AWS. The user can easily point their own domain to the server and setup automatic payments. They then get full administrative access to their website so they can expand and add features just like any other WordPress site.

    So why did I build this?

    If you contact a web designer now you will have to pay them to build up the initial design, work with their timelines, end up with something that needs revisions and your time to live will be in the weeks not minutes.

    The platform I built for 45Squared eliminates the need for the initial design fees and focuses on getting you online quickly. Its great for small businesses who are just getting started.

    So now when I get a request to build a site I can tell the customer that I have two options. First, fully custom. I’m still willing to sit with you and build out the picture perfect website. Or, two, you can launch your own and I will still support the website and help you with your online presence.

    So that’s it. An easy to use WordPress website launcher. Running on enterprise grade cloud. With content, design, layout and all the rest handled by the magic of Claude Opus.

    Try it out: https://ai.45sq.net. No contracts. No weird fees. Get online today.

  • Building in Public – The Automated WordPress Deployment Platform Part 2

    Building in Public – The Automated WordPress Deployment Platform Part 2

    A few days ago I wrote about building an automated WordPress deployment platform using Terraform and AI. Well, i’m happy to report that the entire platform is live and ready for you to explore and launch your own WordPress website.

    Introducing 45Squared’s WordPress deployment platform powered by Ubuntu and Claude. Try it out today at https://ai.45sq.net.

    Let’s talk about how this all works.

    The front end infrastructure that an end user will see is pretty straightforward. I am utilizing an ECS cluster and NextJS to deliver the end user experience. The second portion of the user experience is handled by an AWS API Gateway to manage all of the user credentials, payment processing, sit launch status. Authentication is handled by AWS Cognito. Hate on it all you want, Cognito works just fine when configured correctly.

    Frontend architecture

    Behind the scenes, once a user transaction has completed successfully, the website is provisioned using another ECS task. This container runs through a sequence of steps to provision the AWS EC2 instance for the user to utilize. Each tenant instance is running a hardened Ubuntu image that is built using Packer. I will cover this in another post. Throughout the provisioning process, the task is updating the DynamoDB table so that the user gets a live look into how their website is progressing.

    Provisioning architecture

    Each tenant is given a subdomain as well as the ability to utilize a custom domain name. Each tenant is also given a Cloudfront CDN for global static content distribution. And of course, each tenant receives their own SSL certificate for both their custom domain and their subdomain.

    Each site can be managed by SSM which will eventually be linked into an AI agent for management through Slack or another messaging platform.

    I don’t intend to use this platform to compete with the large players. 45Squared’s vision has always been to serve the small to medium size businesses who want personalized support while still receiving an amazing product. This platform gives them the ability to quickly launch a website and get their company on the world wide web within 10 minutes.

    If you are interested in building out a website using the platform the first few users can receive 50% using code “BETATESTER50”. There are limited redemption so be sure to get going quickly!

    Don’t miss an update

  • Building In Public – Designing an automated WordPress deployment engine with Terraform

    If you didn’t know I built a lot of technical experience by building out WordPress websites. I started a business around it. I did quite well. Lately, I’ve been trying to get back into building websites for small businesses. I find that the space of AI has made some of the bigger players too complex even though they claim to be easy.

    My goal is to create a single web page where a user can describe their website and have it auto provision a full WordPress instance build upon core AWS services. The end result will be a fully managed WordPress instance, backed by AWS’s 99.999% up time. It will be fully automated and take around 10 minutes to have a fully developed website.

    Current Architecture Plans:

    I’m utilizing AWS to handle everything (no surprise). Originally I was going to utilize Step Functions as the provisioner but as I started building I ended up hitting too many roadblocks and restrictions from a timing perspective.

    When dealing with Bedrock the response times can vary. So I made the switch to go with an ECS approach. The control plan/signup page is built there so the provisioner should also just be another task.

    Features:

    • Custom Domains
    • Automated SSL
    • Load balancers
    • Auto healing (through auto scaling)
    • Monitoring

    Essentially all the standard features you would expect from a web host. Just without the design portion.

    This will be a new ongoing series for you all to read about. If you’re interested in following along subscribe to my mailing list!

    Don’t miss an update

  • Cloudwatch Alarm AI Agent

    I think one of the biggest time sucks is getting a vague alert or issue and not having a clue on where to start with troubleshooting.

    I covered this in the past when I built an agent that can review your AWS bill and find practical ways to save money within your account. This application wasn’t event driven but rather a container that you could spin up when you needed a review or something you could leave running in your environment. If we take a same read-only approach to building an AWS Agent we can have have a new event driven teammate that helps us with our initial troubleshooting.

    The process flow is straight forward:

    1. Given a Cloudwatch Alarm
    2. Send a notification to SNS
    3. Subscribe a Lambda function to the topic (this is our teammate)
    4. The function utilizes the AWS Nova Lite model to investigate the contents of the alarm and utilizes its read only capabilities to find potential solutions
    5. The agent sends its findings to you on your preferred platform

    For my environment I primarily utilize Slack for alerting and messaging so I built that integration. Here is an architecture diagram:

    When the alarm triggers we should see a message in Slack like:

    The AI is capable of providing you actionable steps to either find the root cause of the problem or in some cases, present you with steps to solve the problem.

    This workflow significantly reduces your troubleshooting time and by reducing the troubleshooting time it reduces your downtime.

    So, if this is something you are interested in deploying I have created a Terraform module so you can quickly deploy it into your own environment to reduce your troubleshooting steps!

    Check it out here: https://aiopscrew.com

    If you have questions feel free to reach out to me at anytime!

  • Fantasy Football and AI – Playoffs Round 2

    Well. It had to end at some point.

    I think the AI mostly selected correctly this week. Unfortunately it wasn’t enough. We fell short by about 5 points. Going into Monday night we needed a massive game from George Kittle as the rest of the team performed very poorly. He delivered all the way until the 4th quarter where he likely twisted his ankle and was done for the game as the 49ers were up 2 scores.

    Here are the results:

    Josh Allen might have had a foot injury early in the game but stayed in for the entire game. The Bills simply didn’t throw the football. TreVeyon Henderson got absolutely demolished and left the game with a probable concussion. Josh Jacobs was questionable going into the game and cleared to play. The Packers simply didn’t play him.

    With that loss we are eliminated from the playoffs and will be playing next week for 3rd place. Still a decent finish for our first year utilizing AI.

    Looking Ahead

    Through the off season I want to continue to work on the overall architecture of this agent and system. Ideally, I want to have the custom model built for next season and build an API around that to help us make better predictions.

    Other action items:

    1. Find a way to load news stories and story lines for determinations
    2. Manage injuries/waivers better
    3. Handle live NFL standings (teams eliminated from playoffs might play differently than teams fighting for a spot)

    I also would love to be able to expose all of this publicly so that anyone reading can build their own applications around my predictions.

    Stay tuned next week for our final placement!

  • Fantasy Football and AI – Week 13

    Sigh… another week another loss. It was a close one. It turns out people just didn’t really show up to play.

    Its hard to win a game when your high scorer is a defense. There was some light at the end of the Patriots game when Henderson was running down the field. Unfortunately they took him out and then the drive stalled. Had he been able to get a touch down we could have won. We left some points on the bench as well:

    Zach Ertz had a monster game and many of the other players would have been better than Saquon.

    On to week 14. This is the last week before our playoff run. Here is the current proposed roster:

    Its hard to not start Saquon Barkley. But he’s trending down and I think I agree with the AI here in not selecting him. Marvin Harrison Jr. is questionable again due to his surgery but is expected to play. We grabbed Christian Watson, Marcus Mariota and the Commanders defense for week 14. We dropped J.J. McCarthy due to poor performance and injury. Henderson is on bye this week. Our current bench looks like this:

    What do you think? Do you agree with the AI’s selections for the week?

  • Fantasy Football and AI – Week 12

    Well, unfortunately we took a big loss and are now in a three way tie for first place. Here are the actual results:

    I think the biggest hit was how poorly Josh Allen played. What is interesting is that I was reviewing his passed performance against Houston and he has had his worst outings of his career there. This week was no different… The other interesting thing is that Saquon Barkley just isn’t the same back as he was last year. He is trending down.

    In response to Josh Allen’s poor outing I added a deviation and historical performance analysis against and opponent to the data set so now we have a value like:

    "HOU": {
       "avg_actual": 11.6,
       "avg_deviation": -2.67,
       "avg_deviation_percentage": -18.69,
       "avg_projected": 14.27,
       "by_season": {
        "2024": {
         "avg_actual": 14.6,
         "avg_deviation": 0,
         "avg_deviation_percentage": 0,
         "avg_projected": 0,
         "games_played": 1,
         "games_with_projections": 0
        },
        "2025": {
         "avg_actual": 5.05,
         "avg_deviation": -5.65,
         "avg_deviation_percentage": -52.8,
         "avg_projected": 10.7,
         "games_played": 2,
         "total_actual": 10.1,
         "total_projected": 21.4
        }
       },

    I then coded a new tool inside of the AI to reference this data set to assist with its final calculations.

    So, when we ask the tool about Josh Allen’s performance against a team we get something like:

    Found exactly what you're looking for in Josh Allen's performance data.
    
    Josh Allen vs Kansas City (Week 9, 2025):
    - 28.8 fantasy points - That's a STRONG performance!
    - He was projected for 22.0 points but exceeded by 6.8 points (+30.9%)
    - This was one of his better games this season
    
    Historical Context (2024 vs KC):
    - Week 11, 2024: 24.0 fantasy points vs Kansas City
    
    Bottom Line: Josh Allen has performed WELL against Kansas City in recent matchups. In their two most recent meetings:
    - 2025 (Week 9): 28.8 points
    - 2024 (Week 11): 24.0 points
    
    That's an average of 26.4 fantasy points against KC - well above his season average. Allen seems to rise to the occasion against elite competition like Kansas City. The Chiefs' defense can be vulnerable to mobile QBs who can extend plays, and Allen's dual-threat ability has historically given them problems.
    
    The takeaway: When Allen faces KC, he typically delivers QB1 numbers. That 28.8-point performance this season shows he can still light up even the defending champs!

    I need to backload some more historical data yet but that is a problem for a different day. Looking forward, I hope to build my own model based on this data and setup an automated workflow that will include data ingestion and automated model building so that I can consistently build predictions.

    So, on to week 13. We have a bunch of injuries this week to contend with so this lineup will likely be changing once people are reporting. Here is the current lineup:

    In other news I built and launched https://gridirondata.com which is an API to reference all of the data I have collected so far.

    Unfortunately, its not free. But if you message me about it I’ll probably hook you up!

  • Building jsontotoon.io: A Free Tool to Cut Your LLM API Costs

    If you’re working with LLM APIs, you’re probably watching your token counts like a hawk. Every JSON object you send to Claude, GPT-4, or Gemini costs tokens, and those curly braces and quotes add up fast. I built https://jsontotoon.io to solve this exact problem—and it’s completely free to use.

    The Problem: JSON is Token-Inefficient

    Here’s the thing: JSON is fantastic for machine-to-machine communication. It’s ubiquitous, well-supported, and everyone knows how to work with it. But when you’re paying per token to send data to an LLM? It’s wasteful.

    Look at a simple example:

    [
      {"name": "Alice", "age": 30, "city": "NYC"},
      {"name": "Bob", "age": 25, "city": "LA"},
      {"name": "Carol", "age": 35, "city": "Chicago"}
    ]

    That’s 125 tokens. All those quotes, braces, and commas? The LLM doesn’t need them to understand the structure. You’re literally paying to send redundant syntax.

    Enter TOON Format

    TOON (Token-Oriented Object Notation) converts that same data to:

    name, age, city
    Alice, 30, NYC
    Bob, 25, LA
    Carol, 35, Chicago

    68 tokens. That’s a 46% reduction. The same information, fully reversible back to JSON, but nearly half the cost.

    I realize this sounds too good to be true, but the math checks out. I tested it across real-world datasets—API responses, database dumps, RAG context—and consistently saw 35-45% token reduction. Your mileage will vary depending on data structure, but the savings are real.

    How I Built It

    The backend is straightforward Python running on AWS Lambda. The TOON parser itself is deterministic—same JSON always produces the same TOON output, and round-trip conversion is lossless. No data gets mangled, no weird edge cases (well, I fixed those during testing).

    Infrastructure-wise:

    CloudFront + S3 for the static frontend

    API Gateway + Lambda for the conversion endpoint

    DynamoDB for API key storage (with email verification via SES)

    WAF with rate limiting to prevent abuse (10 requests per 5 minutes on API endpoints)

    CloudWatch dashboards for monitoring

    The whole setup costs me about $8-15/month in AWS fees, mostly for WAF. The conversion itself is so fast (< 100ms average) and cheap that I can offer unlimited free API keys without worrying about runaway costs.

    Real Use Cases

    I built this because I was spending way too much on Claude API calls for my fantasy football AI agent project. Every week I send player stats, injury reports, and matchup data in prompts. Converting to TOON saved me about 38% on tokens—which adds up when you’re making hundreds of calls per week.

    But the use cases go beyond my specific problem:

    RAG systems: Fit more context documents in your prompts without hitting limits

    Data analysis agents: Send larger datasets for analysis at lower cost

    Few-shot learning: Include more examples without token bloat

    Structured outputs: LLMs can generate TOON that’s easier to parse than JSON

    Try It Yourself

    The web interface at https://jsontotoon.io is free to use—no signup required. Just paste your JSON, get TOON. If you want to integrate it into your application, grab a free API key (also no cost, no expiration).

    Full API docs are available at https://jsontotoon.io/docs.html, with code examples in Python, JavaScript, Go, and cURL.

  • AI and Fantasy Football – Week 11

    Wow. Week 11 was filled with injuries. Josh Jacobs went down early with a knee injury. Aaron Rogers went out with a wrist injury but it all started off with an epic performance by TreVeyon Henderson putting up 32.3 points. The end result of week 11? ANOTHER VICTORY FOR AI! The team is now in 1st place. With all those injuries you might be wondering how we pulled off another victory. Well, here is the final scores for the week:

    Josh Allen came through massively with a 51 point game. Riley Patterson put up a few good kicks over in Madrid and George Kittle had a great game as well.

    Looking forward to week 12, we will have to battle some injuries but I think the depth chart should be able to sustain the blows. Here is the current proposed lineup:

    So, tech and data stuff. I added deviations into the data set. So now we can see the difference between what a player’s projection was and their actual. This will help the AI determine how a player is preforming. This is being structured on a per season per week basis as well as historically against an opponent. Next year this data will be valuable when looking at future matchups and draft choices.

    Next, I’m also working on launching an API for this entire project so that you can access the data and utilize it for your own applications. I hope to have a working beta of this by the end of the week! If you are interested in utilizing it feel free to message me. I’m sure a few of you can receive some free keys once its ready! I’ll have a separate post about the API once its ready.

  • Fantasy Football and AI – Week 8

    BIG WIN this week. All but two of the players that the AI picked this week were at or exceeded their projections. We scored 190 points for week 8. Here are the results:

    Unfortunately I haven’t had any time to put into building out an MCP server but the data cleanup has definitely improved the overall application and I think I am in a good place to have this setup for future use (next year!). Currently the team is in 3rd place of 8. All the teams in the league make it to the playoffs so we still have a long way to go!

    For week 9 we have some big players on BYE this week so the AI will have to handle that. Here is the current tentative lineup:

    I definitely think it will need step up some waiver finds for running back! Tune in next week for results!