Tag: ec2

  • The state of SEOScoreAPI – An OpenClaw Project

    The state of SEOScoreAPI – An OpenClaw Project

    If you remember, I did an experiment with OpenClaw. I let it have access to build anything it wanted with a single EC2 instance. It built https://seoscoreapi.com which is a fantastic tool for checking the status of your websites SEO.

    Initially I thought I would scrap the project and just let the domain expire. But I started doing some manual promotion after OpenClaw spent over $200 and got the first paid user! Since then I’ve put a few hours a day promoting and adding some features.

    What’s New

    The last few months have been the most productive stretch since launch. Here’s what shipped:

    ADA Accessibility Audits

    This one came from watching the news. ADA website lawsuits hit over 4,000 in 2025 and they’re still climbing. Small businesses, e-commerce sites, local restaurants — everyone’s a target.

    We built a full WCAG 2.1 AA compliance endpoint that injects axe-core (the industry-standard accessibility engine) into a headless browser and scans the rendered page. It returns a compliance score, a lawsuit risk assessment, category breakdowns across 10 areas (color contrast, forms, keyboard navigation, ARIA, etc.), and specific fix suggestions for every violation.

    We ran it on our own site first. Scored a 71. Found contrast issues, links that weren’t distinguishable from body text, a misused aside element. Fixed everything. Now we score 100. That’s the point — even developers who care about accessibility miss things.

    Available on all paid plans. Starter gets 5/month, Ultra gets 500.

    GEO (Generative Engine Optimization)

    Traditional SEO gets you into Google. GEO gets you into ChatGPT, Claude, Perplexity, and every RAG pipeline pulling from the web.

    The GEO audit checks 26 factors across four categories: crawl accessibility, structural markup, content extractability, and AI discoverability. It answers questions like: Do you have an llms.txt file? Is your content chunked in a way that RAG systems can ingest? Do you have freshness signals? Are AI crawlers even allowed in your robots.txt?

    This is becoming more relevant every month. Traffic from AI systems is growing and most sites aren’t optimized for it at all.

    Competitive Audits

    You can now audit your site against competitors in a single request. The response shows a side-by-side comparison with score differentials, category-by-category breakdowns, and which specific checks you’re winning or losing on. Useful for agencies pitching prospects and for anyone doing competitive analysis.

    What’s Next

    Honestly I’m not sure. I would love to get more people using the service. Possibly some more integrations, a WordPress plugin?

    Try It

    If you’ve read this far, go audit your site: seoscoreapi.com

    The demo on the homepage doesn’t require a signup. Type in your URL, see your score, read the priorities list. If you want API access, the free tier takes 30 seconds to set up — just an email and a verification code.

    If you’re a developer, the docs have everything. Python and Node.js SDKs are on PyPI and npm. The GitHub Action is at SeoScoreAPI/seo-audit-action.

    If you have questions or feedback, I’m at aaron@seoscoreapi.com.

  • Updating AWS Managed Prefix Lists

    I was working with a customer the other day trying to come up with a way to import a bunch of IP addresses into a white list on AWS. We came up with the approach of using Managed Prefix Lists in VPC. I wrote some Python in order to grab some code from an API and then automatically put it into a prefix list.

    The code takes input from an API that is managed by a 3rd party. We first use that and parse the returned values into meaningful lists. After that, we pass the IPs to the function which it will check if the entry exists or not. If it does, it will pass the IP. If it doesn’t exist it will automatically add it.

    import requests
    import json
    import os
    import boto3
    from botocore.exceptions import ClientError
    import ipaddress
    
    def check_for_existing(list_id, ip):
        client = boto3.client("ec2", region_name="us-west-2")
        try:
            response = client.get_managed_prefix_list_entries(
                PrefixListId=list_id,
                MaxResults=100,
            )
            for entry in response['Entries']:
                if entry['Cidr'] == ip:
                    return True
                else:
                    pass
            return False
        except ClientError as e:
            print(e)
    
    
    
    def get_prefix_list_id(list_name):
        client = boto3.client("ec2", region_name="us-west-2")
        response = client.describe_managed_prefix_lists(
            MaxResults=100,
            Filters=[
                {
                    "Name": "prefix-list-name",
                    "Values": [list_name]
                }
            ]
        )
        for p_list in response['PrefixLists']:
            return {"ID": p_list['PrefixListId'], "VERSION": p_list['Version']}
    
    def update_managed_prefix_list(list_name, ip):
        client = boto3.client("ec2", region_name="us-west-2")
        if check_for_existing(get_prefix_list_id(list_name)['ID'], ip) == True:
            print("Rule already exists")
            return False
        else:
            try:
                response = client.modify_managed_prefix_list(
                            DryRun=False,
                            PrefixListId=get_prefix_list_id(list_name)['ID'],
                            CurrentVersion=get_prefix_list_id(list_name)['VERSION'],
                            AddEntries=[
                                {
                                    "Cidr": ip
                                }
                            ]
                        )
                return True
            except ClientError as e:
                print(e)
                print("Failed to update list")
    
    if __name__ == "__main__":
        url = "https://<my IP address URL>"
        headers = {}
        r = requests.get(url, headers=headers)
        json_ips = json.loads(r.content)
        ip = ""
        list_name = ""
        result = update_managed_prefix_list(list_name, ip)
        if result == True:
            print("Successfully Updates lists")
        else:
            print("Failed to update lists")

    If you are going to use this code it will need some modifications. I ultimately did not deploy this code but I had plans to run it as a Lambda function on a schedule so the lists would always be up to date.

    If this code is helpful to you please share it with your friends!

    Github

  • EC2 Reservation Notification

    I realized today that I haven’t updated my EC2 reservations recently. Wondering why I never did this I came to understand that there was no way that I was getting notified that the reservations were expiring. I spent the day putting together a script that would look through my reservations, assess the time of their expiration, and then notify me if it was nearing my threshold of 3 weeks.

    I put this together as a local script but it can also be adapted to run as a lambda function which is what I have it set up to do. As always, you can view my code below and on GitHub.

    import boto3
    from datetime import datetime, timezone, timedelta
    from botocore.exceptions import ClientError
    import os
    import json
    ec2_client = boto3.client("ec2", region_name="us-west-2")
    
    def get_reserved_instances():
        response = ec2_client.describe_reserved_instances()
        reserved_instances = {}
        for reservedInstances in response['ReservedInstances']:
            reserved_instances.update({
                reservedInstances['ReservedInstancesId']: {
                    "ExpireDate": reservedInstances['End'],
                    "Type": reservedInstances['InstanceType']
                }
            })
        return reserved_instances
    def determine_expirery(expirery_date):
        now = datetime.now(timezone.utc)
        delta_min = timedelta(days=21)
        delta_max = timedelta(days=22)
        if expirery_date - now >= delta_min and expirery_date - now < delta_max:
            return True
        else:
            return False
    #Send Result to SNS
    def sendToSNS(messages):
        sns = boto3.client('sns')
        try:
            send_message = sns.publish(
                TargetArn=os.environ['SNS_TOPIC'],
                Subject='EC2-Reservation',
                Message=messages,
                )
            return send_message
        except ClientError as e:
            print("Failed to send message to SNS")
            print(e)
    
    
    if __name__ == "__main__":
    
        for reservation, res_details in get_reserved_instances().items():
            if determine_expirery(res_details['ExpireDate']) == True:
                sns_message = {"reservation": reservation, "expires": res_details['ExpireDate'].strftime("%m/%d/%Y, %H:%M:%S")}
                sendToSNS(json.dumps(sns_message))
    #  

    I have an SNS topic setup that is set to send messages to a Lambda function in the backend so I can format my messages and send them to a Slack channel for notifications.

    If you have any questions, feel free to comment or message me on Twitter!

    GitHub