Portable Power Pricing (or how I built something with AI)

In my recent blog post Surviving the Great PSPS of 2025, I outlined my interest in Portable Solar Generators, and I've been looking for a side project to test out the latest AI capabilities. I found the answer in PortablePowerPricing.com

I was recently looking to buy an additional battery for my EcoFlow Delta 2 Max, but since all of these products are nearly always sold at a significant discount to their MSRP, I wasn't really sure how good of a deal it was. There are existing price trackers, but I realized there was another interesting aspect here. I wasn't just looking for the best price, but for the best price per Watt hour (Wh). I realized there was an opportunity to build a price comparison site specifically for this product category.

I primarily develop in VS Code using devcontainers, so that's what I did for this project. I ended up with 2 different projects, one to crawl the sites to get prices, and one for the public website. Each was a separate VS Code project (devcontainer) and git repo. I used Claude Code (Opus 4.5 and 4.6) via the VS Code plugin. I used the Claude Code Max (5x, $100/month) subscription, and only ran out of quota once during the project.

Throughout this project, I didn't write any code, or build any config files. I made some very minor edits by hand on occasion, but Claude Code wrote almost every line of every file in both projects.

I won't go through every step in the process, but I did want to highlight a few experiences and learnings...

Price Discovery

Clearly a key aspect of this is capturing the prices for each product and documenting their changes over time. I started by asking Claude about potential tools for this, and it ended up recommending PriceGhost. This is a pretty new project that was also built almost entirely with AI. I liked the pitch that it could use AI to help figure out the right price instead of leveraging the 'old school' approaches of parsing ever changing HTML pages.

It worked fine for a little while, but as I added more sites and more pages I ran into issues. One was that this is a new project and the code isn't really battle tested. I had to fork the project and make some changes (I submitted a PR) to get it to work for some pages where the title was too long, but for others it just wouldn't work at all. I also found that the AI aspect of it wasn't really effective. I was using Ollama as my model, but it would not reliably find the right price on the page.

I went back to Claude and discussed the challenges and we ended up settling on ChangeDetection.io. This is a more established and traditional tool, but Claude told me that most (all?) of the sites I would be scraping used Shopify and we could often just pull the JSON file for each product and easily get the product without parsing HTML.

But the real magic here was having Claude write a script that would automatically setup the watches for me. It queried my production database, found the URLs of each product, used the ChangeDetection API to create a watch, and then updated the production database with the watch's ID so I could sync the prices.

This is the kind of script that I could always write, but Claude Code did it in ~ 30 seconds and could make any changes needed to update, etc.

But the real long term issue here is how difficult will it be to maintain the price scraping functionality? As a non-revenue generating side project, I will only have so much patience and it isn't quite so automated that I can just ignore it. That was the promise of PriceGhost but it isn't really there yet.

Product Discovery

The most surprising discovery in this project was using Claude to build up the list of brands, products, and product relationships (which expansion batteries worked with which base stations). This is where the whole project would have fallen apart without AI. Realistically, I would not have spent the time to collect and format this information. It would have taken quite a few hours, and it is boring, mostly mindless work that is not justified for a side project.

However, it was not perfect. I did discover a few minor issues it made, but it was able to self-correct them and do additional research instead of me having to go manually fix them.

But this is where I really burned the tokens. Having Claude research websites and generate the product listings was very intense, and initially I would only add one brand per coding session. It often took >25% of my quota to research a single brand. In hindsight, I wonder if I should have used Sonnet for these tasks.

The one time I did run out of tokens was when I had it research all of the brands and fix any mistakes it made, which it did do, but ran through the full quota. 

Coding

Writing the code for the project was really the least surprising and interesting part. It is a pretty vanilla Next.JS app that uses Server Side Rendering (SSR) to query the data and build out the static content. I had no real issues getting it to build the pages I wanted, and each feature I added pretty much worked without issue.

While I expected this to be the largest section of learnings and revelations, it was really just straight forward. I used Plan Mode to outline and iterate on each feature I wanted to add, and then once it was correct Claude Code just built it out without issue.

I could have used projects like Spec Kit to manage the requirements and features, and I'd like to explore something like that for a future project, but for the scope of this project it was unnecessary.

Deployment

As a side project, I didn't want to spend a lot of money hosting this, but I did want to put it out in the world, so I asked Claude about the best options for 'free' plans and settled on Supabase and Vercel. Claude walked me through the whole process of setting the accounts and deploying the projects. Going from running locally to fully deployed in the cloud took less than 30 minutes.

I did pony up some cash for a real domain, in case it actually becomes something.

Conclusion

While I embarked on this project to get more hands on experience using Claude Code to write custom software, the biggest surprise I found was how useful it was automating data collection and data entry tasks. Those are the kinds of things that cause a project like this to be abandoned because the 'fun part' is over and there is just 'work' left. Instead, the project remained interesting and enjoyable all the way through.

That said, I was still pretty impressed with its ability to write code. For green field side projects like this, I really don't see myself hand writing code anymore.

I also did discover another similar project called WhichWatts, which seems like a more established and more feature complete version. It has also gone through the effort to commercialize it using Affiliate links. Although I do think my addition of comparing bundles (Base + additional battery combinations) makes it easier to compare target capacity by $/Wh.

That does bring up an interesting final point. I think the cost to build a site like this is going to go down dramatically, but the effort to maintain and commercialize it will still exist. I expect we'll see a lot of cool software get built but abandoned because it will be too niche to justify the ongoing effort. That's always been true to a degree, I think it will just accelerate.

Note: This post was written by a human, with editing feedback from Claude.