Before You Start
Open pota-ai-assistant-live.html in a browser before you begin. Click your callsign badge and set your callsign and park reference. Verify the live POTA spots are loading (green POTA LIVE indicator in header). If the API is unavailable, the app falls back to demo data automatically. The AI Assistant tab requires internet access to reach the Anthropic API.
AI & Ham Radio (LIVE) Documentation
Table of Contents
Here we go — this is the live application. Let me orient you.
The first thing you’ll notice at the top is the callsign badge. That’s configurable — if you click it, a settings panel opens where you can enter your own callsign, your primary park reference, your grid square, and preferred mode. Everything in the application then is personalized to your station.
Let me set mine: [enter callsign] and now you can see it’s updated throughout the interface.
The Dashboard tab is your mission control view. In the top row we have four key stats: the number of active POTA spots right now, how many QSOs you’ve logged in this session, current solar conditions — SFI, A-index, and K-index — pulled live from the HamQSL solar data feed, and your parks-hunted count.
Below that on the left you have the AI Insights panel. This is where the system synthesizes the solar data and current spot activity into actionable recommendations. It’s telling us what bands look best and why — not just raw numbers, but what they mean for your activation.
On the right is the live spots panel — real activators, real frequencies. That data comes directly from the POTA API and refreshes automatically.
This is your “walk in the door and know immediately what’s happening” view.
Live Spots & Band Activity
(Live Spots tab)
Moving to the Live Spots tab.
On the left you have the spot map — a visual representation of where active POTA stations are operating right now, color-coded by band. Blue dots are 20m, orange is 40m, green is 17m. You can filter the map by band using the buttons at the top.
Below the map is the band activity chart, which tells you at a glance where the action is. If 20m has 18 spots and 40m has 7, that tells you something about where other activators are having success — and where the hunters are listening.
On the right is the full spot table with the time, callsign, park reference, park name, frequency, band, mode, and any comment the spotter added.
What makes this interesting for hunters is that you can scan this list looking for parks you haven’t worked yet. If I see K-6782, Zion National Park, on 17m SSB right now — and I’ve never worked Zion — I know exactly where to tune. That contact could count toward my 500-parks award.
For activators, watching this list tells you what bands are producing. If you see four of your local parks, all on 20m SSB and none on 17m, that might mean 17m is quiet — or it might mean it’s wide open and nobody’s tried it yet.
QSO Logger & ADIF Export
(Live Logger tab)
The Logger tab is where you track your contacts during an activation.
The entry form is streamlined for speed. You type a callsign and press Enter — focus automatically moves to the frequency field. Type the frequency and press Enter again — QSO logged. That’s two keystrokes per contact once you’ve set your band and mode.
Here’s something clever: as you type a callsign, the system checks it against the live POTA spots. If that callsign is currently spotted as an activator, it immediately tells you — their park reference, their frequency, and their mode. So, if you’re a hunter and you just heard someone calling CQ, you can verify they’re in the POTA system without ever leaving your logging screen.
Watch what happens as I log contacts — the QSO counter updates in real time, and the system tracks your progress toward a valid activation. When you hit ten QSOs, it confirms your activation will count.
Below the log is the ADIF preview. Every QSO is automatically formatted in standard ADIF, the file format that POTA.app accepts for log uploads. When you’re done, you click Export and it downloads a properly formatted file with your callsign, park reference, dates, and times all filled in correctly.
The N-fer splitter button is for multi-park activations — if you tagged contacts with multiple park references, it splits the log and generates separate ADIF files for each park automatically.
Propagation Analysis
(Live Propagation tab)
The Propagation tab brings together solar data and AI analysis to help you make smarter band decisions.
At the top you have the three numbers every ham cares about: SFI, K-index, and A-index — pulled live from the HamQSL data feed. But rather than leaving you to interpret those numbers yourself, the AI does it for you.
The band matrix shows every amateur band with a score from 1 to 10 based on current conditions. Green cells are excellent, blue is good, yellow is fair, and red is poor. You can see briefly that 20m and 17m are the sweet spot right now, while the low bands are marginal during the day.
Below that is the gray line map — the terminator line between day and night, which represents a zone of excellent propagation for long-distance contacts. If you’re near the gray line right now, 40m can be exceptional.
The AI Propagation Analysis on the right synthesizes all of this into specific, actionable advice — not just “20m is good” but “start on 20m SSB, expect solid coast-to-coast paths, try 14.245 MHz, shift to 40m around 17:00 UTC when 20m starts to fade.”
And the Activation Windows panel shows the four six-hour windows of the day ranked by propagation quality — so you can pick the best time for your activation even before you leave the house.
AI Voice Transcription
(Live Voice Log tab)
This one is a favorite feature, because it addresses a real problem that every activator has faced: keeping up with a pileup while also logging accurately.
When you’re running a pileup — especially in marginal conditions with QRM — you’re processing callsigns as fast as you can respond to them. Logging at the same time is a distraction. Mistakes happen. You miss calls. You log the wrong callsign.
What the Voice Transcription feature does is use AI to listen to the audio from your session and automatically identify callsigns, frequencies, and signal reports from natural speech. You hear “Kilo Delta Nine X-Ray Yankee Zulu, you’re fifty-nine in the log” — the AI catches that, highlights the callsign in green, and queues it for logging.
You can see the transcript appearing in real time, with callsigns highlighted in green — those are high-confidence identifications the AI is confident about. The yellow one — the “KI? or N8?” — is a partial that the AI flagged for review because the audio was too noisy to decode with confidence.
Each identified callsign appears in the table on the right with a confidence percentage. Anything above 70% gets a “+ Log” button that adds it to your QSO log with one click. Low-confidence calls get a “Review” button so you can manually verify before logging.
This doesn’t replace good operating practice — you still confirm every callsign over the air. But as a second set of ears that never gets tired, it’s a genuine help.
AI Activation Planner
(Live AI Planner tab)
The AI Activation Planner is your pre-activation preparation tool.
You enter your park reference, the date and time you’re planning to go, your primary mode, and — if you’re doing an N-fer — the additional parks you’ll be activating simultaneously.
Click “Generate AI Activation Plan” and it builds a complete timeline from T-minus 48 hours through your wrap-up, with specific guidance at each stage.
You can see it’s pulling in real data: it knows what park you’re activating, it knows the current propagation forecast for your planned time, and it gives you specific frequency recommendations based on what’s historically active at that park and what the current conditions support.
For the N-fer activation — let me add a second park and regenerate. See how it now includes the N-fer note in the timeline, reminding you to tag contacts with both park references so the ADIF splitter can separate them later? It’s connecting the planning phase to the logging phase to the export phase as one workflow.
The park analytics panel on the right shows historical data for the park you’re targeting — average QSO counts by time of day, top bands, and visitor peak periods. That last one matters because in busy parks, visitors’ traffic on weekends can make it hard to find a quiet spot for your antenna. The AI notes that.
And the awards tracker shows how your session QSOs are accumulating toward active awards. As you log contacts in the Logger tab, these progress bars update in real time.
AI Assistant (Live Claude Chat)
(Live AI Assistant tab)
And finally — the AI Assistant tab. This is where a live, connected instance of Claude is available to answer any POTA question in real time.
This is not a FAQ database or a scripted chatbot. This is the actual Claude AI, with full POTA context built into the system prompt — including your callsign, your park reference, your grid square, current solar conditions, and how many QSOs you’ve logged this session. Every response is personalized to your situation.
Let me ask it a question. [type: “What are the best bands for activation right now?”] Watch it think for a second… and there’s the answer. It tells me specifically about 20m and 17m based on the current SFI and K-index values we saw in the propagation tab. It knows those numbers because the application passed them automatically.
That’s the key difference between this and just opening a browser and asking Claude. The application has given Claude context it would otherwise have to ask you for.
Let me try the quick action buttons on the right — “Find Rare Parks Near Me.” [click] This is asking Claude to find rarely-activated parks in Ohio — and it gives me actual recommendations with reasoning.
One more: “Plan N-fer Route.” [click] Full multi-park strategy, specific to my location and current conditions.
The conversation maintains context — Claude remembers what you asked earlier in the session. So you can build on previous answers without starting over.
This is what AI-assisted ham radio looks like in practice. It’s not replacing you as the operator — it’s just making you better informed, better prepared, and more efficient.
Key Takeaways & How to Get Started
(Closing summary slide)
So let’s bring this all together.
What I’ve shown you today is a fully functional POTA assistant application built entirely by AI from a plain English description. No programming required. No technical knowledge beyond knowing what I wanted to build.
Three things I want you to take home:
Number one: AI is a tool, not magic. It works best when you give it clear, specific instructions. The quality of your prompt is the quality of your output. Practice writing better prompts — be specific about what you want, why you want it, and what format you expect the result in.
Number two: Start simple. Don’t try to build the whole application at once. Start with one feature. “Build me a QSO logger that exports ADIF.” Get that working. Then add to it. Iteration is the key to success with AI development.
Number three: The ham radio community has a natural advantage here. We’re used to technical details. We’re used to describing things precisely — frequency, mode, band, RST, park reference. That precision is exactly what makes for effective AI prompts.
To get started: go to claude dot ai. There’s a free tier. Try describing something you wish existed for your shack — a logging helper, a propagation summary, a contest score tracker. See what comes back. You cannot break it, and you might just surprise yourself.
The POTA assistant I showed you today is available to download and use. I’ll share the link in the club newsletter.
Seventy-three, and good luck on your next activation.
