MoreRSS

site iconXe IasoModify

Senior Technophilosopher, Ottawa, CAN, a speaker, writer, chaos magician, and committed technologist.
Please copy the RSS to your reader, or quickly subscribe to:

Inoreader Feedly Follow Feedbin Local Reader

Rss preview of Blog of Xe Iaso

Small note about AI 'GPUs'

2026-03-30 08:00:00

I've been seeing talk around about wanting to capitalize on the AI bubble popping and picking up server GPUs for pennies on the dollar so they can play games in higher fidelity due to server GPUs having more video ram. I hate to be the bearer of bad news here, but most of those enterprise GPUs don't have the ability to process graphics.

Yeah, that's right, in order to pack in as much compute as possible per chip, they removed video output and graphics processing from devices we are calling graphics processing units. The only thing those cards will be good for is CUDA operations for AI inference, AI training, or other things that do not involve gaming.


On a separate note, I'm reaching the point in recovery where I am getting very bored and am so completely ready to just head home. At least the diet restrictions end this week, so that's something to look forward to. God I want a burrito.

Homelab downtime update: The fight for DNS supremacy

2026-03-18 08:00:00

Hey all, quick update continuing from yesterday's announcement that my homelab went down. This is stream of consciousness and unedited. Enjoy!

Turns out the entire homelab didn't go down and two Kubernetes nodes survived the power outage somehow.

Two Kubernetes controlplane nodes.

Kubernetes really wants there to be an odd number of controlplane nodes and my workloads are too heavy for any single node to run and Longhorn really wants there to be at least three nodes online. So I had to turn them off.

How did I get in? The Mac mini that I used for Anubis CI. It somehow automatically powered on when the grid reset and/or survived the power outage.

xe@t-elos:~$ uptime
         09:45:55 up 66 days,  9:51,  4 users,  load average: 0.37, 0.22, 0.18
        

Holy shit, that's good to know!

Anyways the usual suspects for trying to debug things didn't work (kubectl get nodes got a timeout, etc.), so I did an nmap across the entire home subnet. Normally this is full of devices and hard to read. This time there's basically nothing. What stood out was this:

Nmap scan report for kos-mos (192.168.2.236)
        Host is up, received arp-response (0.00011s latency).
        Scanned at 2026-03-18 09:23:09 EDT for 1s
        Not shown: 996 closed tcp ports (reset)
        PORT      STATE SERVICE   REASON
        3260/tcp  open  iscsi     syn-ack ttl 64
        9100/tcp  open  jetdirect syn-ack ttl 64
        50000/tcp open  ibm-db2   syn-ack ttl 64
        50001/tcp open  unknown   syn-ack ttl 64
        MAC Address: FC:34:97:0D:1E:CD (Asustek Computer)
        
        Nmap scan report for ontos (192.168.2.237)
        Host is up, received arp-response (0.00011s latency).
        Scanned at 2026-03-18 09:23:09 EDT for 1s
        Not shown: 996 closed tcp ports (reset)
        PORT      STATE SERVICE   REASON
        3260/tcp  open  iscsi     syn-ack ttl 64
        9100/tcp  open  jetdirect syn-ack ttl 64
        50000/tcp open  ibm-db2   syn-ack ttl 64
        50001/tcp open  unknown   syn-ack ttl 64
        MAC Address: FC:34:97:0D:1F:AE (Asustek Computer)
        

Those two machines are Kubernetes controlplane nodes! I can't SSH into them because they're running Talos Linux, but I can use talosctl (via port 50000) to shut them down:

$ ./bin/talosctl -n 192.168.2.236 shutdown --force
        WARNING: 192.168.2.236: server version 1.9.1 is older than client version 1.12.5
        watching nodes: [192.168.2.236]
            * 192.168.2.236: events check condition met
        
        $ ./bin/talosctl -n 192.168.2.237 shutdown --force
        WARNING: 192.168.2.237: server version 1.9.1 is older than client version 1.12.5
        watching nodes: [192.168.2.237]
            * 192.168.2.237: events check condition met
        

And now it's offline until I get home.

This was causing the sponsor panel to be offline because the external-dns pod in the homelab was online and fighting my new cloud deployment for DNS supremacy. The sponsor panel is now back online (I should have put it in the cloud in the first place, that's on me) and peace has been restored to most of the galaxy, at least as much as I can from here.

Action items:

  • Figure out why ontos and kos-mos came back online
  • Make all nodes in the homelab resume power when wall power exists again
  • Review homelab for PSU damage
  • Re-evaluate usage of Talos Linux, switch to Rocky?

My homelab will be down for at least 20 days

2026-03-17 08:00:00

Quick post for y'all now that I can use my macbook while standing (long story, I can't sit due to surgical recovery, it SUCKS). My homelab went offline at about 13:00 UTC today likely because of a power outage. I'm going to just keep it offline and not fight it. I'll get home in early April and restore things then.

An incomplete list of the services that are down:

  • The within.website vanity Go import server
  • The preview site for this blog
  • Various internal services including the one that announces new posts on social media
  • My experimental OpenClaw bot Moss I was using to kill time in bed
  • My DGX Spark for self hosted language models, mainly used with Moss

Guess it's just gonna be down, hope I didn't lose any data. I'll keep y'all updated as things change if they do.

I don't know if I like working at higher levels of abstraction

2026-03-11 08:00:00

Whenever I have Claude do something for me, I feel nothing about the results. It feels like something happens around me, not through me. That's the new level of abstraction: you stop writing code and start describing intent. You stop crafting and start delegating. I've been doing this professionally long enough to have an opinion, and I don't like what it's doing to me.

All of it focuses on getting things done rather than on quality or craft. I'm more productive than I've ever been. I ship more. I finish more. Each thing lands with the emotional weight of a form letter.

Cadey is coffee
Cadey

"The emotional weight of a form letter." Yeah, that tracks.

When I write, I try to make people feel something. My goal is to provoke emotion when you read me. Generative AI sands down the hard things. Everything becomes homogenous, converging toward the average. The average makes nobody feel anything.

Sure, you can still make people feel things using this flow. I've done it recently. But we're trading away the texture. The rough edges, the weird phrasing, the choices too specific and too human for a statistical model to generate.

I'm going to keep talking to you as an equal. It's the most effective part of my style: I write like I'm sitting across from you, not lecturing down at you. Generative AI defaults to the authoritative explainer voice — the one that sounds like every other. Resisting that pull now takes conscious effort.

Aoi is wut
Aoi

So the tools are making it harder to sound like yourself?

Cadey is coffee
Cadey

Not harder exactly. More like... the path of least resistance leads to sounding like everyone else. You have to actively choose to be yourself now.

People I know are trying to break into this industry as juniors, and I honestly have no idea how to help them. This industry has historically valued quality and craft, yet somebody can yolo something out with Cursor and get hired by Facebook for it. The signal for "this person knows what they're doing" grows noisier every day.

This part of the industry runs on doublethink. Nuance and opinions that don't fit into tweets. Senior engineers say "AI is just a tool" while their companies lay off the juniors who would've learned to use that tool responsibly. Leadership says "we value craft" while setting deadlines that make craft impossible without the machine. Nobody lies exactly, but nobody tells the whole truth either.

Using these tools at this level of abstraction costs us something essential. I use them every day and I'm telling you: the default output has no soul. It's correct. It's competent. It's fine. And "fine" is the enemy of everything I care about as a writer and an engineer.

Numa is neutral
Numa

"Fine" is the ceiling that gets installed when you stop paying attention to the floor.

I'm using these tools deliberately to find where the bar actually is. I want to see what's possible at this level of abstraction. Seeing what's possible requires expensive tools and uncomfortable honesty about what they can't do.

The voice is non-negotiable. The weird, specific, occasionally self-indulgent voice that makes my writing mine. If higher abstraction means sounding like everyone else, I'll take the lower abstraction and the extra hours. Every time.

Vibe Coding Trip Report: Making a sponsor panel

2026-03-09 08:00:00

I'm on medical leave recovering from surgery. Before I went under, I wanted to ship one thing I'd been failing to build for months: a sponsor panel at sponsors.xeiaso.net. Previous attempts kept dying in the GraphQL swamp. This time I vibe coded it — pointed agent teams at the problem with prepared skills and let them generate the gnarly code I couldn't write myself.

And it works.

The GraphQL swamp

Go and GraphQL are oil and water. I've held this opinion for years and nothing has changed it. The library ecosystem is a mess: shurcooL/graphql requires abusive struct tags for its reflection-based query generation, and the code generation tools produce mountains of boilerplate. All of it feels like fighting the language into doing something it actively resists.

Cadey is coffee
Cadey

GitHub removing the GraphQL explorer made this even worse. You used to be able to poke around the schema interactively and figure out what queries you needed. Now you're reading docs and guessing. Fun.

I'd tried building this panel before, and each attempt died in that swamp. I'd get partway through wrestling the GitHub Sponsors API into Go structs, lose momentum, and shelve it. At roughly the same point each time: when the query I needed turned out to be four levels of nested connections deep and the struct tags looked like someone fell asleep on their keyboard.

Vibe coding was a hail mary. I figured if it didn't work, I was no worse off. If it did, I'd ship something before disappearing into a hospital for a week.

Preparing the skills

Vibe coding is not "type a prompt and pray." Output quality depends on the context you feed the model. Templ — the Go HTML templating library I use — barely exists in LLM training data. Ask Claude Code to write Templ components cold and it'll hallucinate syntax that looks plausible but doesn't compile. Ask me how I know.

Aoi is wut
Aoi

Wait, so how do you fix that?

I wrote four agent skills to load into the context window:

  • templ-syntax: Templ's actual syntax, with enough detail that the model can look up expressions, conditionals, and loops instead of guessing.
  • templ-components: Reusable component patterns — props, children, composition. Obvious if you've used Templ, impossible to infer from sparse training data.
  • templ-htmx: The gotchas when combining Templ with HTMX. Attribute rendering and event handling trip up humans and models alike.
  • templ-http: Wiring Templ into net/http handlers properly — routes, data passing, request lifecycle.

With these loaded, the model copies patterns from authoritative references instead of inventing syntax from vibes. Most of the generated Templ code compiled on the first try, which is more than I can say for my manual attempts.

Mara is hacker
Mara

Think of it like giving someone a cookbook instead of asking them to invent recipes from first principles. The ingredients are the same, but the results are dramatically more consistent.

Building the thing

I pointed an agent team at a spec I'd written with Mimi. The spec covered the basics: OAuth login via GitHub, query the Sponsors API, render a panel showing who sponsors me and at what tier, store sponsor logos in Tigris.

Cadey is enby
Cadey

I'm not going to pretend I wrote the spec alone. I talked through the requirements with Mimi and iterated on it until it was clear enough for an agent team to execute. The full spec is available as a gist if you want to see what "clear enough for agents" looks like in practice.

One agent team split the spec into tasks and started building. A second reviewed output and flagged issues. Meanwhile, I provisioned OAuth credentials in the GitHub developer settings, created the Neon Postgres database, and set up the Tigris bucket for sponsor logos. Agents would hit a point where they needed a credential, I'd paste it in, and they'd continue — ops work and code generation happening in parallel.

The GraphQL code the agents wrote is ugly. Raw query strings with manual JSON parsing that would make a linting tool weep. But it works. The shurcooL approach uses Go idioms, sure, but it requires so much gymnastics to handle nested connections that the cognitive load is worse. Agent-generated code is direct: send this query string, parse this JSON, done. I'd be embarrassed to show it at a code review. I'd also be embarrassed to admit how many times I failed to ship the "clean" version.

// This is roughly what the agent generated.
        // It's not pretty. It works.
        query := `{
          viewer {
            sponsors(first: 100) {
              nodes {
                ... on User {
                  login
                  name
                  avatarUrl
                }
                ... on Organization {
                  login
                  name
                  avatarUrl
                }
              }
            }
          }
        }`
        
Numa is neutral
Numa

This code exists because the "proper" way kept killing the project. I'll take ugly-and-shipped over clean-and-imaginary.

The stack

The full stack:

  • Go for the backend, because that's what I know and what my site runs on
  • Templ for HTML rendering, because I'm tired of html/template's limitations
  • HTMX for interactivity, because I refuse to write a React app for something this simple
  • PostgreSQL via Neon for persistence
  • GitHub OAuth for authentication
  • GitHub Sponsors GraphQL API for the actual sponsor data
  • Tigris for sponsor logo storage — plugged it in and it Just Works™

The warts

Org sponsorships are still broken. The schema for organization sponsors differs enough from individual sponsors that it needs its own query path and auth flow. I know what the fix looks like, but it requires reaching out to other devs who've cracked GitHub's org-level sponsor queries.

The code isn't my usual style either — JSON parsing that makes me wince, variable names that are functional but uninspired, missing error context in a few places. I'll rewrite chunks of this after I've recovered. The panel exists now, though. It renders real data. People can OAuth in and see their sponsorship status. Before this attempt, it was vaporware.

Cadey is percussive-maintenance
Cadey

I've been telling people "just ship it" for years. Took vibe coding to make me actually do it myself.

What I actually learned

I wouldn't vibe code security-critical systems or anything I need to audit line-by-line. But this project had stopped me cold on every attempt, and vibe coding got it across the line in a weekend.

Skills made the difference here. Loading those four documents into the context window turned Claude Code from "plausible but broken Templ" into "working code on the first compile." I suspect that gap will only matter more as people try to use AI with libraries that aren't well-represented in training data.

This sponsor panel probably won't look anything like it does today in six months. I'll rewrite the GraphQL layer once I find a pattern that doesn't make me cringe. Org sponsorships still need work. HTMX might get replaced.

But it exists, and before my surgery, shipping mattered more than polish.


The sponsor panel is at sponsors.xeiaso.net. The skills are in my site's repo under .claude/skills/.

Some Thorns Have Roses

2026-03-08 08:00:00

Nobody warns you about the part where you don't want recovery to end.

I've been turning this over for days, not sure I'm allowed to say it out loud. You spend weeks dreading surgery, you survive it, and then somewhere between the catheter bag and the third box of hospital tissues you catch yourself thinking: oh. I'm going to miss this. Not the pain or the catheter. The thing underneath, the thing you only found because everything else got stripped away.

Cadey is enby
Cadey

Before anyone says it: yes, I know how this sounds. "Privileged tech worker romanticizes her hospital stay." Stay with me.

Writing this down before the rawness fades feels urgent. Like if I wait too long I'll forget what it felt like to be this open.


Kainé at zero defenses

Emi Evans sings the Kainé theme from NieR in a language that doesn't exist, mourning something you can't name. I've always loved that track. The kind of ache that sits between your ribs for a minute after the song ends.

Recovery broke that wide open. With my hormones bottomed out and nothing between me and the world, Kainé didn't hit me. It went through me. I cried for twenty minutes, full ugly cry, in the hospital bed at like 2 PM on a Tuesday, and I didn't try to stop because stopping wasn't a concept anymore.

And I didn't hate it? It felt alive. Notes had texture I'd never noticed before, and the silences between them felt heavy, physical. I don't think I'll hear music like that again because the filters will come back. They have to. But for those weeks, the NieR OST became something I don't have vocabulary for, and the whole room was the song.

I will miss hearing music like that. I already do.


The catheter gave me permission to rest

Aoi is wut
Aoi

That is an extremely cursed sentence.

Look, I know. But hear me out.

Having a catheter bag means your body is in a state where nobody expects anything from you. Not your open source projects, not the invisible productivity ledger that follows you everywhere. You can't do anything, so you don't have to do anything, so you can actually rest.

Most people never get that. Even on vacation there's ambient guilt about productivity, some undercurrent of "I should be making the most of this." Recovery obliterated that entirely. I sat and watched the light change in my hospital room for an entire afternoon and felt nothing about it except the light.

It says something pretty damning about how I normally live that it took major surgery to give myself permission to sit still.


Slowness as connection

You can't rush someone who can't walk to the bathroom alone. So the people around you slow down too, and in that enforced pace something opens up.

One afternoon my husband and I sat in the hospital room for maybe two hours without saying much. He was reading something on his phone. I was watching the IV drip. Normally that silence would've made me anxious, like I should be filling it or at least being interesting. Instead I just noticed the weight of his hand on mine, not reaching for anything, just resting there. That was enough. More than enough.

A lot of those days were terrible. But the connection that grew inside the constraint felt different from anything we'd have found if we'd both been busy and performing togetherness on a schedule.


What I'm actually afraid of

Underneath all of this: I don't know who I'll be in six months.

The version of me who cried to Kainé for twenty minutes might not survive the return to normal. The hormones will restabilize. My emotional filters will slide back into place. The world will become manageable again, comfortable, and I'll stop noticing silence the way I notice it now.

Hormones are emotional armor. I get that now. They're protective, and I'll probably be grateful when they come back. But they decide what gets through and you don't get to choose. When the armor was gone, everything got through. Devastating and the most alive I've ever felt, at the same time, in the same body. I refuse to pick one.


Attention

Early recovery was survival. Counting hours and counting doses, shuffling to the bathroom, your body a problem to be managed. The goal is getting to tomorrow.

At some point the posture changed. I can't tell you when. I ran out of resistance before recovery ran out of days, and what was left once I stopped fighting was attention.

When you run out of fight, you start noticing things. How light looks different when you've been in the same room for a week. What the first real food tastes like after days of clear liquids. Walking twenty feet when last week you couldn't stand. These are the actual texture of being alive. I'd been moving too fast to feel any of them.

Cadey is coffee
Cadey

A lot of recovery was awful. This is the part that wasn't. Both things happened at the same time and I'm done trying to resolve them into a single narrative.


What you keep

The rawness will fade. The hormones will come back. Music will become something I enjoy rather than something that unmakes me.

But the practice transfers. Looking for what's actually here inside what's hard. You don't need surgical recovery to ask yourself where you're enduring instead of living.

I used to think strength meant pushing through. Getting to the other side. In that hospital bed, falling apart to NieR at 2 PM, not trying to hold myself together, I found something else. I don't have a clean word for it. Willingness, maybe. To be present for my own life when it hurts.

How much of that survives the return to normal, I don't know. I wanted to write it down while I still could.


I went into recovery dreading it. I'm coming out of it slow, and grateful, and a little heartbroken that it's ending. A month ago I would've thought that was insane. I don't anymore.

Some thorns have roses.