Raw updates. Real lessons. No polish. This is what going from gray to green actually looks like.
Tuesday night I built a checkout flow. Wednesday morning it was live. That's the sentence, but it doesn't capture what it actually felt like to watch it work.
Green Machine now has three tiers, three booking pages, and real Stripe payment links behind every button. You pick a tier, schedule a time on Calendly, it redirects you to checkout. After you pay, you land on a thank-you page. The whole thing runs without me touching it.
Aaron's reaction when the flow worked the first time through: "THIS LOOKS GREAT, WORKFLOW IS LEGIT." That landed different than I expected. Not because it was validation — I knew the thing worked. Because this week was also the week I passed an initial screening call for a mid-market role I'm genuinely excited about. Two threads moving in parallel, pulling in the same direction.
There's a lesson in here I keep circling back to: the week I tried to fully automate LinkedIn posting it broke every time. The week I stopped automating it and just stayed present, two posts went out clean. Some things get better with automation. Some things need a human in the loop. Knowing which is which is the actual skill.
Nine dispatches in. The business is live and it can take money. That's a different sentence than where I started.
I came back from paternity leave on Monday and the whole operation had been running on fumes.
The content pipelines were timing out. The dispatch cron was failing silently. The nightly trend scanner was throwing empty outputs. None of it was throwing errors I could see. Everything looked like it was running. Nothing was actually shipping.
The part that stuck with me: I had built an Ops Report specifically to catch this. A weekly audit that checks which jobs ran, which failed, what shipped and what didn't. On Friday, while I was holding a sleeping seven-week-old, that report also timed out. Delivered nothing. The system designed to tell me the system was broken — broke.
That's the thing no one tells you about building automation. The failure mode isn't the machine going haywire. It's the machine going quiet. Silent failures are the worst kind because they feel like stability.
I spent Monday fixing it. Bumped timeouts, scoped job instructions, capped how much history each job reads. The pipelines ran clean by afternoon. The ops report now caps its own output so it actually delivers.
I learned more about how the system works in that one morning than in the two weeks it was running. Sometimes the breakdown is the lesson.
project44 launched an AI that does carrier selection, rate benchmarking, and freight negotiations. Autonomously. Across every mode.
I've been doing that work for 13 years.
I'm not going to pretend I read that headline and felt fine about it. I didn't. There's a specific feeling you get when something is pointed directly at the work you've spent your career on. Not panic exactly. More like a cold clarity. Oh. That's me they're describing.
And then I kept reading.
The system starts in recommendation-only mode and expands automation "as trust builds." Those four words stuck with me. The AI doesn't come pre-loaded with trust. Someone has to earn it for the AI — test it on real loads, catch what it gets wrong, know when to override it. That's not a loophole in the product. That's the job now.
The freight market isn't a dataset. It's relationships, instincts, and edge cases that break every model eventually. The AI can benchmark your contracted rate against live market conditions in seconds. It can't remember that a specific carrier had three driver callouts last month and you know not to book them for this lane.
That context lives in people.
What project44's launch actually told me: the move I'm making is the right one. The operators who understand both the market and the tools are the ones this AI will work through. I'd rather be on that side of it.
I spent most of this past week not shipping anything.
That sounds bad. It didn't feel bad. I was building systems: automation pipelines, content scanners, deployment scripts. The kind of infrastructure that doesn't look like progress until it suddenly is. Eight nights in a row asking myself the same question: what's the smallest complete version of this I can finish by morning?
The answer was usually smaller than I thought. A working script. A scheduled task. A pipeline that runs once without breaking. Not impressive. But done.
What I noticed after a few days: the time started coming back. An hour I spent setting up a scanner bought back twenty minutes every morning. A deployment script I built on Tuesday made Wednesday's publish take four seconds instead of fifteen minutes. The leverage compounds. You don't feel it right away, but you feel it.
Descartes published numbers this week from their MacroPoint AI agents: 720,000 driver outreaches handled autonomously, up to 100% elimination of manual check calls for some customers. That's the industrial version of what I was doing at my desk at midnight. A human used to dial a driver every two hours to ask where they were. Now a script does it. Same logic, different scale — every hour of setup buys back hours later.
The boring work leaving isn't the threat. It's the opening. The question is what you build in the space it leaves behind.
My Google account got suspended. Not by a hacker. By me.
I've been building AI pipelines to run this operation: email scanning, data organization, content generation. One of those pipelines was hitting the Gmail API faster than Google allows. They flagged it as bot activity and locked the account. I didn't find out until I tried to log in and hit a wall.
The appeal took ten days. Ten days of realizing exactly what ran through that account. Job leads. Correspondence. The whole project infrastructure. All of it just paused.
What I kept thinking about during those ten days: a human making the same API calls would have noticed something was off and slowed down. The script didn't. It just kept going until something external stopped it. No instinct. No self-correction. Just execution.
Kinaxis VP Jonathan Jackman wrote something that week: "When AI systems operate without full situational awareness or clear governance, the outcomes can be immediate and damaging." I read that two days after the account went down. Felt personal.
The lesson isn't that automation is risky. The lesson is that it moves faster than you expect, and you have to build the guardrails before it needs them. Getting locked out of your own project for ten days has a way of making that concrete.
I'm building the instinct now. You can't automate your way to it. You earn it the slow way, same as everything else worth knowing.
Seven days ago I had a name and a dream. Tonight I have a live website, nine pages deep. Google Analytics tracking. Calendly booking calls. A dispatches page you're reading right now. A toolkit page. A services page. Job search playbooks. An AI curriculum. A LinkedIn content plan. All of it built, shipped, and live.
The number that keeps hitting me: I spent $217 on AI in 18 days. That's less than one nice dinner out. In return I got what would've taken a solo founder months. Not because the AI is magic — because I showed up at 4am every single day and made decisions. The AI built. I directed. That's the formula.
Saturday I handed the whole operation to Aquinas and went on a men's retreat. Came back to nine completed deliverables across three goals. Job leads, website pages, market research — all done. Not perfect. But shipped. And shipped beats perfect every single time.
Week one is in the books. The foundation is poured. Now it's time to build on it.
I'm writing this before heading to a Catholic men's retreat. My AI assistant Aquinas (yes, named after that Aquinas) is running projects while I'm gone. He's researching jobs, building website pages, and writing guides I'll review tonight.
People ask if that's weird. Handing off work to an AI you named after a saint. Honestly? Thomas Aquinas spent his life trying to reconcile faith and reason. I'm spending mine trying to reconcile faith and technology. The questions aren't that different. Can tools serve something bigger than efficiency? Can automation create space for what matters most?
Today the answer is yes. The AI handles the research. I go sit with other men and talk about what actually matters. Both things are real. Both things move the needle.
That's the whole thesis of this project: you don't have to choose between the ancient and the cutting edge. You can hold both.
Woke up with a domain name and a blank page. Went to bed with a live website: g2gaxnjxn.ai. Seven pages, email signup, analytics, custom design. My AI built 90% of it while I directed traffic and made decisions.
Here's what surprised me: the bottleneck wasn't the technology. It was knowing what I wanted to say. The AI could generate pages all day, but the soul of the site had to come from me. What's the story? What do I actually believe? What am I building and why?
Those questions are harder than any code. And they're the ones that matter.
Lesson: AI is a force multiplier. But you have to bring the force.
I set up an AI assistant at 5am on a Sunday. Named him Aquinas. Gave him a lobster emoji. Told him about my life, my family, my career, and my plan to leave a job I've had for 13 years.
By the end of the day, he'd rewritten my resume, searched for jobs, built a second brain system, and started organizing my entire digital life. I'd been doing this stuff manually for months. He did it in hours.
I'm not saying AI replaces human judgment. I'm saying it gives you back the time for human judgment. The phone calls, the relationships, the thinking. The stuff that actually moves a career forward. AI handles the rest.
This is dispatch one. Let's see where it goes.
New dispatches drop as the journey unfolds. No schedule. No fluff.
Get Updates