

Jay Long
Software Engineer & Founder
Published April 2, 2026
There's something happening right now that I don't think enough people are talking about. The no-code automation crowd and the SaaS startup engineering crowd are converging, and agentic AI is the force pulling them together.
For years, these were completely separate ecosystems. The automation folks were building workflows in n8n, Make.com, Zapier. They were working with scrappy mom-and-pop businesses, brick-and-mortar shops with real physical goods and services. Landing pages, lead funnels, digital marketing. That was their world. Meanwhile, engineers like me were over here building custom SaaS, bankrolled by VC money, completely disconnected from those small business clients. We didn't have to be scrappy. We just had to be good.
Now both sides are getting disrupted by the same thing, and the overlap is agents.
Here's the honest truth about what it was like working in VC-funded startups. These people had so much money that nobody ever asked about cost optimization. I actually learned about AWS cost optimization through certification coursework because I thought it would help me attract better-funded startups. And ironically, it did. But in the real world, nobody cared. I would spin up services, forget to shut them down, and we'd just light cash on fire. The metaphor would be burning dollar bills at a campsite to save yourself the effort of going to pick up sticks in the woods. That's how much money was flowing.
So while billionaires poured cash on us, we spent all day playing with the latest frameworks, learning new coding patterns, new cloud architecture. We had so much job security and such high hourly rates that we could just nerd out on technology all day. And we completely atrophied the muscle to work with the kinds of clients that we're all depending on more and more over time.
Because the hourly rate of a straight coder is in freefall. It's hard to imagine it'll be worth a dollar an hour a year from now. Maybe six months from now.
There's a concept here that's actually a familiar pattern wearing new clothes. When you're planning a new project or a major feature, you always look at what dependencies have hit the community. What packages exist. You don't roll your own authentication because there's always a project doing it better. You don't build your own rich text editor. You pick TinyMCE or whatever the leading solution is. You cut down development time by saying "we're not going to build that, use open source" for as many things as possible.
The new version of that pattern is AI. You look at a project and think, yeah, I can do this one-to-one with an agent. Like if you're digging a hole, you put the shovel down and jump in the excavator. But if you have many holes to dig, you need to ask yourself: is this worth automating? And you need to keep going back and experimenting, because something that's not worth automating today might make total sense tomorrow when a new model drops or a new feature lands in your agentic operating system.
People in the no-code automation space know things engineers need to know now. They understand the logic behind automating things. How do you identify what brings real value to real customers out in the wild? How do you put it all together in an automation that makes sense? Lead capture, lead funnels, lead nurturing. Hearing that talked about as one of the most important automation pipelines you need to be ready to offer was eye-opening for me. That's not something we were thinking about in startup land.
On the other side, now that these automators have been given this heavy equipment of coding through AI, they need to know: what's the proper way to deploy these things? How do you architect them? How do you maintain them? Because having a software engineer piloting your AI is still a huge advantage over someone who's just vibe coding with the AI as the lead engineer. When you put your lead of engineering on autopilot, you're at a huge disadvantage to someone who's got an experienced engineer orchestrating and overseeing the AI. You know what bad coding patterns look like. You know what bad database architecture looks like. You know that an acceptable pattern on one project might not be the best approach on another.
People who aren't coming from an engineering background tend to throw a one-liner at Claude and say "figure it out." Don't do that. Open up a text document, write a whole markdown file, dump all your thoughts, all your ideas, all your objectives in there. Describe what you really want to achieve. Maybe suggest some ways you might approach it, but mainly let Claude decide on the how. The more time you spend up front, the better. There is a point of diminishing returns, but most people aren't anywhere near it.
It's not entirely laziness. A lot of it is just not knowing the right questions to ask. When the AI asks clarifying questions, you want to be generous with your answers. An engineer can say "oh, that's a good point," and have a real engineering conversation, dig deeper, ask follow-up questions. Someone without that background is more likely to say "dude, I don't know, you're the engineer."
But I am going to say it's mostly laziness, because there's no reason in the world why you can't apply this same technology to learning. Some of my best custom SaaS clients right now started out just checking my work. They'd get AI to look over what I was doing and ask "is this person making good decisions? Is this legit or is this a line of bullshit?" And what started as verification of my knowledge and skill has turned into those clients having the beginnings of engineering conversations with me. There's no excuse why you can't go down that path.
I've been following some influencers in the automation space. One guy, Nate Herk, puts out a lot of content on YouTube. I initially kind of overlooked him because he was all geared towards n8n, and because there was no code involved, he didn't hit my radar as much. But as agentic automations started to disrupt that whole ecosystem, he came more and more into focus. Watching how he's adapting to these changes, coming from the automation side while I'm coming from the engineering side, it's been genuinely useful. We complement each other well.
And this is already shaping what I build next. I've got Upwork job scraping building up a solid database of market trends. I've got a quiz engine that started as a way to fill in my own technical gaps from years of self-taught, thrown-to-the-wolves learning. The next logical step is combining those into a lead recovery pipeline for Upwork. When someone messages me or invites me to a job, I can have an immediate, intelligent response. That's straight out of the automation playbook, applied with engineering.
The other thing I want to build is a proactive content pipeline. Right now my blog articles come from raw transcripts where I talk off the cuff about whatever pops into my head. But every time I publish through the automated pipeline, I'm also gathering data on engagement across social channels, Google Analytics, all of it. The next step is having an agent go out, do web searches, look at what people are talking about related to my topics, and then come back and interview me. Like how a podcast host preps questions for a guest. Have my agents prep questions for me, come at me with a list of things to respond to. It's a way to learn and share at the same time, and it would have a real impact on the direction and quality of everything I put out.