I am a...
Learn more
How it worksPricingFAQ
Account
May 8, 2026 · 10 min read · Cadence Editorial

How to vet a software developer before hiring

how to vet a developer — How to vet a software developer before hiring
Photo by [Jakub Zerdzicki](https://www.pexels.com/@jakubzerdzicki) on [Pexels](https://www.pexels.com/photo/developer-reviewing-code-on-tablet-in-office-36598855/)

How to vet a software developer before hiring

To vet a software developer before hiring, run them through a 4-stage gate: resume signal, portfolio review, a paid 1 to 2 week trial project on real (small) scope, and reference calls. Score each stage on a written rubric. If they pass all four, hire. If they fail the trial, you spent $1,000 to learn instead of $50,000.

That's the short version. The long version is what most founders get wrong, why their last two hires didn't work out, and how to design each stage so the signal is real instead of theater.

Why most developer vetting fails

Most vetting processes are optimized for the wrong thing. They reward people who interview well, not people who ship. A whiteboard reverse-a-linked-list session predicts whiteboard skill. A LeetCode-style coding test rewards LeetCode prep. A "tell me about a hard bug" story rewards storytelling.

None of those tell you whether the person will open a Cursor session on Monday and have a working PR by Wednesday.

The cost of getting this wrong is brutal. A bad mid-level hire at a $120k salary costs roughly $50,000 to $80,000 in payroll, ramp time, and review-cycle drag before you cut them. A bad senior hire is twice that. The right vetting process catches the same signal in a few hundred dollars and a few days.

The fix is to stop relying on proxies and start watching them work.

The 4-stage vetting gate

Every developer worth hiring will pass four stages in order. If they fail any one, stop. Don't make exceptions for "but they were so impressive in the interview."

StageWhat you doTimeCost
1. Resume + GitHub signal scanSkim public artifacts for shipped work15 min$0
2. Portfolio walk-through45-min call: they walk you through real code45 min$0
3. Paid trial projectReal 1 to 2 week scope on production-adjacent code1 to 2 weeks$1,000 to $6,000
4. Reference calls2 to 3 calls with past managers or clients90 min$0

Each stage filters for a different thing. Stage 1 filters for credibility. Stage 2 filters for thinking. Stage 3 filters for shipping. Stage 4 filters for the things they didn't tell you.

Skip any one of these and you're guessing. Run them all and your false-positive rate drops below 10%.

Stage 1: Resume signal scan (15 minutes)

This is a triage step, not a deep read. You're looking for green flags and red flags in 15 minutes flat.

Green flags: a public product they shipped with a working URL. Recent GitHub commits (not just old stars). A blog post or Loom walking through something they built. A stack on their resume that matches the job within reason (React for React work, Postgres for backend work, etc.).

Red flags: 12 jobs in 5 years with nothing shipped. A GitHub with 200 stars on a tutorial fork and zero original repos. A 4-page resume with 40 buzzwords and no products. Refusal to share a portfolio link.

For developers in specific markets, the signal pool changes. If you're hiring developers in Berlin or recruiting from Eastern Europe, expect different LinkedIn-vs-GitHub-vs-personal-site mixes by region. Adjust where you look, not what you look for.

If a candidate clears Stage 1, advance them. If they don't, write a one-line rejection note and move on. This stage exists so you don't waste 45 minutes on someone whose GitHub is empty.

Stage 2: Portfolio review as a screen

This is where most founders go wrong. They turn the second-stage interview into a behavioral chat or a generic technical quiz. Don't.

Instead, ask the candidate to pick one repo or one PR they're proud of. Have them share their screen. Walk through the code with them. Ask why they made specific decisions: why this database, why this pattern, why this dependency, why not the obvious alternative.

You're listening for three things.

Specificity: Can they explain the trade-off they made and the one they rejected? Vague answers ("it just felt right") are red flags. Specific answers ("we picked Postgres over DynamoDB because we needed transactional reads on the same table we were writing user events to") are green.

AI-native fluency: Ask them to walk through how they used Cursor, Claude Code, or Copilot on the project. What did they delegate to the model? What did they keep human? How did they verify the AI's output? AI fluency is now table stakes for any working engineer in 2026, not a nice-to-have. If they can't articulate a workflow with prompt-as-spec discipline and verification habits, that's a no.

The "last bug" question: ask them about the last production bug they shipped and how they found it. Engineers who've actually shipped have detailed war stories. Engineers who haven't will pivot to hypotheticals.

If the candidate clears Stage 2, you've spent 45 minutes and now have real signal. Time to put money on the table.

Stage 3: Paid trial project (where the real signal is)

This is the stage that separates serious vetting from theater. You hire the candidate for a small, real, paid project. You watch them ship.

The scope matters. Too small and they can fake it. Too large and you'll have wasted weeks if they fail. The right scope is something you'd give a new hire in their first week or two: a real feature, a real refactor, or a real integration, with clear acceptance criteria.

For a mid-level engineer, scope a 1 to 2 week task. Pay them $1,000 (one week at the Cadence Mid tier rate) for a 1-week trial, or $2,000 for two weeks. For a senior engineer doing architecture or complex feature work, scope a 3 to 4 week task and pay $4,500 to $6,000 (three to four weeks at the $1,500/week Cadence Senior rate). For a lead doing systems design, scope a fractional engagement at $2,000/week.

If you've never run a trial before, our guide on hiring a developer for a side project has worked examples for narrow scopes you can copy.

What you're scoring during the trial:

  • Shipping: did they merge a working PR within the timeline?
  • Code quality: does the diff read cleanly? Are there tests? Are edge cases handled?
  • Communication: do they ask clarifying questions early instead of guessing? Do they post a daily update or vanish for 3 days?
  • Judgment: when the spec was ambiguous, did they make a reasonable call and flag it, or did they pick the easiest interpretation?

Score each on a 1 to 5 rubric. Anything averaging below a 3 is a no. Anything above a 4 is a strong yes. The middle is where you call references and decide.

Cadence runs this exact pattern at platform scale. The 48-hour free trial is structurally the same gate: you book an engineer, they start work, and you keep them or cut them based on real shipping signal. Across the platform, 67% of trials convert to active engagements past the 48-hour mark, and median time to first commit sits at 27 hours. That's the bar a paid trial should clear.

If you're vetting a single candidate yourself, you're running this same play, just slower and more expensive. If you want the gate run for you, that's what platforms exist for.

Stage 4: Reference calls that actually tell you something

Most reference calls are useless because most founders ask useless questions. "Was Sam a hard worker?" gets you "Yes, Sam was great." Done.

Real reference calls dig for specifics. Three questions that work:

  1. "Walk me through a project Sam shipped end-to-end. What did they own?" This reveals whether the reference can describe Sam's actual work or is paraphrasing the resume.
  2. "If you were starting a new company tomorrow, would you hire Sam again?" Pause for the silence. The pause tells you everything.
  3. "What kind of project would you NOT put Sam on?" Every engineer has weak spots. References who can't name one are protecting them.

Get at least three references. At least two should be direct managers or technical leads, not peer engineers. Backchannel one of them: find a mutual connection on LinkedIn the candidate didn't list and ask informally.

If references and trial signal both come back strong, hire. If either is weak, don't.

When to skip the trial

The 4-stage gate is the default. There are exactly three situations where skipping the trial is rational.

Trusted CTO referral: a founder or technical lead you genuinely trust personally vouches for the engineer. They've worked with them, watched them ship, and would put their own money on it. This is rare. Most "referrals" are LinkedIn messages from acquaintances, which don't count.

You've worked with them before: you already have direct shipping signal. Skip the trial, run a 30-minute alignment call on the new scope, and start.

You're booking by the week with weekly cancel rights: this is the structural shortcut. If your engagement is week-to-week with no notice period, the engagement itself becomes the trial. Cadence works this way: weekly billing, replace any week, no notice. The 48-hour free trial sits on top, so you've got two safety nets in place before you owe anything. If you're hiring full-time, you don't have that structure, so you need the standalone trial.

If you want a deeper read on whether to hire full-time at all, our guide on hiring AWS engineers and hiring a DevOps engineer for a startup both walk through the build-vs-buy decision before you ever start vetting.

What this all costs you (real numbers)

Here's the honest math comparing common approaches.

ApproachUpfront costTime to signalSignal qualityRisk if wrong
Whiteboard interview only$01 weekLowHigh (90% of bad hires came from "great interview")
Unpaid take-home testTheir unpaid hours3 to 7 daysMedium (AI-generated risk)Medium
Paid trial 1 to 2 weeks (mid)$1,000 to $2,0001 to 2 weeksHighLow ($1k spent, no commitment)
Paid trial 3 to 4 weeks (senior)$4,500 to $6,0003 to 4 weeksVery highLow
Full hiring loop$0 to $15k recruiter fee4 to 8 weeksMedium-highVery high (6 to 9 months payroll if wrong)
Cadence booking$500 to $2,000/week48 hoursHigh (real shipping signal)Very low (replace any week)

The full hiring loop is the most expensive option when you count the cost of getting it wrong, even though the upfront cash spend is zero. A $15,000 recruiter fee plus 60 days of CEO time plus a 30% chance of a bad hire is not a deal.

A paid trial flips that math. You pay $1,000 to $6,000 to convert a 30% bad-hire risk into a 5% bad-hire risk. That's the single best dollar you spend in early-stage hiring.

If you're already running this kind of week-by-week structure, you can book your first engineer through Cadence and the 48-hour trial folds into your normal vetting flow. Every engineer on Cadence is AI-native by default, vetted on Cursor, Claude Code, and Copilot fluency before they unlock bookings, so Stage 2's AI-native check is already done.

What good looks like

A clean vet of a mid-level engineer takes 2 to 3 weeks total: 15 minutes Stage 1, 45 minutes Stage 2, 1 to 2 weeks Stage 3, 90 minutes Stage 4 spread across calls. Total cost: $1,000 to $2,000.

A clean vet of a senior engineer takes 4 to 5 weeks total at $4,500 to $6,000.

If you're spending more than that and still not sure, your rubric is fuzzy and you're treating the trial as a vibes check instead of a scored exercise. Write the rubric down before you start. Score every stage. Trust the score.

If you want the question bank for Stage 2, our companion piece on the questions to ask a developer in interviews drills into the exact prompts that surface real signal vs interview theater.

If you don't want to run all four stages yourself, Cadence is built around this structure. Weekly billing, 48-hour free trial, replace any week, daily ratings. You get the trial-stage signal without managing the full vetting pipeline. See how the hiring flow works.

FAQ

How long should a developer trial project last?

One to two weeks for a mid-level engineer working on a real feature, three to four weeks for a senior engineer doing architecture or complex refactor work. Anything shorter and you don't see their working pattern. Anything longer and you're not running a trial, you're running a contract.

Should I pay for the trial project?

Yes. Unpaid trials filter out the strongest candidates, who already have offers and won't waste a weekend. Paid trials get serious applicants and signal that you respect their time. Budget $1,000 for a 1-week mid-level trial as the floor.

What's a fair rate for a trial project?

Match the trial rate to the going engagement rate. The Cadence anchor is $500/week junior, $1,000/week mid, $1,500/week senior, $2,000/week lead. If you're hiring against a different geography or seniority, those tiers still work as a sanity check on what you'd pay long-term.

How do I evaluate a developer if I'm non-technical?

Two options. Hire a fractional CTO or trusted technical advisor to design the trial scope and review the code, then you score the soft skills (communication, judgment, shipping cadence). Or use a vetted platform that runs the technical screen for you and you only review trial output. Don't try to fake-evaluate code yourself; the cost of being wrong is too high.

What's the difference between vetting and hiring?

Vetting is the signal-gathering stage that happens before you commit to a long-term placement. Hiring is the commitment. The 4-stage gate exists so the hiring decision is data-backed instead of vibes-based. If your engagement is week-to-week (booking, not hiring), the gate compresses because the engagement itself becomes the trial.

All posts