How to Run a Deal Review That Actually Improves Margin (Template Included)
The Meeting That Moves the Margin Needle Most
Every mid-market VAR runs deal reviews. Weekly, bi-weekly, sometimes daily for end-of-quarter pushes. The agenda is always the same: walk the pipeline, update close dates, flag deals that are stuck, pressure-test commit numbers. Will it close? When? Is procurement engaged? Did we get the PO?
These are fine questions. They help you forecast. They do absolutely nothing for your margin.
Here's what never gets asked in a typical VAR deal review: "Why did you price this at 8%?" Or: "Do you know who else is bidding?" Or: "This account accepted 13% on a similar Cisco deal six months ago — why are you quoting 9% this time?"
Those questions — the pricing questions — are where the money is. Not the pipeline questions. The 30 seconds before a rep submits a quote, when someone with context and authority asks "why this markup?" and the rep has to justify the number out loud, is the single highest-leverage moment in your entire sales process. It's where millions of dollars of gross profit are won or lost every year.
And at most VARs, that moment doesn't exist. The rep picks a number, types it into the quoting tool, and sends it. No challenge. No review. No second opinion. The deal review happens afterward — once the price is already in the customer's hands and the margin is locked in.
You're running the meeting. You're just running it too late.
Here's what we've found working with VARs who've adopted margin-focused deal reviews: the margin improvement doesn't come from the review itself. It comes from the preparation. When reps know they'll face six specific questions before a quote ships, they gather the competitive intelligence, pull the historical data, and think through the pricing rationale before they ever walk into the room. The review is the enforcement mechanism. The behavior change is the product. VARs that implement this process typically see 1.5 to 3 points of GP% improvement within two quarters — not because the review catches bad pricing, but because reps stop submitting bad pricing in the first place.
What's Wrong With Pipeline-Only Deal Reviews
A $2M Palo Alto deal at 6% markup and a $2M Palo Alto deal at 12% markup look identical on a pipeline report. Same stage, same close date, same revenue. But one generates $120K in gross profit and the other generates $240K. The pipeline review can't tell the difference, and because it can't, it doesn't try.
The result: reps optimize for win rate, which means they optimize for low prices. And by the time a deal hits "Stage 4 — Proposal Sent" and shows up in the review, the quote is already out. The pricing decision was made days ago, by the rep, alone. You can't raise a price once the customer has seen a lower number. A margin-focused deal review flips the sequence. It happens before the quote goes out, not after. The price isn't sacred. It's a hypothesis that should survive scrutiny.
What a Margin-Focused Deal Review Looks Like
The mechanics are simple. Before any deal above your threshold gets quoted, the rep presents the deal to a review group — typically the sales manager, a pricing lead or sales ops analyst if you have one, and optionally a solutions architect who can speak to the services mix. The presentation takes 3–5 minutes per deal and follows a fixed template.
The rep doesn't present a slide deck. They answer six questions. The same six questions, every time, for every deal. The consistency is the point — it forces the rep to gather the information that drives good pricing before they sit down to build the quote. Most of the margin improvement comes not from the review itself, but from the preparation. When a rep knows they'll be asked "who are we competing against?" they go find out. When they know they'll be asked "what did we price last time at this account?" they look it up. The review creates the behavior.
Here are the six questions.
These six questions are the operational core of what we call the VAR Pricing Decision Tree — an eight-question framework for pricing any deal. The deal review uses six of the eight because two of them (deal type classification and strategic account position) are typically established before the review. If you want the full framework your reps can use when pricing independently, read The VAR Pricing Decision Tree.
Question 1: Who Are We Competing Against on This Deal?
This is the single most important input to a pricing decision, and it's the one reps most often skip. "I think it's competitive" is not an answer. "The customer mentioned they're also talking to SHI and possibly CDW" is an answer. "This is sole-source — we hold the SEWP (NASA's government-wide acquisition contract) contract vehicle and no other reseller is authorized" is a better one.
The review group's job is to calibrate the markup range based on the competitive landscape. If SHI is in the deal, the range compresses. If it's sole-source, the range expands significantly. If the rep doesn't know who's bidding, that's a data gap that should be closed before the quote goes out — a quick call to the customer's procurement contact or the OEM channel rep can usually surface it.
Question 2: What Did We Price Last Time on This Account/OEM Combination?
Past pricing is the best predictor of future pricing tolerance. If you won a Dell server deal at this account last quarter at 11%, you have a real data point. If you lost a Palo Alto deal at 14%, you have a ceiling. If this is the first deal at a new account, you say so — that's useful information too, because it tells the review group to be more conservative with the estimate.
This question forces reps to look backward before they look forward. It takes 90 seconds to pull up the last two or three deals in Salesforce. The number of reps who don't do this before pricing a deal is staggering.
Question 3: What Is the Deal Registration Status?
Deal reg changes the math fundamentally. If you have approved deal registration with Cisco, Palo Alto, Dell, or F5, your cost basis is 2–8 points lower than competitors who don't. That means you can price at market and make significantly better margin, or price slightly below market and still earn more than you would without deal reg.
The review group needs to know: Do we have deal reg? Did we apply for it and get denied (which means a competitor probably has it)? Did we not apply at all (which is a missed opportunity that should be flagged)? If a competitor has deal reg and we don't, that changes the competitive posture of the entire deal and should push the markup strategy toward services differentiation rather than product price competition.
Question 4: What Is the Margin Floor for This Deal Type?
Your reps should state the floor, their proposed markup, and the gap between the two. The signal you're listening for: deals priced within 1–2 points of the floor on sole-source opportunities. That's not competitive pricing — that's a rep defaulting to the lowest comfortable number. A competitive hardware refresh near the floor? Fine — the market set the price. A sole-source renewal priced near the floor? Push back. If the customer isn't shopping the deal, your rep shouldn't be pricing like they are.
The margin floor itself should flex by OEM and deal type. A 6% floor on competitive Cisco networking is realistic. A 6% floor on sole-source Palo Alto security is leaving 8–10 points on the table. If your floors don't vary by context, they're functioning as targets — and your reps are pricing to them.
Question 5: What Is the Customer's Historical Price Sensitivity?
Some customers negotiate every line item down to the penny. Others review the total, check it against budget, and sign. The rep should know which type this customer is — and if they don't, they should say so.
Historical price sensitivity can often be inferred from past deals. If the customer accepted a 12% markup on the last three Cisco deals without a single pushback email, the rep doesn't need to drop to 8% "just in case." If the customer ran a reverse auction last time and drove the price down by 6 points, the rep should price near the floor and focus energy on the services attach.
This question also surfaces a common failure mode: reps who assume price sensitivity based on the customer's industry or size rather than actual behavior. "They're a bank, so they're going to be price-sensitive" is a guess. "Their procurement team has pushed back on price in three of the last four deals, averaging a 2.5-point reduction from our initial quote" is an insight. The review should demand the latter.
Question 6: What Is the Product Mix and What Margin Range Is Typical for That Mix?
A deal that's 90% hardware and 10% services has a different margin profile than one that's 50/50. The rep should present the rough product mix and the expected margin on each component.
The review group should push on three specifics: Is there a deployment or migration services engagement the customer hasn't been offered? Is there a managed services wrapper that converts one-time hardware revenue into recurring margin? And has the rep scoped a health-check or assessment engagement that often leads to Phase 2 purchases?
Most reps default to quoting the hardware the customer asked for. They don't proactively scope services because it requires a different conversation — one about outcomes, not products. The deal review is where that conversation gets forced. When someone in the room asks "what's the services play on this deal?" on every deal, reps start preparing for the question. Within two quarters, services attach rates typically climb 8–15 percentage points.
The margin math is transformative. A $500K Cisco Catalyst refresh at 9% GP generates $45K in gross profit. Add a $120K deployment and migration services engagement at 35% GP and the deal becomes $620K at 13.5% blended — $83,700 in gross profit. The hardware margin didn't change. The services turned a thin deal into a healthy one.
This is also where the review group can catch the most common services miss in the channel: the customer who's about to buy hardware from you and professional services from someone else. If a customer is buying 400 access points, someone is going to deploy them. If your rep doesn't quote the deployment, a local integrator or the OEM's own services arm will. That's not just lost services revenue — it's a competitor who now has a relationship inside your account.
The Deal Review Template
Margin-Focused Deal Review Template
Complete before every deal review session
| Deal Overview | |
| Deal Name & Customer | |
| Deal Amount | |
| OEM / Vendor | |
| Deal Type | Hardware refresh / Net-new / Displacement / ELA (Enterprise License Agreement) renewal / Services-only |
| Competitive & Historical Context | |
| Competitive Landscape | Sole-source / Competitive — list known competitors |
| Deal Registration Status | Approved / Pending / Denied / Not submitted |
| Last Price at This Account + OEM | GP% on most recent comparable deal, or "New account" |
| Customer Price Sensitivity | High / Moderate / Low — with supporting evidence |
| Pricing & Mix | |
| Product Mix | % hardware / % software-licensing / % services |
| Margin Floor for This Deal Type | Per company pricing policy |
| Proposed Markup (Hardware) | |
| Proposed Markup (Services) | |
| Blended GP% | |
| Blended GP$ | |
| Rationale & Decision | |
| Rep's Pricing Rationale | 2–3 sentences: why this markup, given the above inputs |
| Review Group Decision | Approved / Approved with adjustment / Requires re-pricing |
| Adjusted Markup (if changed) | |
| Notes | |
Download the Deal Review Template (PDF)
If your team quotes 40 deals per quarter and this process reprices even a quarter of them upward by 2 margin points, that's $200K–$400K in annual GP you're currently leaving on the table. The math depends on your deal volume and average deal size, but for a $200M VAR, the typical impact is $300K–$600K in the first year. Download the template, run it for one quarter, and measure the difference.