The Five Core Formats
The IPMR (and its predecessor, the CPR) organizes program performance data into five core formats. Each serves a distinct purpose and audience. Together, they provide a complete picture of program health — from the highest WBS summary down to individual variance narratives.
| Format | Name | Content | Primary Audience |
|---|---|---|---|
| Format 1 | WBS Report | PV, EV, AC, BAC, EAC by WBS element; cumulative and current period; cost and schedule variances | Program Manager, customer PM |
| Format 2 | Organizational Report | Same metrics as Format 1, organized by OBS (performing organization) instead of WBS | Functional managers, resource planners |
| Format 3 | Baseline Summary | Time-phased PMB, management reserve, undistributed budget, and total allocated budget by month | Baseline analysts, change control |
| Format 4 | Staffing Forecast | Planned vs. actual staffing by month, by functional category (engineering, manufacturing, etc.) | Resource managers, HR planning |
| Format 5 | Explanatory Notes | Variance narratives, EAC discussion, program issues, corrective actions, and management assessment | Everyone — the most-read format |
Formats 6 (IMS data extract) and 7 (historical/summary data) were added with the IPMR transition. Format 6 provides the government with machine-readable schedule data for independent analysis. Format 7 provides trending data across the program’s life.
Data Reconciliation: The Foundation of a Clean Report
The single most common cause of IPMR rejection is data inconsistency. The report draws from three independent systems — the scheduling tool, the accounting system, and the EV management tool — and if they disagree, the report is wrong. Reconciliation must happen before report generation, not after.
| Reconciliation Check | System A | System B | Common Mismatch |
|---|---|---|---|
| EV vs. Schedule Status | EV tool (% complete) | IMS (task status) | Task marked complete in IMS but EV not claimed; task in-progress in IMS but 0% EV |
| AC vs. Accounting | EV tool (actual cost) | General ledger | Costs booked to wrong WBS; accruals missing; indirect rate adjustments not applied |
| PV vs. Schedule | EV tool (time-phased budget) | IMS (resource loading) | Schedule shifted but budget not re-phased; resource rates updated in one system but not the other |
| BAC vs. Contract Value | EV tool (total budget) | Contract documents | Approved changes added to contract but not yet distributed to CAs; MR log out of sync |
Step 1: Pull the EV status report and the IMS status report side by side. For every control account, verify that tasks marked “complete” in the IMS have corresponding 100% EV claimed. Flag discrepancies.
Step 2: Compare AC in the EV tool to the general ledger cost report. Totals must match within $1. If they do not, trace the difference: missing accruals, cost transfers, or indirect rate true-ups are the usual suspects.
Step 3: Verify that the sum of all CA BACs plus Undistributed Budget plus Management Reserve equals the Contract Budget Base (CBB). If a contract modification was processed this period, confirm the new budget has been allocated.
Step 4: Spot-check 5–10 control accounts: does the time-phased PV curve in the EV tool match the resource-loaded schedule in the IMS? If a schedule change was made this month, was the budget re-phased to match?
Result: If all four checks pass, the data is clean and report generation can proceed. If any check fails, resolve the discrepancy before generating Format 1 — never submit a report with known data errors.
Format 5 Narrative Writing
Format 5 is the only part of the IPMR that most executives and government program managers actually read. It is the narrative layer that transforms data into insight. A well-written Format 5 tells the program’s story: what happened this month, why it matters, and what the team is doing about it.
Format 5 typically contains the following sections:
| Section | Content | Guidance |
|---|---|---|
| 1. Executive Summary | Program health overview in 2–3 paragraphs | Lead with the bottom line: is the program on track? State CPI, SPI, EAC, and key accomplishments or issues |
| 2. Variance Analysis | Narratives for each WBS element exceeding thresholds | Follow the 5-part structure: state → root cause → impact → corrective action → recovery timeline |
| 3. EAC Discussion | Management EAC rationale; comparison to statistical EAC | Explain why management EAC differs from BAC ÷ CPI; list specific drivers of the difference |
| 4. Schedule Assessment | Critical path status, milestone performance, schedule risk | Identify any milestones missed or at risk; explain schedule recovery plan if SPI ≤ 0.95 |
| 5. Staffing Assessment | Actual vs. planned staffing; key hires, departures, or gaps | Highlight any resource constraints affecting performance; tie staffing issues to specific variances |
💡 Write Format 5 for the Decision-Maker
The reader of Format 5 is a busy government PM who oversees multiple programs. They need to know three things: (1) Is this program in trouble? (2) Does the contractor understand the problems? (3) Is the contractor taking credible action? Write for that audience. Lead with conclusions, not background. Quantify everything. Avoid jargon that does not add precision. A Format 5 that requires the reader to cross-reference Format 1 to understand the narrative has failed its purpose.
Common Submission Errors and How to Avoid Them
DCMA and government program offices track submission quality. Repeated errors erode confidence in the contractor’s EVMS and can trigger a formal system surveillance review. The following errors account for the majority of report rejections:
| Error | Frequency | Impact | Prevention |
|---|---|---|---|
| Arithmetic errors in Format 1 | Very common | Report rejected; credibility damaged | Automated cross-footing checks before submission; never manually adjust cells |
| BAC ≠ CBB | Common | Suggests baseline integrity problem | Reconcile CBB to contract value every month; trace every contract mod to CA distribution |
| EAC inconsistency | Common | Format 1 EAC ≠ Format 5 EAC discussion | Generate both from the same data extract; never manually key EAC values into narratives |
| Stale narratives | Very common | Identical text copied from prior month | Require CAMs to update root cause, impact, and timeline each month; flag unchanged text |
| Missing period dates | Occasional | Government cannot file report correctly | Automate header generation from reporting calendar |
| Format 3 does not reflect approved changes | Occasional | Baseline change history is incomplete | Link Format 3 generation to the change control log; verify monthly |
The Monthly Close Process
Generating a clean IPMR requires a disciplined monthly close process. Every activity must happen in sequence — you cannot write narratives before data is reconciled, and you cannot reconcile before actuals are posted. The timeline is compressed: most contracts require submission within 15 business days of the reporting period end.
| Phase | Days | Activities | Deliverable |
|---|---|---|---|
| 1. Data Close | 1–3 | Post actuals, status schedule, claim EV, close accounting period | Raw data in all three systems |
| 2. Reconciliation | 4–5 | Cross-system reconciliation, resolve discrepancies, validate EACs | Clean, reconciled dataset |
| 3. Analysis | 6–8 | CAMs write variance narratives; program-level rollup; EAC review | Draft Format 5; updated EACs |
| 4. Report Generation | 9–11 | Generate Formats 1–4 from tool; assemble Format 5; cross-check | Draft IPMR package |
| 5. Review & Submit | 12–15 | PM review, quality check, final corrections, electronic submission | Submitted IPMR |
⚠️ Late Actuals Kill the Timeline
The most common reason for late IPMR submission is late cost data. If the accounting system does not close on time, every downstream activity shifts. Programs should negotiate a firm accounting close date with Finance and escalate immediately when it slips. A one-day delay in cost close typically produces a two-day delay in submission — because reconciliation and analysis are compressed rather than skipped, creating errors that require rework.
🎯 The Bottom Line
The monthly IPMR is the primary deliverable of the EVMS process — it is where data becomes information and information becomes decisions. Master the five formats, especially Format 5 where narrative meets numbers. Reconcile before you report — never after. Automate arithmetic checks to eliminate the errors that destroy credibility. And protect the monthly close timeline ruthlessly, because late data produces late reports and compressed analysis produces bad narratives. The IPMR is not paperwork — it is the program’s monthly opportunity to demonstrate command of its own performance.
Stop reading, start modeling
Model your process flow, run simulations, optimize staffing with TOC math, and test your knowledge with 107 interactive checks — all in one platform.