Purpose of This Document
This document records the policy decisions and operational rationale behind the ATFA CE review process. It is the companion to the BEN Review Engine system prompt — where the system prompt tells BEN what to do, this document explains why.
It is written for people, not machines. The primary audiences are the ATFA Director, the CE Review Committee, and anyone who may step into this role in the future. The goal is that a successor could read this document and understand not just the rules, but the thinking behind them — making the process defensible, transferable, and improvable over time.
CE Credit Calculation Policies
Why not round up spare minutes? A 55-minute session is 1.0 credit, not 1.1. Partial-credit accounting creates administrative complexity and opens the door to manipulation. The clean threshold approach is easier to apply consistently and harder to game. Spare minutes are simply lost — this is the industry-standard approach for most CE programs.
Future review: This policy should be revisited as ATFA's certificate holder base matures. Stricter thresholds (e.g., requiring 60 minutes minimum) may be appropriate in later years.
Program Classification Policies
Tier 4 Research Methodology
The Feedback Loop
The feedback loop is the most strategically important feature of the BEN system. It transforms each review cycle from a one-time event into a compounding investment in the Known Programs database.
Year 2: Every program approved in Year 1 is now Tier 3. Only genuinely new programs hit Tier 4. Research volume drops meaningfully.
Year 3+: The database compounds. The most common programs in any given submission batch are likely ones that have been approved before. The marginal cost of each new cycle decreases over time.
This is why maintaining the Known Programs database rigorously — with full session metadata, not just program names — is worth the upfront effort. The database is the system's primary asset.
Decision Log
A running log of significant process decisions — what was decided, when, and by whom. This creates an audit trail for governance purposes and helps future successors understand how the process evolved.
| Date | Decision | Made By | Notes |
|---|---|---|---|
| March 2026 | Established CE credit rounding rules: 50–89 min = 1.0, 90–119 min = 1.5. Spare minutes lost, not rounded up. | Ben Hopf | Adopted lenient 50-min floor for Year 1 to accommodate programs using academic-hour conventions. To be reviewed in Year 3. |
| March 2026 | Established four-tier program classification framework (Auto-Approved, Reputable, Known, Unknown). | Ben Hopf | TAF and TAI designated as sole Tier 1 programs. ABA, Cannon, FIRMA designated as initial Tier 2. |
| March 2026 | Selected Google Sheets as v1 database and Committee review surface. CE App retained for holder-facing submission only. | Ben Hopf | CE App API capabilities pending assessment. May migrate if API supports bulk operations. |
| March 2026 | Established BEN Review Engine as name for the process system, powered by Claude (Anthropic). | Ben Hopf | Final BEN acronym TBD — shortlist: Batch Evaluation Navigator, Bureau of Evaluation & Notation, Benchmark Evaluation Navigator. |
| March 2026 | Established four-layer Tier 4 research protocol: web search → LinkedIn → YouTube → Facebook/social. | Ben Hopf | BEN produces research summary cards; Director and Committee make final approval decisions. |
| March 2026 | All Committee communications via Google Sheets View+Comment. Comments serve as formal async decision record. | Ben Hopf | Synchronous call reserved for contested submissions only. |
ATFA Working Documents — Site Map
All ATFA CE process documentation is published across the following subdomains. Each page serves a distinct audience and purpose.