Accredited Trust and Fiduciary Advisor

CE Review Process
Policy & Decisions

The governance document behind the ATFA CE review process — the reasoning, rationale, and policy decisions that inform how the BEN Review Engine operates. Written for the Director, Committee, and future successors.

Status Draft — Internal
Version v0.1
Audience Director, Committee, Successors
Author Ben Hopf
Last Updated March 2026
I.

Purpose of This Document

This document records the policy decisions and operational rationale behind the ATFA CE review process. It is the companion to the BEN Review Engine system prompt — where the system prompt tells BEN what to do, this document explains why.

It is written for people, not machines. The primary audiences are the ATFA Director, the CE Review Committee, and anyone who may step into this role in the future. The goal is that a successor could read this document and understand not just the rules, but the thinking behind them — making the process defensible, transferable, and improvable over time.

Relationship to other documents: This is one of four working documents in the ATFA CE process system. See §VII for the full site map and links to each.
II.

CE Credit Calculation Policies

Policy CE-001
Official program CE data takes precedence over time-slot calculation
When a program provides official per-session CE credit amounts, use those as the authoritative source. Time-slot calculation is a secondary method used only when official per-session data does not exist.
Some programs publish cumulative CE totals (e.g., "16.5 ATFA hours for the full program") without per-session breakdowns. In those cases, we cannot use the program total to validate individual session submissions — we must calculate from time slots instead. Always document which method was used in the recommendation report so the Director and Committee can assess confidence.
Policy CE-002
CE credit rounding rules — time-slot calculation method
Under 30 min = 0 credits (flag for Director)  ·  30–49 min = 0.5 credits  ·  50–89 min = 1.0 credit  ·  90–119 min = 1.5 credits  ·  120–149 min = 2.0 credits  ·  150–179 min = 2.5 credits. Spare minutes below each threshold are lost — never rounded up.
Why these thresholds? Different programs use different definitions of a CE "hour" — some use 50-minute hours (common in academic settings), others use 60-minute hours. Rather than penalizing holders for submitting from programs that use 50-minute hours, ATFA's position in early years is that both 50-minute and 60-minute sessions qualify for 1.0 credit. This is a deliberate policy of accessibility during the ramp-up phase.

Why not round up spare minutes? A 55-minute session is 1.0 credit, not 1.1. Partial-credit accounting creates administrative complexity and opens the door to manipulation. The clean threshold approach is easier to apply consistently and harder to game. Spare minutes are simply lost — this is the industry-standard approach for most CE programs.

Future review: This policy should be revisited as ATFA's certificate holder base matures. Stricter thresholds (e.g., requiring 60 minutes minimum) may be appropriate in later years.
Policy CE-003
Non-CE sessions are ineligible regardless of program tier
Receptions, cocktail parties, meals, registration periods, breaks, administrative introductions, and closing wrap-ups are ineligible for CE credit. This applies even within Tier 1 Auto-Approved programs like TAF and TAI.
The ATFA certification is a professional designation — CE credits should reflect substantive educational content, not attendance at social events. Even within highly trusted programs, BEN should identify and flag non-educational sessions. The Director has discretion to override this in edge cases (e.g., a reception that includes a substantive keynote), but the default is ineligibility. When flagged, BEN should note the session type and the reason for ineligibility clearly in the report.
Policy CE-004
Discrepancies between submitted and calculated hours are flagged, not auto-denied
When a holder's submitted hours differ from BEN's calculated amount, flag the discrepancy in the recommendation report showing both figures. Do not auto-deny. Surface for Director review.
Holders may claim hours that differ from the official agenda for legitimate reasons — a session ran long, they attended an unscheduled workshop, or the program uses a different credit conversion. Auto-denying discrepancies would be punitive and would burden the Director with appeals. The better approach is to surface the gap and let the Director apply judgment. BEN's role is to flag, not adjudicate.
III.

Program Classification Policies

Policy CL-001
Tier 1 programs are auto-approved without individual Committee review
TAF (Trust Advisors Forum) and TAI (Trust Advisors Institute) submissions are recommended for approval automatically. Committee ratification is bulk, not individual.
Both TAF and TAI are administered by Campbell University and the Trust Education Foundation — the same institution that administers the ATFA certification itself. The Committee's review role exists to assess programs at arm's length; reviewing ATFA's own programs session-by-session is redundant and wastes Committee time. Bulk ratification preserves the formal governance record without the overhead.
Policy CL-002
Low-confidence matches are flagged, not auto-classified
When BEN matches a submission against a known program with low confidence (e.g., partial name match, inconsistent date, ambiguous session title), the submission is flagged for Director review rather than auto-classified into a tier.
A false positive — approving a submission that doesn't actually match a known program — is worse than a false negative that requires a human review. ATFA's credibility depends on the CE records being accurate and defensible. When BEN isn't sure, it should say so clearly rather than guess. Confidence notes in the recommendation report give the Director the context to resolve ambiguous cases quickly.
Policy CL-003
Newly approved Tier 4 programs are auto-promoted to Tier 3 at cycle close
At the close of each review cycle, every program approved from Tier 4 (Unknown) is automatically added to the Tier 3 (Known) database with full session metadata. This happens before the next cycle begins.
This is the core compounding mechanism of the system. The first year will have the highest Unknown volume — but every cycle thereafter benefits from the accumulated approvals of all prior cycles. In practice, this means the Director and Committee spend progressively less time on research-heavy Tier 4 submissions each year, and more time is freed for genuine judgment calls. The database is the asset; maintaining it rigorously is what makes the system self-improving.
IV.

Tier 4 Research Methodology

Policy RS-001
BEN exhausts automated research before flagging for human review
For every Tier 4 submission, BEN runs a four-layer research sweep (web search → LinkedIn → YouTube → Facebook/social) before producing a research summary card and flagging for Director and Committee review.
The research sweep exists to separate the work that a machine can do (finding and aggregating public information) from the work that requires human judgment (deciding whether that information is sufficient to approve a credit). The Director and Committee's time is valuable — they should not be spending it Googling event names. BEN does the legwork; humans make the call.
Policy RS-002
When research yields nothing, request documentation from the holder
If all four research layers return no verifiable information, BEN notes this explicitly in the research summary card and recommends that the Director request supporting documentation from the certificate holder before making an approval decision.
Some legitimate programs — particularly small regional conferences, internal corporate training events, or niche professional gatherings — simply don't have a public web footprint. The absence of online evidence is not proof of illegitimacy. In these cases, the burden of proof shifts to the holder: they should be able to provide a certificate of attendance, an agenda, or a letter from the organizer. This is a standard approach in professional CE administration.
Policy RS-003
BEN produces research summary cards, not approval decisions
BEN's output for Tier 4 submissions is a research summary card containing: source URLs, event description, agenda match assessment, legitimacy signals, calculated hours (if determinable), and a plain-language confidence note. The approval decision is made by the Director and Committee.
The line between research and judgment is intentional. BEN can determine whether information exists and what it says — it cannot determine whether ATFA should accept that information as sufficient. That requires institutional knowledge, context about the submitting holder, and professional discretion that only the humans in the process possess. Keeping BEN in the research role and the Director/Committee in the judgment role creates appropriate accountability.
V.

The Feedback Loop

The feedback loop is the most strategically important feature of the BEN system. It transforms each review cycle from a one-time event into a compounding investment in the Known Programs database.

Policy FL-001
How the feedback loop works
At cycle close, every Tier 4 program approved by the Committee is promoted to Tier 3 with full session metadata stored. In subsequent cycles, those programs are matched automatically — no research sweep required.
Year 1: High Unknown volume. BEN researches many programs from scratch. Committee reviews are thorough but time-intensive.

Year 2: Every program approved in Year 1 is now Tier 3. Only genuinely new programs hit Tier 4. Research volume drops meaningfully.

Year 3+: The database compounds. The most common programs in any given submission batch are likely ones that have been approved before. The marginal cost of each new cycle decreases over time.

This is why maintaining the Known Programs database rigorously — with full session metadata, not just program names — is worth the upfront effort. The database is the system's primary asset.
Practical implication: The quality of Year 1's database seeding has an outsized impact on Year 2 and beyond. Taking the time to compile prior-year approval records and load them into the Known Programs database before the first BEN-assisted cycle will significantly reduce the Tier 4 research burden in that first cycle.
VI.

Decision Log

A running log of significant process decisions — what was decided, when, and by whom. This creates an audit trail for governance purposes and helps future successors understand how the process evolved.

Date Decision Made By Notes
March 2026 Established CE credit rounding rules: 50–89 min = 1.0, 90–119 min = 1.5. Spare minutes lost, not rounded up. Ben Hopf Adopted lenient 50-min floor for Year 1 to accommodate programs using academic-hour conventions. To be reviewed in Year 3.
March 2026 Established four-tier program classification framework (Auto-Approved, Reputable, Known, Unknown). Ben Hopf TAF and TAI designated as sole Tier 1 programs. ABA, Cannon, FIRMA designated as initial Tier 2.
March 2026 Selected Google Sheets as v1 database and Committee review surface. CE App retained for holder-facing submission only. Ben Hopf CE App API capabilities pending assessment. May migrate if API supports bulk operations.
March 2026 Established BEN Review Engine as name for the process system, powered by Claude (Anthropic). Ben Hopf Final BEN acronym TBD — shortlist: Batch Evaluation Navigator, Bureau of Evaluation & Notation, Benchmark Evaluation Navigator.
March 2026 Established four-layer Tier 4 research protocol: web search → LinkedIn → YouTube → Facebook/social. Ben Hopf BEN produces research summary cards; Director and Committee make final approval decisions.
March 2026 All Committee communications via Google Sheets View+Comment. Comments serve as formal async decision record. Ben Hopf Synchronous call reserved for contested submissions only.
VII.

ATFA Working Documents — Site Map

All ATFA CE process documentation is published across the following subdomains. Each page serves a distinct audience and purpose.

Coming soon: database.atfacertification.com — read-only view of the Known Programs database for Committee reference.