$CTR Distribution Guide
Version 1.0
🎯 Purpose
This framework defines the standards, criteria, and process for allocating CTR (Contribution Tokens) to contributors within Matou DAO. CTR is a non-transferable, reputation-based token awarded to individuals who have completed peer-verified contributions that support the governance, development, and operation of Matou DAO.
The goal of this framework is to ensure:
- Consistent and fair recognition of effort
- Peer-reviewed validation of contributions
- A transparent reputation system that rewards meaningful participation
🪙 What is CTR?
CTR (Contribution Token) is issued to recognize verified contributions to the DAO. Unlike UTIL, which is used to access DAO services, CTR reflects the contributor's reputation and standing within the DAO.
✅ Key Characteristics:
- Non-transferable: Cannot be bought, sold, or traded.
- Earned only: Must be issued through a verified contribution process.
- Used for: Eligibility for roles, voting weight in some governance processes, or access to advanced features and decision-making privileges (based on DAO policy).
🔁 Allocation Process
| Step | Description |
|---|---|
| 1. Contribution Framed | A steward publishes a contribution request with clear outputs, roles, and peer reviewers. |
| 2. Work Completed | The contributor completes the task as defined in the request. |
| 3. Peer Review | A designated peer (or peers) evaluates the contribution based on quality, completeness, and adherence to the scope. |
| 4. Steward Review | The responsible steward confirms peer feedback and final output. |
| 5. CTR Allocated | CTR is awarded using Issuance Formula |
🧩 Peer Review Guidelines
All CTR-earning contributions must go through peer review. This ensures accountability and community alignment. A peer review should include:
-
Confirmation that the defined outputs were delivered
-
Assessment of quality and cultural alignment
-
Constructive feedback (if relevant)
-
Recommendation for CTR tier level
Score Definition 0 Some what complete, low effort, late, over budget 1 Basic task completion, meets criteria, completed on time, on budget 2 Complete, accurate, exceeds completion criteria, completed early 3 High-impact, highly exceeds expectations, under budget
Peer reviewers may be assigned in advance or nominated post-completion by the steward or community.
🧮 CTR Issuance Formula
CTR is issued proportionally to delivery quality, impact, complexity, and verified effort.
- Formula:
CTR issued = (Delivery + Impact + Complexity) × Effort - Delivery: 0–3 score for timeliness, completeness, and budget adherence.
- Impact: 0–3 score for realized value and significance of outcomes.
- Complexity: 0–3 score for technical/organizational difficulty and novelty.
- Effort:
clamp(Actual Hours / 8, 1, 20)— minimum 1.0, maximum 20.0.Actual Hoursiscontribution.metadata.actual_duration.
Scoring charts (0–3)
| Delivery | Definition |
|---|---|
| 0 | Issues with delivery; Incomplete, late or over budget |
| 1 | Good delivery; Meets criteria, on time and/or on budget |
| 2 | Great delivery; Exceeds criteria; early and/or under budget |
| 3 | Outstanding delivery; materially exceeds expectations and efficiency |
| Impact | Definition |
|---|---|
| 0 | Negligible; no significant impact of notice |
| 1 | Useful, limited scope value; local benefit |
| 2 | Clear, material value across the project/workstream |
| 3 | High, organization‑wide or strategic value |
| Complexity | Definition |
|---|---|
| 0 | Trivial; well‑known patterns; minimal coordination |
| 1 | Standard; moderate scope; normal coordination |
| 2 | Non‑trivial; multiple unknowns; cross‑team coordination |
| 3 | High novelty or ambiguity; significant technical/organizational risk |
Examples
| Actual Hours | Delivery | Impact | Complexity | Effort | CTR Issued |
|---|---|---|---|---|---|
| 5 | 1 | 1 | 1 | 1.0 | 1.0 |
| 8 | 1 | 2 | 1 | 1.0 | 1.0 |
| 12 | 2 | 2 | 1 | 1.5 | 1.5 |
| 24 | 3 | 2 | 2 | 3.0 | 3.0 |
Notes:
- If hours or score are disputed, stewards resolve and record the final values before issuance.
Policy safeguards:
- Caps: Optional per‑contribution and per‑period caps; default none.
- Evidence: Verify hours via commits/PRs, meeting notes, or timesheets; stewards may audit and adjust before issuance.
- Rounding: Round to the nearest 0.5 CTR; minimum issuance 0.5 CTR unless Score = 0.
Note on fairness: Do not apply contributor tier/role multipliers to CTR. Reserve such multipliers for UTIL to avoid compounding reputation.
🧠 Steward Responsibilities
Governance or project stewards are responsible for:
- Publishing clear, well-scoped contribution requests
- Assigning peer reviewers
- Reviewing and finalizing CTR allocations
- Updating contributor records
Stewards may consult the Incentivisation Registry and past CTR allocations to guide decisions.
📘 Documentation Requirements
Each verified contribution should be archived with the following:
- Contribution description and outcome
- Peer review summary (review form results)
- CTR allocation amount and tier
- Linked outputs (GitHub repo, doc, video, etc.)
This ensures transparency and builds a reliable contribution history.
🛡️ Principles
- Fairness: Recognition should match the value and effort of each contribution.
- Transparency: Clear standards and traceable decisions.
- Cultural integrity: Contributions should reflect kaupapa Māori and other Indigenous values where applicable.
- Meritocracy + Solidarity: Both excellence and collective contribution are celebrated.
🧭 Summary
| Feature | Description |
|---|---|
| Assessment Method | Score-based, reviewer-approved |
| Quality Dimensions | Completeness, usefulness, cultural fit |
| Token Formula | CTR = (Delivery + Impact + Complexity) × Effort |
| Governance Control | Governance Stewards validate awards |