Skip to main content

$CTR Distribution Guide


Version 1.0


🎯 Purpose

This framework defines the standards, criteria, and process for allocating CTR (Contribution Tokens) to contributors within Matou DAO. CTR is a non-transferable, reputation-based token awarded to individuals who have completed peer-verified contributions that support the governance, development, and operation of Matou DAO.

The goal of this framework is to ensure:

  • Consistent and fair recognition of effort
  • Peer-reviewed validation of contributions
  • A transparent reputation system that rewards meaningful participation

🪙 What is CTR?

CTR (Contribution Token) is issued to recognize verified contributions to the DAO. Unlike UTIL, which is used to access DAO services, CTR reflects the contributor's reputation and standing within the DAO.

✅ Key Characteristics:

  • Non-transferable: Cannot be bought, sold, or traded.
  • Earned only: Must be issued through a verified contribution process.
  • Used for: Eligibility for roles, voting weight in some governance processes, or access to advanced features and decision-making privileges (based on DAO policy).

🔁 Allocation Process

StepDescription
1. Contribution FramedA steward publishes a contribution request with clear outputs, roles, and peer reviewers.
2. Work CompletedThe contributor completes the task as defined in the request.
3. Peer ReviewA designated peer (or peers) evaluates the contribution based on quality, completeness, and adherence to the scope.
4. Steward ReviewThe responsible steward confirms peer feedback and final output.
5. CTR AllocatedCTR is awarded using Issuance Formula

🧩 Peer Review Guidelines

All CTR-earning contributions must go through peer review. This ensures accountability and community alignment. A peer review should include:

  • Confirmation that the defined outputs were delivered

  • Assessment of quality and cultural alignment

  • Constructive feedback (if relevant)

  • Recommendation for CTR tier level

    ScoreDefinition
    0Some what complete, low effort, late, over budget
    1Basic task completion, meets criteria, completed on time, on budget
    2Complete, accurate, exceeds completion criteria, completed early
    3High-impact, highly exceeds expectations, under budget

Peer reviewers may be assigned in advance or nominated post-completion by the steward or community.


🧮 CTR Issuance Formula

CTR is issued proportionally to delivery quality, impact, complexity, and verified effort.

  • Formula: CTR issued = (Delivery + Impact + Complexity) × Effort
  • Delivery: 0–3 score for timeliness, completeness, and budget adherence.
  • Impact: 0–3 score for realized value and significance of outcomes.
  • Complexity: 0–3 score for technical/organizational difficulty and novelty.
  • Effort: clamp(Actual Hours / 8, 1, 20) — minimum 1.0, maximum 20.0. Actual Hours is contribution.metadata.actual_duration.

Scoring charts (0–3)

DeliveryDefinition
0Issues with delivery; Incomplete, late or over budget
1Good delivery; Meets criteria, on time and/or on budget
2Great delivery; Exceeds criteria; early and/or under budget
3Outstanding delivery; materially exceeds expectations and efficiency
ImpactDefinition
0Negligible; no significant impact of notice
1Useful, limited scope value; local benefit
2Clear, material value across the project/workstream
3High, organization‑wide or strategic value
ComplexityDefinition
0Trivial; well‑known patterns; minimal coordination
1Standard; moderate scope; normal coordination
2Non‑trivial; multiple unknowns; cross‑team coordination
3High novelty or ambiguity; significant technical/organizational risk

Examples

Actual HoursDeliveryImpactComplexityEffortCTR Issued
51111.01.0
81211.01.0
122211.51.5
243223.03.0

Notes:

  • If hours or score are disputed, stewards resolve and record the final values before issuance.

Policy safeguards:

  • Caps: Optional per‑contribution and per‑period caps; default none.
  • Evidence: Verify hours via commits/PRs, meeting notes, or timesheets; stewards may audit and adjust before issuance.
  • Rounding: Round to the nearest 0.5 CTR; minimum issuance 0.5 CTR unless Score = 0.

Note on fairness: Do not apply contributor tier/role multipliers to CTR. Reserve such multipliers for UTIL to avoid compounding reputation.


🧠 Steward Responsibilities

Governance or project stewards are responsible for:

  • Publishing clear, well-scoped contribution requests
  • Assigning peer reviewers
  • Reviewing and finalizing CTR allocations
  • Updating contributor records

Stewards may consult the Incentivisation Registry and past CTR allocations to guide decisions.


📘 Documentation Requirements

Each verified contribution should be archived with the following:

  • Contribution description and outcome
  • Peer review summary (review form results)
  • CTR allocation amount and tier
  • Linked outputs (GitHub repo, doc, video, etc.)

This ensures transparency and builds a reliable contribution history.


🛡️ Principles

  • Fairness: Recognition should match the value and effort of each contribution.
  • Transparency: Clear standards and traceable decisions.
  • Cultural integrity: Contributions should reflect kaupapa Māori and other Indigenous values where applicable.
  • Meritocracy + Solidarity: Both excellence and collective contribution are celebrated.

🧭 Summary

FeatureDescription
Assessment MethodScore-based, reviewer-approved
Quality DimensionsCompleteness, usefulness, cultural fit
Token FormulaCTR = (Delivery + Impact + Complexity) × Effort
Governance ControlGovernance Stewards validate awards