R20 Dreamcatcher Simulation Toolbox
Following from https://ncase.me/trust/ we should build a simulation tool for the Dreamcatcher that lets us:
Given that all actors in the dreamcatcher have a discrete set of actions, and each one can carry different amounts of error and bad behaviour, we should be able to make a similar game for the dreamcatcher, where we can play with the knobs and dials and see what the outcome will be ?
Further, we should be able to take a snapshot of the current system and model it out to increased actors, increased rounds of play, so we can predict where the system is going, and then show how the simulation changes if we add more or less of different types of actor, or add in a new system, new piece of code, change the QA threshold, etc
I think this is the part the guys at CERN were getting really excited about, since you can look at someones behaviour, and given the current system, predict their future outcomes, but also guide them to more desirable outcomes through introductions. We could also actively "juice" the system, by introducing a "good citizen" coin, where you earn this by taking a less immediately rewarding path, in favour of something that benefits the long term network simulation. For example, favouring repeat interactions since this is better for the trust metrics
the smallest form of regulation is the blockchains reliable execution, so this alone is an increase in regulation and is near zero cost - we need to tune what this does to favour the simulation, so changes to the core code can be shown to have effects on the future, and the core code can give out rewards for good behaviour.
apply this to the securities guardian https://dreamcatcher-tech.github.io/dust/Requests/R04 by detecting which archetype you are based on behaviour and triggering the guardian if you're acting like you aren't participating but just copying or always saying yes
To quote the game:
If there's one big takeaway from all of game theory, it's this:
What the game is, defines what the players do. Our problem today isn't just that people are losing trust, it's that our environment acts against the evolution of trust.
That may seem cynical or naive -- that we're "merely" products of our environment -- but as game theory reminds us, we are each others' environment. In the short run, the game defines the players. But in the long run, it's us players who define the game.
So, do what you can do, to create the conditions necessary to evolve trust. Build relationships. Find win-wins. Communicate clearly. Maybe then, we can stop firing at each other, get out of our own trenches, cross No Man's Land to come together...