(Reputation Economy) Forgiveness and the Elastic Freeing of Opportunity Cost
Just three years ago I met my friend Trudy and soon enough we would speak on the phone for hours about the most varied topics, mostly relating to how to use lightweight technology to help people through the next recession (because recessions are inevitable).
We soon found ourselves discussing something called “theory of mind,” which is “…necessary to understand that others have beliefs, desires, intentions, and perspectives that are different from one’s own.”
Push came to shove, and I went out and purchased a stack of 4” square cardboard boxes and about 1,000 of decorative elastic string.
Why? Because I wanted to explore “in the real world” the kind of relationships one might model in a multi-dimensional graph database.
Because I come from the technology world, I’d been around some wild applications of graph databases for over 20 years.
In 2000–2001 I managed design of a graph database used to help broadcast news outlets such as CNN make good use of their vast stores of video, audio, and images, and these databases were vast in scale and in scope.
I spent many years in the information security industry (anti-fraud and anti-hackers), and oversaw a graph database that actively monitored 7,000,000,000 devices and billions of transactions.
Interesting aside, these graph databases are the domain of machine learning algorithms that are frequently described as examples of artificial intelligence (AI) by people who work in marketing.
You can construct an array of algorithmic “profilers” which traverse the relationship between nodes (which sometimes represent people) to promote and support in certain applications.
Here’s one that’s really interesting:
There are a lot of machine learning algorithms that are designed to identify the likelihood that somebody will engage in fraud or other untrustworthy behavior.
These algorithms are designed in such a way that assume everybody is inherently untrustworthy. This is why, for example, your bank will disable your credit card as soon as you travel outside your normal geographical region.
In fact, these assumptions are false.
We ran a series of experiments that change the assumptions, leveraging the billions of transactions compiled over the past 15 years, from hundreds of different industries and thousands of vendors, modifying the baseline assumption in the following manner:
We assumed that everybody is inherently trustworthy, but some people might periodically bend the rules, and we factored for an organizational cultural embrace of redemption and forgiveness.
Long story short, the accuracy and effectiveness of these machine learning algorithms skyrocketed, because guess why?
Most people are inherently trustworthy. Duh.
This work helped me better understand the nature of how tightly we are connected to one another, but I wanted to explore these connections in my son’s recently-vacated basement bedroom, so I ran lengths of elastic string and connected it to boxes.
Graph models can simulate the nature of how we are connected, and at the time I wanted to better understand her contribution to theory of mind (TOM), which she’d received as a gestalt a few years prior.
Luckily for you, I don’t have a photo to share. The thing looked crazy, but in the years since I’ve ruminated upon the experience as I’ve explored certain ideas about the economic byproducts of forgiveness and redemption.
The stewards of our relationships with one another are our attorneys, and they are financially incentivized to perpetuate conflict because they charged by the hour to deliver a “win.”
They represent a significant vector for entropy, and this entropic tendency completely bogs down the architecture of our institutions, over time making us more and more ossified and biased towards collapse.
All of us know somebody who can’t seem to extricate themselves from an impossibly lengthy and complicated conflict with someone else.
Think about the nature of their entangled relationship from the perspective of a graph model which might capture the energy invested in such a way that it could be expressed with the elasticity of a rubber band.
Like I said, I purchased 1,000 feet of decorative elastic string, and I used that to suspend a series of small cardboard boxes.
It doesn’t take long to completely bog the model down when trying to simulate the entangled nature of conflict between two or more people.
Think about this.
Have you ever seen the videos where people wrap an increasing number of rubber bands around a watermelon?
Eventually the aggregate energetic potential crosses a certain threshold and the rubber bands cause the watermelon to explode.
Imagine this from the perspective of an option to unlock the energetic potential for embracing forgiveness and redemption?
Don’t try and figure out how to do this within our current society; the model is too large, there’s too much entropy, there’s too many organizational inhibitions to introducing any such innovation.
My company, 214 Alpha, has built a mobile app that enables people to pay for just enough governance to get the job done, and not a single bit more.
Our explicit emphasis is upon micro-economies, such as a 24/7 virtual farmers market, so local people can purchase locally-sourced produce from local producers.
Our app features seven pre-integrated features necessary for self-governance, including digital identity, identity verification, communication, commerce, banking, reputation, arbitration, and governance.
The reputation economy enables communities to provide financial incentives for members to embrace values which might be important to them.
Not only do people earn a reputation score, but they are actually paid in community vouchers as a bonus above and beyond the profits that they might generate within the exclusive marketplace.
Technically, this is referred to as gamification, which is how video games provide incentives to players to learn and explore the unique qualities of the gaming environment.
My design for the reputation economy was inspired by one of the oldest living codes on earth; the Pashtunwali (the people who live in the mountains between Afghanistan and Pakistan).
There are two dominant pivot points within our reputation economy, and exploration of the model described within this article was performed in pursuit of defining a third:
Selvage pays people a bonus if they can innovate a way to reduce the rates of arbitration within our system.
The economics behind this are pretty simple: by reducing the rates of arbitration, they optimize our system. Pretty straightforward.
In this manner, a skilled mediator could better compete head-to-head with attorneys who are ordinarily incentivized to perpetuate conflict, as described previously.
Weft: a person is financially rewarded if they can innovate away to elevate the reputation of others, and the economic justification is similarly straightforward: it helps optimize our system.
Let’s assume there’s a person in one of our micro economies that has the highest possible reputation score.
This represents one end of the rubber band, so to speak, and the other end is connected to one or more members that have zero or even a negative reputation score.
The person with the highest reputation score has an elastic (sliding scale) reward, relative to how well and how broadly they leverage their reputation to elevate the reputation of others.
They don’t have to do it. It’s entirely voluntary, which ensures that our solution remains consistent with a nonaggression principle, or NAP.
However, a person might choose to elevate their return on investment by applying some creativity, because:
Not only does it make for a good feel good philosophy, this is actually true within our software.
The person with the highest reputation score can’t really improve it that much more until he or she helps elevate the reputation of others, because they rewards for triggering the reputation economy approach diminishing returns for those who are orders of magnitude better represented than others.
So, like I said. It’s entirely voluntary, but if a person really wants to continue earning an elevated return on investment, they’re going to need to help elevate the reputation of others.
A technical aside: one cannot earn in reputation bonus more than they had to pay for the transaction in the first place, and 80% of the transaction fees are kept by the community themselves as a form of voluntary tax that makes the entire endeavor financially self-sufficient in the first place.
In our discussions with indigenous communities they are shocked and excited to recognize that our reputation system helps them model values their “keepers of the fire” have kept alive for millennia.
This third dimension (which we’ve not publicly named) rewards people for embracing forgiveness and redemption, and it works in a similar manner.
Opportunity cost is “the loss of potential gain from other alternatives when one alternative is chosen.”
That’s the official definition, but let’s go back for a moment and consider a watermelon under the pressure of 1,000 rubber bands.
In our solution, parties are presented the option to simply forgive one another, seek and deliver public redemption, and just walk away.
Of course there’s a reputation score within our software, but when two or more parties are in conflict it’s likely that one or both parties are diminishing the perceived reputation of others behind their backs.
In order to secure forgiveness and redemption, both parties would have to make amends for any damage which might have occurred during the period of conflict, and if both parties are satisfied, and the communities governance model agrees, they can “cut the Band-Aids” and unlock the trapped opportunity cost.
This means that both parties are actually financially incentivized to embrace the philosophy of forgiveness and redemption, but only if they accrue increased Selvage and Weft reputation above and beyond the statistical norm.
In other words: blessed (and rewarded) are the peacemakers, provided they invest in improving their own demonstrative ability to forgive and forget, and help their fellow community.
How do we know that people aren’t going to cheat the system?
Remember: most people are inherently trustworthy, so this isn’t the problem you might think it is.
The word community has been misused by large companies, so I should emphasize that we’re talking about 30, 80, 150, maybe 1000 people at most.
This could be at your church parish, your neighborhood, vegans who live on the West Coast, etc.
When you re-define what it means to be part of a community, it removes any inhibitions you might have to actually just get started.
When people interact with one another within a proper community of their peers, they’re less biased towards trying to “cheat the system.“
214 Alpha doesn’t support large multi-national corporations, so it’s less likely that the relationship between people in the community are launched from a place of cynicism.
In fact, most people are shocked to recognize that their primary social relationships are within one or more overlapping community that’s extremely local.
If there’s an elastic string that connects you to your community, that community probably fills a room, and is not as global as they would lead you to believe, not to suggest that you don’t know people in other countries.
When marketing departments talk about a global community, they’re really talking about their market capitalization, and they’re not referring to the overwhelming majority of interactions you have with the people closest to you.
Start small, start “local,“ and extrapolate.
Modify the nature of your interpersonal relationships, and you might find that your quality of living substantially improves.