top of page

THE WORLD NEEDS A PLATFORM FOR TRUST, COMMUNICATION AND COMMUNITY BUILDING


Truth, Trust, Communication, and Community.

Truth is boring. It doesn’t sell ads; doesn’t attract audiences; doesn’t excite us; and nowadays it doesn’t win elections. Truth is just is what it is – in today’s world, truth is easy to ignore.

Trust too is boring. We only miss it when we don’t have it. Trust also doesn’t attract audiences, largely because it doesn’t come easy and it doesn’t come fast. If trust won elections, Washington and London would be very different places.

Truth and trust form the basis for civilization because they are the underpinnings of communication and community. It is difficult to communicate with someone who does nothing but lie to you. It is impossible to have a real community based on mistruths and mistrust. Without community and civilization, we have only anarchy – a combination of turmoil, disorder, chaos, mayhem, and lawlessness that leads to revolution or insurrection – but in this case, we are faced not with the conflict between tribes or communities but with a conflict of all against all.

It is easy, in this “post-truth” world, to surround ourselves with preferred answers to difficult questions. We can also surround ourselves with people who believe our version of the truth. We can constantly reinforce that version of the truth through our choice of newspaper, television channel, blogs, Internet newsfeeds, Facebook friends and more.

But our interpersonal behavior tells each other we don’t trust the version of the world we have built for ourselves. We aren’t sure whether there is anything in today’s world we can rely on as being 100% true or 100% trustworthy. Can we not call almost anything and anyone into question? Do we even know whether the source of the communication we trust is human instead of a computerized bot programmed to mislead us?

We live in a world that gives us access to a million-fold greater facts than previous generations had access to. So many facts, we developed sophisticated tools to reduce those facts to subsets we can deal with. But that very filtering process involves making choices. How do we choose which facts are the relevant ones? Who makes those choices for us? How do we (or they) do that without ascribing some desired meaning to those facts. Whom do we trust to do that for us?

Truth is the meaning we derive from facts. Trust is confidence in that meaning. But filtering (choosing which subset of facts we deem relevant) can lead to differing interpretations and therefore different meaning. The filtering process can unintentionally introduce bias, when we apply it to large fact sets. Lies attempt to get us to derive the exact opposite meaning. Trust derives from patterns of behavior that reinforce we are getting truth and not lies.

Increasingly, we are getting neither. We are simply getting fed information the provider of which takes little or no responsibility for. We are left alone to determine whether it is truth or lie and whether to trust its source. We have little or nothing that tells us where that information really came from.

We have learned how to deal with those trust issues in commerce, whether through trust in currency, or markets or even buyer and seller reputations, whether at our bank, in the stock market, e-Bay, or Amazon. Those with poor trust ratings do less business than those with higher ratings.

Why then, have we failed to impose similar standards of trust on the rest of the information we receive? Are not the markets of governance and politics as reliant upon trust between sides? Can any communication have real meaning, can any compromise or agreement be lasting, unless it is built on trust?

The Impacts of Filtering.

Web sites like Google and Facebook are tools that allow us to search through seemingly unlimited amounts of data, they are also filters that apply bias in selecting which data we receive and they both have allowed us to vastly expand the universe of people with whom we can daily interact. Facebook or Linked-In are similarly filters that allow us to choose which subsets of that population we continue to interact with.

At first, these tools enriched our world. They provided us access to ever greater numbers of facts and people and, in so doing, broadened our perspective, expanded our networks, and deepened our insights. They allowed us to communicate with information and cultures far beyond our physical access and allowed us to build communities not bounded by geography.

But the volume of data, facts and “truths” kept growing; forcing us to apply ever more aggressive filters to reduce the inbound data to levels we could absorb. Some virtual communities remain small enough so individuals are recognized and have on-line personalities and reputations that effectively govern interactions in those communities. But the larger online media and social networks have expanded to where software is assigning reputation and relevance, to where we not only don’t know the person with whom we are communicating; we don’t really even know if it is a person. All of which leads to and feeds the rise and influence of things like “fake news” sites. In these cases, the software filters are failing us, and instead of giving us greater access to fact, are drowning us in untruths. Trust, if it exists, is an illusion. When the filter becomes the truth, we are left in a state of constant “buyer beware.” Humans love data confirming what they believe and love to ignore data that conflicts with their beliefs. By feeding these tendencies, the software filters enable large communities, even entire countries, to be separated into camps that believe vastly different things simply because they choose filters that reinforce their pre-existing bias.

Society has relied on some type of filter for centuries. Some of the words we use to describe those age-old filters are belief, faith, and religion. Our religion, faith and beliefs have long provided a means for us to live comfortably in a world otherwise too complex for our minds to comprehend. Obviously, religion and faith have other higher purposes; but their ability to provide meaning to the incomprehensible is both a strength and a weakness. In the world that existed before global data and communications systems, it was easier to support a limited and manageable number of religions. Limiting the number limited religious conflict. But forcing multiple religions into a smaller community also required understanding and tolerance, ultimately through communication and building trust, for those communities to survive.

The United States, being a collection of immigrants, forced a system that tolerated many religions. To avoid conflict, the framers adopted the Constitutional principle allowing anyone to believe anything – religious freedom and freedom of expression. It also gave us both the freedom to trust anything and anyone and the responsibility to choose wisely.

Our cities have become such global, religious, and cultural melting pots it becomes difficult to create a filter that isn’t drowned out by the realities of daily life – largely because we are daily confronted with evidence to the contrary. We can create virtual communities that are echo chambers for our chosen sets of beliefs, but we are constantly confronted with alternate realities that make it difficult to maintain any bias. Instead, the biggest danger in urban populations is simply overload – there is so much stimulus we risk becoming apathetic and passive, even in the face of situations that would suggest active involvement. By staying at home, watching television, and getting our information through the Internet, we can filter out the dissonances that overload our senses and ability to cope.

Smaller and more rural communities, by contrast, make it easier to regulate the speed at which new ideas are assimilated. This both makes it easier for these communities to keep out conflicting values or ideas. But it also makes them more vulnerable to filters with a large degree of bias. The counter-balance is ultimately face-to-face communication that builds trust and understanding and allows for more gradual assimilation of new “truths.” Done right, these communities become important centering forces. Done wrong, they can allow small groups to drift even further from reality.

Other nations, by employing a chosen set of filters, have attempted to maintain communities in which the desired faiths and beliefs are supported by appropriately filtered facts. Careful control of television and news media and strong limitations on access to Internet media, have allowed a biased set of truths to be maintained in tightly controlled geographies. In such places, conflict occurs at the edges; wherever the controlled truth comes into touch with people exposed to a different truth. We have enabled individuals and nations to expand their wealth or power at the expense of others by using the filtering process to get millions to follow them down a path that can’t be justified by its true purposes, but is often successfully clothed in the support of a culture, religion, faith, or belief seeking to give it a higher purpose. As intelligent creatures, we humans seem unduly susceptible to being duped into the support of causes unsupported by fact, but driven by a political, religious, cultural, or belief-driven bias that disables our ability to see truth. We often trust in things we fail to comprehend and we do so at our peril.

War, Revolution, and Anarchy.

War and terrorism are the human pinnacle of trust in one truth running headlong into equally strong trust of another. Wars don’t mean that truth prevails – typically they only mean that one not fully factually supportable truth wins out over another equally not supportable interpretation or filtering of fact. Anarchy is the state in which there is no shared truth and no shared trust – it is the virtual opposite of community.

Nature itself has fact-revealing wars. We know them as fire, flood and drought, extreme heat and cold and disease. Like war, they have ways of devastating populations that ignore fact – such as of overpopulation of a species or overexpansion of that species into territories incapable of supporting it. In nature, everything is always at war with its surroundings, a war for survival and growth waged against the forces of balance. Lots of constant little battles weed out the weak and retain a sustainable balance. But, sometimes a species can progress far out of balance. In nature’s way, the greater the transgression, the greater the damage that nature’s forces of truth inflict upon it.

Put differently, small fires, floods, droughts, temperature extremes and diseases make species stronger and serve as course corrections that minimize further damage. But the more we ignore nature’s attempted course corrections, the greater the devastation the ultimate correction imposes – just ask the U.S. Forest Service what years of preventing small fires did to build conditions for major conflagrations in the Western U.S.

The degree to which our data filters today control our lives puts us at ever-increasing risk of a similar human conflagration. But this conflagration differs from any humankind has seen before. It doesn’t exist at the edge, where two differing cultures or religions come face-to-face over their differences. This one exists everywhere. It separates us from the person living next door and the person next to us on the street. It means we don’t know whom and what we can trust.

At first this new and different world excites us. Discovering new truths and being challenged on a few of our old ones keeps us on our toes. These new truths sell ads. They attract audiences. But, our comfort level with these new truths is very much tied to the ratio between the old and comfortable and the new and uncomfortable. When the ratio of old to new is high, the new attracts us. As the ratio of new increases, so does our level of discomfort. At some point, we become overwhelmed by the new and seek ever greater comfort in filters that tell us the old still holds true. To drown out the now overwhelming amounts of new, we increase the filtering function – whether on Facebook, through Google, our choice of television station or our choice of newspaper, we seek to hide behind the comfort of that filter.

But, facts are still facts and truths still exist. Eventually those facts and truths will win out. The real question is how much damage will we inflict upon ourselves clinging to the old, before being forced to accept the new? Unfortunately, these are also the moments when humanity is at its weakest and most influenceable by biases and untruths that seem comforting at the time. We become highly capable of trusting exactly those we should not place our trust in.

The smartest and most benevolent of the governments that strictly control their filtering processes (their media), have also learned that they can only filter so much – that too strong a filter will clash too frequently with the truth and that revolts will ensue. So, they refresh the filter to where it still serves their purposes but bends without breaking the attachment to fact and truth.

What about those parts of the world that have chosen democracy? Those places where our individual access to fact and truth was supposed to allow us to choose the correct path? Places where trust resulted from open communication between people who disagreed, but also knew each other and knew that once resolved, they could live with their differences?

How do we repair our filtering processes so they do not allow us to wander so far from the truth that only a conflagration can shatter the illusion? What do we do when the level of filtering has become so extreme that we have no grounding, clinging only to the barest shreds of our former beliefs. Can we still repair the filter without also solving the communication and community problem? Must we not somehow tie fact and meaning back together to provide us with some set of basic truths? How much harder is this to do when dealing with entire nations and a global media rather than local communities where we can fall back on person-to-person discourse and communication in rebuilding a sense of community? Can there be trust without community?

The Solution We Need, But Will Avoid at all Cost; Perhaps to Our Demise.

The ultimate communication occurs only when we are standing face to face, naked, with everything about us revealed to our opposite in dialog. Only then are the words being said revealed in full context, only then must we deal directly with the consequences of our words. We may still not trust each other, but we are given an obvious choice as to whether to join in a community with that opposite in dialog. Universities are increasingly testing the Facebook profiles of their applicants as a justifiable means of determining who they will and will not admit to their student community. Suddenly, the realities of posts long forgotten may have real and unpleasant consequences.

What if every post or action we took on the web had those same consequences? What if participating in the benefits of the World Wide Web meant that we had to submit to absolute transparency about whom we are and what we did – almost like a family member living in the same house? You could still post whatever you wanted, express whatever opinions you desired, but you could no longer hide from those posts or expressions. Those that interacted with you, that read your posts, that believed you and decided to join in a community with you would actually know the full you, for good or for bad. A bot would be revealed as just that. A teenager posting fake news from Macedonia for profit or mischief would also be revealed as just that.

The reality is that the modern world is increasingly giving governments the ability to see all of us in that light, fully exposed by our activities on and interactions with the online world. Why not take things a step further and reveal that same reality to all who might want to be part of a community with us?

Must this be the price we pay to reap the real benefits of ubiquitous world-wide communications and communities? Is there a better alternative? Can we really trust anyone whom we don’t really know? Can we assign consequences unless we can assign responsibility?

These are questions we must answer and we must answer them very quickly. Participation in that type of an “open” community would still be voluntary – you could choose not to be on the web and thus reveal nothing about yourself, but your ability to influence others would similarly be extremely limited. For those who chose to participate, the degree of common knowledge of who they were would be directly proportional to how influential they wanted to be. How different might the world be if we knew the true makeup, backgrounds, motivations, biases, etc. of those whom we chose to lead us?

Rebuilding the Internet for Truth, Trust, Communication, and Community.

Google (or Alphabet as it is now known – a name appropriate to the repository of more facts than the world has ever before known and therefore the owner of the most powerful filtering process the world has ever known) has a business unit today simply known as X. X has as its mission the invention and launching of “moonshot” technologies it hopes could someday make the world a better place.

What moonshot could possibly be a) more difficult, b) more meaningful in making the world a better place, or c) more appropriate for X as a company than to rebuild the Internet into a place that allows for lots of discourse (small fires) but prevents war (the conflagrations), simply by providing us the same level of trust in a broad range of information we now demand of any site we send money to? What if instead of just spreading information, we communicated with people whom we both really knew and thus really trusted in the same way as members of our family? What if by communicating in those ways, we built trust and community rather than destroying it?

By building trust, I do not mean giving us some centrist view, as many media firms have attempted to navigate. The center point between an extreme and a slightly less extreme perspective is even further from the truth than the less extreme perspective. Truth is not compromise and trust does not derive from being told what to believe.

Truth is the meaning most supported by fact. As facts change, the truth must change. Separating fact from error is itself not an easy process. Because of the balance that nature has achieved over thousands of years, its “fact pattern” can be described as a bell-shaped curve. There are extremes and outliers (and each is highly important in the processes we call evolution and creative destruction), but the clear majority of things happen in the middle and adjustments only slowly deviate from the mean. Understanding where the true “mean” is allows us to absorb outlier facts on both extremes and not be swayed by them.

But our current world seems to progress ever more toward an inverted bell-shape, where the number of opinions, events and risks seem ever more separated at the extremes and where the likelihood of small adjustments rather than a wild ride between bubble and burst seems ever less likely. When the mean is just the middle but the weight of data is at the extremes, then there is no basis for communication or community.

We are betting on events that threaten our very existence as a species on this planet, but we behave as though some safety net will allow us time after time to go “all-in” on our bets and still not lose. The individual version of that is a suicide bomber – the collective version is where we seem headed.

So how do we rebuild trust? How do we have fact-based discourse about differences? How do we build rather than destroy communities? And can any organization much smaller than a Google even dare to take on this a challenge? Addressing truth and trust may be the biggest and most important moonshot that X could ever undertake.

At its core, such an effort would require each of us to agree to a simple rule: You may have access to all of the benefits that the web can offer, but you may only do so in exchange for complete openness about who you are, how you use the web and are willing to face the consequences of your actions on that web.

Building Veritela.

Let’s refer to this moonshot as Veritela. Why Veritela? Because the Latin word for truth was Veritas, and the goddess of truth, the reputed daughter of Saturn and mother of Virtue is often depicted as a young virgin dressed in white – for whom a name like Veritela seems much more appropriate. In addition, the Latin word “tela” refers to a web or fabric – that web signifies the importance of interconnectedness, of communication and community. A web of truth seems highly appropriate. What is Veritela? Veritela reveals the truth about those with whom we communicate, it gives us the basis to determine whom we trust and it allows us to make rational choices about whom we include in our communities. It tells us the truth about stuff – a truth teller on an astronomical basis.

Imagine Veritela not as some boring statute carved in stone telling us the truth. Imagine her instead as a highly attractive flirt, one that teases us with what we want to see, but somehow ultimately forces us to reconcile that flirtation with fact. Imagine her as inciting thousands of tiny fires, fires that enliven us, but also teach us the danger of fire and correct our course in many small but important ways. Even as she eggs us on, she grounds us in reality. She is the daily exercise that makes us stronger rather than breaking us down. She tests us and ultimately allows us to begin trusting each other.

She allows us to build our world of Facebook friends, but she also reveals exactly who they are. She allows us to choose our news, our data feeds, our television shows, but she gives them the same trust scoring we demand of our e-commerce sites.

She holds us individually accountable by scoring us on criteria like credibility, integrity and trustworthiness. The greater the level of circulation our posts receive, the higher the degree of credibility verification we face – liars without influence are a nuisance, liars with a global following lead us to war.

She cannot be advertiser-supported network television. Nor can she be supported by government or by corporations, she must be supported by us, the people. She is more like subscriber-supported Netflix, striving for far higher quality of content, but again driven by fact rather than the influence of dollars.

Veritela won’t be easy to live with. She will be like the 5-year-old that keeps asking us questions, questions that at first are easy to answer, but that over time become deeper, more probing and require us to refine our own thinking. She will also be difficult to escape, as every action we take will be reflected in our scoring – as in communities of 100 years ago, it will be difficult or impossible to escape the reputation and image of yourself that you build.

Can we build Veritela in time to save ourselves? Will X take on this gargantuan task?

bottom of page