Laying a Shared Foundation for a Decentralized Web
In this guest post, Richard Whitt builds on his prepared remarks from the “Defining Our Terms” conversations at the Decentralized Web Summit in 2018. The remarks have been modified to account for some excellent contemporaneous feedback from the session.
Some Widespread Confusion on Openness
So, we all claim to love the concept of openness — from our economic markets, to our political systems, to our trade policies, even to our “mindedness.” And yet, we don’t appear to have a consistent, well-grounded way of defining what it means to be open, and why it is a good thing in any particular situation. This is especially the case in the tech sector, where we talk at great length about the virtues of open standards, open source, open APIs, open data, open AI, open science, open government, and of course open Internet.
Even over the past few months, there are obvious signs of this rampant and growing confusion in the tech world.
Recently, former FCC Chairman, and current cable lobbyist Michael Powell gave a television interview, where he decried what he calls “the mythology… of Nirvana-like openness.” He claims this mythology includes the notion that information always wants to be free, and openness is always good. Interestingly, he also claims that the Internet is moving towards what he called “narrow tunnels,” meaning only a relative few entities control the ways that users interact with the Web.
Here are but three recent examples of tech openness under scrutiny:
- Facebook and the data mining practices of Cambridge Analytica: was the company too open with its data sharing practices and its APIs?
- Google and the $5B fine on Android in the EU: was the company not open enough with its open source OS?
- Network neutrality at the FCC in its rise and fall, and rise and now again fall. Proponents of network neutrality claim they are seeking to save an open Internet; opponents of network neutrality insist they are seeking to restore an open Internet. Can both be right, or perhaps neither?
The concept of openness seems to share some commonalities with decentralization and federation, and with the related edge/core dichotomy. To be open, to be decentralized, to be on the edge, generally is believed to be good. To be closed, to be centralized, to be from the core, is bad. Or at least, that appears to be the natural bias of many of the folks at the Summit.
Whether the decentralized Web, however we define it, is synonymous to openness, or merely related in some fashion, is an excellent question.
The Roots of Openness
First, at a very basic level, openness is a very old and ancient thing. In the Beginning, everything was outside. Or inside. Or both.
A foundational aspect of systems is the notion that they have boundaries. But that is only the beginning of the story. A boundary is merely a convenient demarcation between what is deemed the inner and what is deemed the outer. Determining where one system ends and another ends is not such a straightforward task.
It turns out that in nature, the boundaries between the inner and the outer are not nearly as firm and well-defined as many assume. Many systems display what are known as semi-permeable boundaries. Even a human being, in its physical body, in its mental and emotional “spaces,” can be viewed as having extensible selves, reaching far into the environment. And in turn, that environment reaches deep into the self. Technologies can be seen as one form of extension of the properties of the physical and mental self.
The world is made up of all types of systems, from simple to complex, natural to human-made. Most systems are not static, but constantly changing patterns of interactions. They exist to survive, and even to flourish, so long as they gain useful energy in their interactions with their environments.
“Homeostasis” is a term describing the tendency of a system to seek a stable state by adapting and tweaking and adjusting to its internal and external environments. There is no set path to achieving that stability, however — no happy medium, no golden mean, no end goal. In fact, the second law of thermodynamics tells us that systems constantly are holding off the universe’ relentless drive towards maximum entropy. Only in the outright death of the system, do the inner and the outer conjoin once again.
Human beings too are systems, a matrix of cells organized into organs, organized into the systems of life. The most complex of these systems might be the neurological system, which in turn regulates our own openness to experience of the outside world. According to psychologists, openness to experience is one of the Big Five personality traits. This is considered a crucial element in each of us because it often cuts against the grain of our DNA and our upbringing. Surprise and newness can be perceived as a threat, based on how our brains are wired. After all, as someone once said, we are all descendants of nervous monkeys. Many of our more adventurous, braver, more open forebears probably died out along the way. Some of them, however, discovered and helped create the world we live in today.
From a systems perspective, the trick is to discover and create the conditions that optimize how the particular complex system functions. Whether a marketplace, a political system, a community, the Internet — or an individual.
Second, it may be useful to include networks and platforms in our consideration of a systems approach to openness.
There is no firm consensus here. But for many, a network is a subset of a system, while a platform is a subset of a network. All share some common elements of complex adaptive systems, including emergence, tipping points, and the difficulty of controlling the resource, versus managing it.
The subsets also have their own attributes. So, networks show network effects, while platforms show platform economic effects.
From the tech business context, openness may well look different — and more or less beneficial — depending on which of these systems structures is in play, and where we place the boundaries. An open social network premised on acquiring data for selling advertising, may not be situated the same as an open source mobile operating system ecosystem, or Internet traffic over an open broadband access network. The context of the underlying resource is all-important and as such, changes the value (and even the meaning) of openness.
Third, as Michael Powell correctly calls out, this talk about being open or closed cannot come down to simple black and white dichotomies. In fact, using complex systems analysis, these two concepts amount to what is called a polarity. Neither pole is an absolute unto itself, but in fact exists, and can only be defined, in terms of its apparent opposite.
And this makes sense, right? After all, there is no such thing as a completely open system. At some point, such a system loses its very functional integrity, and eventually dissipates into nothingness. Nor is there such a thing as a completely closed system. At some point it becomes a sterile, dessicated wasteland, and simply dies from lack of oxygen.
So, what we tend to think of as the open/closed dichotomy is in fact a set of systems polarities which constitute a continuum. Perhaps the decentralized Web could be seen in a similar way, with some elements of centralization — like naming conventions — useful and even necessary for the proper functioning of the Web’s more decentralized components.
Fourth, the continuum between the more open and the more closed changes and shifts with time. Being open is a relative concept. It depends for meaning on what is going on around it. This means there is no such thing as a fixed point, or an ending equilibrium. That is one reason to prefer the term “openness” to “open,” as it suggests a moving property, an endless becoming, rather than a final resting place, a being. Again, more decentralized networks may have similar attributes of relative tradeoffs. This suggests as well that the benefits we see from a certain degree of openness are not fixed in stone.
Relatedly, a system is seen as open as compared to its less open counterpart. In this regard, the openness that is sought can be seen as reactive, a direct response to the more closed system it has been, or could be. Open source is so because it is not proprietary. Open APIs are so because they are not private. Could it be that openness actually is a reflexive, even reactionary concept? And can we explore its many aspects free from the constraints of the thing we don’t wish for it to be?
Fifth, openness as a concept seems not to be isolated, but spread all across the Internet, as well as all the various markets and technologies that underlay and overlay the Internet. Even if it is often poorly understood and sometimes misused, openness is still a pervasive subject.
Just on the code (software) and computational sides, the relevant domains include:
- Open standards, such as the Internet Protocol
- Open source, such as Android
- Open APIs, such as Venmo
- Open data, such as EU Open Data Portal
- Open AI, such as RoboSumo
How we approach openness in each of these domains potentially has wide-ranging implications for the others.
One quick example is open source.
The Mozilla Foundation recently published a report acknowledging the obvious: there is no such thing as a single “open source” model. Instead, the report highlights no fewer than 10 different types of open source archetypes, from “B2B” to “Rocket Ship to Mars” to “Bathwater.” A variety of factors are at play in each of the ten proposed archetypes, including component coupling, development speed, governance structure, community standards, types of participants, and measurements of success.
Obviously, if the smart folks at Mozilla have concluded that open source can and does mean many different things, it must be true. And that same nuanced thinking probably is suitable as well for all the other openness domains.
So, in sum, openness is a systems polarity, a relational and contextual concept, a reflexive move, and a pervasive aspect of the technology world.
Possible Openness Taxonomies
Finally, here are a few proposed taxonomies that would be useful to explore further:
Means versus End
Is openness a tool (a means), or an outcome (an end)? Or both? And if both, when is it best employed in one way compared to another? There are different implications for what we want to accomplish.
The three Fs
Generally speaking, openness can refer to one of three things: a resource, a process, or an entity.
- The resource is the virtual or physical thing subject to being open.
- The process is the chosen way for people to create and maintain openness, and which itself can be more or less open.
- The entity is the body of individuals responsible for the resource and the process. Again, the entity can be more or less open.
Perhaps a more alliterative way of putting this is that the resource is the function, the chosen process is the form, and the chosen entity is the forum.
For example, in terms of the Internet, the Internet Protocol and other design elements constitute the function, the RFC process is the form, and the IETF is the forum. Note that all these are relatively open, but obviously in different ways. Also note that a relatively closed form and forum can yield a more open function, or vice versa.
Form and forum need not follow function. But the end result is probably better if it does.
So, in all cases of openness, we should ask: What is the Form, what is the Forum, and what is the Function?
Scope of Openness
Openness also be broken down into scope, or the different degrees of access provided.
This can run the gamut, from the bare minimum of awareness that a resource or process or entity even exists, to transparency about what it entails, then to accessing and utilizing the resource, and have a reasonable ability to provide input into it, influence its operation, control its powers, and ultimately own it outright. One can see it as the steps involved from identifying and approaching a house, and eventually possessing the treasure buried inside or even forging that treasure into new being.
Think about the Android OS, for example, and how its labelling as an open source platform does, or does not, comport with the full scope of openness, and perhaps how those degrees have shifted over time. Clearly it matches up to one of Mozilla’s ten open source archetypes — but what are the tradeoffs, who has made them and why, and what is the full range of implications for the ecosystem? That would be worth a conversation.
Interestingly, many of these degrees of openness seem to be rooted in traditional common carrier law and regulation, going back decades if not centuries.
- Visibility and Transparency: the duty to convey information about practices and services
- Access: the norms of interconnection and interoperability
- Reasonable treatment: the expectation of fair, reasonable, and nondiscriminatory terms and conditions
- Control: the essential facilities and common carriage classifications
In fact, in late July 2018, US Senator Mark Warner (D-VA) released a legislative proposal to regulate the tech platforms, with provisions that utilize many of these same concepts.
Openness as Safeguards Taxonomy
Finally, openness has been invoked over the years by policymakers, such as the Congress and the FCC in the United States. Often it has been employed as a means of “opening up” a particular market, such as the local telecommunications network, to benefit another market sector, like the information services and OTT industries.
Over time, these market entry and safeguards have fallen into certain buckets. The debates over access broadband networks is one interesting example:
- definitional — basic/enhanced dichotomy
- structural —Computer II structural separation
- functional — Computer III modular interfaces
- behavioral — network neutrality
- informational — transparency
In each case, it would be useful if stakeholders engaged in a thorough analysis of the scope and tradeoffs of openness, as defined from the vantage points of the telecom network owners, the online services, and the ultimate end users.
The larger point, however, is that openness is a potentially robust topic that will influence the ways all of us think about the decentralized Web.
Richard S. Whitt Whitt is an experienced corporate strategist and technology policy attorney. Currently he serves as Fellow in Residence with the Mozilla Foundation, and Senior Fellow with the Georgetown Institute for Technology Law and Policy. As head of NetsEdge LLC, he advises companies on the complex governance challenges at the intersection of market, technology, and policy systems. He is also president of the GLIA Foundation, and founder of the GLIAnet Project.