Boundary Logic

Boundary Logic

Here’s my dilemma, folks. There’s an order of magnitude more material for this page than on the entire iconicmath site. The application of iconic techniques to logic yields astonishing results. The Image shows the container representation of the logical concept of equivalence. Iconic logic is unary, there is no true/false duality. When applied, logical results come more efficiently, iconic deduction is a clearer process. However the newfound clarity requires that we reconsider what is meant by thinking logically. Unit-ensembles and depth-value notation retained a visceral connection to our current place-value numbers, although iconic arithmetic moves substantively away from the group theoretic concepts of modern algebra. In contrast, iconic logic is radically different from conventional logic. Conventional logic is conceptual, its ANDs and ORs and NOTs are famously without referents in the actual world. Iconic logic is grounded, it is constructed physically rather than conceptually. Just like arithmetic, logic is putting stuff into containers and then simplifying the result. Like the Merge and Group operations of iconic numbers, iconic logic’s Cross and Call manage all facits of simple logical inference.

The iconic patterns of arithmetic and logic are presented below. For logic, assume that is the same as ( ). One fundamental difference between arithmetic and logic is the units of arithmetic are full while units of logic are empty. The iconic arithmetic rules, if read from left to right, are Ungroup and Unmerge. Since equality allows both directions of transformation, this change is solely visual, intended to allow an easier comparison between arithmetic and logic. Similarly Cross and Call have been written with explicit containers and with dots for “units” (the unit for logic is the value True).

These sets of equations define simple arithmetic and simple logic, and they define the difference between arithmetic and logic. In order to count, arithmetic requires that units accumulate. Logical truth does not accumulate, the Call rule illustrates what in arithmetic would amount to1+1=1. The Cross rule also assures non-accumulation by defining containment as cancelation rather than grouping. This visual information is better contemplated than described by words.

Charles Saunders Peirce invented this particular approach to iconic logic in the 1890s, which he called Entitative Graphs. Other iconic approaches during the same time period included Venn’s diagrams and Frege’s diagrammatic logic. George Spencer Brown published the Laws of Form in 1969 which placed Peirce’s graphs in the algebraic context of equations. Cross and Call are from Laws of Form. My work elaborates and extends Spencer Brown’s seminal work. To distinguish each system, I’ve called my variations on Spencer Brown’s theme Boundary Logic.

What follows is a dozen articles on iconic logic. I’ve organized them roughly in order of technical complexity, and included short summaries of the content of each. For convenience, and since some articles are monographs, the length of each is indicated in the title.

 

Common Sense (5 pages)

The Advantages of Boundary Logic — A Common Sense Approach presents the basic ideas of iconic logic anchored to common sense physical examples.

 

Class Notes (5 pages)

The Boundary Logic Class Notes also present the basic ideas of iconic logic, in bullet form. This is the version I usually give to students.

 

From the Beginning (32 pages)

Boundary Logic from the Beginning steps carefully though the origins, concepts, and applications of boundary logic. This is the last paragraph of the article.

Conventional logic has always harbored a contradiction. It is said to be “how we think rationally”, yet symbolic logic is notoriously difficult. Simple problems of inference are known to confuse the majority of people. Perhaps the most penetrating idea introduced by the iconic approach to deduction is that what we know as rationality is not the formal symbolic manipulation of a complex and challenging logical formula. Rather it is a simple and inevitable consequence of making marks in empty space.

 

Take Nothing Seriously (50 pages)

Taking Nothing Seriously: A Foundational Diagrammatic Formalism covers the same ground as From the Beginning, with more of a focus on simple mathematical ideas than on conceptual origins. This evolutionary document has a fundamental conceptual error, it attributes a meaning, that of sharing, to empty space. The concept of sharing space should have been stated as sharing the same container. Here is the (uncorrected) abstract.

We explore the consequences of constructing a diagrammatic formalism from scratch. Mathematical interpretation and application of this construction is carefully avoided, in favor of simply pointing toward potential mathematical anchors for formal diagrammatic structure. Diagrams are drawn in a two-dimensional space, while conventional mathematics emphasizes the one-dimensional linguistic space of strings. A theme is that the type of REPRESENTATIONAL SPACE interacts with the expressive power of a formal system.

Starting at the simplest beginning, we examine only two aspects of diagrammatic representation: forms that SHARE the representational space, and forms that enclose, or BOUND, portions of that space. The formalization of SHARING and BOUNDING is called boundary mathematics.

A pure boundary mathematics is a set of formal transformation rules on configurations of spatial boundaries. Due to the desirability of implementing spatial transformation rules by pattern-matching, we introduce boundary algebra, that is, boundary mathematics with equality defined by valid substitutions. This step connects us firmly to the known mathematical structure of partial orderings, algebraic monoids, Boolean algebras, and Peirce’s Alpha graphs.

SHARING and BOUNDING do not form an algebraic group, leading to the surprising conclusion that boundary algebra is not isomorphic to Boolean algebra although it is equally expressive. This result is a formal consequence of Shin’s observation that Alpha graphs have multiple readings for logic, providing formal evidence that, at least for propositional calculus, diagrammatic formalism is more powerful than linguistic formalism.

 

Peirce (18 pages)

Boundary Logic and Alpha Existential Graphs reiterates the conceptual foundations of iconic logic found in Taking Nothing Seriously, comparing Spencer Brown’s approach to Peirce’s approach. Here is the abstract.

Peirce’s Alpha Existential Graphs (AEG) is combined with Spencer Brown’s Laws of Form to create an algebraic diagrammatic formalism called boundary logic. First, the intuitive properties of configurations of planar nonoverlapping closed curves are viewed as a pure boundary mathematics, without conventional interpretation. Void representational space provides a featureless substrate for boundary forms. Pattern-equations impose constraints on forms to define semantic interpretation. Patterns emphasize void-equivalence, deletion of structure that is syntactically irrelevant and semantically inert. Boundary logic maps one-to-many to propositional calculus. However, the three simple pattern-equations of boundary logic provide capabilities that are unavailable in token-based systems. Void-substitution replaces collection and rearrangement of forms. Patterns incorporate transparent boundaries that ignore the arity and scope of logical connectives. The algebra is isomorphic to AEG but eliminates difficulties with reading and with use by substituting a purely diagrammatic formalism for the logical mechanisms incorporated within AEG.

 

Conventional Interpretations (30 pages)

Conventional Interpretations of Boundary Logic Tools again establishes the conceptual foundations and then goes on to describe the algebraic theorems that are of pragmatic importance to the implementation of boundary logic in software. Here’s the abstract.

The concepts of conventional logic can all be represented in the BL formalism. In so doing, conventional logic becomes simpler and more efficient. Similarly, the concepts of BL can be represented in conventional logic, but only by adding new tools and perspectives to conventional approaches.

Thus, BL improves and extends the power of conventional logic. Using BL concepts generally and implementing BL concepts in software data structures and algorithms and in semiconductor and other hardware designs (among other applications of BL) avoids the representational and computational complexities of conventional logic approaches. It is, however, possible to replicate the transformational rules and mechanisms of BL using a vocabulary of conventional logic. This might have the effect of creating new ideas for conventional logic that appear to be novel and unique, when they would be, in fact, derivative of BL innovations.

To make the relationship between BL and conventional logic clear, this document includes many comparisons of techniques in representation, in the form of theorems, in proofs, and in hardware architectures.

 

What’s the Difference? (53 pages)

We are now heading into the more technical articles that presume experience with iconic formalisms. What’s the Difference? Contrasting Boundary and Boolean Algebras addresses a fundamental confusion about Spencer Brown’s work, that the formalism of an iconic logic can be isomorphic (identical up to the choice of symbolic names) with the symbolic system of Boolean algebra. This complete misinterpretation is obvious, just count the number of ground symbols. That is, compare {TRUE, FALSE} to { O }. Here’s the abstract.

There is a common misconception that boundary algebra is isomorphic with Boolean algebra. The paper describes a dozen deep structural differences that are incompatible with isomorphism. Boundary algebra does not support functional morphisms of any kind because it does not support functions. It does support a partial order relation. A one-to-many mapping between boundary and Boolean algebras emphasizes the difference between the two systems. This mapping shows that boundary algebra subsumes Boolean algebra within a formally smaller structure. Boundary algebra applies to n-element Boolean algebras, and does not have group theoretic structure.

 

Equality (23 pages)

The main point of Equality Is Not Free is that when an equal sign,=, is interpreted logically as each side of the equation implying the other, then logic and algebra get confused at a foundational level. This article explores the consequences. Here is the introduction.

Our conceptualization of mathematical expressions, definitions, and proofs is formulated in the language of logic, using AND and NOT and IMPLIES and IF-ANDONLY-IF. This same language maps to boundary algebra, so that the way we describe and address problems logically can also be formulated as the structural transformation of algebraic boundary forms.

Boundary algebra provides a different way of thinking about deduction and rationality. The language of boundary algebra consists only of SHARING and BOUNDING and EQUALS.

In the sequel, we use boundary algebra tools to analyze and to deconstruct the structure of logic itself, with an emphasis on the relationship between logical connectives and algebraic EQUALS.

 

Insertion (17 pages)

Generalized Insertion explores an innovation in the concept of logical proof. Like many iconic techniques, it has no parallel in conventional symbolic logic. Both logic proof and Boolean optimization can be conducted by querying parts of the fact base, without engaging in deduction. This is the opening paragraph of the article.

The Boundary Insertion algorithm is a general proof technique for logic, functionally equivalent to the four other known techniques (truth tables, natural deduction, resolution, and algebraic substitution). Unlike the other techniques, polynomially bounded Insertion is quite comprehensive.

 

Complexity (66 pages)

Computational Complexity and Boundary Logic explores the classic P=NP computational problem that a general proof procedure for logic must always, at some point of complexity, make a guess. The Virtual Insertion technique also encounters this barrier, the article explores when. Here is the Summary.

Logic has evolved over thousands of years within language. Mathematical logic is young, a creation of the 20th century. One of the most outstanding problems for mathematical logic is determining whether or not there exist tractable algorithms for exponential problems. An answer may lay in one of the simplest exponential problems, that of determining if a given logic expression is a tautology. Boundary logic (BL) provides a new set of computational tools which are geometric and algebraic, most definitely not conventional logic. These container-based tools are simpler than those of both mathematical and natural logic. Can boundary logic shed light on tautology identification?

After the nature of algorithmic complexity and BL are introduced, we explore conventional rules of inference and deduction from a BL perspective. Proofs of all but the distributive law are close to trivial using BL algorithms. Certainly none of the explicit structure of modern logic identifies complexity. We explore significantly difficult tautological problems which can be constructed using the rules of logic. None of these are complex, although some are non-trivial. It appears that compound logical rules do not identify complex problems, yet logical proof systems rapidly require exponential effort.

We then describe the central reduction algorithm of BL, called virtual insertion, and apply it to known tractable and intractable problems. The low-degree polynomial virtual insertion algorithm is not complete, however the tautologies it cannot reduce may be constrained to intractable problems. Thus, virtual insertion is an efficient decision procedure for tractable tautologies. This is valuable since almost all pragmatic problems are tractable. Using virtual insertion recursively produces a complete decision procedure for elementary logic, but one with the expected exponential complexity.

The boundary logic representations and algorithms described herein have been fully implemented in software and applied to computationally difficult practical problems, such as circuit design, tautology detection, and computer program optimization.

 

Containment (166 pages)

Iconic and Symbolic Containment in Laws of Form is a recent (2010) monograph that throughly explores the structure of Spencer Brown’s iconic logic and its relationship to other mathematical approaches. The goal is to decisively demonstrate that Laws of Form is a calculus of one relationship, that of containment. The interpretation of sharing space as a function is an error that violates a fundamental iconic rule that empty space has no interpretation. This is the overview from the monograph.

At the turn of the twentieth century, the mathematical community adopted a radical plan to put mathematics on a firm foundation. The idea was symbolic formalization, the representation of concepts using encoded symbols that bear no resemblance to what they mean. At the same time, C.S. Peirce developed Existential Graphs, an iconic representation of logic that is drawn rather than written [ref]. Iconic forms are images that look like what they mean. In 1967, G. Spencer Brown reintroduced Peirce’s iconic system in a more general algebraic style as Laws of Form [ref]. Today, mathematics remains symbolic, while other communication media have evolved into visual and interactive experiences. This essay explores the possibility of an iconic mathematics.

Laws of Form (LoF) is an algebraic system expressed in an iconic notation. We interpret LoF as a calculus of containment relations, and develop intuitive, symbolic, and iconic descriptions of the Contains relation. The representation of the LoF calculus is analyzed from both the iconic and the symbolic formal perspectives, with a focus on the descriptive structures necessary to align the two representational techniques. An iconic notation resembles its interpretation, which is at variance with modern mathematical conventions that strictly separate syntax from semantics. Iconic form is displayed in two and three dimensions, while symbolic form unfolds as encoded one-dimensional strings of tokens. Iconic variables stand in place of arbitrary patterns of zero, one, or many objects, while symbolic variables stand in place of single expressions. Symbols conform to speech, while icons conform to vision.

The iconic mathematics of LoF provides a semantic model for both a relational and a functional calculus. The operation of putting an object into a container forms an algebraic quasigroup. When further constrained by LoF rules, the functional calculus provides a new perspective on Boolean algebra, an equally expressive calculus that is both non-associative and non-commutative. LoF can be interpreted as propositional logic; iconic containment as well provides a new, non-symbolic deductive method.

We show four varieties of iconic notation to illustrate the interdependence of representation and meaning. Iconic formal systems extend the ability of symbolic notations to express parallelism, transformation across nesting, structure sharing, and void-based transformation. The expressive neutrality of several symbolic conventions (labeling, grouping, arity, null objects, variables) is examined in light of the LoF iconic calculus. Computational examples in both symbolic and iconic notations are compared side-by-side. We show that inducing symbolic representation on essentially spatial form has led to widespread publication of erroneous information.

 

Five Liars (6 pages)

Lewis Carroll’s Five Liars Puzzle was considered to be one of the most difficult logic puzzles of its day. It was published in 1897. This article solves the puzzle with iconic techniques. Interestingly, Carroll got the puzzle wrong, thinking there was only one solution when in fact there are two.