Church-Rosser in Morphogrammatics

Lambda calculus application of kenomic and morphic abstractions

Rudolf Kaehr Dr.phil@

Copyright ThinkArt Lab ISSN 2041-4358

 

Abstract

Morphogrammatics is not presuming a multitude of contextures but is creating a polyverse of contextures in each situation of a realization of an operation. This continues studies proposed with “Notes on semi-Thue Systems in a Context of Morphogrammatics".

1.  Further mimicry: Reduction theorems

1.1.  Iteration vs. accretion

1.1.1.  Motivation

If term X can be reduced to both Y and Z, then there must be a further term V (possibly equal to either Y or Z) to which both Y and Z can be reduced.

Kenomic inversion
"If there are two morphogrammatically equivalent but semiotically different terms t1 and t2, t1 typeset structure t2, and t1 !=sem t2, then two separated terms t1 and t2 can be produced from both t1 and t2, which then produce a prior term ttypeset structure which can be produced from both t1 and t2."

Two representions, one solution
λ X, Y, Z:
CONS(((XY),Z):

CONS(XY), Z) ((ab), a).

cons ((ab), a) = (aba) | (abb) | (abc) . <br /> (aba) !=  _ MG(abb) !=  _ MG(abc) . <br />

Many representations ,   one    solution . <br /> ((aba) <br />    &n ...  />     ∐                                                               (abc)

Deconstruction, step one
Church-Rosser: two representation, one solution.
Morphogrammatics: two solutions, one representation.

Example - I  Church - Rosser : Two representation, t  _ 1, t  _ 2,    ... -> t _ 1 and t  _ 0 --> t _ 2 imply t  _ 1 =  _ ID t  _ 2 .

<br /> (br) :           (λ v t) s => t [v ... p;                

Example - II Morphogrammatics : Two different solutions, t  _ 1 , t  _ 2   ... p; of a formula    t  _ 0 with one representation    t  _ 0 .

t  _ 0 --> ... --> t _ 1 and t  _ 0 --> ... --> t _ 2 imply   ... ;                

Example - I II (t  _ 1 !=  _ ID t  _ 2 | t  _ 1 =  _ K ... A0; _ SYNT t  _ 02    . <br />         

β-equivalence

"Finally, β-equivalence is obtained by allowing reduction steps as well as inverse reduction steps, i.e., by making -->typeset structure  symmetric:
Definition. We write M =typeset structure M’ if M can be transformed into M’ by zero or more reduction steps and/or inverse reduction steps.
Formally, =typeset structure is defined to be the reflexive symmetric transitive closure of -->typeset structure, i.e., the smallest equivalence relation containing -->typeset structure.” (Selinger, p.17)

Nonetheless, the β-equivalence is based on the principle “One representation, one solution” and is therefore not to be confused with the given examples of the concept “Many solutions, one representation".

1.1.2.  Polycontexturality and morphogrammatics

What the difference to the polycontextural approach as it was presented under the motto “Lambda calculi in polycontextural situations"? As the title suggests polycontexturality of situations is presupposed to place, i.e. distribute and mediate, lambda calculi. The study of such distributed lambda calculi was partly elaborated and has delivered some interesting results. Nevertheless, polycontexturality wasn’t generated by such dissemination of lambda calculi but presupposed.
Polycontexturality has its own calculus of generating distributed contextures in a polyverse.

On the other hand, a justification for the choice of polycontexturality isn’t necessary. The reason is simple, there is no generally acceptable reason to opt for the classical monocontextural approach. The fact that monocontexturality is world-wide accepted and technically in many respects highly successful doesn’t mean that this monocontextural approach has found a a secure foundation. Its legitimacy is pragmatically, its limits more and more obvious.
Therefore, an option for polycontexturality isn’t less arbitrary than an option for the established monocontexturality.

In contrast to the contextural decision, keno- and morphogrammatic constructions as sketched with the paper “Notes on semi-Thue Systems in a Context of Morphogrammatics” are not presupposing but generating their polycontexturality on the base of their own operations. Each repetition and iteration has per se its retrograde recursive double role as iteration and as accretion.

Morphogrammatics is not presuming a multitude of contextures but is creating a polyverse of contextures in each situation of a realization of an operation.

1.1.3.  Modi of substitution

Definition. The (capture-avoiding) substitution of N for free occurrences of x in M, in symbols M[N/x], is defined as follows:

<br /> x[N/x]              ≡ ... 801; λy ' . (M {y/y '}[N/x]),    if x != y, y ∈ FV (N), and y ’ fresh .


This opens up the possibility to introduce different kinds of abstractions involved in the process of substitution. The general table of different kinds of substitutions as they are introduced for morphogrammatic semi-Thue systems shall be applied.
http://memristors.memristics.com/semi-Thue/Notes%20on%20semi-Thue%20systems.pdf

Example

typeset structure


Identity (equality)
ID = (x1 ≡ x2 ≡ x3)

typeset structure

Table  u typeset structure v

typeset structure


Equivalence
(x1 =keno x2 =keno x3)

Table u typeset structure v

typeset structure


Similarity
(x1 =sim x2 =sim x3)


Table u typeset structure v

typeset structure


Bisimilarity
(x1 =bis x2 =bis x3)

Table: u typeset structure v

typeset structure

<br /> Table of the modi of substitution for different types of lambda calculi LC . <br />     ...  bisLC                           O                               metamLC                         O

[Graphics:HTMLFiles/Church-Rosser in Morphogrammatics_27.gif]

Different version
Reversing beta-reduction produces beta-abstraction rule.
Kenneth Slonneger, Formal syntax and semantics for programming languages, §5, p.149, 1995

Syntax of Lambda Expressions 1. t = x,    x ∈ Var 2 . t = λ t x M x and M are expressions  3. t = (MN),    M, N    expressions

β −Reduction Rules (br)    (λ v t) s => t   [v/s]  b .  ... rps)    if t  _ 1 --> t  _ 2 then if t  _ 1 = t  _ 2

2.  Church-Rosser reductions for morphogrammatics

2.1.  Lambda Calculus modus=ID

"In fact, when a form contains more than one lambda that can be reduced, it does not matter which one is reduced first, the result will be the same. This is known as the Church-Rosser property, or, informally, as the diamond property.” (Barker)

t  _ 0 --> ... --> t _ 1 and t  _ 0 --> ... --> t _ 2 imply t  ...  2 =  _ sem w  _ 3 =  _ sem w  _ 4 .     : [++ ++]

<br /> Example - I <br />        <br />      ... bsp;     :    (br) ; (a1), (br) : t  _ 1 = t  _ 2 .

General remarks to the modi of abstractions

Obviously this says that both branches are terminal and that both terminal results of the branches are semantically equal, therefore the branches are meeting in a common end. This corresponds to the graphematical equality or identity case.
Thus, we have :                      
                              branch1              branch2
                                   |                       |
           typeset structuretypeset structure=typeset structure    

and typeset structureSEM typeset structure, hence the final result is t3 =typeset structure.

Hence, for equality of terms we get: t3.1=SYNT t3.2.

Again, this result is supposing a common arithmetic of the example, given by the term ttypeset structure  But what happens if we allow a game where arithmetics might be colored, and thus the results might still be syntactically equivalent but semantically different, because, say, blue is not green.typeset structureOr if the number systems are running differently. This situation could result into a kenogrammatical calculus if we accept that the difference is not just in color but in the syntactical terms with results t3.1 = (1 + 1) + (1 + 1) and t3.2 = (2 + 2) +(2 + 2).

Hence, for equivalence of terms we get: t3.1 !=SEM t3.2 but t3.1 =KENO t3.2.

More interesting deviations happens with the case when the terms differ in the number of sub-terms too.
Say, if we get a result with different operators “+" and “x”.
t3.1 = (1 + 1) + (1 + 1) and t3.2 = (2 x 2) +(2 x 2).
In this case, there is no equality (identity) or kenogrammatical equivalence but similarity. Both terms are similar in their abstract structure which conserves the complexity of the terms, i.e. the length of the formula.

Hence, similarity of terms we get: t3.1 !=SEM t3.2, t3.1 !=KENO t3.2, but t3.1 =SIM t3.2.

Also similar terms are still of the same length, the number of their elements might differ. Only the metamorphic abstraction over terms is introducing a bisimilarity between terms of different complexity and therefore different length of the involved terms.
Bisimilarity happens if we get structurally different answers from the ‘same’ constellation t0.
Say, t0= (typeset structurex.x + x)(typeset structurey.y +y)1) delivers the answers t3.1 = (1 + 1) + (1 + 1) and t3.2 = (2 x 2) + (2 x 2) x (1 +1).

Hence, for bisimilarity of terms we get: t3.1 !=SEM t3.2, t3.1 !=KENO t3.2, t3.1 !=SIM t3.2 but  t3.1 =BIS t3.2.

All cases, equality, equivalence, similarity and bisimilarity are still accepting the main scheme of term the development, here the branching of t0 into t1 and t2terminating in t3. Therefore, there is not yet any retrograde redefinition of the scheme involved during the term development.
This retrograde redefinition is attempted with interactional and interventional metamorphosis of general polycontextural formal systems. •    

Example - II <br /> (br) : subst : (λ v t) s  -> t [v/s ]  _ I ... ;                

“In these computations, the final values are syntactically equal, so we know that they are semantically equal. This is interesting in light of the fact that in the first computation the top-level function was applied to the value of its arguments, while the second computation, it was applied to the syntactic expressions defining its arguments!” (Starck, p. 203)

2.2.  Lambda Calculus modus=KENO

               &nbs ...  w  _ 2  =  _ sem w  _ 4, w  _ 2  =  _ MG w  _ 4 .

               &nbs ... #xF3A0; _ SEM t  _ 2, t  _ 1 =  _ keno t  _ 2 . <br /> <br /> Null

Example - III <br />               ... #xF3A0;]  _ keno             

               &nbs ... p;                

Further examples <br /> <br /> | Example1 <br />        &nb ... ;       : ' a '/t  _ 1 (br) ; ' b '/t  _ 2 (a1, br)

Example1 + accr + BIF <br />             ...                1.2                                                         2.2    

Counter - Example3   :   <br />           &n ... (b + c) )     : t  _ 1  !=  _ KENO t  _ 2 . <br />

2.3.  Lambda Calculus modus=similarity

Substitutions had been quite harmonious for the identity (equality) and the equivalence case.
Substitution in the mode of identity which is the case for classical lambda calculi is a prototype of a transformation or rewriting system whithout any deviation from identity.
Kenomic substitutions happens in the mode of equivalence, in contrast to the equality of identity, enabled by the kenomic abstraction or the Stirling Turn.
Nevertheless, both abstractions are balanced in respect of the number of occurrence of their terms.
This presumption is abandoned with the abstraction of similarity. Also similar terms are still of the same length, the number of their elements might differ.
Only the metamorphic abstraction over terms is introducing a bisimilarity between terms of different numbers of elements and different length of the involved terms.

Similarity Underscript[(u =>  _ SIM v) => ( (w  _ 1 u =>  _ S ...  3 w  _ 2 !=  _ sem w  _ 4, w  _ 2 =  _ MG w  _ 4

Table u typeset structure v

typeset structure

Example: u typeset structure v

u = [aab], v = [bba]
u =>MG v, u ¬=>SEMv
wtypeset structurecc], w2 = [dd] : typeset structure typeset structure
wtypeset structureee], w4 = [ff] : typeset structure typeset structure

wtypeset structureSIM, i=1,2,3,4

length(w1) = length(w2)
typeset structure
sem(wi) ∩ sem(u)typeset structure, i = 1,2

length(w3) = length(w4)
typeset structure
sem(wi) ∩ sem(v)typeset structure, i = 3,4

length(w1) = length(w3)
typeset structure
sem(wi) ∩ sem(v)typeset structure, i = 1, 3

length(w2) = length(w4)
typeset structure
sem(wi) ∩ sem(v)typeset structure, i = 2, 4

<br />      [aab] Underscript[ => , SIM][bba] <br /> Overscript[<br />  ... cript[=>, SIM] [bba][dd], <br /> [ee][aab] Underscript[=>, SIM] [ff][bba] <br />, _]

Example I - SIM              Terms ...        : t  _ 1 =  _ SIM t  _ 2,   . <br />

Example II - SIM              Term ... ;        : t  _ 1 =  _ SIM t  _ 2,   .

Example III - SIM              Ter ... ;        : t  _ 1 =  _ SIM t  _ 2,   .

2.4.  Lambda Calculus modus=bisimilarity

2.4.1.  First steps to LC-bisimilarity

<br /> Table :   u   Underscript[=>, BIS]   v   <br /> [   u => v              ...                       -                        -                        +                        +

<br /> Conditions : u   Underscript[=>, BIS]   v   <br /> wu ¬ =>  _ SEM wv, ...  /> w  _ 2 !=  _ sem w  _ 4, w  _ 2 =  _ MG w  _ 4

<br /> Bisimilarity <br /> w  _ 1 Underscript[=>, BIS]    w  _ 2  ... ;                <br />

     Example <br />      u Underscript[=>, BIS] v , ...                <br /> <br />

Example - BIS              Terms = ... ;        : t  _ 1 =  _ BIS t  _ 2,   .


http://memristors.memristics.com/semi-Thue/Notes%20on%20semi-Thue%20systems.pdf

2.4.2.  Linguistic example for LC-bisimilarity

A simple application  “Fruit flies like a banana.”
What’s the meaning of an ambiguous sentence like “Fruit flies like a banana.”?

As it is well known, the sentence has, at least, two meanings:
1. “The insects called fruit flies are positively disposed towards bananas.”
2. “Something called fruit is capable of the same type of trajectory as a banana.”
"These two potential meanings are partly based on the (at least) two ways in which the phrase can be parsed.” (Alan P. Parkes, Introduction to Languages, Machines and Logic, 2002, p. 42)

Ambiguity in languages is reflected in the existence of more than one parse tree for one sentence of that language.

Fruit flies like a banana.                  Fruit flies like a banana.
               ↙↘                                                ↙       ↘
  Fruit flies    like a banana                      Fruit flies   like a banana
                     ↙↘                                    ↙↘                   ↙↘             
                 like  a banana                   Fruit   flies           like  a banana

                                                        
                                                     Fruit flies like a banana.
Fruit flies like a banana.                              ↗↖ a-banana
           ↗↖ a banana                                ↗↖ like
         ↗↖ like                                      ↗↖flies
Fruit-flies                                         Fruit

Compositionality
The sentence X = "Fruit flies like a banana.” is de/composable into two different term-trees.
X' = (x1, x2, x3) and X” = (x1, x2, x3, x4) with

X’ =  like(banana, fruit-flies), length(X’) = 3
X” = flies((like, banana), fruit), length(X") = 4.

Because the meaning of the two sentences is not decomposable into the same amount of sub-terms, the relation between the two interpretations of the full sentence is neither equal, equivalent or similar, therefore, the concept of polysemy is not properly applicable. The 2 interpretations are also not in an asymmetric meaning relation of one-many-mappings.

The ‘ambiguous’ meaning of the sentence X ‘as such’ is therefore understandable as a bisimilar interplay between its two different realizations X’ and X”. The morphogram of X, i.e. its deep-structural meaning prior to any phonological interpretation is thus the morphogram of the whole situation MG = (X, X’, X").

It is a reasonable option to interpret this linguistic example which is an overlapping of two sentences of different structural ‘length’ with the techniques of morphogrammatic bisimilarity. With such an interpretation the former interpretation with the help of morphogrammatic bifunctoriality based on fusion gets a further and technically more unifying treatment.

Morphogrammatic interpretation based of “fusion” and concatenation.

 <br />       [(ab), (c, d)] ∐ [(a, b), (c, d)]   &nb ...    d     < a > < b > < c > d     <br />

http://memristors.memristics.com/Memristics%20LISPs/Memristics%20LISPs.html

Bisimilarity approximation model <br />           ... t  _ 1 and t  _ 2 as X ' and    X " as answers to t  _ 0 .

3.  Types of iterability

3.1.  Identity

"For instance, if f = λx.x is the identity function, then we have f(x) = x for any x.
In particular, we can take x = f , and we get
f(f) = (λx.x)(f) = f .
Note that the equation f(f) = f never makes sense in ordinary mathematics, since it is not possible (for set-theoretic reasons) for a function to be included in its own domain.” (Selinger, p. 7)
http://www.mscs.dal.ca/~selinger/papers/lambdanotes.pdf

The trick to produce circularity is simple, well accepted and has lost any strangeness:
"In particular, we can take x = f, and we get f(f) = (λx.x)(f) = f.”

But it is guided by logical decisions and not by the means of the lambda calculus per se.  
What holds in general, holds in particular too. But all that belongs to innocent hierarchical systems of total governance.

From a quadralectic point of view such naivety is not worth to be followed. Identity, especially in the disguise as beginnings, archai, are not unified by a simple umbrella of identity as it appears with the formula “f = λx.x” and its consequences.

3.2.  Terminal terms

"For example, the lambda term (λx.y)((λz .z z )(λw.w)) can be reduced as follows.
Here, we underline each redex just before reducing it:

(λx.y)((λz .z z )(λw.w) ) ->β (λx.y)((λw.w)(λw.w))
                                   ->β (λx.y)(λw.w)
                                   ->β y.

The last term, y, has no redexes and is thus in normal form. We could reduce the
same term differently, by choosing the redexes in a different order:

    (λx.y)((λz .z z )(λw.w)) ->β y.

"This example also shows that the size of a lambda term need not decrease during reduction it can increase,or remain the same.” (Selinger)

3.3.  Non-terminal terms

Not every term evaluates to something; some terms can be reduced forever without reaching a normal form.The following is an example:

(λx . xx) (λx . xx) : <br /> (λx . xx) (λx . xx)    -> ༺ ... p;    ->  _ β . . . <br /> In fact, most terms don ' t terminate .

3.4.  Chiastic terms

Two infinite developments with a finite inter-relation between the 2 loops are called chiastic terms. Chiastic terms are a new kind of terminal,i.e.finite,interactions between non-terminal terms. Chiasms themselves might have non-terminal developments too.
Chiastic terms belong to a class of new term configurations and are motivated by polycontextural considerations.

(λx . xx) (λx . xx)    -> β  _ keno    (λx  ...  ... <br />                :

4.  From Y to WHY

4.1.  Transitions to morphogrammatic fixed-points

Identity:  F =ID F'

Y ≡ (typeset structuref. (typeset structurex.f(xx)) (typeset structurex. f(xx)))
YF ≡ (typeset structuref. (typeset structurex.f(xx)) (typeset structurex. f(xx)))F
=>  (typeset structurex. F'(xx)) (typeset structurex. F(xx))
=>  F(typeset structurex. F'(xx)) (typeset structurex. F(xx))
=>  F'(YF) = F(YF).                 

Equivalence:  F !=ID F’, F =keno F'

Y ≡ (typeset structuref. (typeset structurex.f(xx)) (typeset structurex. f(xx)))
YF ≡ (typeset structuref. (typeset structurex.f(xx)) (typeset structurex. f(xx)))F
=>  (typeset structurex. F'(xx)) (typeset structurex. F(xx))
=>  F(typeset structurex. F'(xx)) (typeset structurex. F(xx))
=>  F'(YF).                          

Some motivation
If the meaning of a term is defined by its use (Wittgenstein)then we should analyze the use of all crucial terms in the construction of the FixedPoint theorem.

Because this theorem is of crucial importance for formal systems in general and for questions of computability in particular it shouldn't be exaggerated to test all the cases of the use of the main terms involved.

With Barendregt’s formula for the construction of the Fixed Point Theorem, the term “W” is used 6 times and therefore we have to check all its possible ways of use. The comfortable excuse to use the distinction of a syntactic and a semantic or of an object- and meta-language use of the terms to avoid further analysis isn’t of leading importance in this case.

There are certainly better arguments to motivate the decision for a morphogrammatic turn towards a new understanding of the use of signs. For the purpose of a simplified introduction of the techniques of morphogrammatics it should suffice the desire for justification.

From Y functor to WHY interaction
A similar constellation appeared in Lambda Calculi in... with the modeling of a 3-contextural Y and WHY function.


[Graphics:HTMLFiles/Church-Rosser in Morphogrammatics_102.gif]

Chiasm of operator and operand
"In YF, the term Y is an operator and F is an operand of the application YF. Because of the highly abstract definition of the Lambda Calculus it is possible to change the operand, step by step, to an operator.
Now, F is an operator to (YF) and also an operand to Y in (YF). This double-functionality of Y is saved in the mind of the reader, the difference is nullified by notional abstraction.
Polycontextural strategy tries, in contrast and additionally, to inscribe such a notational abstraction into a graphematic play. Because there is no trust in mental representations we have to write it down.” (polyLC, p. 63)

4.2.  Morphogrammatic Fixed Point scheme

4.2.1.  Barendregt’s  Fixed Point

The ingenious construction of the FixedPoint Theorem is given by Barendregt’s formula. It is a simplified definition of the Turing operator “by recognizing that we don’t need to delay the evaluation of the first expression”.
The following analysis of the Fixed Point Theorem is not considering the computational aspects of “eager”  and “lazy” programming.

Proof of the theorem by Barendregt:

typeset structure

To analyze this “lazy” construction of the Y combinator we have to consider all occurrences of the main term “W”. Hence, we count 6 different uses of the term W. Therefore we offer each use of the term W a number: Wi, with  i∈{1, ...,6}.

Thus, the full scheme of the Fixedpoint theorem is now marked by its numbers of occurrence of term “W”  in the application.

typeset structure

There is in fact also a difference of use in “define X ≡ WW”, W2W3, and the logical application “Then X ≡ W”, W2W3. This difference is not specially marked in this discussion.

Fixed Point scheme for W

W1typeset structurex. F(xx),  X ≡ W2W3,  typeset structurex. F(xx)W4,  F(W5W6).

Eager Fixed Point Combinator (Selinger)

"The lambda calculus contrasts with arithmetic in that every lambda term has a fixpoint. This is perhaps the first surprising fact about the lambda calculus we learn in this course.”

"Theorem 3.1. In the untyped lambda calculus, every term F has a fixpoint.

Proof. Let A = typeset structurexy.y(xxy), and define Θ = AA. Now suppose F is any lambda term, and let     N = Θ F. We claim that N is a fixpoint of F. This is shown by the following calculation:

  N = Θ F
      = AAF
      = (λxy.y(xxy))AF
      -->typeset structure  F(AAF)      : (β )( λx.M)N  -->typeset structure M[N/x]
      = F(ΘF)
      = FN.

The term Θ used in the proof is called Turing’s fixpoint combinator.” (Selinger)

Analysis
A = typeset structurexy.y(xxy), Θ = AA,
N = Θ F = AAF = (λxy.y(xxy))AF -->typeset structure  F(AAF) = F(ΘF) = FN.

A1 = typeset structurexy.y(xxy), Θ = A2A3,
N = Θ F = A4A5F = (λxy.y(xxy))A6F -->typeset structure  F(A7A8F) = F(ΘF) = FN.

4.2.2.  Distributed Fixed points

Eager Fixed Point combinator in Scheme

(define Y
  (lambda (f)
    ((lambda (x) (f (lambda (y) ((x x) y)))
      (lambda (x) (f (lambda (y) ((x x) y))))))).

Recursive FACT with Y:
(define fact
  (Y (lambda (f)
        (lambda (n)
          (if (zero? n) 1 (* n (f (- n 1)))))))).

Ytypeset structure-KENO

(define Ytypeset structure
  (lambda (f)
    ((lambda (x) (f (lambda (y) ((x x) y)))
       (lambda (x) (f (lambda (y) ((x x) y))
         (lambda (x) (f (lambda (y) ((x x) y))))))).

Heterarchic distribution of Y ^(3) <br /> Y ^(1.2 .3) ≡ [λf  ... bsp;     ∐)/(λx . f ^3(xx)) (λx . f ^3(xx))))

Recursive 3 - FACT with Y ^(3) <br />

 λfact ^(1.2 .3) . ((Y ^1    (λ(f ^1) λ(n &#x ... nbsp; ∐)/(if (zero ? n ^3) 1 (*  n ^3 (f ^3 (- n ^3 1))) ) .

4.3.  Morphogrammatics of Fixed Point Theorems

4.3.1.  Semiotic equality

Identity : [1,1,1,1,1,1]

The most obvious use of “W” is the use in the mode of identity:
Thus, for all i, j Wi, Wj: Wi ≡ Wj, with i,j∈{1, ...,6}.

For all i, j Wi, Wj: Wi ≡ Wj, i,j∈{1, ...,6}
=>
X ≡ W1W1typeset structurex. F(xx)W1 ≡ F(W1W1) ≡ FX.

That is the classical case of identity:
X ≡ WW ≡ typeset structurex. F(xx)W ≡ F(WW) ≡ FX.
=> X ≡ FX.

Logical interpretation
X ≡ WW ≡ typeset structurex. non(xx)W ≡ non(WW) ≡ nonX.
=> X ≡ nonX.
{X , nonX}∈Contradiction.

4.3.2.  Kenomic equivalence

The kenomic equivalence takes the Stirling turn. What counts in this case is not anymore the identity of the signs in consideration but the structure of their distribution. Also the complexity of distribution is minimal in this application of the fixedpoint construction there are nevertheless enough distinctions available to enable to differentiate between an identity and a kenogrammatic use of signs.

Example1: [1,1,1; 4,4,4]

W1≡ W2 ≡ W3, W4 ≡ W5 ≡ W6

W1typeset structurex. F(xx), X ≡ W1W1, typeset structurex. F(xx)W4 , F(W4W4):

W1typeset structurex. F(xx)
X1 ≡ W1W1kenotypeset structurex. F(xx)W4 ≡ F(W4W4) ≅keno FX4

=> X1keno FX4.

Example 2: [1,1,1,4,1,1]

W1≡ W2 ≡ W3 ≡ W5 ≡ W6, W4!= W1

W1typeset structurex. F(xx)
X1 ≡ W1W1kenotypeset structurex. F(xx)W4keno F(W1W1) ≅keno FX1

=> X1keno FX1.

<br /> Example 2 - par : [1, 1, 1, 1 || 4, 1 || 4, 1 || 4] <br /> W ^1 ≡ W ó ... <br /> => X ^1 ≅  _ keno    ((FX ^1)/(FX ^4)) .


Example3: [1,1,1; 4; 5,5]

W1≡ W2 ≡ W3, W4, W5 ≡ W6

W1typeset structurex. F(xx)
X1 ≡ W1W1kenotypeset structurex. F(xx)W4keno F(W5W5)≅keno FX5

=> X1keno FX5.

Logical interpretation
a)
X1 ≡ W1W1kenotypeset structurex. non(xx)W1keno non(W1W1)≅keno nonX1

=> X1keno non1X1 => {X1, non1X1}∈ Noncontradiction.

b) X1 ≡ W1W1kenotypeset structurex. non(xx)W4keno non(W5W5)≅keno nonX5

=> X1keno non5X5.

=> {X1, non5X5}∈Incompatible.

4.3.3.  Morphic similarity

While the kenogrammatic use in the case of equivalence is still accepting the symmetry of substitutions the case of morphic similarity is abandoning this restriction too. Hence, the necessity to replace the same term in a function like “f(xx)" is deliberated to the possibility to replace the repetition of “x” in “f(xx)" with different occurrences of the term “W”.

Example4: [1,1,1; 4; 5; 6]

W1≡ W2≡ W3, W4, W5, W6

W1typeset structurex. F(xx)
X1 ≡ W1W1kenotypeset structurex. F(xx)W4sim F(W5W6) ≅sim FX5.6

=> X1sim FX5.6.

Example5: [1,2,3; 4; 2; 3]

W1, W2, W3, W4, W2, W3

W1typeset structurex. F(xx)
X1 ≡ W2W3sim typeset structurex. F(xx)W4  ≅sim F(W2W3) ≅sim FX2.3.

=> X1sim FX2.3.

Logical interpretation
X1 ≡ W2W3sim typeset structurex. non(xx)W4  ≅sim non(W2W3) ≅sim nonX2.3.

=> {X1, non(X2|X3)}∈Incompatible.

4.3.4.  Morphic bisimilarity

Morphic bisimilarity abandons the security of the process of substitution established by the presumption in charge for the identity, the equivalence and the similarity option, of the same “length” of the substituted terms.
Hence, the situation “W1typeset structurex. F(xx)" might be replaced by the bisimilar term “typeset structurex. F(xxx)" delivering not WW but WWW with the abstraction W1W1bis W5Wtypeset structure.

Example6: [1,1,1; 4; 5; 6; 7]

W1≡ W2 ≡ W3, W4, W5, W6, W7

W1typeset structurex. F(xx)
X1 ≡ W1W1bistypeset structurex. F(xxx)W4sim F(W5Wtypeset structure) ≅sim FXtypeset structure

=> X1bis FXtypeset structure.

Example7: [1,2,3,3’; 4; 4; 5]

W1, W2, W3,W3,  W4, W2, W3

W1typeset structurex. F(xxx)
X1 ≡ W2Wtypeset structurebis typeset structurex. F(xx)W5  ≅sim F(W6W7) ≅sim FX6.7.
=> X1bis FX6.7.