Rudolf Kaehr Dr.phil@
Copyright ThinkArt Lab ISSN 2041-4358
Fragment (Nov. 2011)
The concept, methods and strategies of blending conceptual spaces are well framed by the general framework of institutions as it was elaborated by Joseph Goguen and Rod Burstall in the 1970s.
"Institutions arose at the end of the 1970s in response to the enormous expansion of logical systems used in computer science, and the need for a uniform way to structure theories in them. Institutions are managing diversity and blending of different conceptual spaces, domains and formalisms.” (Goguen, )
Dissemination is a polycontextural strategy to distribute and mediate diversity, especially the diversity of multiple institutions. The idea of multiple institutions is contrary to the claim of the theory of institutions to be unique and universal. Institutions are in a high degree independent of many specifications like logics, ontologies, semiotics, and more. But with this exceptional generality they claim to frame any possible reasonable reasoning and modeling as such.
Dissemination takes the risk to disseminate even this general constitution of rationality by an act of subversion of and dislocation of the hierarchical order as such between institution and heterogeneous logical spaces.
Institutions are seen as localized, having neighbor institutions of different institutionality, and being in interactional, reflectional and interventional interactions and cooperations.
The papers “Catching Transjunctions” and “The Tale of Transjunctions” gives a highly condensed report about the history of the development of the concept and the mechanism of transjunctions.
This paper, Dissemination, offers some insight into the mechanism of dissemination as distribution and mediation by connecting it to the more traditional concept of Blending.
"Heterogeneous speciﬁcation becomes more and more important because complex systems are often speciﬁed using multiple viewpoints, involving multiple formalisms. Moreover, a formal software development process may lead to a change of formalism during the development. However, current research in integrated formal methods only deals with ad-hoc integrations of different formalisms.”
Dissemination in the sense of polycontexturality is not depending on heterogeneous logical spaces and their different logics. Dissemination might distribute as well homogeneous logical spaces ruled by the same logical system. The difference is this: such logics are disseminated and are taking a place in a kenomic matrix that is defining the kenomic difference of the same logics. Hence, differentness here is determined not by different properties of logical or institutional systems alone but by the difference of their localization.
Hence, blending in a polycontextural context might blend not only heterogeneity but also homogeneity, i.e. the structurally same but kenomically different logics (entailment systems) and institutions. Then the creative aspect of blending gets a new distinction.
A dissemination of a Brownian and classical 1-D CA, reflected by a classical 1-D CA is a simple example of how different types of calculi and automata might interact in a complexion.
Mediation of 3 distributed automata functions:
f1 = classical 1-D CA,
f2 = Brownian 1-D CA,
f3 = classical 1-D CA
"Concept expansions are thus a structured type of cognitive blend, comparable to the idea of closure of Lakoff and Núez (2000, p. 21), and Buzaglo sees them as essential to scientific advances, so even a system of logic, if it is to reflect human cognition, must have scope for blends.” (Alexander, Mathematical Blending, 2008, p. 3)
With the application of cellular automata constructs to morphogrammatics, a new chapter for the understanding of transjunctions is opened.
Like an understanding of Chinese characters is not given with a supposition of a meaning of the characters, hinting to semantic and ontological domains, but with the action related, evocative and with pragmatic request, neither a semantical nor a syntactical understanding of morphograms is catching their nature. Morphograms, in the sense of Gotthard Gunther, had been well understood as the results of a new abstraction from logical, i.e. semantic and syntactic, features. This new kind of abstractness from logical operators, also conceived as the “inscription of the operativity of (logical) operators” or the “inscription of the process of semiosis of signs” has opened up interesting philosophical interpretations towards a new understanding of Plato’s concept of ideas as “Number and Logos” (Gunther, Oehler). Numbers in this new Platonic sense had been conceived by Gunther from their numeric character as ordinal, cardinal and mixed numbers but their operational, dynamic and computational aspects didn’t got a specific thematization.
Morphogrammatics was defined as transformation theory of morphograms (Umformungstheorie) ruled by operators like the reflector R, composition by mediators, and more.
The idea, introduced quite early, to interpret the kenogrammatic successor operation as a retro-grade operation, inscribing retro-grade recursivity, was an important step forwards to an operative understanding of morphograms but didn’t produce enough technicality to fascinate people from the more traditional mathematical and program-theoretical disciplines.
Kenomic cellular automata, and all its further developments, have left the strict theoretical level of formalization of morphogrammatics towards a more applicative or applicable device.
Now, with the automata-theoretical turn in the understanding of morphograms, morphograms are not just abstract characters of a transformational grammar, say objects of manipulations, but themselves rules of calculation.
A morphogram is a morphé and a rule (dynamis) at once.
Morphogrammatics, therefore, is a “transformational” grammar of “cellular automata” of morphic “objects” (CAs) and dynamic, “self-modifying”, rules of CAs.
What does it mean? In purely technical terms of combinatory logic it means the sameness and unity of the concepts of an operator and an operand. Hence, a domain (contexture) defined by strictly logical contradictions as it appears in logical antinomies (paradoxes). In contrast to the singular and unstructured sameness of the combinatory logic concept of combinators, SKI, as being able to function as operators and operands, I(K), morphogrammatic operators/operands are highly structured and able to evolve/emanate their immanent structuration.
Although this step from a logical and mathematical domain with its fundamental distinction of “operator and operand” towards a realm beyond such distinctions is of great importance, the question, what are the new features of such a radical manoeuvre, has to be answered.
What doesn’t exist within and between the systems of morphogrammatics is the feature and apparatus of interactionality as it is achieved by “transjunctional operators" (Gunther, 1962).
Understanding morphograms as operative rules and taking the new feature of transjunctionality into account, the achieved morphogrammatically based cellular automata have to incorporate transjunctional properties.
The new feature of disseminated morphogrammatic cellular automata is transjunctionality.
Transjunctionality is the general mechanism how morphic events of a contexture have at once neighbor events in other contextures of distributed cellular automata.
Hence, transjunctionality is coverings the mechanism of interactionality, reflexionality and interventionality between disseminated cellular automata.
This are totally new features and are in no way covered by the well rcognized immanent parallel processes of CAs.
Transjunctionality is not just parallelism but the interaction between distributed contextures of CAs and logics.
Hence, transjunctionality happens between CAs of distributed CAs in compounds of kenomic CAs.
A deeper understanding of Gunther’s concept of transjunction had been achieved by a step-wise decomposition (de-sedimanetation) of the covered and hidden layers of the simple formalization of the concept of transjunction by a function-theoretic approach.
Now, the decomposition and de-sedimentation has reached some saturation the inverse procedure of composing the parts together has to be considered.
An interesting mechanism has been developed by Joseph Goguen for a computational and algebraic understanding of semantic processes and linguistic-aesthetic figures like polysemy, oxymoron and metaphors. Goguen's method is proposed as the process of blending concepts, i.e as conceptual integration. Blending, and conceptual integration (CI) might be a more technical term for the old, ambiguous and probably overused term “mediation".
Before we can blend things together we have to separate them.
Before we can separate things we have to blend them.
In the case of transjunctions, the ingredients had not been freely on the table. Only with some deeper insight into the philosophical concept of transjunction as introduced and propagated by the philosopher and theoretical cybernetician Gotthard Gunther, it was possible to separate its ingredients and to get access of its properties. This separation happened within an interplay between the notional attempt and the rudimentary formalization of transjunction. The process of separation was in fact a “de-sedimention” of different over-lapping layers of the concept and its step-wise formalization.
The narrative of this step-wise desedimentation and deconstruction of the concept “transjunction: is told, albeit as a tour de force, by “The Tale of Transjunction” and “Catching Transjunctions".
A first reassembly of the deconstructed parts was supported by the polycontextural interpretation of the category-theoretic concept of “bifunctoriality”. Bifunctoriality allows to handle the interplay of “parallel”, i.e. polycontextural and “serial”, i.e. intra-contextural constructions.
This highly abstract bifunctorial apprach is naturally concretized in the framework of “blending” as it has been elaborated by the computer-scientist Joseph Goguen.
In contrast to existing models of conceptual integration (blending) proposed by Goguen, Guhe, Smaill and others, the polycontextural and morphogrammatic approach is not based on ontologies that are releated to logical semantics.
What is the functional meaning of elementary kenomic rules with more than 2 elements? Like rule R5 and rules R10 to R15. Following the experiences made and constructions developed for morphogrammatics and polycontextural logics and sketches for contextural programming languages, the decision to implement kenomic CA rules with more than one element as “transjunctional” kenomic CA rules follows quite natural.
Transitions inside CAs are describing a path. Bifunctorial paths in kenomic CAs are understood as journeys between contextures, i.e. as contexturally distributed CAs and not just as successions of paths.
Blending of concepts
"Conceptual blending is a central, powerful and productive aspect of human cognition, allowing, for example, to conceptualise time in terms of space. However, cognitive modelling has not yet seriously addressed this issue. We outlined in broad terms a way to transfer Goguen’s notion of conceptual blending into the cognitive architecture ACT-R as a first step to include conceptual blending in cognitive models of scientific creativity, in particular mathematical thinking.” (Guhe)
Blending of logics
Blending has not yet been directly addressed in formal logic as genuine topic for logic as such. Neither a logic of blending nor a blending of logic is recognized as a topic of logic. There might be logical and model-theoretic applications of the concept of blending within the context of logical theories, but not as a self-applications onto the primary axiomatics of logics as such.
“Mathematicians talk of ‘proofs’ as real things. But the only things that can actually happen in the real world are proof-events, or provings, which are actual experiences, each occurring at a particular time and place, and involving particular people, who have particular skills as members of an appropriate mathematical community.” (Goguen, 2001).
Creativity and blending
The process of elaborating the details of blending might lead to emergent properties and other surprises.
"In contrast to other combination techniques that aim at integrating or assimilating categories and relations of thematically closely related ontologies, blending aims at ‘creatively’ generating new categories and ontological definitions on the basis of input ontologies whose domains are thematically distinct but whose specifications share structural or logical properties. As a result, ontological blending can generate new ontologies and concepts and allows a more flexible technique for ontology combination than existing methods.” (Hois et al)
Also structural blending is understood as a new formal technique to analyze logical proof-systems, it is not yet applied to the very definition of the structure and architecture of logic.
What is blending anyway?
"Blending is an inference method operating on spaces. There may be more than two input space for a blend operation. Generic space contains the common input elements of the input spaces as well as the general rules and templates for the inputs. The elements of generic space can be mapped onto input spaces. Blend space is the place where the emergent structure occurs. The projected elements from each input space and generic space create an emergent structure in the blend space, possibly something not in the input space. The structure in the blend space may be an input for another blend operation as controlled by an Integration network. A new emergent structure may contain not only elements from the input spaces and generic space but also new emergent elements that do not exist in either space .
Blending involves three operations:
Composition involves relating an element of one input space to another. These relations are called “vital relations”. This matching generally occurs under a "frame".
Completion is pattern completion in which generic space is involved in the blending operation. If the elements from both input spaces match the information stored in the generic space, a more sophisticated type of inference can be made, a generalization of reasoning by analogy. This is the place where we use long-term memory and increase our experiences.
Elaboration is an operation that creates an emergent structure in the blend space after composition and completion. It is also called running the blend.” Baris E. Ozkan, Autonomous Agent-based Simulation of a model simulating the human air-threat assessment process, 2004
A possibility to fill this gap is given with an attended application of the mechanism of structural blending to morphogrammatic and logical systems necessary to define the new concept of transjunctions and transjunctional operations.
Blending (blend, mélange, mixture, integration) together “approaching” and “coming”.
This strategy is well introduced by Goguen’s “Semiotic Morphisms” approach to complex and polysemic systems. Blends are structured by categorical morphisms and composition with the property of commutativity.
Blending is conceptualized on the base of a single external observer for whom blending is offering a kind of a holistic unification of both tendencies, suggesting something new: a blend of both, here, with “intra-contextural” and “trans-contextural” mélanged together. But to add emergent features to the blending process, some additional ingredients have to be spent.
One reason while blending is not an immanent feature of logic, semiotics and computation might be the fact that it happens only on the level of semantics. Like the term “houseboat” and “boathouse” are syntactically not produced by blending but by concatenation. The meaning of the composits is achieved by blending, and there is no simple concatenation of the meanings “boat” and “house” to “houseboat” and “boathouse” and super-additively “amphibious vehicle".
"A classic example for this is the blending of the concepts house and boat, yielding as most straightforward blends the concepts of a houseboat and a boathouse, but also an amphibious vehicle.” (Kutz, Hois)
In contrast, Chinese writing is fundamentally based on blending, “semantically” and “syntactically” at once.
"In contrast to other combination techniques that aim at integrating or assimilating categories and relations of thematically closely related ontologies, blending aims at `creatively' generating new categories and ontological de nitions on the basis of input ontologies whose domains are thematically distinct but whose speci cations share structural or logical properties. As a result, ontological blending can generate new ontologies and concepts and it allows a more exible technique for ontology combination than existing methods.” (Kutz, Hois, p. 1)
"Blending two conceptual spaces yields a new space that combines parts of the given spaces, and may also have emergent structure.” (Goguen)
"We will refer to any diagram having the shape of the one below as a diamond diagram, and in such a diagram, refer to the composition of the two morphisms on its left as its left morphism, to the composition of the two morphism on its right as its right morphism, to the middle upward morphism as its center morphism, to the triangle on its left as its left triangle, and to the triangle on its right as its right triangle. The extent to which these two triangles commute will be an important part of our analysis.” (Goguen)
Analogy to diamond strategies
It seems to be quite obvious that the strategy of blending has an analogy in the diamond strategies of the concepts “position”, with, say Input I1,
“opposition”, with, say Input I2,
“neither-nor”, with, say space G, and
“both at once”, with, say blend B.
Introducing a blendoid
Generic space G : Dissemination of contextures (polycontexturality)
Input I1 : Junctional parts (space)
Input I2 : Transjunctional parts (space)
Blend B : Transjunction as a unity of I1and I2.
"Blending two conceptual spaces yields a new space that combines parts of the given spaces, and may also have emergent structure.” (Goguen)
This version of the blending construction is important for the polycontextural bifunctorial modeling where the identity of the blendoid and the base system have to be constructed and are not obvious. Hence, the construction or realization of a morphism or isomorphism for equality, equivalence or similarity between the contexturally different blendoids and base ontologies has to be established. For the sake of simplicity, this intriguing manoeuvre is omitted in favor of a more direct presentation of the construction of polycontextural blending.
Mediation is defined by the proemial relationship which has a category-theoretic explication as a composition of morphisms of different contextural domains.
Concatenation of logical functions is an internal operation and assumes a homogeneous space of combination. The rules of composition, in respect of suntax and semantics, are well defined.
Combining logics is another mechanism to study complex logical systems. An overview is given by the Stanford artile.
For an early (1988) connection between fibered and polycontextural logics check Jochen Pfalzgraf as a forerunner of the Combining Logic project.
Jochen Pfalzgraf. Logical fiberings and polycontextural systems. In Proceedings of FAIR'1991. pp.170~184
Category-theoretic bifunctoriality and its polycontextural extensions brings all the aspects of merging, blending, combining and mediation together into a working formalism that is open for further modeling and operative applications. Bifunctoriality might be applied to the heterogeneity of different logics and ontologies framed in a common institution. But additionally to this intra-contextural modeling, an emphasis on a polycontextural understanding of the strength of differentness of heterogeneity of logics and ontologies might be applied too.
Polycontexturality is disseminating, i.e. distributing and mediating, institutions as such and not just intra-institutional aspects of heterogeneity.
The sketched formula for bifunctoriality of the logical functions conjunction, disjunction and transjunction are set in a polycontextural framework with its composition (o), mediation (∐) and transposition (◊) functions.
What do we learn from the comparison of the different combining methods?
Blending as a method of merging conceptual spaces gives a workable framework for combining logical spaces (ontologies, contextures, conceptual domains) but is not yet defining the formal and operational aspects of that process of blending.
The elaboration of the concept of mediation as a proemial relation led to the idea of indexing logical systems for distribution and applying the concept of matching conditions, known for the composition of morphisms, for the combination of the distributed logics.
The elaboration of the concepts of indexing, distribution and matching inspired the advent of fibered logics (J. Pfalzgraf, Dov Gabbay) as an emergent property of blending logical systems, and not just conceptual and ontological domains.
Further elaborations of the process of mediation enabled the emergence of the logical architectures of interactional, i.e. transjunctional and reflectional topics on a logic-architextural level that demands for the introduction of a kenomic matrix as the grid of the distributed systems.
A strategy of a subversion between institutions and domains (models, spaces) is accessible with the application of the proemial relation which has the structure of a chiasm between (mono-)institution, multiple domains and multiple domains becomming institutions by ‘elaboration’ which get blended as emergent institutions into the concept of a poly-institution, i.e. a distribution and mediation of the multiple institutions derived by ‘subversion’ (creativity) from the multiple domains.
A more formal approach to a modeling of transjunctional logical systems might be achieved with a dissemination of formal entailment systems, institutions and logics in the sense of Joseph Goguen’s general framework for programming languages.
The example for transjunctional, with sys and replicational, with sysdissemination shows clearly the scheme of the interactivity of formal systems as a structural base for any programming languages, and programming of poly-layered memristive systems.
"Coming to proofs, a logic extends an institution with proof-theoretic entailment relations that are compatible with semantic entailment.” (habil, p. 26)
All that gives a general dissemination scheme only. What has to be elaborated are the corresponding matching and mediating conditions for the constituents of the disseminated general entailment systems E, institutions I and logics L.
Dissemination has a vague connection to parametrization (Goguen, Burstall) and fibering (Gabbay, Pfalzgraf) of theories. This concretization might be realized step-wise for entailment systems, institutions and logics. The mediation conditions for concrete logics, then, are delivering the concrete matching conditions for the whole construction, i.e. for entailment (provability) systems, institutions (models) and polycontextural logics. Soundness is defined between an entailment system and an institution for each distribution. A kind of a harmony is defined between disseminated sound systems.
What is the environment, i.e. "Umwelt”, of a cellular automaton? Is there any chance to define the "Umwelt" (environment) of CAs?
Cellular automata are well accepted as potent tools to model complex natural, biological and sociological systems. It is as well well accepted that complex real-world systems, especially living systems, are defined by the difference of system and environment.
Therefore, it seems to be highly surprising to learn that the mathematical theory and modeling tools of cellular automata are not offering anything like a possibility to define environments of cellular automata.
Cellular automata are defined by the topology of their states and mapping functions defining their rules.
each neighbor of a state in a topology is a state of the topology. Each application of a rule on states is producing states of the application rules. Hence, CAs, in this sense, are functionally closed systems.
Gunther Teubner - The private/public dichotomy: After the critique?
"My argument starts with the obvious observation that the public/private distinction is an oversimplified account of contemporary society. More controversially, my argument continues that any idea of a fusion of the public and private spheres is equally inadequate. As an alternative conceptualisation, I propose that the public/private divide should be replaced by polycontexturality. The claim is this: Contemporary social practices can no longer be analysed by a single binary distinction; the fragmentation of society into a multitude of social sectors requires a multitude of perspectives of self-description.”
The second, as exemplified by Teubner, rejects fusion, arguing for the replacement of the distinction with a concept capturing the multi-dimensional complexity of law in multiple social contexts: `polycontexturality'.
Anna Grear (2003). Theorising the Rainbow? The Puzzle of the Public-Private Divide. Res Publica 9 (2).
Triarchy refers to the three fundamental ways of getting things done in organizations: hierarchy, heterarchy and responsible autonomy. (WiKi)
Blendig as generalization
"As for generalizations, the most powerful ones are those which transcend specific cognitive domains. In our work on conceptual blending, we see as a strong generalization the discovery that the same principles apply to framing, metaphor, action and design, and grammatical constructions. This is not an internal generalization about language, it is an external one relating linguistic phenomena to non-linguistic ones. Such generalizations seem primordial to the understanding of how language relates to general cognition, but they are precluded in principle by the autonomous approach evoked above. It is no surprise, then, if that approach finds no connection between language and the rest of cognition, for that autonomy is built into the very method that serves to build up the field of inquiry and the theories that are its by-products.” Gilles Fauconnier, Introduction to Methods and Generalizations
Blending as mediation
Mediation of cognitive domains is not a case of generalizing or ‘gluing’ cognitive domains.