Nonclassical computation : formal models, mathematical structures and logical foundations (Collected papers)
- Publication Type:
- Issue Date:
NO FULL TEXT AVAILABLE. This thesis contains 3rd party copyright material. ----- This is a collection of the author’s representative papers selected from his more than 120 publications. This collection consists of 17 papers: two of them are invited book chapters, and the others have been published in refereed international journals. All of these papers are devoted to the thorough studies of a central theme, namely, formal models, mathematical structures and logical foundations of nonclassical computation. The modes of nonclassical computation dealt with in this collection include: (1) quantum computation, (2) probabilistic computation, (3) concurrent computation, and (4) intelligent computation. The author has been conducting systematic research on these computational modes and their relationship for more than 25 years, and his discoveries have added to the existing knowledge base of the research area. The author’s main contributions are summarized as follows: • Quantum Computation: As a logical foundation of quantum computation, the author proposed a theory of automata based on Birkhoff-von Neumann quantum logic. Some essential differences between classical automata theory and theory of automata based on quantum logic were discovered. He designed an algebraic language for formally specifying quantum circuits in distributed quantum computing. To provide formal tools for rigorously reasoning about concurrent quantum computing and quantum communication protocols, the author introduced an algebra qCCS of quantum processes in which communications by moving quantum states physically are allowed and computations axe modeled by super-operators. He also developed a quite complete predicate transformer semantics of quantum programs based on Birkhoff-von Neumann quantum logic by modeling quantum predicates as projection operators. • Probabilistic Computation: The author introduced a novel notion of probabilistic refinement index which may be used to describe more flexible refinement relation between probabilistic programs. He established normal form theorems for monotone, conjunctive, disjunctive and continuous probabilistic predicate transformers. He also gave a game semantics of probabilistic programs and proved a winning strategy theorem in probabilistic game semantics. To overcome the difficulty of randomization of asynchronous parallel composition, the author proposed an additive model of probabilistic processes. He also proposed pi-calculus with noisy channels that provides a uniform framework in which we can reason about both correctness properties and reliability of mobile systems. • Concurrent Computation: The author’s principal contribution in this area is that he developed a systematic theory of topological structures in process algebras, in particular, in CCS, to describe approximate correctness and infinite evolution of concurrent programs. His other contributions are: (1) he found a much shorter proof for uniqueness theorem of recursive equations in CCS; (2) he proved uniqueness theorem of recursive equations in higher-order process calculus; and (3) he introduced a notion of weak confluence in labeled transition system. This notion has been used by other researchers in designing algorithms for generation of state spaces and algorithms for model checking transition systems. • Intelligent Computation: Reasoning is a core technology in intelligent systems. The author proposed a first-order logic with approximate proofs. This logic can serve as a logical foundation of approximate reasoning. He introduced the notion of diagnostic specification morphism and some operations of diagnostic specifications. They can be used to model knowledge transformation and fusion, respectively, in diagnostic reasoning. Quantifiers have the ability of summarizing the properties of a class of objects without enumerating them in high level knowledge representation and reasoning. The author presented a new framework for modeling quantifiers in natural languages in which the truth value of a quantified proposition is evaluated by using Sugeno’s integral. The author’s another contribution to this field is the establishment of a formal model for computing with words, which is a key component of intelligent computation.
Please use this identifier to cite or link to this item: