Languages & Linguistics

Formal Language

Formal language refers to a precise, structured system of communication with well-defined rules and syntax. It is often used in mathematics, computer science, and linguistics for expressing ideas and algorithms in a clear and unambiguous manner. Formal languages are characterized by their rigor and lack of ambiguity, making them suitable for use in technical and scientific contexts.

Written by Perlego with AI-assistance

6 Key excerpts on "Formal Language"

Index pages curate the most relevant extracts from our library of academic textbooks. They’ve been created using an in-house natural language model (NLM), each adding context and meaning to key research topics.
  • Psychoanalytic Theory and Clinical Relevance
    eBook - ePub

    Psychoanalytic Theory and Clinical Relevance

    What Makes a Theory Consequential for Practice?

    • Louis S. Berger(Author)
    • 2019(Publication Date)
    • Routledge
      (Publisher)
    For science, the interesting internal properties of language are “the formal devices of language … studied independently of their use” (Chomsky, 1972, p. 198). This is particularly true of “formal” languages. According to Weizenbaum (1976),
    a Formal Language is a game. That is not a mere metaphor but a statement asserting a formal correspondence. But if that statement is true, we should, when talking about a language, be able to easily move back and forth between a game-like vocabulary and a corresponding language-like vocabulary. Precisely that can be done [p. 50].
    Weizenbaum goes on to show that in the way he uses the term a “game” is logically equivalent to, isomorphic with, a Turing machine and thus with a state space representational system (pp. 39–62). It displays all the necessary and sufficient characteristics. Language is seen as composed of “units”—its alphabet, “what things there are in language” (Chao, 1968, p. 57), the elementary particles or building blocks; these are usually categorized (e.g., into parts of speech). Then there is a set of formal rules that defines what is admissible as a “well-formed formula” (Hunter, 1971), that is, “what goes with what, and how, in the language” (Chao, 1968, p. 57); these rules define, and enable one to construct, those basic assemblages of building blocks deemed, by fiat, to be “correct.” Finally, to the alphabet and the formation rules one adds the “deductive apparatus,” the logical procedures that provide an axiomatic base and license the derivation of new well-formed formulas from the axiomatic base set. These procedures and rules are analogous to what in the context of state processes are called transition or translation rules; we recall that these latter rules prescribe how one moves from one state description to another.
    An important aspect of logical analysis applied to language is grammatical analysis, a special case of formation and transformation rules. Typically, the excised corpus is the sentence, its units are words, and the grammatical form displayed by the “string” is the subject-predicate, propositional form. More generally, in grammatical analyses one studies the rules of organization (e.g., morphology and syntax) at the level of morphemes, words, phrases, or sentences. Other representative topics of logical analyses are transformations (e.g., Chomsky), levels of abstraction (e.g., “observational” versus “theoretical” language), or grammatical versus “logical” form (Russell—see Hacking, 1975, p. 74). A limiting case of this kind of approach is exemplified by the work of Russell and Whitehead. Their Principia Mathematica studies language by means of an elaborate system of symbolic logic that is presumed to express “the logical core of all languages” (Barrett, 1978, p. 5).
  • Language and Philosophical Problems
    • Sören Stenlund(Author)
    • 2013(Publication Date)
    • Routledge
      (Publisher)
    1  It manifests itself in a characteristic use of the term ‘natural language’. Natural languages are conceived as being ‘in principle’ Formal Languages. (This is explicit, for instance, in Davidson’s and Montague’s writings.)
    2  Separation of form and content, expression and meaning. It is supposed to be possible to give (at least in principle) a specification of all the external features of the expressions of a language that are relevant to their meaning without referring to or presupposing the meaning or the use of the expressions in this specification. Considerations of meaning and use may be necessary to finding the specification, to isolating the relevant features of an expression, but once they are found the specification can be stated and understood ‘in abstraction from content and use’. (I call this the external or mechanical notion of the form of an expression and its use, and I shall contrast it below with what I call the logical form of an expression and its use. This notion of logical form is not to be confused with the form of an expression as represented by means of the methods of formal logic, by means of formalization, which is an instance of external form.)
    3  The relation between a language and its use in real-life situations is taken to be of the same kind as the relation between a calculus or theory (such as the probability calculus) and its application. Language is exhaustively defined as language through its syntactical and semantical rules. The pragmatic rules, the rules for the use of a language in real situations, are determined on the basis of its syntax and semantics, which are therefore supposed to be conceptually prior to and independent of the pragmatic rules. This could be stated more generally as follows: the logical grammar of the expressions of a language are supposed to be formally
  • Encyclopedia of Computer Science and Technology
    eBook - ePub

    Encyclopedia of Computer Science and Technology

    Volume 2 - AN/FSQ-7 Computer to Bivalent Programming by Implicit Enumeration

    • Jack Belzer(Author)
    • 2020(Publication Date)
    • CRC Press
      (Publisher)
    Modern linguistic theory does not exclude semantics from linguistics, as the structuralist school attempted to do. It does not fall, however, into the other extreme of trying to interface linguistics with such disciplines as ethnology or sociology, which would require making the subject matter of these sciences the direct concern of the linguist. Some contemporary linguists hold that this is the way linguistics should go. Between the two extremes modern linguistics still gives priority to syntax, but its attitude is to discover how far syntax can go in laying the foundation of meaning interpretation. Somewhere, somehow, a borderline must exist beyond which meaning cannot be captured anymore by linguistic means ; the factual, extralinguistic knowledge of language users intervenes at that point. Beyond that line meaning interpretation phenomena are of no direct concern to the theoretical linguist.
    Modern linguistic theory also discards Carnap-type logical syntax (and the artificial logical languages they generate) for metalinguistic purposes. The most such metalanguages could accomplish is to map a tiny subpart of natural language onto formal logic. This subpart is the one closest to the reasoning capability expressable by natural language; it essentially comprises narrowly interpreted and disambiguated conjunctions, predicates, and so forth—in other words, that part of natural language which historically motivated the creation of formal logic at all because of ambiguity and unpreciseness.
    In the case of artificial programming languages, most language designers proceed in the way natural language linguists do, avoiding the identification of the language with structures determined by formal logic. The one exception in this respect is the philosophy of Professor J. McCarthy of Stanford University (formerly of Massachusetts Institute of Technology) in connection with the programming language LISP [4 ].
    Modern linguistics does rely, however, on some other results of formal logic—not to identify subparts of languages with logical structures, but to specify and define the syntactic mechanism incorporated into metalinguistic systems. According to this view, the notion “language” is defined as follows : Given a vocabulary of words, form the set of all possible strings obtainable through concatenation (juxtaposition) of the words. A language is a distinguished subset of this set. Each string of words entering the subset is a sentence of the language. The problem is now how to distinguish this subset, i.e., what makes a string of words a sentence in the language.
    A person, once he learns a language, can usually distinguish immediately sentences from mere strings of words. Thus an English speaker immediately recognizes that the string
  • Semantics - Foundations, History and Methods
    • Klaus Heusinger, Claudia Maienborn, Paul Portner, Klaus Heusinger, Claudia Maienborn, Paul Portner(Authors)
    • 2019(Publication Date)
    formal system must consist of four components:
    • (i) a lexicon specifying the terminal expressions or words , and a set of non-terminal symbols or categories,
    • (ii) a set of production rules which determine how the well formed expressions of any category of the Formal Language may be generated,
    • (iii) a set of axioms or expressions of the lexicon that are considered primitive,
    • (iv) a set of inference rules , determining how expressions may be manipulated.
    A formal system may be formulated purely abstractly without being intended as representation of anything, or it may be designed to serve as a description or simulation of some domain of real phenomena or, as intended in linguistics, modeling aspects of empirical, linguistic data.
    A formal proof is the product of a formal system, consisting of (i) axioms, expressions of the language that serve as intuitively obvious or in any case unquestionable first principles, assumed to be true or taken for granted no matter what, and (ii) applications of the rules of inference that generate sequences of steps in the proof, resulting in its conclusion, the final step, also called a theorem . The grammar of a language, whether logical or natural, is a system of rules that generates all and only all the grammatical or well-formed sentences of the language. But this does not mean that we can always get a definite answer to the general question whether an arbitrary string belongs to a particular language, something a child learning its first language may actually need. There is no general decision procedure determining for any arbitrary given expression whether it is or is not derivable in any particular formal system (Arbib 1969 ; Davis 1965 ; Savitch, Bach & Marsh 1987 ). However, this question is provably decidable for sizable fragments of natural language, even if some form of higher order quantification is permitted (Nishihara, Morita & Iwata 1990 ; Pratt-Hartmann 2003 ; Pratt-Hartmann & Third 2006
  • Routledge Dictionary of Language and Linguistics
    • Hadumod Bussmann, Kerstin Kazzazi, Gregory Trauth, Kerstin Kazzazi, Gregory Trauth(Authors)
    • 2006(Publication Date)
    • Routledge
      (Publisher)
    Logic in linguistics. Cambridge.
    Cresswell, M.J. 1973. Logics and languages. London.
    Feys, R. and F.Fitch. 1969. Dictionary of symbols of mathematical logic. Amsterdam.
    Gabbay, D.M. and F.Guenthner (eds) 1983–9. Handbook of philosophical logic, 4 vols. Dordrecht.
    Guttenplan, S. 1986. The languages of logic. Oxford.
    Hodges, W. 1983. Elementary logic. In D.M.Gabbay and F.Guenthner (eds), Handbook of philosophical logic. Dordrecht. Vol. 2, 1–131.
    Marciszewski, W. (ed.) 1981. Dictionary of logic as applied in the study of language: concepts, methods, theories. The Hague.
    McCawley, J.D. 1981. Everything that linguists have always wanted to know about logic but were ashamed to ask. Oxford.
    Moore, R.C. 1993. Logic and representation. Chicago, IL.
    Quine, W.V.O. 1950. Methods of logic. New York.
    Reichenbach, H. 1947. Elements of symbolic logic. New York (5th edn 1956.)
    Van Fraassen, B. 1971. Formal semantics and logic. New York.
    Wall, R. 1972. Introduction to mathematical linguistics. Englewood Cliffs, NJ.
    Zierer, E. 1972. Formal logic and linguistics. The Hague.

    Bibliographies

    Partee, B., S.Sabsay, and J.Soper. 1971. Bibliography: logic and language. Bloomington. IN.
    Petöfi, J.S. (ed.) 1978. Logic and the formal theory of natural language: selective bibliography. Hamburg.

    formal meaning ⇒ lexical meaning vs grammatical meaning

    formal semantics ⇒ logical semantics

    formalization

    Use of Formal Languages of mathematics and formal logic to describe natural languages. The advantage of formalization as opposed to nonformalized descriptions is the greater explicitness of the vocabulary (=terminology), precision and economy, as well as simpler verification of argumentation.

    References

    Chomsky, N. and G.A.Miller. 1963. Introduction to the formal analysis of natural languages. In R.D. Luce et al. (eds), Handbook of mathematical psychology. New York. Vol. 2, 269–321.
    Salomea, A. 1973. Formal Languages
  • Biolinguistic Investigations and the Formal Language Hierarchy
    • Juan Uriagereka, Juan Uriagereka(Authors)
    • 2018(Publication Date)
    • Routledge
      (Publisher)
    1 The Formal Language Hierarchy

    1.1. Formal Languages

    A current Google search for “Chomsky Hierarchy” yields some 60,000 entries. They go beyond linguistics into computer science, algebra, statistics, communication theory, molecular biology, and even biophysics. Wikipedia reminds us that, within “the area of Formal Languages, the Chomsky Hierarchy (occasionally referred to as Chomsky-Schützenberger hierarchy) is a containment hierarchy a of classes of formal grammars … described by Noam Chomsky in 1956.” It then cites Chomsky’s famous 1956Three Models for the Description of Language, and acknowledges Marco Schützenberger for having played a pivotal role in the development of the theory of Formal Languages. So as not to personalize matters, let’s refer to this notion as the Formal Language Hierarchy (FLH).1
    Let’s examine examples of Formal Languages that are progressively more complex. Consider mere vowel combinations, starting with the English five in (1a):
    1. (1) a. a, e, i, o, u, … b. a, aa, aaa, aaaa,an c… . an , em , il , ok , uj , …
    Should English have more vowels (say, seven) or less (e.g., three), we could just list that as well, since a list is as good as any other. (1b) is slightly more nuanced, in that we repeat token vowels of type a any number of times. The notation an is meant to represent just that: a number n of identical vowels. Of course, we can repeat the other vowels too, arbitrarily, as in (1c). It is easy to show, as we do below, that grammars capable of describing strings as in (1a) are also good at generating strings as in (1b) or (1c).
    Next consider (2), where the number of a’s in strings in the relevant language is supposed to be the same as the number of b’s in other strings in the language:
    1. (2) … a
      n
      , e
      n
    It is not possible to generate Formal Languages with the format in (2) with the simple grammars responsible for generating structures as in (1). This is the type of game we are interested in: What sorts of strings can a given grammar generate? And if it can only generate those as in (1), what sort of grammar is necessary to generate strings as in (2)?