6-Linguistics-Grammar-Kinds

categorical grammar

Grammars {categorical grammar} can relate speech categories by how categories modify other categories. Verbs, including adjectives, modify nouns. Adverbs modify verbs. Adverbs modify adverbs.

constituent grammar

Sentence words have a hierarchy of possibly concatenated smaller groups {constituent grammar} {immediate constituent grammar}.

parts of speech

Constituent grammar uses subject and predicate as fundamental categories. Other grammatical categories derive from them.

rules

Rule sequences build word-group hierarchies. Sentence types have grammar rules, which speakers use for sentence generation, and which hearers use for sentence analysis. Sentence-type rules put high-level word groups in sequences. Sentence-type rules can have necessary parts, optional parts, and branching parts. Second-level grammar rules order words in word groups. Third-level grammar rules order words in subgroups, and so on.

transformational grammar

Grammars {transformational grammar} {generative grammar} can generate sentence patterns by mappings at levels. Language comprehension uses same levels and rules.

structures

Sentences have fundamental structure {deep structure} and structure {surface structure} mapped from deep structure. Deep structure relates grammatical units of simple, representative sentences, sometimes using semantic rules. Surface structure relates word categories, like constituent grammar does, and phonological rules affect it. Deep structure resolves ambiguities left open by surface structure. Predicate calculus is a transformational grammar.

innate

Perhaps, children learn language using innate brain function to decode syntax.

universal grammar

Super-rules can relate particular rules {universal grammar}. All children can learn all languages.

6-Linguistics-Grammar-Kinds-Context

Backus-Naur form grammar

Unrestricted and context-free grammars {Backus-Naur form grammar} allow symbol replacement with symbols or letters.

context-free grammar

Previous input and output can have no affect on output {context-free grammar}. Only current state determines output.

context-sensitive grammar

Current state and previous input and output can determine output {context-sensitive grammar}.

finite state machine

Input string and current state can determine output {finite state machine}.

linear grammar

Previous input can determine output, with no need to go backward or forward to find context or current state {linear grammar}.

regular grammar

Previous input about sentence and input string can determine output, with no need to go backward or forward to find context or current state {regular grammar}.

6-Linguistics-Grammar-Kinds-Substitution

contracting grammar

Simpler strings can substitute for more-complex strings {contracting grammar}. Simpler strings cannot replace strings {non-contracting grammar}.

normal-form grammar

Grammars {normal-form grammar} can allow symbols to become two symbols, for example, replacing sentence symbols with noun and verb symbols.

standard-form grammar

Grammars {standard-form grammar} can allow symbols to become one or two strings, for example, replacing speech-part symbols with letter strings.

well-formed grammar

Grammars {well-formed grammar} can allow simple to complex substitutions.

6-Linguistics-Grammar-Kinds-Quantitative

quantitative grammar

Grammar rules can be strings and functions, which take input strings and make output strings {quantitative grammar}. Strings can have unique integers, using Gödel numbering. String sets {enumerable set} can have unique integers. Quantitative grammars involve enumerable sets of input sentences, output sentences, and grammar rules. Recursive functions, algorithms, Turing machines, and Post axiomatic systems are mathematically equivalent ways to model quantitative grammars.

Post system

Because quantitative grammars involve only integers, quantitative grammars {Post system} can be axiomatic systems, with axioms about strings and inference/grammar rules about speech parts.

6-Linguistics-Grammar-Kinds-Quantitative-Formal

formal grammar

Grammars {formal grammar} can be quantitative.

Chomsky hierarchy

Language processing and computer coding rely on grammar, which specifies sentence-structure rules. Parsing finds syntactical structure. Generating uses rules to create valid sentences. Syntax can have typical sentence structures {regular syntax}, for which words can substitute and which can be recursive {productive syntax}.

grammars

Noam Chomsky defined four generative-grammar classes, with different complexities {Chomsky hierarchy}. Type 0, with highest complexity, is General Grammars, such as Turing Machines. Type 0 grammars can be context-free, unrestricted, contracting, and freely substituting. Turing machines read input and write output anywhere on tape. Type 1 is Context-Sensitive Grammars, such as Linear Bounded Automata. Type 1 grammars can be context-sensitive, unrestricted, and non-contracting. Pushdown machines with finite tape read input and store output, after going backward and forward on tape until they find input string context that tells them what to do next. Type 2 is Context-Free Grammars, such as Pushdown Automata. Type 2 grammars can be context-free, unrestricted, and non-contracting. Pushdown machines read input and store output based solely on current state, without going backward or forward on tape. Type 3, with lowest complexity, is Regular Grammars, such as Finite State Automata. Type 3 grammars can be context-free or context-sensitive, regular, linear, and non-contracting. Finite-state machines read input tape, with no storage, until they find input string that tells them what to do next.

computer language

Computer languages must be deterministic, so parsing look-ahead is finite. Parsing non-deterministic languages requires trying all rules and/or guessing. Most unambiguous and ambiguous recursive transition networks are non-deterministic and cannot map to deterministic recursive transition networks. Non-deterministic finite state automata can map to deterministic finite state automata.

generative grammar

Grammars can have variables, which can have actions and include start symbols. Grammars can have constants, which can have no actions. Grammatical symbols are either variables or constants. Grammars can have rules of going from existing variable series to new variable/constant series. Generative grammars use finite sets of variables, constants, and rules.

relation

Grammar relations involve persons or things {direct object} and time. Relations can be states or events. States are know, believe, or have. States have experiencing subjects. Events involve agents {subject, relation}, instruments "with", beneficiaries "for", and places "on, in, at, above, below", moving, and communicating. States and events determine subject phrases. Events determine verb phrases. To communicate, write, or speak involves recipient "with or to", language "in", and/or topic "on or about". To move or walk involves source "from" and destination "to".

General Grammar

In type 0 {General Grammar}, rules start with variables and productions can be unbounded and context-sensitive. General Grammars are recursively enumerable. General Grammars are equivalent to Turing Machines.

Context Sensitive Grammar

In type 1 {Context Sensitive Grammars}, rules start with variables, and productions are the same length or longer. Rules depend on nearby symbols. Context-sensitive grammars are equivalent to Linear Bounded Automata {non-deterministic Turing Machine}, which have left and right end markers that have no replacements and so bound strings. Context-sensitive grammars are recursive. Context-sensitive grammar-recognition algorithms are Pspace-complete and so can never complete. Context-free grammars plus symbol tables can model context-sensitive grammars.

Context-free Grammar of Chomsky

In type 2 {Context-free Grammar}, rules start with variables and produce variable-and-constant series. Variables are start symbols for grammar subsets. Context-free grammars can accommodate parentheses. Rules do not depend on nearby symbols. Context-free grammars are equivalent to Recursive Transition Networks, which can refer to other transition networks.

parsing

Top-down parsers start with rules with variables and find places that match rules. Bottom-up parsers start with constants and make variables based on rules. Tree structures {parse tree, grammar} show how rules apply. Diagrams {sentence diagram, grammar} show sentence structure. Sentences can have more than one parse tree.

ambiguity

No universal algorithm can determine if context-free grammars are unambiguous or ambiguous or make ambiguous ones unambiguous.

number

Languages can have more than one context-free grammar.

normal form

Context-free grammars can have special forms {normal form, grammar}. Normal forms {Chomsky normal form} can have rules that make one variable into two variables or one constant, with no empty strings. Normal forms {Griebach normal form} can have rules that make one variable into two constants or one empty string.

Regular Grammar of Chomsky

In type 3 {Regular Grammar}, rules start with variables and produces constant-and-variable series {right linear grammar}, or variable-and-constant series {left linear grammar}. There is only one variable and it is on right or left. All other symbols are constants. Simple transition networks are equivalent to regular grammars. Finite state automata (FSA) model regular grammars, because they have start state, finite number of states, set of rules from one constant to another constant, and finite set of terminal states. Regular Grammars use regular expressions: empty strings, variables, or repeated regular-expression strings.

Related Topics in Table of Contents

6-Linguistics-Grammar

Drawings

Drawings

Contents and Indexes of Topics, Names, and Works

Outline of Knowledge Database Home Page

Contents

Glossary

Topic Index

Name Index

Works Index

Searching

Search Form

Database Information, Disclaimer, Privacy Statement, and Rights

Description of Outline of Knowledge Database

Notation

Disclaimer

Copyright Not Claimed

Privacy Statement

References and Bibliography

Consciousness Bibliography

Technical Information

Date Modified: 2022.0225