In linguistics, syntax is the study of the rules, or "patterned relations," that govern the way the words in a sentence are arranged. Syntax originates from the Greek words syn- ("together"), and taxis- ("arrangement"). It concerns how different words (which, going back to Dionysios Thrax, are categorized as nouns, adjectives, verbs, etc.) are combined into clauses, which, in turn, are combined into sentences.

In Semiotics:

In the earliest framework of semiotics (established by C.W. Morris in his 1938 book Foundations of the Theory of Signs) the syntax is defined within the study of signs as the first of its three subfields, syntax, the study of the interrelation of the signs; the second being semantics, the study of the relation between the signs and the objects to which they apply; and the third being pragmatics, the relationship between the sign system and the user.

In Other Grammars:

Dependency grammar is a class of syntactic theories separate from generative grammar in which structure is determined by the relation between a word (a head) and its dependents. One difference from phrase structure grammar is that dependency grammar does not have phrasal categories. Algebraic syntax is a type of dependency grammar.

Tree-adjoining grammar is a grammar formalism which has been used as the basis for a number of syntactic theories. Head-driven phrase structure grammar is a phrase structure grammar which uses additional constraints to simplify the necessary rules. Together with Lexical functional grammar it is a lexicalist grammar. This means that the lexicon is hierarchically structured and contains more information than in other approaches.

Moreover, in monostratal grammars like construction grammar and to a certain degree Head-Driven Phrase Structure Grammar, syntax is not governed by dynamic derivative rules, but by schemas that pair a specific formal configuration with a semantic configuration, and which are then filled or specifid by lexemes. Thus, syntax itself is semiotic.

See also: Phrase, Phrase structure rules, x-bar syntax, Syntactic categories, Grammar, Algebraic syntax, construction grammar, Head-Driven Phrase Structure Grammar, Lexical-Functional Grammar.

In Computer Science

The usage of syntax in computer science has evolved from its related usage in linguistics, especially in the subfield of programming language design. The set of allowed reserved words and their parameters and the correct word order in the expression is called the syntax of the language. The ubiquitous syntax error generated by various programming languages results when the computer cannot find a valid interpretation according to its preprogrammed rules of syntax for the code it has been requested to run, frequently due to a typographic error.

In computer languages, syntax can be extremely rigid, as in the case of most assembler languages, or less rigid, as in languages that make use of "keyword" parameters that can be stated in any order.

The analysis of programming language syntax usually entails the transformation of a linear sequence of tokens (a token is akin to an individual word or punctuation mark in a natural language) into a hierarchical syntax tree (abstract syntax trees are one convenient form of syntax tree).

This process, called parsing, is in some respects analogous to syntactic analysis in linguistics; certain concepts, such as the Chomsky hierarchy and context-free grammars, are common to the study of syntax in both linguistics and computer science.