18+
реклама
18+
Бургер менюБургер меню

Игорь Волков – Hardware and software of the brain (страница 9)

18

Metaphor

This is similar to the theory of analogy which is well developed in physics. The principle is that if we have one phenomenon that is perfectly studied and another one which is barely known but has some analogy with the first, then we can transfer the source knowledge to the target. For example, Hinduism tells that our world is not the first, but before it was created, Shiva destroyed the previous one using a swift dance.

Fig. 15. Shiva as Lord of Dance.

The study of metaphors discovers how we process information. The human brain is very efficient for the comparison by similarity. Neural nets do it immediately in associative memory on the hardware level. Accordingly, we have various types of metaphor.

Allegory

This is an extended type such as fables. They usually employ animals or other non-human creatures to illustrate some moral principle. The story is full of details and stimulates profound thinking.

Hyperbole

Underlines some features by means of exaggeration. A typical example of hyperbole is "million reasons".

Parable

Parables are sample stories for educational purposes. They were actively used by Jesus Christ. In contrast to a fable, it excludes animals or inanimate objects representing speaking beings.

Antithesis

Natural language works with fuzzy objects. In such conditions, it is useful to underline not only some features, but also their negations. In the following sample, antithesis is used to underline a paradox: "The better – the worse; the worse – the better."

Metonymy

In this case, a concept is designated by the name which calls a close association. As an example, a crown is a well decorated headwear, but the word is also used for a royal house or power.

Connotation

As you know, songs convey 2 components – text and music. They correspond to information and emotions. Both are well represented in common speech. In pure texts the second is limited but also present as well. For this purpose we use various synonyms. For such words the meaning is similar but not exactly identical. The difference of the second order may convey our attitude to what we write.

Implicature

When we communicate some useful information, formulating everything explicitly would be boring, especially in dynamic circumstances. By this reason, we usually say essentials taking the rest for granted. For example, when one explains the recipe of a soup, he may enumerate the necessary vegetables, but don't mention salt.

Theory of language

At present, there are many different theories for parts of language such as syntax and language as a whole. Consensus is even not visible, but most of them are, in fact, variants of the same or different solutions of some particular question such as what is the main word of the sentence. Hence, we can formulate a generalized theory taking the best ideas from different approaches or choosing the most practical method for particular problems.

When you create some science from scratch, finding solutions of key problems is the second step. The first – formulating these problems and defining appropriate terms for them.

Natural language as a whole is a communication system for the transmission of 2D cortical images over an 1D sequential channel. Basically, 1 sentence = 1 image. The next sentence either adds details to the existing image or creates a new one. The main problem is how to group words inside the sentence. For this purpose, civilization adds grammar. The Part Of Speech (POS) is an entirely artificial concept. In fact, syntax and punctuation introduce an intermediate level of processing. POS + syntax rules define grouping.

The standard pipeline of language processing is

POS -> Syntax -> Semantics -> Pragmatics

The question is how different steps (levels) interact with each other. A popular principle is that syntax should be self-sufficient, that is independent of adjacent levels from both sides. Word grouping should proceed without words themselves. Only their POS is needed. Also parsing of the sentence should be completed before proceeding to semantic analysis. It is on this next stage that words fill the parse frame. The final meaning is created on the last, pragmatical level taking into account the other sentences of the text.

Unfortunately, this is only a good wish which is possible only for very strict artificial languages. Even for moderately restricted natural language it is impossible. Most words may be several different POS. Accordingly, for each sentence several parse structures are possible. Humans choose one by meaning. For this purpose, the system of analysis should have full backtracking through the whole pipeline.

Another problem is workability of syntax itself. If we take standard English, it has wide variety of phrases on the sub-sentence level. Even filled with real words, such phrases may create an ambiguous construct. The probability will be only higher if we consider POS only. The longer a sentence, the more phrases it contains, the less reliable it is.

Finally, alongside generalized syntax, humans widely use various expressions based on concrete words. Such expressions may form the whole sentence or only a phrase within it.

Human language is like a programming language without automatic error checking. It is up to the users to reduce ambiguity. Human language is very redundant so there are plenty of abilities. Don't attach too many POS to a single word. Say, English syntax allows a noun as an attribute to another noun. Hence no need to declare it as an adjective in the dictionary. Don't use long sentences with several clauses. Break them down to simple sentences. If you see that some construct is ambiguous, replace it. Usually there are several ways to say the same thing.

Computing

A text supplies us with some knowledge that we use later for practical purposes. Which specifically? The term of intelligence may be defined as the ability of problem solving. That's why knowledge accumulation is needed. Let's analyze in details how it happens.

Syntax

Separate words group into phrases, clauses, and sentences. Then into paragraphs, chapters, books, and whole libraries. The structure over the sentence is less standardized. An encyclopedia is like a library, only the latter may contain several books by different authors on the same topic. In the encyclopedia they are concentrated into a single article.

Syntax is a completely artificial formal system. Ideally, it should be detached from both lexicon and semantics. Word grouping should depend only on the part of speech. In real languages, there are lots of exclusions from this principle, but even without them formal grammars are problematic. Let's look into details of these problems.

1. How to represent the structure of the sentence on the very top level? The popular answer is

sentence(subject phrase, predicate phrase)

Both arguments are equal here. Alternatively, one of them is considered the main. If this is the predicate, human sentences become compatible with formal logic

predicate(subject phrase, predicate phrase)

Here the subject will be just one argument of the predicate (in logical sense) alongside the direct object and the other members of the sentence. Also the semantic load of this representation is clear. In this case, each sentence represents some action and the whole text answers the question: "What happens around?"

2. During word grouping, the process passes a hierarchy. Different textbooks present it differently. Some levels may be absent. At the first step, various phrases of the lexical level are recognized. These are: noun phrase, verb phrase, adjective phrase, adverb phrase, prepositional phrase.

At the second, they create phrases of the sentence level: subject phrase, predicate phrase. Secondary members of the sentence, such as the direct object, are usually not single words but whole lexical phrases. Note that the same noun phrase may become either a direct object or a subject.

A clause is like a simple sentence, only as a part of a complex or compound sentence.

Semantics

When syntax analysis is completed, semantics is available by taking into account not only parts of speech, but the words themselves. Meanwhile some part of meaning may be restored already from the syntax structure. Usually a noun corresponds to some object, a verb – to an action. Of course, there are different variants too, but all of them may be explicitly enumerated. Then, we will have a general description of semantics in possible details. If such a description is implemented programmatically, it is enough to supply some dictionary and the program will correctly understand any text composed of these words.

Some addition to the previous Formal semantics

While conjunctions represent relations between clauses, that is actions, prepositions – between objects represented by noun phrases. Generally, there are 2 of them. The second immediately follows the preposition, but what is the first? If the prepositional phrase stands after the subject, it is linked to this subject. If it is a part of the predicate phrase, there are several variants. It may be a prepositional object. 'We spoke about computers.' An adverbial modifier. 'He lives in this town.' Also, it may be an attribute to some noun in the predicate phrase. 'He moved to the table in the corner.' That is the table which stands in the corner.