Part I – Chapter 7

About Cognitive Development

1. The Fourth R

2. Empirical Studies

3. Piaget’s Model

4. Piaget’s Experiments

5. Lines of Inquiry

6. Experimental Techniques

7. Private Languages


1. The Fourth R

Logic is essential to human cognitive and psychological development and to successful living, and should begin to be taught from an early age. We speak of The Three R’s – reading, (w)riting and (a)rithmetic – as being the fundaments of schooling. But a Fourth R should be added, viz. – reasoning, i.e. awareness and use of logic.[1]

Exactly when such educational effort should be carried out is, of course, open to debate. We have to understand the natural development of logical abilities and skills in the absence of interference, before we can determine when best to try and apply some artificial improvements. It is no use trying to impose skills on a child that the child is not biologically ready for; it may even be counterproductive to do that.

Of course, the notion of ‘natural’ development is a bit idealistic – since our individual skills are in practice affected not only by purely biological factors, but also by the thinking abilities and habits of the surrounding society we personally grew up in (although some social currents may affect some individuals more than others). So rather, we should distinguish between subconscious absorption of logical skills, and their more conscious training.

When we speak of ‘cognitive development’, we refer to a wide, varied field of study – which ranges from the sensory, intuitive and rational (purely cognitive) functions, on to emotional, psychological and social factors.

Clearly, our interest here is the former domain, the purely cognitive aspects. Moreover, we are interested in experience (sensations and self-intuitions) only insofar as it is ‘processed’ by reason; i.e. the experiential as raw data for logical treatment. With this in mind, we should perhaps consider our present object of study as more precisely: ‘development of the faculty of reason’; i.e. it concerns our rational powers and their use, or our logical abilities and skills.

Note: I understand the term ‘cognition’ very broadly, as including perception of sense data and their mental equivalents, intuition of self and the functioning of self (including volition), as well as conception and proposition, logical insight and argumentation. Moreover, it is often taken to include physical, mental and volitional processes preparatory to such cognitive acts, made to position the Subject for cognition; e.g. turning his attention in some direction.

In its initial sense, the term ‘cognition’ is as wide in extension as ‘consciousness’ differing from it only in intension. Consciousness refers more to the relation between subject and object, or the eventual ‘substance’ of such relation that connects the two; whereas cognition stresses the impact of such relation on the subject, an intuited event of knowing within the self or soul. Both also imply a state of ‘awareness’ in the subject – a readiness to receive information, or alertness.


2. Empirical Studies

It is important for logicians to empirically study the development of logic in people’s minds, from birth to maturation and onward. Obviously, the use and understanding of logic varies greatly from individual to individual (extensional variation), and within the life of any individual (natural modality change).

Like most formal logicians in history, who work in an ivory tower of sorts, I have not personally studied the matter greatly; but from the examples given by Jean Piaget (Swiss, 1896-1980) and his successors (some of who, of course, did not agree with all his viewpoints), I have become convinced of the value of such studies. Armchair logicians like myself do of course resort to introspection and personal memories, as well as to casual observation and to written history (the histories of popular beliefs and statements, of philosophy and of science); but Piaget ranged more widely or at least in a new direction, studying real children in a purposeful and structured manner, under laboratory conditions.

Logicians had until then tended to concern themselves with the setup of mature minds, almost totally ignoring the fact that logical skills are acquired over time. Such acquisition presumably depends on both nature and nurture.

(a) In part on physiological and neurological maturation (which may vary from one person to the next); and:

(b) In part on cultural osmosis and educational offering (which varies from culture to culture, geographically and across history); and finally:

(c) In part on the efforts of each individual to study logic and train himself or herself in it, and if need be to engage in independent research and thought on the issues involved.

With regard to the development of our organs of cognition, a distinction ought be made between the time of purely physical maturation – and the time needed to learn how to properly use already mature organs. Some children manage to make use of their organs more readily than others, due to different volitional dispositions as well as family and social contexts. Organ development per se refers to a potential; the latter must still in turn be actualized.

Logic has evidently got a geography – different peoples, in different cultures, rely on different logical beliefs and skills. To give a common example, East and West are thought to have very different logical paradigms. But marked differences are possible more narrowly, even within cohabiting ethnic, social or family groups. In some societies, males and females may display considerable differences. Even though they do occur, such differences should not be overrated: being all members of the same species our minds are basically similarly constituted and operative. Naturally, people who inhabit different ‘worlds’ throughout their lives will exhibit different cognitive emphases.

Logic of course also has a history, which logicians would also be wise to take into consideration. I have given some guidelines for such consideration, stressing the need to distinguish between (a) the mere practice of some logical skill, (b) the self-awareness of such practice, and finally (c) the theoretical assimilation of it (formalization and validation, and integration into the larger context of theoretical knowledge). An example I gave in some detail[2] was the a-fortiori argument.

While acknowledging cultural and historical differences in emphasis, logicians should not relativize and withhold judgment. They may pronounce some cultural or historical prejudice or method as inadequate to the task of knowing. For example, with regard to geography, we may pronounce judgment against the anti-rationalism of certain oriental or western logical practices or systems, such as those of Nagarjuna or of Greek sophism. Or again, with regard to history, we may marvel at the twists and turns of medieval and early renaissance thinking as described in Michel Foucault’s Archeology[3].

In empirical studies like Piaget’s, the methodology used must be rigorous. This depends in large part on the understanding of logic by the experimenters themselves – if such knowledge is lacking the wrong questions will be asked, and the answers will surely be misinterpreted.


3. Piaget’s Model

It should be stressed that Piaget’s work is not only about child psychology or cognitive development – many of his observations and concepts may be considered as pure logic theory, equally relevant to adult cognitive processes. For example, his distinction between assimilation and accommodation is very apt.

“As modeled by Piaget, the child explores the world and observes regularities and makes generalizations, much as a scientist does. Piaget… recognizes two fundamental cognitive processes that work in somewhat reciprocal fashion. The first is what Piaget called assimilation, a process that involves incorporating new information into an already existing cognitive structure [or “schema”]… The second process, accommodation, [serves] to form a new cognitive structure that can incorporate the new information… Cognitive development, according to Piaget, represents a dynamic equilibrium between the two processes of assimilation and accommodation.”

(Encyclopaedia Britannica, 2004. Emphases mine.)

These two processes are not limited to developing minds, but continue to be used throughout our lives. They are not limited to one area of logic, but can be adapted to many different fields. For this reason, this terminology is well worth adopting.

For example, I have proposed a distinction between concept-formation by means of similes and that by means of metaphors. Both are ultimately analogical modes of thought, but the latter is less obvious and more creative than the former. The former may be classified as assimilation, the latter as accommodation. In class-logic terms, assimilation is classifying a particular into an already existing class, whereas accommodation is proposing a new class for it. The same distinction could be applied to theory-formation. If one resorts to pre-existing ideas, it is assimilation. If one finds no adequate solution to the problem at hand that way, one is forced to invent something quite new – this is accommodation.


4. Piaget’s Experiments

Some of Piaget’s experiments, or his conclusions from them, strike me as absurd, or at least unclear.

In one experiment, for example, Piaget seemingly examined whether or when children realized the Lever Principle, i.e. that ‘weight times length’ is equal on both sides of a balance in equilibrium. Now I ask – did the experimenter expect children to intuitively know what was not known in the history of mankind till quite late, i.e. until a genius called Archimedes discovered it? Surely, each of us remembers having been taught this principle at school (although it is not totally inconceivable that some children guessed it before).

More to the point, there is nothing ‘innate’ or inherently ‘logical’ about this principle. It is a physical truth, which is empirically evident but far from obvious; one can well imagine a world in which matter would behave differently. So, what was Piaget looking for? All he could hope to find out, at best, is when children are able to understand this principle, i.e. at what age they can be taught it. For, if they already knew it, it was probably due to having learnt it from adults somehow.

Again, in another experiment, liquid in a short, wide container is poured out into a tall, narrow one, and the child is asked which of the two containers holds more liquid. Piaget found that children younger than about seven tended to regard the taller (though narrower) container as holding more liquid. He apparently considers this as informing us on when children acquire understanding of the Law of Conservation of Matter.

But it seems to me that this is only one possible interpretation of events. It could be that the younger children wonder whether ‘liquids expand or contract like gases, to fit the shape and size of their containers’. This hypothesis concerning physical law is not unthinkable; it is a fair alternative to the ‘liquids have constant volume’ hypothesis. The latter is not a ‘logical’ absolute – it is a mere physical law, which happens to be true, but whose truth it has taken mankind a long time to realize. Why should children be expected to have the genius to go straight for the correct alternative? And if they did so, would that be a measure of intelligence, or of narrow-mindedness?[4]

Here again, the onlooker might be tempted to think that knowledge of the law of permanence of matter is considered by Piaget to be a natural development (either innate in human brains or logically inevitable), whereas the principle is more probably learned from others (whether by osmosis[5] or by being explicitly taught it), and maybe in very rare cases arrived at by personal efforts of inductive logic (i.e. by observation, formulation of hypotheses and elimination of inappropriate ones).

Perhaps I do Piaget an injustice, but regarding instances like these, it seems to me that his experimental goals or the conclusions he drew from his experiments were not always clear. Perhaps his intentions or inferences were clear in his mind but he did not express them clearly enough, making it possible for other people to misread them.

Be all that as it may, we could formulate the point made as a methodological principle: everything must be made clear, so that other people do not misinterpret what was sought or what conclusions were drawn. Scientists should be careful to specify their interpretations, explicitly stating what should and what should not be read into them.

With regard to these experiments, it seems to me (without having looked at Piaget’s actual research notes) that the issues of when and why are not clearly distinguished and correlated. First, we should list the various ways an item of knowledge or a skill might make its appearance in the subjects – instinct (innate tendencies), just logic (application of the laws of thought), personal observation combined with logic (induction), learning through examples, hints or explicit lessons (from peers, parents, teachers or the media). Secondly, when such knowledge or skill does make its appearance, we would want to devise (if possible) some test that would reveal to us which of the aforesaid sources was the operative one in the individual case at hand. Otherwise, the information obtained is too vague and confused to allow any conclusion to be drawn (other than an age range for the apparition of the knowledge or skill in the children examined).

Piaget, in his experiments (both the transverse and the longitudinal), often seems (to me) to confuse actual development with potential. The conclusion one can draw from them is that the child happens to have reached this or that level at age so and so; but the tests do not trace the exact genesis of such attainment (explaining why different individuals vary slightly), and they do not make clear whether the child could have done better with a bit of training.

Note, however, with regard to the latter issue, I have been told that Piaget (and others) have indeed found that children can often be trained to improve their performance, but what they thus learn remains rather localized to the precise notion or skill concerned and is not readily passed on to other, analogous items. This could be taken to imply that the potentiality and actuality occur pretty much in tandem, and the causes of actualization of potentiality are of little significance. I do not know how far this general conclusion may be relied on. The fact is some children are average, some are precocious, and some are retarded – the questions remain: why and what can be done about it?

The above examples make clear another important criticism (already hinted at) I would put to Piaget, and some other researchers on the ground: some of this research is billed to be about the development of logical skills, when in fact it is nothing of the sort, but rather about the acquisition of knowledge of basic physical principles. Now, this is a criticism only insofar as the two topics are confused. The acquisition of knowledge is of course not denied to be an interesting topic; but, though all knowledge acquisition implies a logical process of some sort (which it would be interesting to pinpoint for each item of knowledge in each individual subject studied), this topic is not identical with the issue of logical development.

The latter research is the one most interesting to us in the present context. It is the use of empirical techniques to study the development of logical skills in humans. To engage in such research, one must have a pretty clear idea as to what is meant by ‘logical’ skills. Clearly, this term must refer to the whole science of logic, and more broadly epistemology; i.e. to all the inductive and deductive notions, acts and processes armchair logicians have identified as used in the acquisition of knowledge. Ideally, each and every notion, act or process should be studied in turn, although in practice this may be hard to do.

Indeed, such finely tuned investigation may be out of our reach in many instances, judging by the way Piaget (and others) have tended to prefer general conclusions such as the following (which are extremely interesting anyway):

“Piaget saw the child as constantly creating and recreating his own model of reality, achieving mental growth by integrating simpler concepts into higher level concepts at each stage. He argued for a “genetic epistemology,” a timetable established by nature for the development of the child’s ability to think, and he traced four stages in that development.

He described the child during the first two years of life as being in a sensorimotor stage, chiefly concerned with mastering his own innate physical reflexes and extending them into pleasurable or interesting actions. During the same period, the child first becomes aware of himself as a separate physical entity and then realizes that the objects around him also have a separate and permanent existence.

In the second, or preoperational, stage, roughly from age two to age six or seven, the child learns to manipulate his environment symbolically through inner representations, or thoughts, about the external world. During this stage, he learns to represent objects by words and to manipulate the words mentally, just as he earlier manipulated the physical objects themselves.

In the third, or concrete operational, stage, from age 7 to age 11 or 12, occurs the beginning of logic in the child’s thought processes and the beginning of the classification of objects by their similarities and differences. During this period, the child also begins to grasp concepts of time and number.

The fourth stage, the period of formal operations, begins at age 12 and extends into adulthood. It is characterized by an orderliness of thinking and a mastery of logical thought, allowing a more flexible kind of mental experimentation. The child learns in this final stage to manipulate abstract ideas, make hypotheses, and see the implications of his own thinking and that of others.”

(Encyclopaedia Britannica, 2004. Emphases mine.)

Although Piaget’s theories were later challenged in various respects – for instance, other researchers considered that he “tended to overestimate the ages at which children could first perform certain cognitive tasks” (op. cit) – his work rightly deserves to have remained the main reference in this field.


5. Lines of Inquiry

Assuming some ingenious experimenters can come up with appropriate setups, which can indeed yield finite conclusions, I (as a theoretical logician) would suggest the following as some important lines of inquiry that they should pursue. Some of these questions have (I acknowledge) already been asked and answered; but many (I submit) have not. For each topic listed, the questions to ask are:

(a) As of what age, or in what age range, perhaps within a given historical and social context, do we acquire the potential for these specific logical skills?

(b) Under what kinds of favorable conditions and triggering circumstances are these potential logical skills actualized?

My wish-list of topics would (offhand) include, though not be limited to, the following:

· When and how do we get to understand pointing, negating, abstracting, naming, and other such fundamental acts of reason?

· When and how can the laws of thought be said to become operative in thought and action?

· When and how do the basic logical notions of sameness or difference, consistency or contradiction, exhaustiveness or incompleteness, as well as derivative notions like implication and disjunction come into use?

· When and how do the notions of truth vs. falsehood, reality vs. illusion, and related modal notions like uncertainty, necessity, possibility, appearance, and so on. – come into use?

· When and how do we begin to distinguish between our sensory perceptions and our imaginations?

· What of introspection, intuition of self and one’s own cognitions, volitions and valuations?

· When and how are different places and times respectively distinguished; and when and how are the larger abstractions of space and time generated?

· When and how do children begin to conceptualize, to classify, to formulate categorical propositions, to formulate hypothetical propositions, and so forth?

· When and how do we begin using adductive processes, formulating hypotheses and then testing them, and then confirming or weakening them, rejecting or adopting them?

· When and how do we start engaging in syllogistic and other deductive practices?

· When and how do we start using causal logic – resorting to logical, extensional or natural explanations, or identifying things or agents as causes of events.

· When and how do the ideas of formal logic – e.g. symbolizing terms (with X or Y), making general statements about reasoning, distinguishing valid from invalid arguments, and so forth – become understandable to youths?

· As of what age does the logic of paradoxical propositions become comprehensible?

And so forth – we can in this way range throughout the science of logic, and ask the same question of each known logical notion, act or process.

As far as I can tell, experimenters have far from completed the work of empirically tracing our cognitive development. The questions they have been asking so far have not always been pertinent and systematic enough, because their knowledge of logical science has been rather limited and scattered. For instance, there has been insufficient emphasis on the ‘laws of thought’ as the basic instruments of logic (although it has been found, for instance, that children before age 6-8 years tend to juxtapose rather than confront conflicting statements, i.e. they accept them successively without comparing them and seeking to harmonize them).

Although in my view there is yet a lot of research to be done in cognitive development, I do of course admit that much work of great value has already been done. I have no desire to belittle anyone’s achievements. For example, I was interested to learn that a child begins to understand designation, i.e. the intent of pointing at things, and even the intent of simple word-sounds, as of nine months of age! Or again, the association of different sensations and their consideration as different aspects of one and the same physical object, is a gradual process, which may take till age 8-9 months or even as late as 18 months.

Differentiation and integration (or analysis and synthesis of percepts), and classification (grouping and subdividing, concept formation), have also been studied. The child at first views objects (e.g. its mother) as a totality, then (till age 3-4 years) distinguishes their various components (e.g. mother’s smell or face), and later still (till age 10) is able to reconstruct wholes from parts. Children become able to classify in two stages: first (at age 4-6.5 years), they group things in single classes, e.g. “red” or “round”; and later (at 7-8 years), they can handle compound classifications, e.g. “red circles”, and subdivisions, e.g. “circles may be red or green”. All such findings are, of course, of logical significance.


6. Experimental Techniques

With regard to experimental techniques, researchers no doubt do, and if not ought to, keep in mind certain guidelines like the following (very offhand):

· It is important for researchers to ensure they do not project their own thoughts onto the child’s. Does the child understand the questions asked by the experimenters; or are these tricky[6], ambiguously stated or stated in terms still unknown to the child, so that the answers are unreliable? Does the mere asking of a certain question teach the child something it did not till then know, and so skew the experiment? Does asking a question in a certain way insinuate a certain answer, or reduce the probability of a correct answer? Can the child be intentionally taught some relevant notions in such a way that it can answer more questions, more precisely; i.e. reveal more about itself?

· How far can one generalize results from one or two children in one place and time, to all children? Clearly, researchers should test the limits of their generalizations; e.g. in different cultures. In some cases, the tests used on children should be tried on adults; we might well find many adults (as well as children) failing them. For example, one test found that children (I did not note their ages or other experimental details) tend to regard “If–then” statements as exclusive, i.e. as meaning “If and only if – then” (what modern logic has labeled “iff–then”). I think that is kind of funny, because in my experience many adults are still not clear as to the difference between these two forms!

· Experimental queries should be clearly formulated. To avoid all ambiguity, logical statements should be expressed in formal terms, rather than merely descriptively. For examples[7]:

Ø “Awareness that one changes opinion over time” may be formally stated as “I used to believe X, but now I believe Y (or more vaguely, not X)”.

Ø “Awareness of the differences in perspective by different people” = “I think X, but my friend thinks Y (or more specifically, not X)”.

Ø “Awareness that some opinions are false” = “Someone (I or another) believes X, but in fact Y is true (or at least, X is false)”.

Ø “Awareness of difference between appearance and reality” = “It seems that X, but in fact Y (or at least, not X)”[8].

Ø “Prediction of belief changes in different contexts” = “If I saw X, I would get to believe Y”.[9]


7. Private Languages

The following is mere speculation on my part, but I wonder if a child might not, at some stage in its cognitive development, before discovering and adopting all the language(s) of parents and neighbors, have a temporary, private language – a simple, personal invention, consisting of a small number of words, which might be variable (i.e. a word might formed ad hoc and soon forgotten, to be later replaced by another word for the same thing as needed).

I would not be surprised this to be the case, given my theory that language is a tool of personal thought, before it is one of interpersonal communication. I include in the term ‘language’ not only (imagined or spoken) sounds, but (imagined or played-out) gestures[10]; for the essence of language is intending meanings for symbols, intention being an act of volition of the subject (or soul), or more precisely one of velleity.

Note however that, if this hypothesis is correct, the private language of early thought should not be necessarily identified with the sounds uttered out loud by the child or its gestures, because it may be that at this stage the coordination between thought and speech or gestures is not yet perfect. Nevertheless, we should not exclude that some ‘baby talk’ may be part of the child’s private language, while some of it (like papa, mama) may be inspired by the public language overheard in the family environment.

I gather that Piaget considered that children do not grasp language before they are about 18 months old, although they already manifest considerable pre-linguistic intelligence. Granting this, it would be at about that age that a transitional private language might appear. This would allow the child to begin personally engaging in some verbal thoughts (those it is already able to have), even before having received corresponding generally accepted words from its environment.

Thus, I am suggesting that a child instinctively learns a little about language use within himself or herself, before learning the specific, much more developed language taught by the surrounding segment of humanity. The child may first invent a private word, then in an intermediate stage substitute for it a word more or less resembling a public word for the same object, then finally master the public word; in that way, the transition from private to public language would be gradual, with some overlap.

Moreover, it may well be this internal ability’s development that makes possible the subsequent external ability’s development. This ability to invent language certainly exists at a later age, in older children and adults. Without such an ability, humanity would not be able to develop language further as need arose. And of course, language had to have a beginning, somewhere in the depths of our history. Therefore, the issue here is not whether private language occurs at all, but only at what stage of development it occurs.

I have heard of an experiment, that a prince ordered made, in which a child was isolated from other people since birth to ensure it did not learn from anyone how to speak. The experiment aimed to discover whether mankind has a natural language. Not surprisingly, as it grew up the child did not seem capable of any meaningful speech.

This story may be true, but it proves nothing, because the language the child might have invented for personal use would be incomprehensible to others. It was silly to expect the child to naturally speak English, or whatever it was the ruler spoke. Furthermore, the child’s internal use of language may never have translated into external speech or gesture.

Moreover, judging by modern findings, a child so imprisoned, one deprived of affection and of sensory and intellectual stimulation, would grow up as an idiot, if it at all survived the ordeal. Therefore, such an experiment would be distortive and not answer the question asked.

As I said, the hypothesis advanced here is speculative. I do not know what modern experiment we could devise that would settle the issue. Not that it is very important to do so; just a matter of curiosity.

[1] The following reflections on cognitive development are intended to put forward some ideas and recommendations of a logician – they are not the work of an early childhood expert or of a cognitive development experimenter. Information I give here is based on notes I took during a series of lecture on related subjects I attended at Geneva University a few years ago to put myself in the picture.

[2] In Judaic Logic and related works.

[3] See also my essays on this.

[4] Indeed, I would expect children nowadays, at least those raised on daily cartoon watching on TV, to rather first imagine matter to be almost infinitely elastic!

[5] This causality, cultural “osmosis”, is worth studying in detail. For example, the idea of class-inclusion is inherent in the very use of words; therefore, one could say that the moment a child begins to learn a language, it is simultaneously absorbing the notion of class-inclusion. Of course, that lesson may not be immediately clear, but may become clearer with time.

[6] For example, a child is shown 5 apples and 3 oranges and asked whether there are more apples or fruits. The child tends to answer “more apples”, confusing the subclass of oranges with the genus fruits. This is a tricky question, because it compounds a mathematical operation (5 > 3) with a classificatory act (realizing that apples as well as oranges are fruits). Some adults might be misled by such a question. Of course, the point is that the older the child, the less likely it is to be tricked! (Whence the expression “it’s like taking candy from a baby”.)

[7] These examples are taken from actual experiments, through which children of about two were found to have the described abilities. I do not remember the details of the experiments, and so cannot comment on their accuracy. All I noted was “refer to article by Gopnik and Astington”.

[8] E.g. “this looks like a stone, but is in fact a sponge” – though note that this could also be viewed as an example of the act of reclassification.

[9] This is a beginning of hypothetical thinking, note.

[10] Later, of course, a third form of language is developed, namely written language, the use of visual symbols.

You can purchase a paper copy of this book Books by Avi Sion in The Logician Bookstore at The Logician’s secure online Bookshop.