Brain. Seeing. Not speaking.

Advertisements
Photo by Cameron Casey from Pexels

Brain region discovered that only processes spoken, not written words. Patients in a new study were able to comprehend words that were written but not said aloud. They could write the names of things they saw but not verbalize them. For instance, if a patient in the study saw the word ‘hippopotamus’ written on a piece of paper, they could identify a hippopotamus in flashcards. But when that patient heard someone say ‘hippopotamus,’ they could not point to the picture of the animal.

“They had trouble naming it aloud but did not have trouble with visual cues,” said senior author Sandra Weintraub, professor of psychiatry and behavioral sciences and neurology at Northwestern University Feinberg School of Medicine. “We always think of these degenerative diseases as causing widespread impairment, but in early stages, we’re learning that neurodegenerative disease can be selective with which areas of the brain it attacks.” (1)

Spoken words.

Written words.

Mute.

Words expressed can never convey any message.

It is this silence which holds the dearest secrets.

Within its mist you rediscover yourself.

Staying silent.

Holding still.

Outside the realm of words.

Staying speechless.

And yet feeling full.

For this is the only place where things which cannot be expressed…

Can ever be expressed…

Listening to words…

Advertisements
Photo by Dave Meckler from Pexels

For humans to achieve accurate speech recognition and communicate with one another, the auditory system must recognize distinct categories of sounds – such as words – from a continuous incoming stream of sounds. This task becomes complicated when considering the variability in sounds produced by individuals with different accents, pitches, or intonations. In a new paper, researchers detail a computational model that explores how the auditory system tackles this complex task. (1)

In the beginning there was silence.

And then… noise.

Noise cancelling everything out.

With time, we managed to get used to it.

In time, we managed to recognize words.

And we thought we discovered Logos.

Meaning out of nothingness.

Order out of chaos.

But there can be no such thing.

For chaos is chaos.

And noise is noise.

Listen carefully.

Beyond the words.

And you will see the void.

Don’t be afraid of that void.

For it is you.

Unique.

Alone.

Complete.

Staying silent.

Listening to everything…

Before it was ever spoken…

Μπορείς να με καταλάβεις;

Advertisements
Photo by Daniel Maforte from Pexels

As two people speak, their brains begin to work simultaneously, synchronizing and establishing a unique bond. This is what in neuroscience is called brain synchronization.

New research by the Basque Center on Cognition, Brain and Language (BCBL) in San Sebastián and published in Cortex magazine confirms that this phenomenon depends on the language we use to communicate.

“When a conversation takes place in one’s native language, both interlocutors pay attention to it in a more global way, focusing on the sentences and the global content of the message,” stresses Jon Andoni Duñabeitia, co-author of the study. However, when done in a foreign language, attention resources focus primarily on other, more complex linguistic levels for non-native speakers, such as sounds and words.

“In the latter communicative context we need to reconfigure our attention strategies so that we can understand each other, and this may be directly related to the difference in the areas synchronised during the conversation,” suggests Duñabeitia. (1)

Language.

Portrayed as a facilitator of communication.

But it is actually a barrier we must overcome.

Only when this barrier is lifted can we actually speak to each other.

Because communication and understanding never stem from logos.

But Logos is the result of the understanding we already have.

Speak to me.

And I will understand you…

Only if I already do…

Note: “Μπορείς να με καταλάβεις;” = “Can you understand me?” in Greek…

Language. Thought. Time. Dasein.

Advertisements
Photo by Maria Orlova from Pexels

The relationship between language and thought is controversial. One hypothesis is that language fosters habits of processing information that are retained even in non-linguistic domains.

Languages, for instance, vary in their branching direction. In typical right-branching (RB) languages, like Italian, the head of the sentence usually comes first, followed by a sequence of modifiers that provide additional information about the head (e.g. “the man who was sitting at the bus stop”). In contrast, in left-branching (LB) languages, like Japanese, modifiers generally precede heads (e.g. “who was sitting at the bus stop, the man”). In RB languages, speakers could process information incrementally, given that heads are presented first and modifiers rarely affect previous parsing decisions. In contrast, LB structures can be highly ambiguous until the end, because initial modifiers often acquire a clear meaning only after the head has been parsed. Therefore, LB speakers may need to retain initial modifiers in working memory until the head is encountered to comprehend the sentence.

Studies show that the link between language and thought might not be just confined to conceptual representations and semantic biases, but rather extend to syntax and its role in our way of processing sequential information or in the way the working memory of speakers of languages with mixed branching or free word order works. “[…] left-branching speakers were better at remembering initial stimuli across verbal and non-verbal working memory tasks, probably because real-time sentence comprehension heavily relies on retaining initial information in LB languages, but not in RB languages”, says Alejandro Sanchéz Amaro, from the Department of Cognitive Science at the University of California, San Diego. (1)

Thinking in a sequence based on your language.

Languages based on the way you think.

A cosmos structured in the way you see.

People seeing based on how their brain is structured.

In a universe where things can go either right or left, there is only one correct way to go… (Nowhere!) In a cosmos where thinking can be done in various ways, there is only one way to think… (Don’t think!)

Listen to the forest whispering in your ear…

Watch the dim light of existence cast shadows under the light…

Listen to the silence between the words…

There is a structure in the cosmos. And there is chaos in this structure. There is logos governing the universe. And inside logos, the deep darkness of stillness. Any structure imposes structures. Any way of thinking destroys other ways, equally possible and correct.

There is a unity in the clatter of phenomena.

You cannot see this unity from left and go right. Neither if you observe from right to left. You cannot know everything if you already know things. You cannot understand it all if you start by claiming that you understand something.

This unity you can only watch by watching everything.

And the only way to do that, is by watching nothing…

Is the man sitting at the bus?

Search inside…

What is a man?

And you will be astonished by the lack of any plausible answer…

Sign language. Spoken language limitations.

Advertisements
Photo by Sergei Akulich from Pexels

Sign languages are considered by linguists as full-fledged and grammatically very sophisticated languages. But they also have unique insights to offer on how meaning works in language in general.

Sign languages can help reveal hidden aspects of the logical structure of spoken language, but they also highlight its limitations because speech lacks the rich iconic resources that sign language uses on top of its sophisticated grammar.

For instance, the logical structure of the English sentence Sarkozy told Obama that he would be elected is conveyed more transparently in sign language. The English sentence is ambiguous, Schlenker explains, as he can refer to Sarkozy or to Obama. Linguists have postulated that this is because the sentence contains some unpronounced – but cognitively real – logical variables like x and y.

If the sentence is understood as Sarkozy (x) told Obama (y) that he (x) would be elected, with the same variable x on Sarkozy and on he, the pronoun refers to Sarkozy; if instead he carries the variable y, it refers to Obama. Remarkably, in sign language the variables x and y can be visibly realized by positions in space, e.g. by signing Sarkozy on the left and Obama on the right. (1)

See.

Now you know that it was about Sarkozy.

Listen.

Now you know what the other guy meant.

Feel.

Now you understand why the other one is even speaking to you.

Reach out with your senses.

It is all the same at the end.

Ideas may sometimes be conveyed better with images.

But blind people cannot see.

Ideas may sometimes be conveyed better with words.

But deaf people cannot hear.

At the end, you will need to reach out to understand what is said.

But not to the person talking to you.

But to the person inside you.

Listen carefully.

Do you listen anything?

See.

Listen.

Feel.

Why are you even listening?