Jupiter’s new moons. Silent foundations…

Advertisements
Photo by Spiros Kakos from Pexels

The public many times is called to name some newly discovered planets, as happened in the case of Jupiter a few years ago. (1)

We believe we can escape the past, but we cannot.

Jupiter’s moons will always have names based on Greek mythology.

Because they used to.

Everything we do, speak and write, are based on things we used to do, speak and write.

Go back in the beginning.

At a time when we couldn’t speak or write.

And you will be astounded to discover that everything you speak about are based on silence…

Speaking AI… Silent logos…

Advertisements
Photo by 鑫 王 from Pexels

North Carolina State University researchers have developed a framework for building deep neural networks via grammar-guided network generators. In experimental testing, the new networks (called AOGNets) have outperformed existing state-of-the-art frameworks, including the widely-used ResNet and DenseNet systems, in visual recognition tasks.

“AOGNets have better prediction accuracy than any of the networks we’ve compared it to”, says Tianfu Wu, an assistant professor of electrical and computer engineering at NC State and corresponding author of a paper on the work. “AOGNets are also more interpretable, meaning users can see how the system reaches its conclusions.” (1)

Speak.

And you will think.

Think.

And words will come out of your mind.

We believe in Logos.

And we train our children accordingly.

But there is a secret we fail to grasp.

And in our endless chattering we choose to forget.

In the beginning there was not Logos.

Something gave birth to Logos.

In every phrase uttered, the same secret cries out loudly…

There is nothing you can say that hasn’t been said  before…

For being the veil of endless aeons…

Beyond the stars and the darkness…

In the beginning, there was silence…

Brain. Seeing. Not speaking.

Advertisements
Photo by Cameron Casey from Pexels

Brain region discovered that only processes spoken, not written words. Patients in a new study were able to comprehend words that were written but not said aloud. They could write the names of things they saw but not verbalize them. For instance, if a patient in the study saw the word ‘hippopotamus’ written on a piece of paper, they could identify a hippopotamus in flashcards. But when that patient heard someone say ‘hippopotamus,’ they could not point to the picture of the animal.

“They had trouble naming it aloud but did not have trouble with visual cues,” said senior author Sandra Weintraub, professor of psychiatry and behavioral sciences and neurology at Northwestern University Feinberg School of Medicine. “We always think of these degenerative diseases as causing widespread impairment, but in early stages, we’re learning that neurodegenerative disease can be selective with which areas of the brain it attacks.” (1)

Spoken words.

Written words.

Mute.

Words expressed can never convey any message.

It is this silence which holds the dearest secrets.

Within its mist you rediscover yourself.

Staying silent.

Holding still.

Outside the realm of words.

Staying speechless.

And yet feeling full.

For this is the only place where things which cannot be expressed…

Can ever be expressed…

Listening to words…

Advertisements
Photo by Dave Meckler from Pexels

For humans to achieve accurate speech recognition and communicate with one another, the auditory system must recognize distinct categories of sounds – such as words – from a continuous incoming stream of sounds. This task becomes complicated when considering the variability in sounds produced by individuals with different accents, pitches, or intonations. In a new paper, researchers detail a computational model that explores how the auditory system tackles this complex task. (1)

In the beginning there was silence.

And then… noise.

Noise cancelling everything out.

With time, we managed to get used to it.

In time, we managed to recognize words.

And we thought we discovered Logos.

Meaning out of nothingness.

Order out of chaos.

But there can be no such thing.

For chaos is chaos.

And noise is noise.

Listen carefully.

Beyond the words.

And you will see the void.

Don’t be afraid of that void.

For it is you.

Unique.

Alone.

Complete.

Staying silent.

Listening to everything…

Before it was ever spoken…

Μπορείς να με καταλάβεις;

Advertisements
Photo by Daniel Maforte from Pexels

As two people speak, their brains begin to work simultaneously, synchronizing and establishing a unique bond. This is what in neuroscience is called brain synchronization.

New research by the Basque Center on Cognition, Brain and Language (BCBL) in San Sebastián and published in Cortex magazine confirms that this phenomenon depends on the language we use to communicate.

“When a conversation takes place in one’s native language, both interlocutors pay attention to it in a more global way, focusing on the sentences and the global content of the message,” stresses Jon Andoni Duñabeitia, co-author of the study. However, when done in a foreign language, attention resources focus primarily on other, more complex linguistic levels for non-native speakers, such as sounds and words.

“In the latter communicative context we need to reconfigure our attention strategies so that we can understand each other, and this may be directly related to the difference in the areas synchronised during the conversation,” suggests Duñabeitia. (1)

Language.

Portrayed as a facilitator of communication.

But it is actually a barrier we must overcome.

Only when this barrier is lifted can we actually speak to each other.

Because communication and understanding never stem from logos.

But Logos is the result of the understanding we already have.

Speak to me.

And I will understand you…

Only if I already do…

Note: “Μπορείς να με καταλάβεις;” = “Can you understand me?” in Greek…

Exit mobile version
%%footer%%