Listen.

One drone, four microphones and a loudspeaker: nothing more is needed to determine the position of walls and other flat surfaces within a room. This has been mathematically proved by Prof. Gregor Kemper of the Technical University of Munich and Prof. Mireille Boutin of Purdue University in Indiana, USA. (1)

The only way to see is to speak.

The only way to hear is to see.

The only way to taste is to cook.

Don’t you see?

There is nothing to see…

There is everything to create.

Your senses do not connect you with the cosmos.

They connect the cosmos with you!

Back in time…

Photo by Spiros Kakos from Pexels

Time, as far as we know, moves only in one direction. But in 2018, researchers found events in some gamma-ray burst pulses that seemed to repeat themselves as though they were going backwards in time.

Recent research suggests a potential answer for what might be causing this time reversibility effect. If waves within the relativistic jets that produce gamma-ray bursts travel faster than light – at ‘superluminal’ speeds – one of the effects could be time reversibility. (1)

Going back in time.

To speak to our self.

And to warn Him.

That his kids will go astray.

That He should not create the cosmos.

You will never listen to you.

Because you know.

What is here now it was.

What will be has already been.

The past affects the future.

Only because the future had already affected the past.

The only thing that lies being both is Now.

The moment of birth.

The moment of death.

The moment of silence.

Standing still.

Can you hear yourself crying yesterday?

Can you feel yourself smiling tomorrow?

The cosmos should not exist.

And yet it does.

Only because you didn’t listen.

If only you never spoke…

White noise. Dark fear.

Photo by Spiros Kakos from Pexels

White noise is not the same as other noise – and even a quiet environment does not have the same effect as white noise. With a background of continuous white noise, hearing pure sounds becomes even more precise, as researchers have shown. Their findings could be applied to the further development of cochlear implants. (1)

Being cast alone in a dark forest.

How can we hear anything?

If not because we already hear everything?

How can distinguish any sound?

If not for the constant sound you are in?

Can you feel wet if you haven’t ever dived into the ocean?

Listen to your heart.

You wouldn’t be full of fear.

If you weren’t already afraid…

Can’t you see?

That there is nothing you haven’t already seen?

Listen. So that you touch…

Photo by Brett Sayles from Pexels

Our eyes, ears and skin are responsible for different senses. Moreover, our brain assigns these senses to different regions: the visual cortex, auditory cortex and somatosensory cortex. However, it is clear that there are anatomical connections between these different cortices such that brain activation to one sense can influence brain activation to another. A study by the laboratory of Associate Professor Shoji Komai at the Nara Institute of Science and Technology (NAIST), Japan, seen in PLOS ONE, explains how auditory stimulation of the barrel cortex influences responses to tactile stimulation in mice and rats. Komai considered the barrel cortex a good model to see how sound can affect the perception of touch.

“We think our senses are distinct, but there are many studies that show multisensory responses, mainly through audio-visual interactions or audio-tactile interactions,” explains Komai.

His group found that mouse and rat neurons in the barrel cortex were unresponsive to light, but that a strong majority responded to sound. These neurons showed electrical responses to sound that could be categorized as regular spiking or fast spiking. Further, the barrel cortex appeared to treat tactile and auditory stimuli separately. “These responses indicate that tactile and auditory information is processed in parallel in the barrel cortex,” says Komai.

Additional analysis showed that the electrophysiological properties of the responses were different, with sound causing longer postsynaptic potentials with long latency, almost priming the animal to sense touch. This would be like the shuddering one does when hearing a loud boom. According to Komai, this reaction would be an evolutionary advantage for nocturnal animals such as rats and mice.

“In a nocturnal environment, sound may act as an alarm to detect prey or predators. The combination of auditory and tactile cues may yield an effective response. It will be interesting to learn how the same system is advantageous in humans,” he says. (1)

Listening. Tasting. Seeing. Touching. Smelling.

Distinct senses and yet so interconnected.

Interlinked.

But don’t be too dazzled by the light.

It usually hides the deepest shadows.

Senses do not let us sense the world as it is.

They help us break that world apart.

Every path in the dark forest of perception is connected with the others. And there is no way to tread one of them without crossing the others. The more you walk, the deeper you enter the forest. The more you walk, the more everything seems more familiar. The deeper you enter the forest, the more difficult to see the forest.

Tracing back your steps.

At the time when you started walking.

Remember…

As you entered that first path…

Well before the path had a name…

Did you see any paths?

Listen…

Sign language. Spoken language limitations.

Sign languages are considered by linguists as full-fledged and grammatically very sophisticated languages. But they also have unique insights to offer on how meaning works in language in general.

Sign languages can help reveal hidden aspects of the logical structure of spoken language, but they also highlight its limitations because speech lacks the rich iconic resources that sign language uses on top of its sophisticated grammar.

For instance, the logical structure of the English sentence Sarkozy told Obama that he would be elected is conveyed more transparently in sign language. The English sentence is ambiguous, Schlenker explains, as he can refer to Sarkozy or to Obama. Linguists have postulated that this is because the sentence contains some unpronounced – but cognitively real – logical variables like x and y.

If the sentence is understood as Sarkozy (x) told Obama (y) that he (x) would be elected, with the same variable x on Sarkozy and on he, the pronoun refers to Sarkozy; if instead he carries the variable y, it refers to Obama. Remarkably, in sign language the variables x and y can be visibly realized by positions in space, e.g. by signing Sarkozy on the left and Obama on the right. (1)

See.

Now you know that it was about Sarkozy.

Listen.

Now you know what the other guy meant.

Feel.

Now you understand why the other one is even speaking to you.

Reach out with your senses.

It is all the same at the end.

Ideas may sometimes be conveyed better with images.

But blind people cannot see.

Ideas may sometimes be conveyed better with words.

But deaf people cannot hear.

At the end, you will need to reach out to understand what is said.

But not to the person talking to you.

But to the person inside you.

Listen carefully.

Do you listen anything?

See.

Listen.

Feel.

Why are you even listening?

Exit mobile version
%%footer%%