Knowledge. Destruction.

Advertisements
Photo by Spiros Kakos from Pexels

Powerful DNA-sequencing techniques have spurred an avalanche of discoveries about ancient humans, but each one comes at a price: the partial destruction of the specimens from which the DNA was taken. Anthropologists Keolu Fox and John Hawks call for researchers to think harder about safeguarding. “Unless some ground rules are established, future scientists, armed with better, potentially less-invasive methods for extracting DNA from ancient samples could well look back on this era as a time of heedless destruction, fuelled by the relentless pressure to publish,” says Fox and Hawks. (1)

We should not be alarmed or surprised though.

Knowledge IS destruction.

Every time we understand something, we dissolve it into pieces.

Every time we get to know something, we forget something else.

The cosmos was once at our fingertips.

Until we tried to touch it.

And it became real…

Dementia. Dying. Being born!

Advertisements
Photo by Ryanniel Masucol from Pexels

It happens unexpectedly: a person long thought lost to the ravages of dementia, unable to recall the events of their lives or even recognize those closest to them, will suddenly wake up and exhibit surprisingly normal behavior, only to pass away shortly thereafter. This phenomenon, which experts refer to as terminal or paradoxical lucidity, has been reported since antiquity, yet there have been very few scientific studies of it. That may be about to change.

In an article published in the August issue of Alzheimer’s & Dementia , an interdisciplinary workgroup convened by the National Institutes of Health’s (NIH) National Institute on Aging and led by Michigan Medicine’s George A. Mashour, M.D., Ph.D., outlines what is known and unknown about paradoxical lucidity, considers its potential mechanisms, and details how a thorough scientific analysis could help shed light on the pathophysiology of dementia. (1)

Plato said it a long time ago.

What you see are just reflections.

Of a world beyond our own.

There is no way to prove that.

Unless you stop seeing outside.

And see inside yourself.

You are dying now. And you see things do clearly.

And yet, all of the sudden, you start remembering.

Of things you knew and you had forgotten.

But nothing which is worth knowing can be forgotten.

Nothing which is worth knowing can be learnt.

Look! He speaks so clearly now…

No, this is not a sign of hope.

But the last signs of decay fading away…

Predict. What you can never understand…

Advertisements
Photo by Jackson Jorvan from Pexels

Artificial intelligence can predict premature death, according to a study.

Computers which are capable of teaching themselves to predict premature death could greatly improve preventative healthcare in the future, suggests a new study by experts at the University of Nottingham.

The team of healthcare data scientists and doctors have developed and tested a system of computer-based ‘machine learning’ algorithms to predict the risk of early death due to chronic disease in a large middle-aged population. (1)

Computers predicting what they can never understand.

Is there any other way?

We can only predict what we do not know.

Look at the flower.

Smell the wind.

Feel the rain falling…

You will never predict them.

And yet, you smile.

Only because you know all there is to know about them…

The birth of consciousness…

Advertisements
Photo by ritesh arya from Pexels

Think about consciousness for long enough, and you’ll drive yourself to distraction. To psychologist Julian Jaynes, the question of consciousness was big enough to last a lifetime. His answer? Consciousness is much smaller, much rarer, and much younger than we tend to think. Forget about wondering if a dog, cat, or earthworm has consciousness — Jaynes hypothesized that even the ancient Greeks failed to achieve it. “Now, hold on,” you might be saying. “Ancient Greeks wrote some of the most enduring literature of all time — ‘The Iliad’ and ‘The Odyssey’ were written by non-conscious creatures?” To which Jaynes would reply, “Of course not. A conscious mind wrote The Odyssey.” An analysis of these two texts inspired the foundation of Jaynes’ metaphysical beliefs — the bicameral mind.

The bicameral mind (which may sound familiar to “Westworld” fans) is essentially a consciousness split in half. One half takes care of execution: When it receives the message that the body is hungry, it seeks and consumes food; when it gets the message that it has been wronged and insulted, it seeks vengeance. The other half is the one that sends those messages. Back before we had developed any sort of introspection, those messages would have hit the brain like the word of the gods. After all, where else could it have come from? The breakdown of the bicameral mind happens when that executive half starts really asking that question and finding the answer is “nowhere.” In other words, Jaynes says, consciousness didn’t arise until we stopped attributing our inner monologue to the gods. (1)

Trying to answer the big questions.

Trying to understand.

This is what started everything.

In the beginning we just accepted the cosmos.

Being an integral and active part of it.

But at one point we decided to leave home.

And deny our Father.

We wanted to “know”.

And the only way to do that was via defining everything else as “different” than us; thus, compatible with analysis and examination. We used to be part of the cosmos. Defining the universe while the universe defined us. Now we still see the stars. But as something distant. Longing to go there, even though we used to be walking on the Sun. Afraid that we will die if we touch them, while we used to play with them as kids.

Lying down on a forest clearing.

Listening to nothing.

Thinking of nothing.

Alone in the cosmos.

Who is talking?

Wanting to “see”. Missing the obvious. (Open your heart to the darkness)

Advertisements
Photo by Engin Akyurt from Pexels

Inspired by the human eye, researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed an adaptive metalens, that is essentially a flat, electronically controlled artificial eye. The adaptive metalens simultaneously controls for three of the major contributors to blurry images: focus, astigmatism, and image shift.

The research was published in Science Advances.

“This research combines breakthroughs in artificial muscle technology with metalens technology to create a tunable metalens that can change its focus in real time, just like the human eye,” said Alan She, a graduate student at SEAS and first author of the paper. “We go one step further to build the capability of dynamically correcting for aberrations such as astigmatism and image shift, which the human eye cannot naturally do”. (1)

We arrogantly celebrate the creation of an “eye”. But we have forgotten than it is not an eye that we need in order to see. We used to know that there is no river at all, until the moment we stepped inside it. And ever since, we have been dragged away from home by that nonexistent cold river’s current…

We used to see the stars.

Well before the invention of telescopes.

We used to examine our inner self.

Well before the invention of microscopes.

We used to know the universe.

Well before philosophy even had a name…

Exit mobile version
%%footer%%