See your face… Move your hand… Break the mirror…

Advertisements
Photo by Jason Stewart from Pexels

Given the limited capacity of our attention, we only process a small amount of the sights, sounds, and sensations that reach our senses at any given moment. Research suggests that certain stimuli – specifically, your own face – can influence how you respond without you being aware of it.

In an experiment, participants looked at a cross symbol displayed in the center of a computer screen while a picture of a face appeared on each side of the cross. The face on one side of the cross was the participant’s own face, while the face on the other side of the cross belonged to a stranger. The participants were told to focus their attention on the cross and ignore anything else that might appear.

The findings showed that participants automatically attended to their own faces when they appeared on screen, despite the fact that they were instructed not to do so. Importantly, the findings also showed that participants automatically attended to their own faces even when they weren’t aware of them. (1)

We know our self.

We sense our self.

Some only see their self.

Everywhere.

Even when we are told not to.

Yet, these people will not see what they look for.

For you need to look to others in order to see you.

Look closer.

They are not obstructing you from seeing better.

Instead, they provide the only window to yourself.

These are not ‘other’ people.

They are you.

You are them.

Mirrors of existence, mirroring what cannot exist.

Look at the mirror.

Move your hand.

No, the mirror does not reflect you.

You ARE the mirror…

Seeing better. Through not seeing. Descartes and the wisdom of Silenus…

Advertisements

Visual acuity is normally thought to be dictated by the shape and condition of the eye but these new findings suggest that it may also be influenced by perceptual processes in the brain. “We discovered that visual acuity – the ability to see fine detail – can be enhanced by an illusion known as the ‘expanding motion aftereffect’ – while under its spell, viewers can read letters that are too small for them to read normally”, says psychological scientist Martin Lages of the University of Glasgow. (1)

We train our eyes to see things which do not exist.

In order to see things which do.

This reminds me of Principiorum philosophicorum part. 3, artculo 47 hic verbis by Descartes: Whatever we have set as a starting point, whatever assumptions or axioms we use to start examining something, we will always end up in the same conclusion one way or another.

Leibnitz disagreed with that idea in his letter to Philipp on January of 1680. His objections were mainly related to the nature of God and the discussion of good and evil. I believe this discussion must be made on a more abstract philosophical level: Do we need to start from a specific starting point to see something? Or will we see it someday somehow no matter how wrong is the path we have chosen?

Seeing small letters. Because your eyes do not work as they did.

Not seeing anything. Because your eyes work as they use to work.

Will you finally see the letters?

Will you finally get blind?

Does it matter?

Sitting down in the forest. Touching the earth below. What you feel is you. Going back and forth from that cursed meeting with Silenus. The purpose of this world lies not within this world. There is nothing to read in these letters…

Filling in the gaps (of the blind spot). Believing (what is not there). Lies. Truth.

Advertisements

To make sense of the world, humans and animals need to combine information from multiple sources. This is usually done according to how reliable each piece of information is. For example, to know when to cross the street, we usually rely more on what we see than what we hear – but this can change on a foggy day.

“In such situations with the blind spot, the brain ‘fills in’ the missing information from its surroundings, resulting in no apparent difference in what we see,” says senior author Professor Peter König, from the University of Osnabrück’s Institute of Cognitive Science. “While this fill-in is normally accurate enough, it is mostly unreliable because no actual information from the real world ever reaches the brain.

Scientists wanted to find out if we typically handle this filled-in information differently to real, direct sensory information, or whether we treat it as equal.

To do this, König and his team asked study participants to choose between two striped visual images, both of which were displayed to them using shutter glasses. Each image was displayed either partially inside or completely outside the visual blind spot. Both were perceived as identical and ‘continuous’ due to the filling-in effect, and participants were asked to select the image they thought represented the real, continuous stimulus.

It seemed that people treat ‘inferred’ visual objects generated by the brain as more reliable than external images from the real world. (1)

What is real is just a representation in our mind.

And the more “pure” the representation, the more “real” it feels.

We see what we want. And the more interference we get from our senses, the more fake the world seems to be. That should not make us doubt the validity of our mind, but the validity of our senses instead. If their input does not imply anything regarding the validity of our perception (or what is more, if their input makes our perception be less related to “reality”) then perhaps our senses could be not related to the… validity of our perception.

This is the obvious and simplest conclusion of them all. And we should not be afraid of any conclusion, no matter how much it opposes our beliefs.

Look out for the fake.

It does not carry any notion of ‘reality’ and, thus, is more pure. (and thus, more real)

Look out for the lies.

That is where veracity is hidden…

Seeing via an app… Not seeing…

Advertisements

A mobile phone application from Microsoft is designed to help people with color blindness see the world a little bit more clearly. Color Binoculars, created by two Microsoft software engineers (one of whom is colorblind), applies a filter to incoming images, changing the colors on the screen to ones that are easier to distinguish.

Looking at red and green objects through the application (which uses your phone’s camera) will make the reds brighter and more pink and the greens darker, making differences between the two more obvious. It won’t correct the colors: an individual with color blindness will not be able to suddenly see red or green via using the application, but they will be able to distinguish between red and green objects. They might more easily be able to identify that a red and green sweater had a striped pattern, for example, even though they still wouldn’t see red or green as most of us do. (1)

People seeing the world through an application.

People seeing the world through telescopes.

People seeing the world through binoculars.

People seeing the world through glasses.

People seeing the world through their eyes.

People not seeing anything….

Asymmetries. Sensing. Motionless.

Advertisements

“Did something move over there?” Everyone has experienced this situation. One is looking towards a sound source, but with the best will in the world, one cannot detect an object. Only its sudden movement, even if minimal, allows its immediate perception.

Scientists at the Ruhr-University Bochum have investigated this phenomenon and show for the first time how simultaneous counterchange of luminance at the borders between object and background triggers activity waves in the visual brain. These waves may constitute a sensitive signal for motion detection. In their study, the scientists presented small gray squares on a monitor screen. The squares then either turned bright or dark with identical luminance intensities and the scientists recorded the subsequent brain activity. The surprising result was that the darkening squares were represented considerably earlier in the brain than the squares that brightened. “This shows that simultaneous changes in luminance occurring in the outer world were time-shifted in the brain,” says Sascha Rekauzke, first author of the study. A small temporal offset of a few milliseconds between the processing of darks and lights was already known. Within the eyes, retinal ganglion cells that signal light “ON” open their ion channels directly upon transmitter release. In contrast, light “OFF” signals are conveyed indirectly, via further intracellular cascades. The RUB scientist now showed that the resulting time difference is further amplified within the brain, in the range of about ten milliseconds.  As a consequence, simultaneous counterchange of luminance at neighboring locations leads to a spatiotemporal offset of activation in the brain. This offset triggers a motion signal in the form of a wave of activity spreading asymmetrically in one direction.

Asymmetry is also used for sound localization: Acoustic waves from laterally displaced sources reach the ears with minimal temporal offset. From the interaural time difference neuronal networks compute time delays and our brain interprets from them the presumed direction of the sound source. As Dirk Jancke said: “Our brain is a giant comparison machine based on self-generated asymmetries. Our study further substantiates this and shows that this is true even for elementary steps in perception”. [1]

So we detect motion due to asymmetries. Asymmetries in our sensory organs. Asymmetries in our brain. The whole world is in motion. And we sense it.

But could the world be stable? Could the universe and the cosmos be completely symmetrical, thus motionless? Could the asymmetries in our sensors be the CAUSE of the illusion of motion?

Stop sensing.

And you will see everything.

In one place.

Motionless.

Whole.

Exit mobile version
%%footer%%