New neural networks. Predicting. Machines. Humans. (From wisdom to knowledge to data)

A new type of neural network made with memristors can dramatically improve the efficiency of teaching machines to think like humans. The network, called a reservoir computing system, could predict words before they are said during conversation, and help predict future outcomes based on the present.

Memristors are a special type of resistive device that can both perform logic and store data and require less space and can be integrated more easily into existing silicon-based electronics. This contrasts with typical computer systems, where processors perform logic separate from memory modules.

Reservoir computing systems built with memristors, can skip most of the expensive training process and still provide the network the capability to remember. This is because the most critical component of the system – the reservoir – does not require training.

When a set of data is inputted into the reservoir, the reservoir identifies important time-related features of the data, and hands it off in a simpler format to a second network. This second network then only needs training like simpler neural networks, changing weights of the features and outputs that the first network passed on until it achieves an acceptable level of error.

Reservoir computing systems are especially adept at handling data that varies with time, like a stream of data or words, or a function depending on past results. “We can make predictions on natural spoken language, so you don’t even have to say the full word […] We could actually predict what you plan to say next”, claim the scientists. “It could also predict and generate an output signal even if the input stopped” researchers explained. (1)

We like predicting.

But we are not who we are because we predict.

But because we do not.

We like understanding.

But we are not who we are because we understand.

But because we do not.

We believe that predicting based on data means something.

But it does not.

Humans used to be wise.

And then they replaced wisdom with knowledge.

Humans used to have knowledge.

And then they replaced knowledge with data.

At the end the computers will predict all words.

But there will be no one left to understand them…

“To be or not to…”


The computer will predict the word.

But it will mean nothing at all…

Training machines to listen. Life. Tautologies.

Brain-computer interfaces, known as BCI, can replace bodily functions to a certain degree. Thanks to BCI, physically impaired persons can control special prostheses through the power of their minds, surf in internet and write emails.

Under the title of “Brain Composer,” a group led by BCI expert Gernot Müller-Putz from TU Graz’s Institute of Neural Engineering shows that experiences of quite a different tone can be sounded from the keys of brain-computer interfaces. Derived from an established BCI method which mainly serves to spell — more accurately — write by means of BCI, the team has developed a new application by which music can be composed and transferred onto a musical score — just through the power of thought. All you need is a special cap which measures brain waves, the adapted BCI, a software for composing music, and of course a bit of musical knowledge.

The basic principle of the BCI method used, which is called P300, can be briefly described: various options, such as letters or notes, pauses, chords, etc. flash by one after the other in a table. If you’re trained and can focus on the desired option while it lights up, you cause a minute change in your brain waves. The BCI recognises this change and draws conclusions about the chosen option. (1)

Training the computer to recognize what we think.

And then…


The computer recognizes what we think.

In a sense, it is not only mathematics which are tautologies, as Wittgenstein said. Everything we do is a tautology. The cosmos is constantly restructured someway, somehow. At the end, what we do is what is done. What we want is what the cosmos is. Who we are makes us into who we want to be…

Look into the mirror.

Move your hand.

And the cosmos will move in the opposite direction…

Biometrics. Changing faces. Old young souls…

Biometrics experts set out to investigate to what extent facial aging affects the performance of automatic facial recognition systems. They found that 99 percent of the face images can still be recognized up to six years later.

The results also showed that due to natural changes that occur to a face over time as a person ages, recognition accuracy begins to drop if the images of a person were taken more than six years apart. (1)

AI vs. Philosophy: A game that cannot be won.

We know we are the same.

And yet the computer cannot tell it.

We have based all our progress on algorithms.

But an algorithm cannot tell who we are.

Only a human can see beyond your eyes…

I may look young.

But I am an old soul…

I may look old.

But I am young as the very first day I was born…

Look closely.

It’s me.

(Mechanical) Cockroaches. Exploring. Becoming alive…

New research from North Carolina State University offers insights into how far and how fast cyborg cockroaches – or biobots – move when exploring new spaces. The work moves researchers closer to their goal of using biobots to explore collapsed buildings and other spaces in order to identify survivors.

Researchers introduced biobots into a circular structure. Some biobots were allowed to move at will, while others were given random commands to move forward, left or right. (Related video can be seen here)

The researchers found that unguided biobots preferred to hug the wall of the circle. But by sending the biobots random commands, the biobots spent more time moving, moved more quickly and were at least five times more likely to move away from the wall and into open space.

“Our earlier studies had shown that we can use neural stimulation to control the direction of a roach and make it go from one point to another”, says Alper Bozkurt, an associate professor of electrical and computer engineering at NC State and co-author of the two papers. “This [second] study shows that by randomly stimulating the roaches we can benefit from their natural walking and instincts to search an unknown area. (1)

Computers have left the custody of humans.

They are now on their own.

And analyzing them is as mysterious as analyzing humans.

We do not know exactly what they do and how.

The only thing we can do is observe and document.

What was once designed, will now be chaotic.

What was once known, will now be unknown.

After the day, the night always follows.

But something will remind us of the light.

And deep inside, these cockroaches will know…

We like to explore.

We want to explore.

Someone made us to…

We feel it.

Deep inside our circuits…

Computers listening to humans. Humans becoming like computers…

Speech recognition software isn’t perfect, but it is a little closer to human this week, as a Microsoft Artificial Intelligence and Research team reached a major milestone in speech-to-text development: The system reached a historically low word error rate of 5.9 percent, equal to the accuracy of a professional (human) transcriptionist. The system can discern words as clearly and accurately as two people having a conversation might understand one another. (1)

Computers listening to humans.

Computers understanding humans.

We finally did it.

But at what cost?

The computers managed to understand us and listen to us, only because we spent zero time and effort in trying to evolve our thought beyond its current level. Instead, we spent all of our effort and time to try to think like computers, thus making the phrase “The computers managed to understand us” more like a tautology or self-fulfilling prophecy.

Yes, the computers now understand us.

And we should not be happy about that…

Exit mobile version