Is there a relationship between honey and some components of computers? The immediate answer that tongues would suggest without thought is no. However, what seems strange and illogical has been realized at Washington State University, where university engineers demonstrated a method to use honey in creating components for neuromorphic computers—systems designed to mimic the neurons and synapses found in the human brain. In a study published on April 5 in the journal "Applied Science," researchers showed that honey can be used to make a "memristor," a component similar to a transistor that can not only process data but also store it in memory.
Feng Zhao, an associate professor in the College of Engineering and Computer Science at Washington State University and co-author of the study, stated in a report published on the university's website alongside the study: "This is a very small device with a simple structure, but it has very similar functions to human neurons. This means that if we can integrate millions or billions of honey memristors together, they could be turned into a neuromorphic system that works much like the human brain."
The "honey memristor" was developed by Brandon Soyka, the lead researcher of the study, by processing honey into a solid form and placing it between metallic electrodes, making its structure similar to human synapses. They then tested the memristor's ability to simulate synaptic function with high switching speeds of 100 and 500 nanoseconds, respectively. The memristor also simulated synaptic functions known as spike-timing-dependent plasticity and rate-dependent plasticity, which are responsible for learning processes in the human brain and retaining new information in neurons.
Engineers at Washington State University created honey memristors on a small scale, about the size of a human hair, and the research team led by Zhao plans to develop it on a nanoscale, about 1/1000 of a human hair, and assemble millions or even billions together to create a complete neuromorphic computing system. Currently, traditional computer systems rely on what is called Von Neumann architecture, named after its creator, which incorporates input, usually from a keyboard and mouse, and output, such as a screen, and also includes a central processing unit and memory storage.
Zhao explains that transferring data through all these mechanisms from input to processing to memory to output requires a significant amount of energy compared to the human brain. For example, the supercomputer Fugaku uses over 28 megawatts, which is almost equivalent to 28 million watts to operate, while the brain uses only about 10 to 20 watts. The human brain contains more than 100 billion neurons with over 1000 trillion synaptic connections between them, and each neuron can process and store data, making the brain more efficient than a traditional computer. Developers of neuromorphic computing systems aim to mimic this structure.