Intel has created the world's largest neuromorphic(神经形态) computer, a device intended to imitate the operation of the human brain. The firm hopes it will be able to run more complicated AI models than is possible on conventional computers.
A regular computer uses its processor to carry out operations and stores data in separate memory, but a neuromorphic device uses artificial neurons to both store and compute, just as our brains do. This removes the need to transmit data back and forth between components, which limits the speed of current computers.
It also means "brain-like" computers can be more energy efficient. Intel claims its new Hala Point neuromorphic computer uses 1% of the energy a conventional machine takes when running optimization(优化)problems, which involve finding the best solution to a problem given certain restrictions. Mike Davies at Intel says that despite its big power, Hala Point is about as big as a microwave oven. Intel suggests that a machine like Hala Point could create Al models that learn continuously, rather than needing to be trained to learn each new task, as is the case with current models.
But James Knight at the University of Sussex, UK says current models like ChatGPT are trained using graphics cards operating in parallel(平行), meaning that many chips can be put to work on training the same model. But as neuromorphic computers work with a single input and can't be trained in parallel, it is likely to take decades to train something like ChatGPT on such hardware, not to mention make it learn continually, he says.
While today's neuromorphic hardware still needs to be improved, says Davies, it could one day take pre-trained models and let them learn new tasks. "This is the kind of continual learning problem that we believe large-scale neuromorphic systems like Hala Point can solve in the future," he says.