Artificial neural networks (ANNs) have been around for a long time. ANNs have become popular recently because they run very efficiently on GPUs. GPUs were not invented to run ANNs. They were invented do provide realistic graphics to videogames. ANNs run on GPUs because the linear algebra for one is very similar to the other.
Consider an alternate history where the math behind videogame graphics differs significantly from the math behind ANNs. In the alternate history, ANNs run just fine on CPUs but won't run at all on GPUs.
We know from historical hindsight that inventing GPUs just to run ANNs would be worthwhile. But that would only be obvious in the alternate history if ANNs produced useful results running on modern CPUs. Do they?
This article suggests GPUs run faster by a factor of about 10×. I think that's close enough performance such that we'd still be using ANNs even if we didn't have GPUs and this usage would incentivize the invention of custom hardware for ANNs not unlike what is going on in the world right now. There are even some circumstances where training on a CPU is faster.
Would we invent custom hardware just to run ANNs? I think the answer is yes, but the hardware would be several years behind where it is right now.