I followed a scientific American article in 1992 as a high schooler and got digit recognition and basic arithmetic working on a 386. What the popsci press said at the time was that we were limited by memory bandwidth (cache size), training data, and to some extend pointer-chasing (and other inefficencies) in graph algos