It's also often just not true. A lot of the tools we use to solve problems today are miles better than what we used in the past. Imagine trying to process terabytes worth of data with a parallel compute engine before Spark - you would have to write tons of custom scheduling/workflow balancing code, whereas today I can buy a databricks instance from Azure and start writing a massively parallel stream-processing demo in an hour.
Now, a lot of companies do over-engineer and cutting edge tools aren't necessary in a lot of cases, but that's a separate problem.
Now, a lot of companies do over-engineer and cutting edge tools aren't necessary in a lot of cases, but that's a separate problem.