Yes, we went to the moon on this, One of NASA's IBM computers.
My comment is reflecting that the more we think we are doing new things in computers, we really have done something similar, and already worked out the bugs, previously.
A great article I read maybe 20 years ago was right on, and still is 20 years later. The author complained that everything in computers has to be invented three times. Take multitasking operating systems. Mainframes could do it in the 60s. Then people invented minicomputers, which could not multitask. But everyone hated the mainframes so they went to mini computers. But then, people complained because they couldn't multitask, so multitasking operating systems like UNIX had to be written for them. Then PCs were invented, and everyone hated mainframes, and mini computers, and wanted to work on PCs. But they didn't multitask, but that was ok. Then people started complaining, and so companies had to write multitasking PC operating systems. Each time it was the same thing over again, but people thought it was some great new discovery.
You can point out almost any area of computing and see the same thing. Granted, I love the extra speed and capacity of modern machines, but we still just keep reinventing the wheel and slapping on new buzzwords.
Does crack me up what people think is new. I was emailing in the 80s. I was "using" part of the ARPANET in the early 90's when it open to a more public domain purpose. It was pretty unfunctional, pretty archaic, command dependent, not much out there. ARPA was pretty functional once you were inside the government sponsored world.
It does make me laugh at what people thought was multitasking. "No, your processor is really just concentrating on another thread." At the same time, it is pretty amazing at how much processing capability is now in 3 cubic inches.