(Trying again with paragraph breaks. B-b )
Back in 1973 I made it to my first NCC (the AFIPS National Computer Conference - the annual big industry shindig in those days). At that time Moore's law was quite the buzz. Memory chips were still following it, but complex function chips were starting to fall off from the straight line on the log graph.
At that time there were a few microprocessors out. But it was far before the stage where you could put a microprocessor on every device control card. Most such functions - including the "glue" around the microprocessors themselves - were constructed of small-scale integration chips. Support chips were starting to graduate from things like four independent gates, a couple flops, or a multiplexer per package. But chips were essentially all still being designed by silicon manufacturers. A few might have been done under contract with companies designing boxes. But most were based on the semiconductor companies' marketing departments' guess at what would be wanted a couple years in the future.
I realized that one explanation for the shortfall might be that, as the complex function chips became larger, the engineering of more of the circuitry was moving from the system designers - including the garage and venture-financed startups - to the semiconductor manufacturers. This reduced the number of engineers on the job and their connection to the needs of the final products. Further, it changed the incentives on the engineers, making them more conservative (since they needed to keep an established company in business rather than take risks to establish a new venture or product).
There was a panel with several of the silicon companies that discussed the problem. Come the Q and A session I brought up the above, and proposed a solution: That the silicon companies license their design tools to the system designers and build the chips THEY design. That way the complex-function engineering, along with its risks and costs, could be moved back to the ventures, while the silicon companies could concentrate their engineering on what they do well - improving the process. And I asked whether any of their companies would consider such an approach. (I thought of it as a "silicon breadboard", but I don't recall actually using the term in the question.)
At least three of the companies' representatives - Mororola, Intel, I forget who else - said that there was no way they would ever do such a thing. (The Motorola guy was quite emphatic about it.)
And the guy beside me gave me his card and suggested I interview with him. (He was from Signetics, which was already doing a mask-programmed gate array chip which the customer could customize. I DID interview with him - and to this day I kick myself for not taking a job there. It would have gotten me out to Silicon Valley 12 years earlier, two years before both the release of the Altair 8080 and the founding of the Homebrew Computer Club. B-b )
A few months later that year, IBM announced they'd make their design tools available to customers and would fabricate chips under contract. Over the next couple years several other manufacturers followed suit. One of them transitioned from custom silicon design to tool licensing as a business and several others started up just to do tools. For a while it was known as the "silicon foundry" system. Now it's ASIC (application specific integrated circuit) design, there are standards for the major design languages, and a whole ecosystem of manufacturers of chips and of computer-aided design tools for all stages of the process.
And ASIC design is what I do for a living since I went back over to the hard side of the force in the early 1990s.