Yes -- transistors do degrade over time and that means CPUs certainly do. But keep a couple of things in mind. Computer manufactures are very conservative in slowing down clock rates so that CPUs last for a long time. Also, those older computers don't run as "hot" as newer ones because they are doing far less in terms of processing than modern computers that operate at clock speeds that were inconceivable just a couple of decades ago. So you might not notice any performance hits in older machines even after 20 or 30 years.
Here's a question -- has an effective way to measure transistor degradation been developed? For people with multi-core, data crunching monsters, that is an important question.
Anyway, one of the great things about older computers is that they use very inexpensive CPUs and a lot of those are still available. If worse comes to worse, you can find replacement parts easily. Heck, a lot of them are still in use in "embedded" designs and are still manufactured.