The US played no part in what happened in Germany after WWI. It was entirely the Europeans, who have always been worthless allies and brutal victors who put the squeeze on Germany and then lacked the courage to take action when Hitler rose to power.
Okay, good. SOMEBODY around here reads history!!
Caveat: the USA withdrew their loans to Germany because of the Crash of 1929 and the Great Depression -- we just didn't have the dollars -- and that did throw Germany into a serious depression, much worse than England or France. Did that support the rise of Hitler? Probably not really the point -- nationalism was the point -- but I guess it didn't help. But we were in the worse shape of all, so whatever. The rearming of the Rhineland (March 1936) was the moment of truth -- stop Hitler or not -- and that was Europe that let Hitler get away with that, but we would never have backed them if they had gone in with tanks.
Not that we should have, of course. Darn, can't these feckless Europeans do ANYthing on their own feet??
Last edited: