The Truth about Germany: Football03/16/2008March 16, 2008https://p.dw.com/p/DPM9AdvertisementGermany is one of the greatest football nations on earth. They won the World Cup in 1954, in 1974 and in 1990. Even their women have won the world cup! But is the passion for soccer shared by all Germans?