This is something that has been eating at me for quite some time… Given today’s political climate and the anti-war sentiment, I am forced to wonder if the current views on war existed in the ’30s-‘40s would the Allied forces have won WWII? Or would we be living in a “German World” as Hitler wanted? I feel these are valid questions, because the world faces the same kind of threat today, with the Islamofacists wanting an Islamic world.

p.s. Just to clarify my position. I support the US troops, as well as Canada's....