Dumb post.

By: Dr. Spock

Hitler hated America and would have attacked the USA if given the chance. 

Hitler declared war on the USA right after Pearl Harbor, not vice versa.  As a matter of fact, the USA never officially declared war on Nazi Germany.

Post Please Log in OR Register for an account before posting.