Of course, yes..... During the first world war, it indirectly helped the allies with weapons and even economically to defeat the axis powers which also included the Germany... During the second world war, US was 'dragged' into the war and frm the word go, US was keen on destroying the Germany.. But the US never declared open war on Germany till now and it is unlikely to happen in the near future..... Either it has taken sides when it required or simply worked from behind the scene....
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
December 11, 1941. only hours after Germany declared war on the US
Germany and Italy declared war on the US on Dec. 11, 1941 after the US declared war on Japan because of the bombing of Pearl Harbor on the 7th. The United States never did declare war on Germany/Italy.
Because the United States had declared war on Japan, Germany's ally.
The US did not declare war on Germany. Germany declared war on the US, on December 11, 1941. This was four days after the Japanese attacked Pearl Harbor, and three days after the US declared war on Japan. Shortly after, Italy also declared war on the US. The procedure is the president asks Congress for a Declaration of War, and Congress then votes on the question.
The US did not declare war on Germany. Germany declared war on the US shortly after the Japanese attack on Hawaii.
DID THE UNITED STATES DECLARE WAR ON GERMANY FIRST YES OR NO
germany
in 1942
1917
Besides the obvious, because Germany declared war on the US hours before.
Germany
December 11, 1941, immediately after Germany declared war on the US.
Germany declared war on the us after Japan carried out it's part in Pearl Harbour.
December 11, 1941. only hours after Germany declared war on the US
Thats what i want to know!
Germany and Italy declared war on the US on Dec. 11, 1941 after the US declared war on Japan because of the bombing of Pearl Harbor on the 7th. The United States never did declare war on Germany/Italy.