The United States in World War I

The United States entered World War I in 1917, fighting on the side of the Allies. Although the US contribution to the war was small relative to the European powers, the war would go on to affect American life into the 1930s.