The history of German foreign policy covers diplomatic developments and international history since 1871.
Before 1866, Habsburg Austria and its German Confederation were the nominal leader in German affairs, but the Hohenzollern Kingdom of Prussia exercised increasingly dominant influence in German affairs, owing partly to its ability to participate in German Confederation politics through its Brandenburg holding, and its ability to influence trade through its Zollverein network. The question of excluding or including Austria's influence was settled by the Prussian victory in the Austro-Prussian War in 1866. The unification of Germany was made possible by the Franco-Prussian War of 1870–71, in which the smaller states joined behind Prussia in a smashing victory over France. The German Empire was put together in 1871 by Otto von Bismarck, who dominated German and indeed all of European diplomatic history until he was forced to resign in 1890.
The new German Empire immediately became the dominant diplomatic, political, military and economic force in Continental Europe, although it never had as large a population as the Russian Empire. The Great Britain continued to dominate the world in naval affairs, international trade, and finance. The Germans tried to catch up in empire building but felt an inferiority complex. Bismarck felt a strong need to keep France isolated, lest its desire for revenge frustrate his goals, which after 1871 were European peace and stability. When Kaiser Wilhelm II removed Bismarck in 1890, German foreign policy became erratic and increasingly isolated, with only Austria-Hungary as a serious ally and partner.[1]
During the July Crisis, Germany played a major role in starting World War I in 1914. The Allies defeated Germany in 1918. The Versailles Peace Treaty was punishing for the new Weimar Republic.
By the mid-1920s, Germany had largely recovered its role as a great power thanks to astute diplomacy on its own part, the willingness of the British and Americans compromise, and financial aid from New York. Internal German politics became frenzied after 1929 and the impact of the Great Depression, leading to a takeover by Adolf Hitler and the Nazis in 1933. They introduced a highly aggressive foreign policy in alliance with Italy and Japan. The British and French tried to appease in 1938, which only whetted Hitler's hunger for more territory, especially in the East. Nazi Germany had by far the most decisive role in starting World War II in 1939.
Since 1945, Germany has recovered from massive wartime destruction to become again the richest and most powerful country in Europe, this time it is fully integrated into European affairs. Its major conflict was West Germany versus East Germany, with East Germany being a client state of the Soviet Union until the collapse of the Soviet Union. Since the 1970s, (West) Germany has also sought to play a more important role internationally again.[2] After the collapse of Communism in 1989-1991, East Germany was merged into Germany, and Berlin became the capital of the united country. NATO expanded to include the former East Germany and also most of the East European countries that had been satellites of the USSR. Relations with Russia worsened after the seizure of Crimea from Ukraine in 2014. However, Germany depends on Russia for much of its energy supply, and Russia needs the cash payments for oil and gas. Relations were tense with the United States during the Presidency of Donald Trump (2017–2021) but have improved at the start of the Presidency of Joe Biden (2021– ].