History
Fact-checked

At WiseGEEK, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.

Learn more...

What is a Peace Treaty?

Alan Rankin
Alan Rankin

A peace treaty is a formal agreement between two or more countries that have formerly been at war. The treaty is a contract designed to cease hostilities immediately. It can also lay out the conditions for surrender, reparations, or other requirements to ensure peace. Many of the most famous wars of history ended with a peace treaty, including the Napoleonic Wars, the American Revolution, and World War I. In modern times, complicated conflicts often call for a more detailed resolution than a simple treaty.

Wars are conducted by governments or other opposing bodies for a variety of reasons. When one force has superior combat strength, the war often ends with bloody conquest. When forces are evenly matched, or nearly so, fighting can last for months or years, with tremendous losses on both sides. In these cases, diplomacy often triumphs where brute force fails to resolve the conflict. A peace treaty is a common way for opposing forces to end hostilities and begin the process of rebuilding.

The United Nations, which is headquartered in New York City, was designed as a forum where international conflicts could be resolved.
The United Nations, which is headquartered in New York City, was designed as a forum where international conflicts could be resolved.

The earliest recorded peace treaty, the Treaty of Kadesh, ended a war between the Hittite people and the Egyptian empire in roughly 1280 BC. The treaty has been preserved on stone tablets; a replica of the agreement is on display at the United Nations headquarters in New York. In modern history, Paris has long been the site for the signing of peace treaties, as it is often a neutral location. The American Revolution ended with the 1783 Paris Peace Treaty, while Napoleon conceded defeat with the Treaty of Paris in 1815.

The Treaty of Versailles marked the end of World War I.
The Treaty of Versailles marked the end of World War I.

A peace treaty does not always mean the end of hostilities. During the Indian wars of the 18th and 19th centuries, the American government was notorious for disregarding treaties it had previously signed with Native American tribes, and more battles resulted. The Treaty of Versailles, which officially ended World War I in 1919, imposed harsh sanctions on Germany, the defeated party. These sanctions crippled the German economy and caused bitter resentment among the German people. Many historians believe these conditions led to the rise of the Nazi party and the Second World War.

Many historians say the Treaty of Versailles gave rise to Nazi Germany.
Many historians say the Treaty of Versailles gave rise to Nazi Germany.

In modern times, war rarely ends with a comprehensive peace treaty. The Vietnam War, the Korean Conflict, and France’s Algerian War ended with agreements that slowed combat gradually over a period of years rather than bringing an immediate peace. Modern war is a messy business that is difficult to control through such civilized measures as a signed contract. Nevertheless, diplomats work diligently to ensure that conflicts are resolved, one way or the other, by eventual peace.

Discuss this Article

Post your comments
Login:
Forgot password?
Register:
    • The United Nations, which is headquartered in New York City, was designed as a forum where international conflicts could be resolved.
      By: Touch
      The United Nations, which is headquartered in New York City, was designed as a forum where international conflicts could be resolved.
    • The Treaty of Versailles marked the end of World War I.
      By: lebanmax
      The Treaty of Versailles marked the end of World War I.
    • Many historians say the Treaty of Versailles gave rise to Nazi Germany.
      By: Recuerdos de Pandora
      Many historians say the Treaty of Versailles gave rise to Nazi Germany.
    • Diplomats represent the United States in peace talks around the world.
      By: Pezography
      Diplomats represent the United States in peace talks around the world.