The end of the first world war resulted in the following things which are stated as possible answers: harsh peace conditions and a radicalization of political parties, as well as worldwide depression and unemployment.
It didn't lead to a more peaceful world as wit the ending of the World War I, there was still fighting going on, mainly in the area of Asia. In general, World War I left many scars and opened many wounds between nations.