Wednesday, December 21, 2011

What did Germany get after WWI ?

They Didn't GET anything they got many rights taken away from them... Such as the right the have a military and produce m amounts of wepons, and the complete blame for world war 1 they talk about it in the Treaty of Versallies

0 comments:

Post a Comment