Ted Santi
Wednesday, December 21, 2011
What did Germany get after WWI ?
Your Ad Here
They Didn't GET anything they got many rights taken away from them... Such as the right the have a military and produce m amounts of wepons, and the complete blame for world war 1 they talk about it in the Treaty of Versallies
0 comments:
Post a Comment
Newer Post
Older Post
Home
View mobile version
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment