Definition of Deutschland

  • 1. A republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990 Noun

Synonyms for word "deutschland"

Semanticaly linked words with "deutschland"