West Germany
English Dictionary
->
Letter W
-> West Germany
Search Dictionary:
West Germany Definition
(n)
a
republic
in
north
central
Europe
on
the
North
Sea
;
established
in
1949
from
the
zones
of
Germany
occupied
by
the
British
and
French
and
Americans
after
the
German
defeat
;
reunified
with
East
Germany
in
1990
West Germany Synonyms
Federal Republic of Germany
West Germany
© Art Branch Inc.