Definify.com
Definition 2024
Western_Christianity
Western Christianity
English
Proper noun
- The form of Christianity traditionally practiced in Western Europe, consisting essentially of the Roman Catholic, Protestant, Anglican, and Old Catholic traditions.