Skip to main content

Search from vocabulary

Content language

Concept information

Preferred term

tropical medicine  

Definition

  • Tropical medicine is the branch of medical science that aims to combat health issues primarily related to diseases prevalent in tropical areas. Tropical medicine was developed during the age of European colonialism, especially during the 19th century when large-scale colonization of tropical areas occurred. [Source: Encyclopedia of Environment and Society; Tropical Medicine]

Broader concept

Belongs to group

URI

https://concepts.sagepub.com/social-science/concept/tropical_medicine

Download this concept: