Skip to main content

Search from vocabulary

Content language

Concept information

Preferred term

workplace violence in the United States  

Definition

  • Workplace violence has always existed in the United States—indeed, during some periods of our history, fear, intimidation, and physical violence were commonplace in work settings. Contemporary expectations in industrialized democracies, however, are that all workers are entitled to a workplace free from recognized hazards. [Source: Encyclopedia of Victimology and Crime Prevention; Workplace Violence, United States]

Belongs to group

URI

https://concepts.sagepub.com/social-science/concept/workplace_violence_in_the_United_States

Download this concept: