Concept information
Preferred term
workplace violence in the United States
Definition
- Workplace violence has always existed in the United States—indeed, during some periods of our history, fear, intimidation, and physical violence were commonplace in work settings. Contemporary expectations in industrialized democracies, however, are that all workers are entitled to a workplace free from recognized hazards. [Source: Encyclopedia of Victimology and Crime Prevention; Workplace Violence, United States]
Broader concept
Belongs to group
URI
https://concepts.sagepub.com/social-science/concept/workplace_violence_in_the_United_States
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}