Interventionism refers to the belief that a state has a responsibility to intervene in the affairs of other states to promote its own interests or values.
Interventionism refers to the belief that a state has a responsibility to intervene in the affairs of other states to promote its own interests or values.