As shots ring out at schools across America, one cannot help but assume that America is more violent now than it has been in the past. Has America really become more violent? Violence has always been present in the world, and in America, but it seems that in the twentieth century American violence has taken on a new form. Crime has become an epidemic that affects every age group, every ethnic group, and every person in a different way, but in each case, the effect is detrimental. In many ways, the violence present in American culture today has desensitized society. In movies, on television, and even on the nightly news Americans can watch brutality and gore whenever they want. It is therefore no surprise that crime has become more commonplace in America's society. Does this mean that because American society is accustomed to crime that violence is on the rise? Throughout America's history, there have been times of increasing violence as well as times when violence seems to be on the decline. American society is beginning to enter another period where violent crimes are decreasing according to the latest U.S. Department of Justice statistics . It is evident that social and/or economic conditions can cause these periods of semi-peace. America may seem to be becoming more violent as we move towards the twenty- first century, but in actuality, violence is decreasing across the country due to incredible social and economic conditions.
Just by taking a brief look one can see that certain times in America's history were more violent then others. America has been going through a sort of "violence cycle" where violence increases due to certain political, social or economic circumstances and then decreases due to other circumstances. America is currently going through one of the decreases in this historical cycle. From 1984-1993, violent crimes were on the increase. Murder and other violent crimes were being commit...