I don't want to start an argument. But this is a question I've had ever since the first time I heard the sentiment.
I have a hard time believing there are Southerners that are still bitter that the Union won the Civil war. I was born and raised above the Mason-Dixon line so geographically I guess I'm a Northerner. But I've never felt like I "won" anything. I don't have any animosity toward the South, I don't think of anyone as a Reb, I rarely think about the effects on our country.
I just dont get it. Does anyone have an explanation?
I have a hard time believing there are Southerners that are still bitter that the Union won the Civil war. I was born and raised above the Mason-Dixon line so geographically I guess I'm a Northerner. But I've never felt like I "won" anything. I don't have any animosity toward the South, I don't think of anyone as a Reb, I rarely think about the effects on our country.
I just dont get it. Does anyone have an explanation?
Comment