Answers 1Add Yours
I think it is necessary for Americans to know the truth of the Vietnam WAr. Many Americans still believe that the United States was victorious. This leads right into perception of ongoing conflicts that the U.S. is involved in. There is media spin coming from all directions. Vietnam taught us that the truth about a conflict can remain illusive unless dealt with honestly.