In 1942, it became clear that World War II would drastically change the United States forever. The war would help bring racial equality to the American workforce and give Black soldiers the chance to serve in high-ranking military positions. This is...
In 1942, it became clear that World War II would drastically change the United States forever. The war would help bring racial equality to the American workforce and give Black soldiers the chance to serve in high-ranking military positions. This is...