When America was first “discovered” by the Europeans, every country wanted to get a foothold on the new continent. It represented a whole new way to expand their country and make themselves stronger than their rivals, and at this time in European history, they were all vying to become the most powerful nation in Europe. […]