How did Hawaii become a US State?
The state of Hawaii is the only tropical state in the United States. It is also an example of late 19th century expansionism that saw the United States compete with other major Western powers for influence across the World and particularly in the Pacific. Hawaii was also a kingdom and the first government the US overthrew to gain possession of the islands.