HRS - Ask. Learn. Share Knowledge. Logo

In History / Middle School | 2014-09-13

The colonies that became the original United States were part of which European nation's land claims?

Asked by eloghene3

Answer (2)

England. The land was under English control but the land itself was often owned my individuals that the king owed money to,proprieters(business men) and the king.

Answered by Geekhawk | 2024-06-10

The original United States colonies were primarily under English land claims, starting with the establishment of Jamestown in 1607. By the mid-eighteenth century, thirteen colonies had developed, all governed under British charters. Other European countries, such as France and Spain, also held territories, but England had the most significant influence.
;

Answered by Geekhawk | 2024-12-26