Definitions for "Postcolonial"
Referring to interactions between European nations and the societies they colonized (mainly after 1800); more generally, "postcolonial" may be used to signify a position against imperialism and Eurocentrism
Postcolonial refers to the period after the formal retraction of colonial rule in the developing world. This varies considerably, but in the case of the former British colonies, it refers to the period after the Second World War. Postcolonialism is a term that refers to the working through of the effects of colonization on a society or culture. The study of postcolonial culture examines the various mechanisms of colonialism (e.g., political rule, economic exploitation, colonial education systems) and their long-term , imbedded cultural and social implications. While many former colonies are now independent states, postcolonial studies insists on the need to recognize and understand the ways in which its effects persist in the social, cultural, and political life of those states today.