what does it mean to “Decolonize”?
I love that we're talking about decolonizing everything these days. I'm seeing calls to decolonize education, decolonize birthing, decolonize social work, decolonize healthcare. Yes, yes, yes, a million times yes.
And, what are actually talking about?
Does decolonizing mean centering indigeneity? Are we talking about abolitionism? Is this addressing white supremacy? Are we dismantling systemic power-over relationships? Where do we begin?
I can't speak for these movements. Just as there are many ways to be anti-racist, there are many approaches to decolonization.
What I can speak to is what I mean when I say Decolonize.
In my work with women of color, Decolonizing the Body means being in the practice of unlearning embodied white dominance -- how our bodies have learned to contort in order to survive under systemic oppression.
The fallacy of whiteness is a mirage seeded by imperialist, colonialist value systems and the Christianization of the West. It requires an erasure of the self to earn a kind of "conditional" belonging. More on this here.
We learn to disconnect from our power
Take up less space
Doubt our abilities
"Prove" ourselves through hard work
Be nice, etc. etc
We often know when we're doing this. We see how we discount ourselves. But, try as we might to reclaim confidence, we still feel deficient.
Feel is the key word here.
We can't think ourselves out of our embodied patterns. In fact, an over reliance on our head and analytical brain is a symptom colonization.
Reclaiming ourselves means being in a practice of feeling for and beginning to trust the knowing of the body. This is where the decolonized self lives. It is the realm of intuition, creativity, spirit, ritual and interconnection. It is where we contact the undiminishable ancestral inheritance that lives in our bones. Join me here?
In Belonging,
Kelsey