Category Archives: Op-Ed

Is College Ready for Me?

Why does college readiness inevitably  refer to the students’ readiness to grapple with systems that are insufficient to support them? It  should refer instead to the readiness of a college to be held accountable for the success of students to whom it has marketed itself as a supportive and safe environment. In the past few years, Wellesley has made an effort to support POC, low-income, and first generation students. At their core though, all of these efforts place a large burden on the students themselves to spend the time, effort, and energy to find help in navigating academia. But don’t worry—administrators will serve you cake to celebrate diversity and visibility.

The problem with most of the efforts to promote diversity at Wellesley is that they are focused on mentorship.  I’m all for mentorship as a component to a robust effort to bring about inclusion, but on its own, it’s only a band aid solution, and a taxing one at that.

I’m told to find a mentor who understands my experiences.  So I go to my Latina mentor. Then I go to my gay mentor. And my physics mentor.  My mentor who was a first-generation college student. One who grew up low income. The one who descended from immigrants. There aren’t enough hours in a day to go to every mentor I need to fully realize the complexity of my identity and cobble together some sort of strategy for propelling me through a system that was fundamentally not designed to enable  a person like me to succeed. Of course I’m grateful for all of my mentors and everything they do for me, but after spending so many hours getting mentored and designing a strategy, I’m exhausted. Except that it’s not time to be exhausted– it’s time for me to put forth my best effort to execute the strategy. And even then, there’s a huge chance that that strategy won’t work.  It’s not a well-trodden path so we’re shooting in the dark.

Mentor me all you want, the system’s still screwed.

Mentorship has taught me how to file a Title IX complaint when I’m facing harassment from my professors.  But mentorship doesn’t teach the professors not to harass me. Mentorship has taught me how to respond to a professor who refuses to believe when I say I’m sick. But mentorship doesn’t teach professors not to treat me like a serial liar. Mentorship has taught me to fight and fight to prove wrong the people who don’t believe that I’m capable of succeeding.  But the fighting is tiring, especially when the people who are supposed to be on my side are actively undercutting me behind my back.

Of course, there are professors who try support me, but they’re so far outside my experience that making their support effective requires training on their part.  The way academia is set up though, jobs are highly competitive. For a professor who has not yet received tenure, taking time off from research to participate in inclusivity trainings can negatively affect a career. And for a professor who has received tenure, there are virtually no repercussions for being problematic toward students no matter how many complaints are made.

To the administrators: as for the cake, shove it up your ass. Congrats on “seeing and hearing” POC, low income, and first generation students, I guess?  Congrats on “seeing and hearing” the struggles while doing very little about it. Congrats, but until I see equitable treatment of all students, I’ll go ahead and assume this means you acknowledge the struggles, but you don’t care enough to effect any real change.

I’m calling for college readiness, and by that I mean the college’s readiness to deal with the type of students it has never had to consider before.  Train all of your professors, not just ones who express interest.  Critically examine your practices to see who isn’t benefiting and why.  Engage with the problem instead of just hiring a few diversity representatives to engage for you.  To hell with “that’s the way academia has always been.” If you’re looking for a change in the people represented in your system, you have to change the system to accommodate those people.  That’s not negotiable.

Op-Ed: Resisting the Couch Party.

The president of Egypt can now legally stay in power for 11 more years. But rest assured, he will not be in office forever, because “there’s no such thing as ruling for eternity.” According to President Abdel Fattah ElSisi. “We all die at some point.”

On the 23rd of April 2019, Egypt passed the most dangerous constitutional referendum in its history with an alleged 88.8% of voters saying YES to the proposed changes. Admittedly, it’s a slightly more creative percentage than the presidential elections of last year, which ElSisi won by 97%, after jailing all but one of his opponents.

Faced with a sham referendum that, among other changes, extends the presidential term until 2030, some Egyptian government-opposers decided to go vote No, despite the predictable result. By doing so, they hold on to a fleeting sense of hope and raise a voice of dissent against the regime, albeit a muffled one.

Photo of government posters in Tahrir Square. Cairo, April 2019. [MOHAMED EL-SHAHED/AFP/Getty Images]

No one, of course, was surprised by the April results. In fact, we Egyptians have grown accustomed to these types of elections, complete with DJs and government-commissioned music videos accompanied by dancing supporters in matching T-shirts. Not to mention subsidy boxes exploiting the poor and buying their vote in exchange for two packets of flour and a month’s supply of cooking oil. In the past, people in the opposition have called for a boycott of elections in an effort to expose the illegitimacy of the vote and to refuse to participate in what many want the world to know is simply a travesty of democracy,  staged by a dictatorial military regime.

What was interesting this time around is the online movement by regime-opposers to actively participate in the referendum and vote No, rather than the usual boycott. For days – and it was only days between the announcement of the amendments and the start of voting – my social media feeds were filled by friends in the opposition debating about the vote, asking themselves and each other the common question: “What’s the point?”. The tired and the cynical made fun of those supporting the movement to vote No and popularizing the hashtag #انزل_قول_لأ (Go say no); The latter responded with impassioned posts about our country’s future and our people’s lost revolution.

When the revolution erupted in January of 2011, I was only 14 years old. Politics were not really a topic of discussion in our house, but I knew my parents didn’t like then-president Mubarak. As an Egyptian, self-censorship is a skill passed down to you at an early age out of fear of the reach of the police state. In fact, I remember one of my first lessons. One day in primary school I was singing a jingle from a famous advertisement while walking home with my father. The song was for the popular cheese product La Vache qui rit, The Laughing Cow. And to my father’s surprise, I turned to him and asked loudly “Why do we call Mubarak the laughing cow!?” … Unknowingly, I was referring to a derogatory caricaturization of Mubarak, a tacit form of humorist resistance, commonly used as a tool of popular dissent. At the time, I was not old enough to be let in on the joke, but this was the day my father explained to me what a plain-clothed informant is and why we don’t talk about politics in public. Or at all.  

Former Egyptian president Hosni Mubarak. The Laughing Cow cheese logo. PC: Getty Images / La Vache Qui Rit Arabia

My parents’ generation is one that experienced an era of disillusionment and political disengagement, brought about by the military’s dominance of the political process since 1952. A period that gave rise to what is now known as the “Couch Party”, a term coined during the revolution to refer to the silent majority of politically detached Egyptians, whose participation in public life does not extend beyond the couches of their homes. Many in the Vote No campaign this year warned that boycotting the referendum would be a return to the same political stalemate; A return to the proverbial couch.

The 2011 revolution broke through a barrier of fear and political paralysis. When millions of Egyptians took to the streets and ousted Mubarak, not only did politics become a topic of discussion in every household, but the people actually saw themselves as participants in the political process, agents for change after years of oppression, suffering and inequality.

That sense of agency was short-lived. When people mobilized again in 2013 to protest the Muslim Brotherhood’s push for more power, their movement was hijacked by a military coup, masked as a response to popular demand. Today, many call the the series of 2011 uprisings, dubbed the Arab Spring, failed revolutions. Apart from Tunisia, all the countries that saw massive protests in 2011, today are far from meeting any of the revolutionaries’ demands, the removal of the president simply being the tip of the iceberg in most of these cases. Just to name a few, Yemen is entangled in an ugly proxy war, Libya has become a failed state, Syria is committing war crimes against its own people in its torture facilities, and Egypt has made a full circle back to brutal military dictatorship.

ElSisi’s is the most oppressive and relentless regime Egypt has ever seen. Since his rise to power in 2014, we have witnessed countless executions without due process, waves of forced disappearances, detention of young people and activists, and torture, which as in the case of Giulio Regeni, an Italian graduate student researching labor rights, can end in murder. Most recently, we were traumatized by the images of people burning to death after a railway accident in Cairo’s main station which exposed the level of neglect to which the regime has allowed the basic infrastructure to descend. Repairs to the railway were proposed last year, but in a televised conference the president brushed off the proposals claiming they were not worth the investment.

Indeed, in ElSisi’s Egypt, human lives are not worth the investment and human rights have no place. Neither does the rule of law. The constitutional amendments extend presidential term duration to six years and allow for two consecutive terms. The changes will be applied retroactively to the president’s current term, extending it until 2024, but they also disregard the president’s first term that ended in 2018, making it possible for him to stay in power until 2030… or until the next referendum. The amendments also extend presidential powers over the judiciary and make the military in charge of “protecting democracy”. The final text was never made available to voters and was not posted at voting stations.

In the few days between the announcements of the referendum and the window to vote while abroad, I tried to arrange a trip to NYC in order to vote, but it was impossible at such short notice. I had spent the entire week arguing online as I made up my decision about whether to Vote No or to Boycott and had eventually decided I wanted to join the efforts to Vote No simply in order to fight passivity.

The entire operation was illegitimate and undemocratic and the turnout figures were clearly inflated, claiming 44% participation rate (~27 million people), despite empty polling stations. Those pushing for a No vote were fully aware of that. In fact, this movement actually has nothing to do with the results, but only with the act of participation. It is a movement against passivity, against defeatism and against the reemergence of the politically unengaged Couch Party. It is a movement of hope to remind the people, but also to remind those in power that once the barrier of fear is broken, nothing can reimpose it. Despite the brutality of the current regime, the movement to Vote No in this referendum served as an attempt to hold on to this glimmer of hope, to the promise of the revolution and to the memory of those who died fighting for it and those who are still in prison for believing in it.

Burning Churches Can’t Burn Down Their Structures

I opened my phone on April 15th to find a barrage of news notifications about a tragedy in France. Notre-Dame Cathedral had burned down. I felt mournful for this gorgeous piece of architecture and part of French culture. I began reading the news articles wondering about the magnitude of the tragedy and the number of lives lost. What I learned shocked me. No one lost their life, and the main structure of the cathedral and its two iconic towers were still intact. A measured tragedy, I thought, but a tragedy nonetheless.

What I learned in the following hours was more astonishing. Massive donations had poured into the rebuilding of the cathedral within hours of the inferno. Hundreds of millions of euros came in from French billionaires. First, 100 million euros from François-Henri Pinault, head of the luxury goods group Kering that owns brands like Gucci and Saint-Laurent. Hours later, not wanting to be outdone, a rival billionaire, Bernard Arnault, France’s richest man and CEO of LVMH, donated 200 million euros. More donations from the ultra-wealthy in France continued to pour in. The cause gained international attention too; even the United States government pledged its support. In just a day and a half, 880 million euros (995 million US dollars) had been raised.

All right, great! This landmark cathedral will get rebuilt. People were able to mobilize money rapidly and put it to use. The French State accepted millions in funding for Notre-Dame, in spite of the fact that it is global symbol of the Catholic Church — one of the world’s wealthiest organizations. People will be able to keep travelling to simply revel in the glory of this cultural epicenter and post glamour shots to Instagram. Christians, who claim to follow a religion of giving to the poor and caring for thy neighbor will get their ornate roof fixed with money that seemed to materialize overnight. Super. This structure will remain intact. But it makes me wonder. What else could this instant billion have gone to?

What about the three historically black churches that burned down – not by accident – in Louisiana less than two weeks before? Do those who worship at these places deserve compassion any less? These churches put out an ask for a mere $1.8 million between them and although they were able to raise it, for a while they were hard-pressed to get even that much. And what about the atrocities in Sri Lanka on Easter Sunday, the hundreds of people who lost their lives in the bombing attacks on churches? Do they not deserve donations to rebuild their places of worship? Where is the outpouring of international support for the hundreds of lives lost there? What about the families in Flint Michigan who have not seen clean tap water in nearly five years? It is estimated this crisis could be resolved with only $55 million. Perhaps instead of the U.S. government pledging support to France, it could have helped the families poisoned on American soil. What about Puerto Rico? Where was the money needed to rebuild in the wake of Hurricane Maria last September, a tragedy that not only destroyed much of the island physically, but took out nearly 4% of the population? Experts say it would only take $139 million to recover from this disaster. And finally, what about our planet? Did these billionaires think about the fact the we may not even be here in 50 years to see this rebuilding if we don’t start making rapid progress towards dealing with climate change? Where is the money for that?

Let’s step back for a moment. The total amount needed to fix the burnt-down black churches, provide clean water to Flint, and aid Puerto Rico is only $195.8 million. Even if this money was sent to these other crises, there would still be plenty to fix Notre-Dame. So why is it that donations flooded the Cathedral but the money needed by poor and marginalized communities is nowhere to be found? It’s because the same structures of power that built the Catholic Church and keep it running are the ones keeping these communities down.

The huge donations to Notre-Dame reflect a legacy of colonial and white supremacist power structures that determine far too many events in today’s society. The Catholic Church exists to perpetuate the supremacy of white Catholics, and the church’s imperialistic history cannot be ignored. The colonization of disadvantaged or marginalized communities and cultures brought the church to power, and in order to retain this power these communities must be kept down. In the same vein, the ultra-wealthy are in positions of power due to a history of systematic oppression and power grabbing. More often than not their wealth is the product of a legacy of exploitation, and this legacy has not changed, only morphed with the times. Today, we see an outpouring of support coming from those who benefit from these structures of power to the very structures of power that have put them in their place of privilege and continue to do so.

This is a dangerous self-perpetuating system, one reflected in nearly every aspect of the Notre-Dame fire, even down to how the donations are structured. There is a 60% tax break on Notre-Dame donations — allowing for those who have profited most from our late-stage capitalist society to continue to reap the benefits. With this tax break, for each 100 million donated the French billionaires get 60 million back, plus all the publicity they enjoy for being a generous supporter of French culture. And they get to do this while still holding amounts of wealth that if donated or redistributed in a bit more thoughtful way could do some serious good. And by serious good I don’t mean fixing the roof on a building that is symbolic of the oppressive colonial behavior that condemned these disadvantaged communities to where they are today.

While people die, the planet burns, natural disasters wreak havoc, and racist attacks take place, the ultra-wealthy spend their money in ways that serve only to keep their power structures intact. Ironically, in this case that means keeping a physical structure intact. The power systems that brought the Catholic Church to prominence and lined the pockets of these billionaires are the same ones that kept the burned-down black churches in Louisiana and bombed churches in Sri Lanka from seeing massive publicity and support, poisoned the people of Flint, and let the people of Puerto Rico go without reparations. Maybe it’s time to turn the tables.

Take Better Care: Is Self-Care Truly The Ideal?

“Self-care is not selfish.” Since I first stepped onto the Wellesley College campus in the Fall of 2015, I have encountered this phrase everywhere I go. I’ve seen it in Orientation programming and on dining hall posters–I’ve even heard it from the mouths of my friends. Eventually, I began to say it myself. As a Resident Assistant, I talked endlessly about how to maintain a certain level of self-care, handing out advice and posting literal “how-to” sheets, as if there were some kind of magical formula. “Hey, you! Yeah, great, take care of yourself! Go take a nap”, one of my own posters basically read. How nice and encouraging.

And this obsession with superficial self-care is not just a Wellesley phenomenon. It’s recognized by numerous sources on the national level, including US News, Psychology Today, and Forbes. I agree that the practices the “self-care” mantra promotes are important. Everyone needs to take care of themselves in order to survive.

But here’s the catch: Self-care as an ideal mode of living is premised on an intrinsic mistrust of the community around you. Go take care of yourself, because no one else is going to do it! That will obviously make you feel loved.

This messaging becomes even more problematic when its source is a system of power. For example, when it comes from the college administration itself, the liability for students’ wellbeing is then shifted onto the students. Does the administration really believe that it’s enough to want the best for their students while not following through with serious action? Or is the college just too ill-equipped to be effective or even strategic in the first place? The school is basically telling students that they have to deal with their issues on their own, except it’s packaged in such a friendly manner that it hides the institution’s unwillingness to take responsibility for its students. This self-protective approach is exposed by the shallowness of #wellness events on campus. Have you ever had all your problems taken care of by going to a night of bingo? I sure haven’t.

It’s an unacceptable societal problem that the only care you’re receiving is from yourself. When life gets hard, you won’t necessarily be able to keep it up. Maybe you’ve done it–pulled yourself up by your bootstraps. If so, you probably don’t see it as such a happy, low-stress feat. It’s heavily taxing, because if we were meant to solve all our problems on our own, we wouldn’t be living in communities from the start.

I think a lot of this extreme focus on the self comes from living in our supremely individualistic society. Specifically in college, I feel like I’m constantly confronting the American ideal of lone ranger independence. A white American friend of mine from the Midwest told me how there’s no way her grandma is ever going to live with her parents–it’s a retirement village or nothing. What! Coming from a family with members who immigrated in later waves, I could never say these words to my parents, and they could never ever say them to theirs. In less individualistic non-American cultures, telling people to self-care is inherently selfish–on the part of the person doing the well wishing. It’s basically saying this community you have will cheer you on, but you’re the one who has to land the routine. However, if you love them enough to give them (unsolicited) advice, why aren’t you doing the routine with them?

Also, the meaning of self-care originally referred to therapy for people who either weren’t being sufficiently cared for or were so dependent that they needed to feel some autonomy. It’s about surviving. Not thriving. As Slate points out in an article on the history of the term, self-care was historically given as medical advice to dependent patients, a coping mechanism for those in trauma-related professions, and later as a resistance effort for marginalized groups. Nowadays it’s being tossed around casually, as for example in on-campus postering–“Self-care and Face Masks”, “How to Self-Care During Exams”, “Take Care of Yourself and Pet a Puppy”. These are fun things! But for me, it’s never been these events that have gotten me through. It’s the people around me who have actively made sacrifices to care for me, who have sat with me until 3 AM, who have never given up on me even when I gave up on myself.

I believe that in order to truly thrive we need to support and pour life into one another. That’s the world I want to live in–one where a college orientation doesn’t have to emphasize self-care, because it already consists of a community of people who will radically love each other without being instructed to do so. One where no one is using curt hashtags and administration waivers to shirk the responsibility of actually caring for one another, because the needs of the administration and the students no longer diverge. Instead, we’re all members of humanity, and we no longer need to rely on our singular abilities. With everyone working together, we can all help each other take better care.

I am neither African nor American

I often wonder why some people find the term “African American” so comforting. Does the repeated vowel sound have alliterative appeal? Has history made people afraid to say ‘the B-word’? Is it too harsh? Jarring? Abrupt? I have encountered an alarming number of instances in which people like me have been tossed into the African American box as though we B-words were some sort of monolithic group whose members could all be referred to by the same name. People like me aren’t a uniform group. Just ask the 1 in 10 people who are too-often identified as African American when they are foreign-born. This Jamaican woman is tired of being called African American. Here’s why.

Calling someone African American is reductionist. The complex reality of the personal significance of space and place is reduced to a label that is casually thrown around by persons who do not take the time to examine the words they are using.

Please tell me, dear white journalist/survey writer/commenter/friend, what is so scary about the B-word. Historically, yes, the B-word was considered offensive in the United States. So was interracial marriage. There are people alive today who were around when “negro” surpassed “colored” as the accepted term. “Negro” was socially acceptable for a very long time—in fact, until the Black Power Movement of the 1960s. The point is that this is a complex discussion that history cannot explain away. Historical context is no excuse for a lack of precision in the language we use. If the B-word is taboo, it shouldn’t be. No one should hesitate to say it, as though it’s something unkind or forbidden. There needs to be a paradigm shift in the way that we talk about the members of the incredible melting pot that is the United States of America.

Tell me why, when I fill in a demographic survey, there are still places in which the B-word is associated with that “African American” modifier. Tell me why my white friend from South Africa is less African American than I am. Consider for a moment the conflation of race, culture, historical context, and geographical location that has resulted in the absurd fact that the term “African American” is associated with skin color and not country of origin. I am sensing a double standard here.

Yes, I know, it could be argued that this is a simple question of usage or verbal habit that has nothing to do with semantics. Why does it matter what we’re called if the intent isn’t racist, bigoted, or ill-meaning? Answer: it matters because “African American” is not who I am. Even if I did have an American passport—which I don’t—what gives you the right to label me as African? What about all the uniquely Caribbean aspects of my culture that are distinct from those of my African-identifying counterparts? If your point is that my ancient ancestors came from Africa, then I have news for you: if you go far enough into the past, yours did too.

The fact of my African ancestry should not determine the term by which I am to be permanently identified. Don’t call me a negro. Don’t call me colored. Hell, you don’t even have to call me a person of color. Please, don’t call me African American. For crying out loud, just call me black.

If You Want Diversity, Open the Gates

Computer science is hard. That’s what everyone says, and they’re right. It’s not just because you’re learning an entirely new language with its own rules, syntax, and semantics, and it’s not because programming doesn’t click right away for everyone. It’s not even because it can be a struggle to find teachers to help you learn it. The real reason that computer science is so hard isn’t even related to the study itself; it’s because if you aren’t a White or Asian man, you’re in a field where the odds are stacked against you from the very beginning.

I walked into my first computer science classroom my junior year of high school and was one of three women in a class of thirty people. Then I went to my first computer class at Wellesley with no cisgender men in sight. At last, no working with just select, trusted male friends, carefully vetted before the class; at last, no men trying to explain something that I already knew better than they did; and at last, no more enduring the endless comments about how easy something must be if she can do it. It was a breath of fresh air.

But one thing didn’t change: the gatekeepers were ready and waiting.

The majority of students who graduate from college with a computer science degree are White or Asian, and the majority of those students are male. Studies point out numerous reasons for this, from the cultural stereotypes that surround computer science to the sexism that pervades the field in companies and universities. But the crucial element that discourages diversity in computer science? Gatekeeping—that is, controlling and limiting access to something. It’s not just the big companies that do this—it starts with schools failing to reach those who need the most help.

You’d think a place like Wellesley College wouldn’t have this problem. Yet last year, a CS professor at Wellesley accidentally sent out to the CS student body a document meant for faculty. It contained a list of “problem students” who, according to a subsequent faculty explanation, were struggling in class and needed to have a watchful eye kept on their progress. While we’ll never know for sure the list’s exact purpose, some students correctly identified it for what it was: a key component of gatekeeping. Just by seeing their names, the listed students were discouraged from continuing their classes because they were taking more time to program than they should. This document heightened awareness of how professors and lab instructors keep each other informed of which students are doing poorly. Ostensibly, instructional faculty do this so that those students can receive more help, but it’s unclear how many students received that help and how many dropped their major after the list was circulated.

CS faculty are the first to welcome anyone into the department, but   the gatekeeping at Wellesley is much more overt than the professors think. The first two introductory classes for CS have long been known for “weeding out” students who want to go into CS; discouraged by their difficulty and the time sink required for a passing grade, many drop the courses and pursue other interests. Even as the department is growing as interest in coding increases, these two classes prevent too many students from experiencing the fascinating aspects of computer science.

It shouldn’t be this way. Programming and the knowledge surrounding it should be accessible, especially since the world around us is becoming increasingly reliant on computers—your smartphone has artificial intelligence built into all of its core systems, and in the last ten years, virtual reality has left clunky machines behind and moved to affordable headsets. But gatekeeping keeps computer science out of reach for students who don’t meet the minimum requirement (read: White or Asian cisgender male). The assignments for many CS classes list the number of hours it takes to complete them; go over that hour limit and you are supposed to seek help from faculty or peers. Imagine how demoralizing it is to see some peers breeze through assignments when you have to constantly go to the help room and office hours. For some, it’s easier to just give up and take different courses. Who wants to spend fifteen hours on a single problem set?  Who wouldn’t rather do anything else?

Help room and office hours can be intimidating for a first-year student, particularly for first generation students and students of non-White and non-Asian descent. No one should be ashamed of how long it takes to learn to program. It’s time for colleges in general—and Wellesley in particular—to have a more robust program in place to catch students who slip through the cracks. It’s time for them to promote the diverse environment they claim to support.

When foreign language instruction is cultural appropriation

I remember the first Spanish class I took in middle school. The teachers gave us a handout outlining the benefits of learning a foreign language. Enhancing memory, preventing cognitive decline, and improving job prospects were all listed on the sheet. Determined to learn something useful, I felt eager to learn Spanish, my first acquired language that wasn’t native, after Urdu and English. In reality, I was enticed to learn a new language primarily for the sake of advancing my own prospects. I’m sure I wasn’t alone. When this happens, it isn’t the learner who is to blame. Certain approaches to foreign language instruction are in fact based on principles of cultural appropriation.

Cultural appropriation is the adoption of another culture’s elements, practices, or customs in the absence of an understanding of the context behind them and with the intent of benefiting an interest that is often unconnected with the culture. We’ve seen cultural appropriation before. Take Beyoncé and Coldplay’s music video “Hymn for the Weekend,” a clip that depicts India as an exotic land and features Beyoncé appropriating traditional Indian dance moves while dressed in Indian-style garments. Produced more for revenue than for spreading cultural awareness, the video takes advantage of another culture and uses it to attract more video views. It also demonstrates a lack of understanding of Indian culture by trading on stereotypes in the form of levitating holy men and people dancing in the streets.

Though acquiring a new language is advertised as the key to connecting with people from other cultures, cultural appropriation occupies this domain too, and its claims are just as seductive. That middle school handout reappeared again when I was taking Spanish in eleventh grade and yet again in twelfth grade. These constant invitations to learn a language for solipsistic benefits, a passport to personal advancement, prompt the question, are we really teaching students how to understand and appreciate other cultures? Or are we encouraging them to take advantage of another culture and language in order to flaunt multilingualism on a résumé?

Emphasis on learning a language solely to enhance one’s prospects isn’t the only way cultural appropriation can affect teaching approaches. Any pedagogy that devalues the relationship between culture and language in the service of some other goal is evidence of cultural appropriation as well. In early March, April Rose, a state delegate in Maryland, proposed a bill authorizing county boards of education to allow computer programming to fulfill students’ foreign language graduation requirements. Rose told the Carroll County Times that allowing computer programming to satisfy foreign language requirements would “provide more access to … classes that really provide true workforce skills.” While coding does involve gaining fluency in programming vocabulary and syntax, it doesn’t replace learning about another human culture. Languages are not just for communicating; they are also for understanding. Stripping students of exposure to another form of human connection gives the impression that it’s not necessary to learn how to exchange ideas with people from other cultures. Furthermore, while learning to code may prepare students for getting a job after they complete their education, there’s no guarantee that they will thrive in that workplace if they do not know how to establish relationships with people from other cultures.

Does this mean that all foreign language instruction programs are guilty of cultural appropriation? Not necessarily. Programs that teach learners to appreciate the cultures associated with a language are not examples of cultural appropriation; on the contrary, they incorporate forms of cultural education in order to teach language. It isn’t enough to simply learn the vocabulary and grammar rules of a language; it’s also essential to learn cultural context. Take the “you” pronoun in Urdu, a language commonly spoken in Pakistan, as well as parts of India, Bangladesh, and the Middle East. In Urdu, there are three forms of “you” — one extremely informal form reserved for animals or those who are “inferior”, another relatively informal form used to address children or close members of a family, and a very formal form that signifies respect. A speaker ignorant of the cultural context denoting when to use each form could easily misuse it and offend someone.

Promoting cultural education goes quite naturally with teaching a language. Teaching students about the countries a language is spoken in, incorporating literature and film into instruction, and offering travel experiences are all ways that foreign language programs can help learners appreciate and acknowledge the culture of the language they are trying to learn. This value and respect for other cultures is what will enable them to more easily connect with others, understand and accept cultural differences and become global citizens.

If You’re Going to Have a Multicultural Requirement, Do It Right

Liberal arts colleges have recently struggled with issues of diversity on their campuses. Minority students are often underrepresented and under-supported, and they face micro-aggressions on a daily basis. On some campuses, tensions around issues like race have exploded, as at Middlebury College, where a talk by the libertarian political scientist Charles Murray caused intense controversy in 2017. Liberal arts colleges, like Middlebury, have begun putting into place various programs to deal with their diversity issues. Students at many of them must now fulfill multicultural credits to graduate—credits that are added in addition to traditional distribution requirements in the fields of mathematical reasoning, natural sciences, social sciences, history, and foreign language.

Unfortunately, schools are putting little effort into creating and enforcing these new requirements. If a school is going to mandate multicultural coursework—which they should—they need to put in the work to make sure that it is truly opening students’ eyes. It is often said that the purpose of college is to expose students to broader perspectives and new experiences, and push them outside their comfort zones. If this is true, isn’t learning about cultures other than their own an intrinsic part of this education? Take liberal arts schools, which aim to give students a well-rounded education that makes them knowledgeable about many subjects, as opposed to the career-focused approach they might find at a non-liberal arts schools. This well-roundedness comes in an attempt to challenge students’ beliefs, make them critical thinkers and writers, and prepare them to become global citizens. In this light, it’s obvious that multicultural requirements are as important as any other area that liberal arts schools might require.

These requirements are often constructed in ways that defeat their purpose. While many schools have caught on and made multicultural credits necessary for graduation, they are not always doing so thoughtfully or developing the new requirements in ways that will effectively challenge students—a deficiency that is particularly relevant to the many liberal arts colleges that remain majority white and have low populations of international students. Colleges must take special care in developing multicultural requirements because what constitutes “multiculturalism” is not as clear-cut as natural science or literature. Multiculturalism can take many forms–learning about cultures foreign to one’s own experience, learning about minority cultures in one’s own country, coursework that takes an intersectional approach to issues within the student’s culture. The central focus of multiculturalism must be learning about experiences outside of the student’s own. Some schools are doing this better than others, some are doing it worse. At Middlebury, for example, multicultural requirements have met resistance in the form of student outcry—students must have coursework in two multicultural areas: one course about Europe, one course from AAL—Asia, Africa, and Latin America. This is a problematic division of mandatory coursework because it privileges European/ Western civilization as equal in importance to Asia, Africa, and Latin America combined. In the wake of recent negative feedback from the students, the school’s administration is considering reformatting this curriculum decision. They are right to do so.

Another issue at some schools is allowing classes about minority cultures in the United States to count for their multicultural requirements. Now, this is not inherently a bad thing, and in some cases it is done well—at Mount Holyoke College, if a class about a Western country or North America is to count, it must be about people of color in that country or region, or about people in North America whose primary language is not English. On the other hand, at Wellesley College, classes about queer culture in the United States are deemed adequate—even though some that count toward this requirement do not include the study of queer people of color, which is essential to a class that is supposed to be “multicultural.” These classes have the potential to be multicultural, but as currently constructed are not fully so. In many cases, it all comes down to what is being studied in these classes and how the curriculum is structured. For example, French language and culture classes that don’t address people of color in Paris or countries other than France that have large French-speaking populations, such as the many African nations once colonized by France, don’t count. In most cases, unfortunately, it can be assumed that schools are not taking the right approach. Some colleges are conspicuous offenders, like Columbia University: classes such as Intro to Geography and Intro Biology can satisfy Columbia’s multicultural requirement. It goes without saying that these courses should not count.

Colleges just aren’t trying hard enough to diversify the perspectives their students encounter in the classroom. It’s not just multicultural requirements: it’s literature classes that don’t include any authors of color and women’s and gender studies classes that don’t include the intersections of class, race, sexuality, and gender. College multicultural requirements are an opportunity for academic institutions to challenge themselves and their students. Any college that allows an intro to geography course to satisfy a multicultural requirement is definitely not trying hard enough–even if it has plenty of company.

The Boys’ Club: An Antiquated, Entitled System of Oppression

No matter where you look it seems like there’s a new headline emerging about admissions fraud or gender discrimination, as elites manipulate the situation to their advantage. As if it weren’t bad enough that the less fortunate, myself included, have to compete with affluent students whose parents can pay for private school, tutors, test preparation courses, coaches, campus visits, and more to sharpen their academic skills and burnish their résumés. Innate ability can only take a student so far without the opportunity to actually take advantage of it. Money is a large enough hurdle without our sex also being held against us.

Now proof has emerged that parents have taken the extra step of bribing officials to get their kids into college, paying for test results to be manipulated, and having experts write their children’s entrance essays?! Not that any of us is surprised to hear this, but I feel outrage nevertheless. Scandals such as these are, unfortunately, true of most countries. However, having lived in Japan for over a year, I was surprised to discover this country was no exception to the scandalous trend.

You think it’s hard getting into medical school here in the U.S.? It could be worse. Getting into medical school in Japan is a hugely challenging process for two reasons. First, the difficulty level of the entrance exam is extremely high. The necessary knowledge for the exam is not covered in high school, which means just preparing for the test already requires that you attend an additional prep school every day after regular classes for as long as four years—and these classes themselves aren’t cheap either. Second, even if you do pass the exam and the interview, private medical school can cost from $180,000 to $270,000. This is 5 to 7 times the regular cost of a college education in Japan. Furthermore, they don’t have financial aid there like we do, so this is money your family is expected to pay out of pocket. It’s not uncommon for students to have to take the entrance exam multiple times, and each medical school has a separate exam that students have to pay to take. Thus, just taking the exam already involves a significant financial, physical, and psychological cost. Imagine all the hours of sacrifice and study, only to be cheated out of a place as less-qualified applicants circumvent the system.

Last year, an investigation into the medical school acceptance of an education ministry bureaucrat’s son in exchange for backdoor promises of research funds revealed more than expected. It brought to light widespread score manipulation based on donations and connections—and on gender. Women’s scores were being purposely decreased across the board at multiple top medical schools in order to keep their acceptance rate around 30%, so men would remain the majority. Investigation revealed this had been going on for more than a decade, and more than two decades for some of the schools. The guilty have claimed a variety of justifications, the main one being that women cannot be “real doctors” and will just leave their profession if they have a child or get married. Considering that women are traditionally expected to quit their jobs if they marry or have a child, is it any wonder? Given Japan’s current birthrate plight (its population is shrinking: of the 32 countries with a population of 40+ million, Japan ranks at the bottom with just 12.3% of the overall population being children), you would think they would be taking this more seriously. How hard would it really be to expand the child care options and support these women so they can do their job? Prime Minister Shinzo Abe, who was recently re-elected, promised to fix the daycare shortage and put women in positions of power. Like the head-bent apologies of those responsible for the med school admissions scandal, Abe’s promises seem likely to be nothing more than empty words.

In the 2017 Global Gender Gap Report, the World Economic Forum ranked Japan 114th out of 144 countries in terms of economic participation and opportunity and 123rd in terms of political empowerment. Approximately 50% of Japanese women are college-educated, one of the world’s highest levels, yet rampant sexism and discrimination against women make it difficult for them to find high-level or full-time positions. Only 4% of managerial positions in Japan are held by women, and on average women earn just 70% of what a man with the same job and experience would receive. This boys’ club should have long since faded into the annals of history. That this antiquated, entitled system of oppression is still such a systemic problem is  absolutely unforgivable no matter where you live.

Here in the U.S., women are similarly shortchanged on pay and advancement because our reproductive capacity makes us a “liability” in the workforce. Plenty of memes pop up on the internet everyday about how hot it is to find a man who offers to wash the dishes or pick the kids up from school. That’s because it is not expected of them. Women are expected to marry, have children, and take care of the home. Sure, we’re “allowed” to work, but we are still expected to do everything else.

America or Japan, getting into school or making it in the workplace—discrimination and unfair practices seem to be everywhere you look. Officials bow their heads to apologize or are replaced, but it’s all window dressing. Nothing really changes. We need to start taking this seriously and level the playing field.

On the Primacy of Geography

Americans’ lack of knowledge related to geography and world events has become a bit of a joke that in today’s political climate has stopped being funny. Last year, the crew of a late-night talk show went around New York City, asking Americans to label a country on a blank world map. Not a specific country, just any country out of 195 recognized states. There were more than a few awkward silences. This is one of many examples that, especially since the 2016 election, have given the rest of the world something to laugh about. These people on tv may not represent the average American. However, recent events have shown a harsh light on the degree to which Americans are uninformed about world events, and the response has been the equivalent of an indifferent shrug.

The picture doesn’t improve when you turn to statistics. Two surveys conducted in 2006 and 2017 by National Geographic and the Council on Foreign Relations found that in 2006 three-quarters of young Americans (aged 18-24) thought English was the most widely spoken native language in the world. It’s not. It ranks behind both Mandarin Chinese and Spanish. Three years after the beginning of the war in Iraq, only 37 percent of young Americans could locate that country on a map despite the attention it was getting in the media. In 2017,  a survey of 75 geography-related questions showed an  average score of 55 percent among American college students, who would be scurrying to office hours with their tails between their legs if that score counted towards their GPA.  

The problem isn’t that Americans can’t properly label a blank map or know the difference between temperate and continental climates, since a policy of forcing middle schoolers to do just this hasn’t resulted in a more aware public. The problem is geographic illiteracy- a lack of knowledge that signals, more troublingly, a lack of interest.

If you don’t know where Guatemala is on a map, chances are you’re unaware that in 1954 the United States backed a coup in Guatemala that sought to overthrow a democratically elected president with left-leaning policies. The violence and civil war that followed in Guatemala has had huge implications for immigration flows from Central America today. Knowing the location of Guatemala is the first step in grappling with complex ideas like immigration that will be at the center of 2020 debates. But you wouldn’t know this if you didn’t know where Guatemala was.

The truth is, politics don’t stop at the border; having regional context is an increasingly integral part of understanding an issue. Take Yemen, which is currently the site of the largest humanitarian crisis in the world. The same Americans who couldn’t locate Iraq when we were fighting a war there probably can’t locate Yemen now. This makes it unlikely that the average American is aware that the civil conflict in Yemen and the resulting scourges of malnutrition and cholera are exacerbated by actions of other governments vying for power in the region. It’s equally unlikely that these same Americans know that the U.S. supplied weapons to Saudi Arabia that played a role in this crisis until the senate voted against the policy in March.

This lack of understanding of basic geography becomes increasingly dangerous in a democracy where we vote for leaders who are tasked with responding to emerging situations. But instead of embracing their democratic responsibilities, it seems as if middle-class Americans, many of whom were central to the outcome of the 2016 election, are willing to evade this duty and to avoid grappling with the implications of bringing an individual to power who is equally indifferent. Engaging with the context and geography that shapes the realities of millions of people who have no say in the outcomes of these elections has become merely a nuisance. The danger isn’t limited to citizens: if you’re indifferent to the location of Syria, you’re liable to accept political and economic analysis from politicians who themselves can’t locate Syria.

Americans can afford to be ignorant, because geopolitics has little discernible effect on our daily lives. We don’t see our lives as directly impacted by a failure to engage in a larger international conversation. We are largely buffered from many of the effects of crises happening in the far-off amorphous regions that lie beyond our borders. Why bother? We can pawn off this engagement on other people.

This is a shame, because geography is a subject that has no age limit, and no prerequisite. It doesn’t require enrollment at a top university, or late nights hitting the books. It requires only curiosity, and perhaps a sense of global citizenship. We are living in a world rife with big complex problems, but turning our backs to these issues gets us no closer to solutions. There are areas for incremental progress in our own lives and communities. We need not bear the burden of these global issues alone, but in choosing to engage, we are taking one small step towards understanding the issues and forcing our leaders to understand these issues too. So if you find that you’re ready to reengage, feel free to start with the survey linked above. Don’t worry about your score. The bar is exceptionally low. There are no failing grades.