Skip to content

Cruelty to the Enemy – Operation Cast Lead in review

by

Thirteen years ago Israel launched what has been politely referred to as a “major military offensive” against the Gaza Strip. The offensive, called Operation Cast Lead, was not the first time or the last time Israel assaulted the Palestinians of Gaza. The Operation was merely a continuation, and a punctuation (almost an exclamation), of Israel’s policy toward Palestinians and to Gazans in particular.

It’s not clear how many people died during those three weeks; Robert Fisk lists the number of Palestinians killed at 1,417, including 313 children, and more than 5,500 injured Palestinians – many permanently. B’tselem, a human rights group in Israel differs, saying that 1,387 Palestinians died, of which 773 were civilians. B’tselem also reminds us that nine Israelis died, of which three were civilians. It’s clear that during the twenty-two days of the Operation that Israeli soldiers followed the rabbinical injunction to inflict “cruelty to the enemy,” as paraphrased by Maayan Lubell.

Because Israel denied B’tselem, the right to “enter the Gaza Strip to supplement the work of field-researchers” in Gaza following the Operation, in some instances the “organization was not able to establish whether the person killed took part in the hostilities.” Human Rights Watch, and other agencies were also in in large part denied access. Both organizations agree, along with other human rights organizations that “[c]ivilians lose their protection from attack only during the time they directly participate in hostilities” (HRW).

People familiar with other assaults on Palestinians in Gaza, the West Bank, and elsewhere, over the years would not be surprised that war crimes were committed during Operation Protective Edge. It’s true that both the Israeli army and Hamas in Gaza could be accused of war crimes, but hasty to lay blame equally. The HRW report on Operation Cast Lead, published in August 2009, reminds us that the laws of war “oblige states to conduct impartial investigations into credible allegations of serious laws-of-war violations, and to hold accountable anyone found responsible for war crimes, regardless of rank,” but at that point several months later “the Israeli government and IDF have failed to conduct serious investigations into many of the credible allegations of laws-of-war violations by Israeli forces during Operation Cast Lead,” and the “soldiers who fought in the operation” that “spoke publicly about attacks on civilians and other violations” were accused of “hearsay and exaggerations,” by the Israeli army, which “criticized the soldiers for speaking out.”

Israeli tanks standing on the Israel-Gaza border while smoke billows from Gaza during Operation Cast Lead, January 14, 2009. (Yossi Zamir/Flash90)
Israeli tanks standing on the Israel-Gaza border while smoke billows from Gaza during Operation Cast Lead, January 14, 2009. (Yossi Zamir/Flash90) – found at https://www.972mag.com/10-years-since-cast-lead/

The law of proportionality in armed conflict “forbids suffering that is caused in no direct relation to a concrete military advantage and in disproportion with it.” Although both sides have correctly been accused of launching indiscriminate rockets and mortars the numbers of casualties listed above tell us that Israel was looking to inflict cruelty on the enemy and those enemies were in large part non-combatants, including emergency personnel.

Part of the background for the reason for Operation Cast Lead, provided by Institute for Middle East Understanding (IMEU), was that on the day Obama was elected – November 4, 2008 – “Israeli soldiers staged a raid (The Guardian cited by IMEU) into the Strip, killing six members of Hamas. This ended a ceasefire that had been in place between Israel and Hamas for the previous six months. Obama had made Middle-East peace an issue he intended to address as president; Netanyahu, who never had the best relations with Obama probably saw an assault on Gaza as a great way to thwart an effort toward peace.

Although the Operation ended just before Obama took office in 2009 during those weeks – and before and after – Gaza’s borders were closed, “enforced mostly by Israel but also by Egypt in the south.” Israel has its own reasons, such as cultural dominance, to close the borders, but this has always been supported by the United States.

As we look back at Operation Cast Lead thirteen years later it’s clear that the Operation was just another event in an almost normal day of the life of Palestinians of Gaza. The borders remains closed, enforced by Israel, Egypt, and the U.S., and in other several Operations since 2008 hundreds of civilians have died with no accountability.

West Side Stories

by

This week I saw Spielberg’s remake of West Side Story.

America’s classic love story in the manner of a Romeo and Juliet, the original West Side Story won awards and led to a long-running Broadway production. The original West Side Story on the silver-screen (1961), which tells the story of New York’s white population fighting the Puerto Rican population over land that is being gentrified, wasn’t the original plan for the story-line. Richard Brody, in the New Yorker, reminds us that

The story of the original ‘West Side Story’ is that of white Jewish artists (Leonard Bernstein, Arthur Laurents, and Jerome Robbins, later joined by Stephen Sondheim) who planned to make a musical play about Jewish and Irish gangs and then, worrying that they were heading for cliché, shifted their focus to people they knew nothing about.

The 1961 production had an mostly white cast – with the exception of Rita Moreno, who starred in both the 1991 and 2021 productions). Speilberg did what he was supposed to do by having Latinos play the roles of Puerto Ricans, and for having Spanish dialogue in the show. Rachel Zegler, the lead that plays Maria, has a mother from Columbia.

The praise for the remake ends there. If you avoid suspending disbelief and look at the performance, you’ll see that, as Richard Brody points out, Spielberg has Zegler “act like a Disney princess” with “oversimplified facial and vocal expressions.” In the 1961film Maria had “recently arrived from Puerto Rico for an arranged marriage to Chino.” In Spielberg’s version she had been in the city for years with no mention of an arranged marriage; Chino is in night school and working as a mechanic. Brody points out that “nothing comes of these new practical emphases; the characters have no richer inner lives, cultural substance, or range of experience than they do in the first film.”

There’s a big difference between an arranged marriage than the expectation of parents that a daughter would marry within culture, and Maria’s parents were pushing her to like – and perhaps marry Chino.

The best thing of Spielberg’s version, Brody says were “the songs, their acerbity, the view of racial discrimination and class privilege,” which were already in the original production. He left out the important things from the 1961 version, while trying to set the story in the 1950: “there is no police lieutenant’s open insulting of white kids, or openly racist and threatening rant against Puerto Ricans.”

If you want to see “West Side Story” you might as well see the original. Or maybe you should just read Romeo and Juliet, and imagine that it’s set in the 1950s during a period of gentrification and immigration.

Kronos Questions

by

The ransomware attack on Kronos is a big deal in the sense that millions of people and hundreds of entities use it to track hours and issue payments to people.

Creator: AlexLMX | Credit: Getty Images/iStockphoto
Copyright: ©alexlmx 2019

Retail stores, organizations, cities, and other groups are going back to basic bookkeeping and time charts to keep track of hours now.

The questions are: will the hand-written time-charts result in an honest system that pays people what they should be payed? If now, will people demand to be payed what they’re worth? Does the temporary lack of access to the “cloud” that keeps track of hours mean that checks or direct deposit payments can’t be submitted to HR to send to employees? And perhaps most importantly, do we want a system where millions of people depend on whether a “cloud” system provider either functions or it doesn’t?

Tales from the (retail) front – Holiday Edition

by

It’s Holiday season. For some reason, this is a big deal in the world of retail.

I wrote about how on the first day of retail “they throw you in at the deep end, and at the same time expect you to know your way out.” The interesting part is that more than two months later, if feels the same way.

I described how it’s doable to greet a customer and ask them “can I help you find anything?”. After about fifty days at work I’ve learned many of the products that we sell, but with seasonal changes things always move around. I know what we sell, but I have no idea where to find them

Even on a good day, it seems that there’s always something that goes wrong. The shoes aren’t where they should be, so I asked the computer if we have them. The computer says no, but I find the shoes the customer wanted in the wrong place after they left the store. Boxes seem eager to break, and price tags have habit of not being on the items.

The number of staff providing excellent customer service has at least doubled since I has hired two months ago. My hours will in stay in flux, I know, although I’m mostly getting the hours per week I’m asking for. Even though the number of staff has increased, because it’s The Holidays the number of hours I work is likely to increase (I could have said no, but that doesn’t seem to be the proper response).

For some reason, during The Holidays people want to buy stuff. If that’s you, please remember to be nice to the people providing you excellent customer service.

Happy holidays to all, and to all a good night • The Louisville Cardinal

Partition

by

On November 15, 1884 “an international conference was opened by the chancellor of the newly-created German Empire at his official residence on Wilhelmstrasse, in Berlin,” Patrick Gathara tells us. The purpose of the conference was to determine the future of Africa.

The West Africa Conference began on November 15, 1884. Gathered in Berlin, capitol of the newly-created German Empire, the conference lasted 104 days, Patrick Gathara tells us in “Berlin 1884: Remembering the conference that divided Africa.”

The Conference (also referred to as the Berlin Conference) “established the rules for the conquest and partition of Africa, in the process legitimising the ideas of Africa as a playground for outsiders, its mineral wealth as a resource for the outside world not for Africans and its fate as a matter not to be left to Africans.”

Monty Python | GIFGlobe
Image from Monty Python’s “Meaning of Life”

The Powers at the Conference – which included the United States but refused to include African nations – stated that they had three goals:

that of the commercial and industrial nations, which a common necessity compels to the research of new outlets. That of the States and of the Powers summoned to exercise over the regions of the Congo an authority which will have burdens corresponding to their rights. And, lastly, that which some generous voices have already commended to your solicitude – the interests of the native populations.

The Conference “resolutely refused to consider the question of sovereignty, and the legitimacy of laying claim to someone else’s land and resources,” Gathara informs us.

I would suggest that the Conference was continuation of, and an expansion of, “The Great Game” between European powers. European states had long only had “influence on the coast,” in Gathara’s words. Following the Conference, he says, “they started grabbing chunks of land inland, ultimately creating a hodgepodge of geometric boundaries that was superimposed over indigenous cultures and regions of Africa.

The Conference has been described as ‘Diplomatic in form, it was economic in fact.’ Although it was “dressed up as a humanitarian summit to look at the welfare of locals, its agenda was almost purely economic.” A newspaper in Africa a few years after the Conference said it was just replacing the theft of African people as slaves with the theft of African resources and land. As I wrote in Never-ending Profit-Making “the capitalist is always looking to expand his capital and he looks to do this through the labor of other people, who works under the capitalist and who has no claim to the product of their labor under this system.”

“Today,” Gathara wrote on 2019, “Africa is still seen primarily as a source for raw materials for the outside world and an arena for them to compete over. Conferences about the continent are rarely held on the continent itself and rarely care about the views of ordinary Africans.” Ethnicity and tribalism continue to be the bane of African politics, he tells us, despite the achievement of “independence” of most African nations by the 1960’s.

Africa was not the first or last area partitioned between European powers without their consent and against their will. The Americas – all 35 countries that are part of the OAS – are still trying to come to terms with a colonial past. In 1915, 1916, and 1917 Britain, France and Russia made conflicting promises that still reverberate to Arabs in what is now Saudi Arabia, Lebanon, Jordan, Israel, and Syria, while at the same time the European states negotiated among themselves how to partition these lands. Thirty years later, the United Nations decided – against the interest of the indigenous people of what is now Israel – that the land should be partitioned.

The Berlin Conference that begin in 1884 was just part of a pattern by European powers – plus the United States – to make statements about humanitarian efforts while grabbing chunks of land in an endless search for raw materials.

Can I help you find anything?

by

Three and and half years after getting a master’s degree in in peace and sustainable development studies, and dozens of applications later, I got a job as a training to be a manager in a retail store.

Before I critique the job and retail in general, I should say more about the application process.

Most jobs ask for an extensive work history. Sometimes you can “upload resume” and it will populate the fields of your work history for you (this function never works properly – it ends to with some of the work history, and mashing other in places they don’t belong, like the next job title falling into the contact section of the previous one). Fine, so you, the employer wants to know that I can perform mundane tasks. This job application asked for the last ten years of employments, including any volunteer experience! I listed of what I’d done in the last ten years, focusing on things I’d talked about in my cover letter. I am thankful to myself that I didn’t list every thing I’d done in the last ten years.

After going to an interview, I was asked to submit letter from the five different places I’ve worked or volunteered in the last ten years (at least, the ones I listed on the application). I’m grateful to the five people, most of whom I haven’t had any words with in more than five years, who promptly responded to me and acknowledged that I do, or did, work for them. I’m not sure that I was being asked to submit a letter of reference or merely an acknowledgement that I worked for these people or organizations, bu I’m not sure there’s much of a difference. A letter saying I was there and that I put in effort is a letter regardless of what you call it, and five of them for a retail job is (just a little) excessive.

I wasn’t expecting to be working in retail. With a master’s degree I thought I’d be working on policy. Trying to enter or reenter the job market is a grueling process, though.; I’ve heard that algorithms will look at recent experience, and ignore people who may have education, but not recent experience. I lost track of the numbers of applications I’ve submitted over the years, and to some extend the number of types of jobs I’ve applied for. Nonetheless, I try to only apply to jobs that I would be I would actually want to do.

Part of my master’s lever education is peace through sports. This point of discussion during the interview may have been the reason I now I have a job training to be a manager at Big 5 Sporting Goods, although I feel it’s hardly relevant to what I do. I’m making something close to minimum wage to learn to oversee everything at the store.

Nobody knows how to to do their job on the first day, of course. The interesting thing about retail is that they throw you in at the deep end, and at the same time expect you to know your way out. This is a bad analogy, perhaps, but there is a established pattern of how clothes are hung, or which socks go on each rack. Yet there is little instruction except to do the job, without instruction on how to do it.

Great Customer Image & Photo (Free Trial) | Bigstock

It’s not hard to pick of the lesson of how to greet a customer when the walk in, and ask them “can I help you find anything?. The hard part is that we who work in retail are told to do this on the first day, without having the faintest clue where things are in the store.

The structure of the job is another issue. Although I could ask for some days or time off, I have no real control over my schedule. I’m scheduled at different hours on different days each week, and I never know my schedule more than a week a head of time (unless I say I’m not available on some day at some hour in advance). The law says that if you work more than five hours (most shifts are either 5 or 8 hours) you get thirty minute to eat. On evening shifts, that mean we’re having at 5pm, if we remember to have a lunch before work, or we’re having dinner after 9pm after getting home. For some lucky mortals, this isn’t a big deal; for others food is essential even if not desirable. The law also says that if the employer doesn’t have to provide part-time workers with benefits, but my experience is that as a part-time worker the hours are about as long as a full-time shift. Part-time employees are often asked to work 35 or 38 hours in a week, but because that’s not 40 hours benefits don’t need to provided.

The takeaway from all this is that the application process for jobs (minimum wage jobs in retail in particular) is ridiculous, that the expectations of knowing how to do a job without being taught how to do a job (even a retail job) is ridiculous), and that the laws regarding who gets benefits and who gets basic needs like foods while on the job is ridiculous. Perhaps one day we’ll value both workers and non-workers.

I will continue to post on this blog. The energy used for working is amazing, and may impede even my best ideas.

Green (In)Action

by

Richard Young’s article published September 23 in Carnegie International – Europe is eye-opening. It’s says nothing surprising, really, but puts into words the message that many climate activists and woke members of the public following the climate crisis have been trying to say.

https://images.carnegieendowment.org/images/article_images/Youngs_GreenDemocracy_1235368669_1420x770_2.jpg
Image from Young’s article

Young discusses different kinds of government, and how they might respond to the climate crisis and demands of climate activists. He observes, about panels and commissions that

Top-down, expert-led initiatives like the UN Intergovernmental Panel on Climate Change, the UN Framework Convention on Climate Change and the Conference of Parties, and the Paris Agreement have in practice fallen short. They appear to drive not comprehensive solutions but rather changes that work within current systems—namely technological fixes without the necessary systemic changes to social and political organization.

(A lot of climate activists and scientists believe that COP26, despite its admirable goals, will in practice fall short, and accomplish little or nothing)

Commissions, panels, conventions, and conferences are set up by democracies. Although Young’s article focuses on Europe, it’s safe to say that his conclusion that Europeans are “acting to address climate change through an uneasy combination of depoliticized democracy, climate assemblies, and protest movements” applies to democracies beyond Europe.

He provides several steps that would help “democracies become more fully attuned to the imperatives of ecological transformation: Root climate expertise in popular support; Foster localized conceptions of citizenship; Harness the power of mass engagement productively; Balance the climate movement’s localism and transnationalism; and Couch climate action in full-fledged democratic renewal.

He concludes that “Rather than pitting different kinds of democratic engagement against one another, countries need a more comprehensive conception of democracy to deal with the ecological transition for the long haul. Reshaping democracy by fixing these limitations to current approaches for achieving a greener future is the crucial task ahead.”

What we should take away from the article is that not only is “ecological transformation” essential to mitigate climate change, but there must be a transformation of democracy and citizen engagement.

Twenty Years Later – A 9/11 Edition

by

After twenty years, many Americans can still tell you exactly where they were when the terrorist attacks happened on September 11, 2001.

The collapse of the Twin Towers and the the crash of Flight 93 in ‘a field in Pennsylvania‘ – along with the mysterious hole American Airlines Flight 77 left in the Pentagon – devastated thousands of families.

As devastating in the short term and the long term of the deaths of three thousand Americans are in one day, the short- and long-term effects on domestic and global affairs of 9/11 is far more tragic.

With every action there is an equally strong reaction, and the response to September 11 was to create the unwinnable, unending, War On Terror.

The U.S. government responded to 9/11 by invading Afghanistan in 2001 in an attempt to destroy Al-Qaeda and the Taliban, ignoring the fact that none of the nineteen men the U.S. identified as the terrorists involved in 9/11 came from Afghanistan (they came from Egyt, Saudi Arabia, the UAE, and Lebonon). The U.S. relatively quickly mitigated the power of the Taliban in Afghanistan and other time installed an what appeared to be stable puppet government. Still, the U.S. stayed in Afghanistan.

It also became apparent that the U.S.’ occupation was about more than defeating the Taliban. A couple weeks ago Yadullah Hussain reminded us in the Financial Times that the ‘Gravestone of Empires’ – Afghanistan – is “resource rich, with an abundance of coal, natural gas, copper, lithium, gold, iron ore, bauxite and prized rare-earth mineral reserves”.

Hussain’s article was published five days before the deadline the U.S. had imposed on itself to brings it’s home from Afghanistan. For some, it almost appeared as if the U.S. was being pushed to stay in Afghanistan in order to control those resources. Instead, years after it first promised it would do so most U.S. troops were out of Afghanistan by August 30, 2021. Within days the Taliban had control of the government, which has created a huge ongoing humanitarian crisis. After years of trying, the U.S. replaced the Taliban with the Taliban.

In 2003 the United States invaded Iraq under the pretense that somewhere in Iraq there were “weapons of mass destruction”. That wasn’t the reason fro invading Iraq, and neither was “saving the Iraqi people” from Saddam Husein, according to Assan Butt’s article in AlJazerra. The invasion was because “a quick and decisive victory in the heart of the Arab world would send a message to all countries, especially to recalcitrant regimes such as Syria, Libya, Iran, or North Korea, that American hegemony was here to stay. Put simply, the Iraq war was motivated by a desire to (re)establish American standing as the world’s leading power”.

9/11 - Did You Know: BBC Reports on Building 7 - YouTube
BBC describes Tower 7 collapse while it’s still standing.

The collapse of Building 7 at the World Trade Center left many people, who were able to think beyond the state-sponsored narrative, wondering who planned what when.

The United States has long been a police state, at least against it’s “minority” non-white populations. September 11, 2021, turned the U.S. in a security state and led to an Islamophobia that’s never disappeared.

The ACLU informs us, the USA Patriot Act (Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism) “was the first of many changes to surveillance laws” passed 9/11. “Hastily passed 45 days after 9/11 in the name of national security,” the Patriot Act “made it easier for the government to spy on ordinary Americans by expanding the authority to monitor phone and email communications, collect bank and credit reporting records, and track the activity of innocent Americans on the Internet.” Instead of being aimed at catching terrorists, the Patriot Act “turns regular citizens into suspects”.

A year later, in November, 2002, President George W. Bush signed the Homeland Security Act, which created the Department of Homeland Security. This pretty much ensured that the U.S. would remain a genteel-looking police state and security state.

Submit your TSA joke for T-Shirt Hell T-Shirts... - Mustang Forum - Mustang  World
The iconic sarcastic shirt about the TSA

It seems that most Americans were okay giving up the relative ease of air travel that existed before 9/11 and submitted themselves happily to the existence of TSA.

It seems that most Americans gave little thought to the violations of basic civil liberties that the Patriot Act infringed on. The existence of what is nicely called a “detention camp” where many detainees were tortures and “thousands of Iraqis, Afghans, and other suspect foreigners were held without charge and without the legal means to challenge their detentions” in Guantánamo Bay seemed to never cross the mind of Americans, especially after Obama said in 2009 that he’d close the camp; twenty years after 9/11 Wlilliam Roberts wants to know why the camp is still open.

The Patriot Act is still law, and the Department of Homeland Security seems secure only in its continual existence, despite the fact that we have theoretically ended the War On Terror (or at least made it look like we’ve brought the troops home). These civil violations are a part of America that no one questions, except for some people the government would label “radical”.

Two years ago a study concluded that the U.S. had spent $6.4 Trillion on war in the Middle East and Asia since 2001. Sarah Lazare, for CommonDreams suggests that we spent twenty years fighting the unwinnable, un-endable, War on Terror instead of twenty years using the same money to fight Climate Change. She makes a strong case that the money spent on the military could have been used for other projects (“A sum of $1.7 trillion could eliminate all student debt, $200 billion could cover 10 years of free preschool for all three and four year olds in the country. And, crucially, $4.5 trillion could cover the full cost of decarbonizing the U.S. electric grid”) and reminds us that “dismantling the carbon-intensive U.S. military apparatus must be part of the equation”

To not continue to make the same mistakes, the well-informed and connected Kathy Kelly says, is that the the U.S. “must express true sorrow, seek forgiveness,” collectively recognize the horrors of the policies resulting from 9/11 and that in order to counter terror we must abolish war. CodePink made a similar suggestion.

Ending war and making meaningful reparations would be a start toward undoing the legacy of 9/11. Over the last twenty years it’s become increasingly obvious that maintaining “hegemony” is important to U.S. policy-makers, and that doing so is more important than having a semblance of not sacrificing civil- and human rights in the process. It’s time to support people instead of corporations, and peace instead of conflict.

The Human Element

by

As we begin watching the US Open at Flushing Meadows, and start contemplating an end to another season of America’s pastime it’s a good time to reflect on what the sports means and the relation between human and technology.

Tennis has been using a camera-generated review system for a long time, and in this pandemic-stricken year there will be no human calling the ball in our out. This might prevent some outbursts “I swear to God I’ll f***ing take the ball and shove it down your f***ing throat” (Serena Williams), or a player unhappy he’s losing hit the ball into the neck of a linesman (Novak Djokovic). It will also take the human element out of the game.

Hawk-Eye
Hawkeye

Because of the pandemic, last year’s baseball season was shortened to sixty games. This year it’s returned to the regular 162 game season for America’s pastime, with some new rules to try to shorten games and prevent them from going much over nine innings. In the last few years, baseball has implemented a review system helps prevent brawls (sort of) and that helps to put sportsmanlike conduct into our great game. To use replay to determine whether a runner beats a throw to a base, or hits the a ball fair or foul, can be aggravating to some fans – and doubtless to some players an coaches – doesn’t not remove a human element from the game, but does add make the game more about technology and less about humans.

There is no review of balls and strikes, which means there’s some temper between both pitchers and batters (and coaches) yelling at umpires over calls they don’t like. Baseball, along with other sports, has been called a game of inches. (If you were to look at some commentary online, some people think that the umpires have been worse with balls and strikes than ever before). Forbes, in a 2019 article, suggests that although the balls or strikes review system “could be on the horizon in baseball” and that it just needs some technological tinkering. If baseball does go to a ball-and-strikes review system, it further takes the human element out of the game.

Pitch number six on the FOXTRAX was the strike three call in this bad beat for the Tampa Bay Rays and Ben Zobrist.
Pitch number six on the FOXTRAX was the strike three call in this bad beat for the Tampa Bay Rays and Ben Zobrist. / FOX, MLB on YouTube from https://www.fanduel.com/theduel/posts/video-remembering-when-ben-zobrist-struck-out-to-lose-on-one-of-the-worst-strike-calls-in-mlb-history-01ecz644fwfv

If we are looking to take the human element out of sports perhaps we should all play baseball or tennis on XBOX, where the game determines for whether the shot was in or out, or whether it was a ball or a strike.

That’s not a viable option because sports are about humans and human judgement. A self-reflective article about the press treatment of Naomi Osaka and other tennis players pointed out that we value athletes simply because these are people “who have been elevated to prominence by dint of their hand-eye coordination and superior cardiovascular fitness.” We watch sports for the skills and for the human element. There must be a limit to technology.

Living Alone – a shared post

by

Shared from another blog. The interesting experience of living alone for the first time:

This week is week 3 of officially living in my condo and living on my own for the first time in my life. I was nervous as my moving date crept up because I’ve never been on my own before. Sure, I lived at college for 3 years, but I had roommates. Now I’m completely […]