JOURNAL
Fat Shaming vs. Fat Acceptance: Is It Okay to be Fat?
Fat: it’s big news. In today’s world, everyone wants to talk about every body, be it big, little, or oddly shaped, and fat is right there at the top of the agenda. There’s the fat shamers (those whose purpose it is to shame ‘fatties’ into becoming ‘thinnies’) and the fatosphere (people who write blogs for and in support of fat people). Then there is everybody in between and it seems that no-one wants to be left out of the debate. So with all this going on, and with the fat acceptance activists doing daily battle with the fat shamers, the real question remains: is it okay to be fat?
Fat Shaming
Fat shaming (heckling and harassing obese people) is becoming increasingly popular. The idea is that shame will motivate overweight and obese people into doing something about their situation. It is suggested, too that the whole concept of fat shaming stems from the idea that people don’t lose weight because they are lazy, lack willpower, and have little or no self-discipline[1]. These ‘shamers’ come from all walks of life, from government campaigns designed to encourage people to lose weight, to the media who aim to stigmatize fat people, right down to the general public who use things like Twitter’s hashtag #fatshamingweek in a way that makes fat shaming seem almost like a hobby, something to take fun from, rather than anything productive. But does fat shaming actually work? And perhaps more importantly, is it morally acceptable?
The Millennial Generation: Their Attitudes, Social Behavior, & Religious Independence
A lot of thought and effort has been expended on pinpointing what makes up the character of the millennial generation, and how they became that way. The main point that most everyone agrees on is that they are very different from their parents and grandparents before them. Of course, some change in attitudes is inevitable with the passage of time, and millennials do still retain some aspects of their forebears’ perspective and values. Still, there is one area in particular where millennials break quite radically with their predecessors: as a whole, they are dramatically less religious than the rest of the country.
A recent Pew Research Center study[i] on millennial attitudes and social behavior examined how they compared to the previous generations. Many of these findings are unsurprising: for instance, millennials are much more likely than any other group to have posted a selfie, or a picture of oneself (usually taken with a phone camera), on their social media, indicating how far they have integrated recent technology into their lives. Other findings are less immediately apparent. For instance, millennials tend to be more liberal than their parents, but they are far more likely to identify as independent from political organizations than their predecessors than previous generations did at their age, regardless of whether they lean right or left politically; essentially, they don’t identify as members of the parties they vote for.
Baby Boomers and the Elder Orphans That They Become
The Aging Population
The population of America is aging. We’re getting older. That may seem self-evident but the problem is, it’s not just us. Rather, the proportion of older people within our society is increasing and the ratio of young to old is shrinking. In 2012, there were 43 million people aged 65 or over in the US, compared to just 35 million only ten years earlier, in 2002[1]. It is estimated that by 2029, 20% of the US population will be 65 years old or over, and that by 2056, the population of over-65s will bigger than that of the population of under 18s[2]. This, in part, is due to the so-called baby boom generation – those born in the fertile post-war years between 1946 and 1964. The oldest of this group turned 65 back in 2011 and the youngest will probably need health care right through to 2060. Many chose to remain child-free, which in itself isn’t a problem, but as the population continues to age, so difficulties begin to show.
The Baby-Boomers and What They Become
The baby-boomers are now facing a new, and perhaps less spritely name: the elder orphans. The term, coined recently, refers to older people who need care yet have no relatives either at all or living nearby. Dr. Maria Torroella Carny, the chief of geriatric and palliative medicine at North Shore Health System, released a paper last week discussing just that issue. These elderly people, who are often divorced or widowed and have no children, have no support system and are effectively ‘orphaned’ during a particularly vulnerable time in their lives. She uses case studies to demonstrate just how serious this can be and how devastating the potential consequences are[3].
Lead Poisoning and Criminal Behavior: Can there really be a link?
The recent death of Baltimore man Freddie Gray, who’s spinal cord was apparently snapped when in police custody, has sparked not only an investigation into police practices but has also reinvigorated the discussion around the potential effects of lead poisoning in children. Despite long being known to be harmful, the effects of lead paint in the homes of children are still being discovered and surprisingly, are still having an effect. The question of the moment, though, is can lead poisoning in children ultimately lead to criminal behavior in adults?
Lead Paint and its Effects
Although the practice of putting lead into paint was banned in 1978, many homes, especially in poor socio-economic areas, still have lead paint on the interior and exterior walls. In time, this paint deteriorates and will chip or release dust that can either be breathed in or more likely swallowed by children. It wasn’t so long ago that 10 micrograms of lead per deciliter of blood was considered a safe level. However in 2012, following a 30 year study into the effects of lead poisoning, the Center for Disease Control and Prevention (CDC) reduced that number to just five micrograms per deciliter. Now, many argue that there is no safe level at all[1]. What’s even scarier is that in 2007, an estimated 25% of homes still contained deteriorating lead paint and currently, more than four per cent of children in the US have some level of lead poisoning. These figures rise in big cities and poor urban areas.
Sex during pregnancy: Is it safe for the baby?
Many pregnant couples have a lot of questions when it comes to their sexual relationship during the pregnancy. There are many different ways that a pregnant woman will experience her sexuality during pregnancy and it can sometimes differ greatly from how she felt about her sexuality before and after pregnancy. Couples want to know if having sex is safe for the baby, how their sexual desire might change throughout the pregnancy, different positions, and various other questions.
Is having sex safe during pregnancy?
This is one of the top questions that couples have. The simple answer is yes, it is safe. Your baby is protected within the amniotic sac, the uterus and the surrounding muscles. Apart from that, the mucus plug in your cervix will protect the baby from any infections (although if you have concerns about sexually transmitted infections, you should always use protection). Unless a doctor has specifically told you that you can’t have sex because of some kind of complication with your pregnancy it will be safe to have sex right up until you go into labor. Such complications can include placenta previa, premature labor, unexplained vaginal bleeding or abnormal discharge, cervical insufficiency, a dilated cervix, when your water has broken, if you or your partner have or feel an outbreak of genital herpes coming on, or have other sexually transmitted infections. If you ever have any doubts about whether or not it’s safe to have sex during your pregnancy, your safest bet is to check with your doctor. If your doctor says that sex is off limits for you, make sure to have him or her define what sex is. You might still be able to have oral sex, engage in mutual masturbation or other forms of intimacy even if vaginal intercourse is off the table for the time being.
Burlington Vermont: A Road Map to Becoming a Renewable City
In September of 2014 the city of Burlington, Vermont became the first major city in the nation to source 100% of its electricity from renewable energy sources. Though the achievement received scant media attention, it marked a major milestone for renewable energy in the United States. Across the country, coal plants are being retired and renewable energy is grabbing an increasing percentage of the electricity market share. Slowly but surely Big Oil and Coal are losing their stranglehold on the American power grid.
Cities looking to make their own contributions toward greening the U.S. power grid can learn a lot from Burlington's example. So what does it take to become the country's first renewable city?
What Powers Burlington?
Burlington's landmark achievement was a long time in the making. The city has been growing its renewable energy portfolio since 1984 when it completed construction of the Joseph C. McNeil Generating Station, a wood-burning power plant. When rising energy costs in the 1970s prompted Burlington's utility company, Burlington Electric Department (BED), to investigate alternative approaches to powering the city, the utility concluded that “[u]sing wood fuel would put money back into the Vermont economy, improve the condition of Vermont’s forests and provide jobs for Vermonters.” The facility is powered almost entirely by wood chips generated as a byproduct of the state's lumber industry. BED's most recent figures indicate that in addition to generating power through wood burning, the facility is now recovering methane gas produced as a byproduct (biogas) and using it to generate additional power.
The McNeil plant has been an important source of renewable energy for Burlington since its construction; as of 2013 energy generated at McNeil accounted for about 45% of Burlington's total energy consumption. In the intervening years, however, BED has invested in a variety of additional renewable resources, including several wind farms and hydroelectric plants. By 2013, these combined resources had allowed the city to reduce its fossil fuel usage to less than 6% of its total electricity consumption.
The tipping point for Burlington occurred in the fall of 2014. The city's declaration of independence from fossil fuels coincided with its purchase of the nearby Winooski One Hydroelectric Facility, which city officials had long had their eye on. Ownership of Winooski One ensured Burlington a long-term, reliable source of renewable energy and allowed the city to close the resource gap that until that point had been filled by fossil fuels. While exact figures have not yet been made available, according to a 2014 article by Burlington Free Press the city now anticipates that wind, hydropower, and biomass will each supply roughly one third of the city's power. Small-scale resources, such as locally generated solar power and agricultural biogas plants also contribute a small percentage of the city's power supply.
The marginalization of the transgender community
The LGBTQ community is often seen as a monolithic entity, with everyone working towards the same end goal. It’s certainly easy to think that way, since they’re united as a community by the marginalization they experience for their sexuality and gender expression. But that acronym itself shows the inaccuracy of that assumption. This community includes gay men and lesbians, who are linked by their homosexuality but often experience different, gender-specific forms of homophobia – for instance, while a gay man might be greeted with simple disgust and even violence by a straight man, a lesbian might instead be told that her homosexuality is “sexy” by the same man and find that he is sexually aggressive towards her despite her orientation or even because of it. Bisexuals often face marginalization in the LGBTQ community because their homosexual peers resent their option to “pass” for straight, or find that potential partners outright reject them for fear of not being able to fully satisfy them. The “queer” label that rounds out the acronym is itself an umbrella term for several other disparate groups who face similar problems in how they are treated by their society for their sexuality and gender identity; often “queer” is used as shorthand for these groups or even for the entire LGBTQ community, due to how many terms would need to be rattled off to mention all of them.
Does Being Vegetarian Actually Save Any Animals?
There are lots of reasons that people become vegetarians or vegans – health, sustainability, up-bringing, but by far the most common explanation given is a moral one, that the unnecessary suffering and killing of billions of animals per year is unethical. It’s not a surprisingly conclusion, given the massive amount of animals slaughtered for food alone in the US. In 2013, 8.1 billion animals died to feed Americans, and meat eaters will consume an average of 2,088 animals in their life-time[1]. Surely then, it stands to reason that abstaining from eating meat will save the lives and prevent the suffering of animals. Whether this is true or not, however, is under some debate – and if it is true, just how many animals does vegetarianism actually save?
Calculating Saved Lives
There have been numerous studies and calculations discussing just how many animals are saved each year by a vegetarian diet – and the numbers vary wildly, from as little as 50 to as large as hundreds. Noam Mohr, of the animal charity PETA, suggests that the average meat-eater in the US consumes 26.5 animals per year and that is made up of of a cow, of a pig, of a turkey, and 25 chickens (which includes 1 allowance for eggs)[2]. On the other end of the scale, some argue that the average meat-eater consumes 406 animals per year, made up of 30 land animals, 225 fish, and 151 shellfish[3]. It is then assumed that a vegetarian, by abstaining from meat, saves the same amount of animals that a meat-eater kills.
The Childfree Life
Do we have an obligation to reproduce or is it okay to not want kids?
The choice to live your life childfree is still surprisingly taboo, even given the modern propensity for contraception and increasing reproductive freedom. It’s got to be said, there is a clear distinction between being childless and childfree. Whilst the former would like to have children but cannot, be it due to infertility or illness or whatever, this issue is one that deals primarily with the latter – the childfree, those who are able but choose not to procreate. The choice to not have children often shocks people and the proclamation is, more often than not, met with insidious comments like “there must be something wrong with you,” “that’s just selfish,” “you were a child once,” and worse “you’ll change your mind when you get older/meet the right man/your biological clock starts ticking”. It’s surprising, primarily, because in an age when we pride ourselves on freedom and choice, we still ultimately put an obligation on reproduction.
The ‘unnaturalness’ of it all
As anthropologist Sarah Blaffer Hrdy points out, women are associated with the ideals of nurturing and child-rearing and so, when a woman decides that she wants to remain childfree – or worse, declares that motherhood was a mistake after the fact – they are seen as unnatural[1], as though something is wrong with them. Jessica Valenti argues that all women are separated into two distinct categories: mothers and non-mothers[2] and in this way, parenthood defines us. Not whether we are or are going to be good parents of course, just whether or not we are parents – and the idea that we will be, and that we want to be, is still seen as our ‘default setting’.
The Good News About Edible Insects
If your life depended on it, would you eat a bug?
If you're like most Westerners, this might be the only circumstance under which you could imagine voluntarily eating a bug -- stranded in dire straits, desperate for any source of nutrition you can get your hands on, doing any disgusting thing you have to in the name of survival. But elsewhere in the world, entomophagy -- the practice of eating insects -- is a part of everyday life. It's not just food-insecure communities either: the Food and Agriculture Organisation of the United Nations (FAO) estimates that roughly 2.5 billion people worldwide -- over a quarter of the population -- consume insects as a regular part of their diet. In regions of Asia, Africa, and South America, these critters range from standard street fare to sought-after delicacies. According to Julieta Ramos-Elorduy, author of the book Creepy Crawly Cuisine: The Gourmet Guide to Edible Insects, in Mexico City, “[a] pound of ants costs ten times more than a pound of meat, and the white agave worm fourteen times more. Grasshoppers, the red agave worm, and water boatman eggs all cost twice the price of beef.”
Here in the West we’ve been conditioned to regard insects – particularly in the context of food – as, well, icky. But it turns out there are several excellent reasons why we should all at least try to get over our collective hangup. Among those with an eye toward sustainability, there's a growing consensus that the cultivation of edible insects offers a vital solution to the daunting problems of feeding a growing population. With chronic malnutrition affecting some 805 million people worldwide and current livestock practices already straining the earth’s resources, bugs could very well be the key to ensuring both food security and environmental sustainability for the planet’s future.
So ask yourself: if the world depended on it, would you eat a bug?
Bugs: The Better Livestock
In 2006 the FAO published a report titled Livestock's Long Shadow in which it explored the sobering environmental consequences of modern-day agriculture. The findings cover a host of environmental problems, ranging from air and water pollution and greenhouse gas emissions to rainforest destruction and soil degradation. According to the report, livestock activities account for 18% of total global greenhouse gas (GHG) emissions – more than the transportation sector. When it comes to individual greenhouse gases, livestock rearing accounts for a full 35 - 40% of global methane emissions and up to 65% of nitrous oxide emissions – both of which gases carry significantly higher global warming potential than the oft-demonized CO2. The livestock sector currently occupies around 30% of the planet's ice-free land, and the expansion of livestock grazing is considered to be the number one driving factor behind the destruction of our rainforests.
Reducing livestock's environmental impact, then, is of critical importance to addressing the effects of global climate change. But with a global population expected to reach 9 billion by 2050, scaling back on food production simply isn't an option. In fact, the FAO predicts that agricultural production will have to as much as double to keep up with demand, effectively exceeding the planet's supply of arable land. With pollution, population growth, and nutritional needs all poised for collision, the word “unsustainable” doesn't even begin to cover the challenges ahead.