Understanding the Differences between Palliative Care and Hospice Care

Understanding the Differences between Palliative Care and Hospice Care

Regardless of whether it’s you or a family member who is suffering from an illness or disease, it can be an extremely difficult and trying time, especially when you don’t have support around you. Curative treatments are of course important, but comfort and care are equally vital—both during the illness and as death approaches. With a huge 30% of all Medicare dollars being spent during the last twelve months of a patient’s life[1], it’s time we re-evaluated our approach to illness and put a greater focus on palliative and hospice care.

The terms ‘palliative care’ and ‘hospice care’ are often used interchangeably, but they are in fact different and they’re used at different times during the course of an illness. Understanding the importance of treatments and the differences between the two are vital if you or your loved one want to get the right care at the end of life. 

 Palliative Care

The term ‘palliative’ comes from the Latin word ‘palliare’, meaning to cloak, and the purpose of palliative care is to provide as much comfort, pain relief, and support as possible. It’s designed to work in tandem with curative treatments and often begins at the point of diagnosis[2]. It is offered to those suffering serious and potentially life-threatening illnesses such as cancer, heart disease, AIDS, and dementia, and is used to prevent or treat the side effects and symptoms of both the original illness and of the curative treatment itself. As well as offering pain relief and help with any resulting subsidiary illnesses, support


Non-Profit Organization Awards Grant to Student of Chicana and Chicano Studies

Non-Profit Organization Awards Grant to Student of Chicana and Chicano Studies

Non-profit organization OKOLOGIE.ORG is delighted to announce that a grant of $500 has been offered to student Jessica Orozco, after her successful research into sanctioned residential segregation and its effects on educational success in the Los Angeles area. Orozco, who received the grant in February this year, will use the money to help towards costs associated with attending the NACCS Conference in the Chicana and Chicano Studies field. 

With around 2 million Chicano/Latino students in California, and only around 25,500 Chicano/Latino teachers to serve them, children are often only exposed to their ethnic culture twice a year – during Hispanic Heritage Month and Cinco de Mayo. Oklogie.org believe that Orozco's field of study is important, as it addresses the often-overlooked social, political, cultural, and economic conditions of the Chicano people.

Happy Earth Day

Happy Earth Day

Join Earth Day Network on Earth Day 2018 - April 22 - to help end plastic pollution. Plastic is threatening our planet's survival, from poisoning and injuring marine life to disrupting human hormones, from littering our beaches and landscapes to clogging our streams and landfills. Together, we can make a difference.

Is Overpopulation Still a Problem?

Is Overpopulation Still a Problem?

Overpopulation has been a topic of discussion for centuries. From Plato and Aristotle to modern day scientists and philosophers, the questions of population control – whether we need to impose controls and what those controls should be – have been hotly debated. The world’s population currently stands at approximately 7.6 billion people and every day, around 360,000 babies are born. That’s roughly 15,000 new mouths to feed every single hour. On the flip side, only around 150,000 people die each day[1]. The disparity is obvious and at this rate, population growth is inevitable, but is it a problem? Fifty years after Paul Erhlich terrified the world with his vision of starvation and death in his book The Population Bomb, do we still need to worry about the effects of overpopulation?

From Before Christ Onwards

As far back as the fourth century BC, overpopulation has been a concern. Plato and Aristotle recommended instilling strict birth controls to ensure that the population didn’t rise above 200 million people worldwide[2]– a stark contrast to today’s 7.6 billion! Later, Thomas Malthus famously warned about growing population in 1798, and by 1968, Paul Erhlich argued that it was too late – we’d surpassed a sustainable level and control was no longer an option. Instead, he argued, we needed to actively reduce the population through enforced, compulsory methods. Skip forward to 2018, however, and the fiery conversations about overpopulation have been somewhat dampened. The urgency around reducing the birth rate or even reducing the population itself seems to have fizzled out. Does that mean that it’s no longer a problem, though, or have we simply become apathetic? 

 

 

Nadine Burke Harris, M.D. is the author of "the deepest well - Healing the long-term effects of childhood adversity."

Nadine Burke Harris, M.D. is the author of "the deepest well - Healing the long-term effects of childhood adversity."

Dr. Harris interview on NPR, discusses what negative experiences can do to a growing child’s health. Children’s exposure to adverse childhood experiences, such as, physical, emotional or sexual abuse, physical or emotional neglect, parental mental illness, substance dependence, incarceration, parental separation or divorce or domestic violence can negatively affect health outcomes.

Are We Failing Our Elders?  The Rise of Senior Poverty in the US

Are We Failing Our Elders? The Rise of Senior Poverty in the US

One in every six elderly Americans now live below the federal poverty line, and over half of all ‘baby boomers’ have reported a deterioration in their quality of life over the past few years[1]. 21 per cent of married couples and 43 per cent of single people over the age of 65 depend on Social Security for a massive 90 per cent of their income, and almost 62% of households headed by someone over the age of 60 are in debt[2]. 2.9 million older households in the US suffer from food insecurity[3], and between 1991 and 2007, the number of 65- to 74-year-olds applying for bankruptcy increased by a huge 178%[4].

Now, “25 million Americans aged 60 and above are economically insecure”[5], leaving them to struggle with housing, rising healthcare costs, nutrition, transportation, and savings. To make matters worse, they are often left isolated and alone, living in suburbs and surrounded by other elderly people, whose families have grown and flown the nest[6]

An Aging Labor Force – Working Until You Drop

With the increasing financial difficulties faced by the older population, many continue working long after the traditional retirement age, with 40% of ‘baby boomers’ claiming that they plan to “work until they drop”[7]. Of course, for some, this is a choice – a desire to continue in their current roles or even explore alternative career options, but for too many it’s a financial necessity. In the year 2,000, 4 million senior citizens continued to work after retirement age, but by 2017 that number had jumped to 9 million – the highest senior employment rate in 55 years[8]! In fact, it’s the highest senior employment rate since before retirees earned the right to healthcare and Social Security benefits in the 1960s.

To put further strain on older employees, the workforce is changing. Technology is becoming more and more prevalent, and senior employees are left to learn new skills or change the way they work[9]. This leaves some forced out of their jobs and as a result, many sell their properties, give up their lifestyles, and travel the country in search of seasonal work[10]. The Bureau of Labor Statistics (BLS) estimates that by the year 2022, workers aged 55 and above will make up a huge 25% of the labor force[11] and it doesn’t look like things are going to get any better any time soon. We’re failing our elders, and we need to do something about it.

Why Some People Choose to Amputate, and the Implications that Has for the Rest of Us

Why Some People Choose to Amputate, and the Implications that Has for the Rest of Us

For the majority of us, the amputation of a limb is something from our nightmares – the result of a horrific accident or debilitating illness, perhaps a birth defect – something to be avoided at all costs. It's certainly not something we would choose to do. Approximately 185,000[1] people in the US experience the amputation of a limb every single year but surprisingly, more and more people are opting for an elective amputation, an amputation that is not the result of a life-threatening defect but is voluntary, and one that is taken upon for a variety of different reasons. But why do people choose to have such a life-changing procedure? And what implications does that have for the rest of us?

 Elective Amputation of Problematic Limbs

Currently, the majority of elective amputees are those who are suffering from problematic – though not life-threatening – issues, such as damaged foot or a mangled knee. Although many doctors and surgeons still oppose the idea, these patients choose amputation as a way of improving their lives. It often ends painful suffering, can improve mobility, and offer a higher quality of life in general. Hugh Herr, double amputee and biophysicist in the biomechatronics department of Massachusetts Institute of Technology (MIT), is one such person[2] and he believes that as time goes on, people will increasingly choose to amputate in order to replace their "heavy and stupid"[3] legs with high-concept, technological prostheses.

 

 

Should We Be Afraid of AIs?

Should We Be Afraid of AIs?

As technology continues to improve and the development of artificial intelligence hurtles towards progress, it’s no wonder we find ourselves asking whether we should be worried. What will artificial intelligence be like? Will we retain control or will something go wrong? Will it help us or hinder us? There is certainly enough science fiction to suggest the latter. Books and movies are littered with examples of rogue robots and AIs gone bad – machines that take over the world and enslave the human population or worse, kill us all off; but that’s just fiction. And besides, we’re a long way from walking, talking AIs who are part of our everyday lives, right?

 A Present Problem

Wrong. We already encounter artificial intelligence in much of the technology that we use every single day. Apple’s Siri and Microsoft’s Cortana are, after all, forms of AI – albeit more basic than those we see roaming the streets in the latest sci-fi movies. Then there’s smart cars that can drive themselves, iRobot Roomba vacuum cleaners that guide themselves around your room and then return themselves to their charging stations, and security surveillance that can follow potential crime without human control. There are fraud detectors, predictors for retailers, recommendation services like those you find on Amazon, and even automated online customer service support[1].

And that’s just the beginning of it. In Japan, the tech firm SoftBank has released a best-selling humanoid-style robot named Pepper who can recognize emotions and responds accordingly[2]. There are sex robots currently in production – alongside a sweep of controversy, automated weapons are being developed, and more and more AIs are taking the jobs of human beings. So what seems like a future problem is actually something that we are very much embroiled in already.

The Addiction of Pleasure and All its Damning Consequences

The Addiction of Pleasure and All its Damning Consequences

Hedonists have long argued that the path to happiness is the pursuit of pleasure – the more pleasurable activities you participate in, the happier you are likely to be. It seems somewhat self-evident too – pleasure makes you happy, albeit for a limited time, so if you bunch together a number of happy-inducing pleasure activities, you will ultimately be happy. However, evidence suggests the opposite. In a world where pleasure activities such as alcohol, drugs, sugar, sex, pornography, wealth like general populace has never seen before, even social media and smart phones are abound, we seem to be unhappier than ever. In a world of increased privilege, we are increasingly discontent, and that in itself has negative consequences we could never have foreseen. As we become addicted to the pursuit of pleasure, are we actually ruining our chances of genuine happiness? And could we potentially be sending ourselves to an early grave?

 The Increase of Pleasure Activities

The strive for pleasure is evident within our culture, and it's becoming easier and easier to grasp at as our lives become less fraught with worries such as war and famine. The average American now consumes 94 grams of sugar per day – almost double the government's recommended limit of 50 grams per day. This has increased from 87 grams per day in 1970[1]. Not only is the availability of this 'feel good food' increasing, the desire for it is sky-rocketing too, suggesting an addictive tendency of this pleasure-seeking habit. It's not just sugar either. Drug use has increased by almost six per cent since 2007[2] and the use of smart phones has shot up from approximately 62 million people in 2010 to 224 million people in 2017[3]. Pornography, narcotics, social media use, and alcohol intake are all on the rise too. What's more, the average annual household income has increased from $49,354 in 2007 (ranging from $36,338 in Mississippi to $62,369 in New Hampshire) to $57,856 in 2015 (ranging from $40,037 to $75,675)[4], meaning that we can now pursue pleasure quicker and easier than ever before.

Cheat Meals: Sin or Savior?

Cheat Meals: Sin or Savior?

Dieting can be really tough, especially when there is so much conflicting information out there – eat this, not that; do that, not this. The topic of cheat meals – or even cheat days – is no exception, and the debate it engenders can get a little heated. After all, both food and health are passionate subjects. What’s it all about, though? Is it really possible to have one meal a week, where you eat whatever you want, and still maintain – or even lose – weight? Cheat meal advocates say yes, and they say they’ve got the science to back them up. 

A Psychological Boost

One of the biggest benefits of cheating, advocates claim, is actually a psychological one rather than a physical one. Everyone knows how tough it can be to stick to a strict regime, and the idea is that a cheat meal will allow you to relax your regime once a week, helping you to stick to it the rest of the time. It provides that added incentive to be ‘good’, because you know that you’re earning a splurge on the weekend[1]. That’s a dangerous road though, and only works for some. It can, potentially, lead to that famous slippery slope. Joe Vennare, creator of The Hybrid Athlete, warns that “some people can’t make the switch from healthy to unhealthy. Once they get a taste of sweets, they binge and can’t go back. It throws off their entire diet plan, serving as a setback instead of a small break from the rules”[2].

 

Google+