下一章 目录 设置
1、Entitlements:?Ageism?and?Ableism?at?Work Ent ...
-
Entitlements: Ageism and Ableism at Work
Aging is an extraordinary process where you become the person you always should have been.
— David Bowie
Poppy was eighty-six years old when he was forced into retirement. Up until that point, he maintained a small office at the hospital and diligently (and happily) participated in rounds once a week. Poppy was in excellent health; sustaining his professional practice of medicine meant the world to him and was an integral part of his identity. He was devastated when the powers that be decided abruptly to end his professional career. They said they needed his office space. I understand the pressures the decision makers were under; structurally and institutionally, our society has been built around retirement. This is why retired is so widely used as another term for older adulthood.
We grow through infancy, childhood, adolescence, adulthood, and then — we retire. But what does being retired even mean? Merriam-Webster defines retirement as “withdrawal from one’s position or occupation or active working life.” In my view, it is highly problematic that we use a term that connotes a complete withdrawal from an activity that we used to do to generally describe one of life’s stages. Describing someone as retired says nothing about them other than they used to work. It egregiously fails to capture any relevant information about interests, passions, goals, and ambitions in the present or for the future.
Poppy’s retirement was the precursor to a depression that lasted the rest of his life. The man who loved ice cream and chocolate no longer wanted either. The classical music aficionado turned down offers to listen to a CD with “No thanks, I’m retired.” The professor emeritus of medicine got stripped of his identity unwillingly, and there was no structure in place to support him in an effort to find a renewed sense of purpose and meaning. Nonny was exasperated with Poppy’s depression and would reach out to my father for advice. What could she do to get him to change out of his pajamas and engage with the things he loved? We all felt stuck and helpless.
Societally, we lack the understanding and the language needed to promote a path toward a meaningful elderhood. Retirement, instead, has become the primary developmental stage equated with later life. I use the term developmental purposefully to illustrate the absurdity. Unlike other developmental stages of life that contain meaningful milestones and markers of growth and expansion, retirement revolves around a mechanism to downshift older workers out of the workforce — sometimes with a thankyou party and a gold watch; sometimes not — and that is a big problem. Retirement is not a developmental stage; it is a social institution. The social institution of retirement was designed to provide younger workers opportunities by reducing unemployment, containing costs, and tying workers to jobs through pensions. The concept of retirement also addressed older workers’ needs by rewarding them for long and loyal service and providing a mechanism for older-age income security. However, the formation of retirement as a social institution structurally cemented two critical notions: Being a productive and profitable worker has a time limit, and post-working life is an inevitable period of inactivity and idleness that is earned and should be desired.
Retirement as “Old Age”
It is reported that 70 percent of men sixty-five and older remained employed in America through the time of the Civil War. Landowners, predominantly men at the time, retained a great deal of authority, power, and esteem and often deeded their sons the homestead with the understanding they would receive financial and other support through their older age. However, a shift away from traditional elder authority toward control based on achievement and wealth accumulation began to get under way in the first half of the nineteenth century, and numerous documents from the period reflect a growing ambivalence toward the aged.
Historians credit Augustus Caesar with conceiving the idea of retirement in 13 BCE when he created a system offering a lump-sum payment to soldiers after twenty years in a legion and five years in the military reserves. Yet the social institution of retirement did not get established until the mid-1800s. The emergence of labor unions and mandatory retirement policies between 1865 and 1900 in Europe, North America, and elsewhere provided the building blocks for a retirement system. And the formal adoption of such a system, under which retirement income would be provided, first coalesced in Germany in 1889 when Chancellor Otto von Bismarck initiated the first-ever state pension system. Von Bismarck designated seventy as the age of eligibility for benefits, yet life expectancy hovered around forty-five for women and was even lower for men. So retirement was designed to be a relatively small amount of money for relatively few people.
In the United States, the earliest formalized pension program was passed in 1862, but exclusively for disabled Civil War veterans or the surviving widows and orphans of soldiers who had died or been killed during active duty. In 1906, old age was added as a sufficient qualification for benefits, and more than 90 percent of remaining Civil War veterans became beneficiaries. An ugly side effect of this generous system was a growing view of older men as prey for younger women looking to cash in, planting the seeds for ageist attitudes and stereotypes of older men as gullible and weak.
The American Express Company created the first private pension plan in the United States in 1875. It applied to employees who had twenty years of service, had reached the age of sixty, and had been recommended by a manager and approved by a committee and also by the board of directors. This concept caught on as more and more employers started to promote efficiency and mobility and were eager to pave the way for younger workers to replace older ones. Over the next fifty years, hundreds of corporations followed suit and adopted the practice of providing an economic incentive for exiting the workforce. The Internal Revenue Act of 1921 spurred this practice further by changing the tax code to exempt employers’ contributions to pensions from federal corporate income tax.
In 1910, the Massachusetts Commission on Old Age Pensions defined old as sixty-five and older simply because sixty-five was the age already being used in most pension schemes at that time. In 1916, Germany reduced its state supported retirement age from seventy to age sixty-five. In 1934, Congress passed the new federal Railroad Retirement System using sixty-five as the age for eligibility. The American Committee on Economic Security (CES) then endorsed sixty-five as a marker for retirement based on actuarial studies that demonstrated sixty-five equated to a sustainable retirement age in that it would result in a system that could be self-sustaining with modest levels of payroll taxation. The Social Security Act of 1935 formally established a system for paying retired workers age sixty-five and older. And that is how sixty-five came to be regarded as the beginning of “old age.”
With his support for a state pension system as a well-earned provision for retired workers, Bismarck simultaneously sent an ominous message signaling old age as a time of incapacity:
The State must take the matter into its own hands, not as alms giving but as the right that men have to be taken care of when, from no fault of their own, they have become unfit for work. Why should regular soldiers and officials have old-age pensions and not the soldier of labor? This thing will make its own way: it has a future.
Superannuated, a commonly used term to describe people eligible for retirement during this period, provides another indication of the growing disdain for old age. It is defined as “disqualified or incapacitated by age; old and infirm…too old; worn out, antiquated; made out of date or obsolete, esp. by age or new developments.” The roots of ageist and ableist language and thought were officially planted.
Shifting demographic trends and mounting scientific studies shaped increasingly negative attitudes toward older workers. A new scientific study released in 1882 postulated that with time, parts of the body simply wore out due to age and repeated usage. This wear-and-tear theory of aging, introduced by German biologist Dr. August Weismann, was taken by many as evidence that people had a fixed capacity and time limitation for work. Over the next few decades, ageism was exacerbated further with scholars, such as statistician Frederick Hoffman, claiming that ending work at sixty-five would maximize productive potential. British economist William Beveridge argued that older workers lacked adaptability. Professor of medicine William Osler made the ageist decree that there was a “fixed period” for productive years from age twenty-five to forty, followed by the “uncreative years” from forty to sixty, and followed further by the unproductive useless years which commenced after sixty. Osler was also one of the first to classify retirement as a time of leisure earned in the world but separate from the rest of life. All the while, the population of older people in the United States was beginning to experience a dramatic expansion. In 1900, 4.1 percent of the population was sixty-five and older. By 1950, the figure had almost doubled, to 8.1 percent.
Age Discrimination: Ageism and Ableism at Work
There is an important caveat worth mentioning that speaks to the complexity of the aging experience. We do experience physical decline over time, and a high proportion of jobs during the nineteenth and the early twentieth centuries required exhaustive physical labor. So this pervasive attitude associating productivity with age was not wholly unjustified. But the phenomenon of aging was now solidly defined as a process of decline and inability, and the societal solution deemed most efficacious was to create a normalized system of withdrawal based on age. This broad misunderstanding persists to this day, and we continue to conflate age and ability.
Two terms illustrate the difficulty of using age as a sole indicator of ability and thereby occupational qualification: chronological age and functional age. Chronological age is age measured from birth. Functional age surmises an individual’s age based on their functional ability. While the notion of functional age moves conceptually in a positive direction by tying age to actual ability, it is still problematic in that it attempts to explain what it means to be of a certain age. Defining what it means to be X years old is impossible given the complexity and multidirectionality of the aging experience. Think about it: What does it mean to be forty years old? Sixty years old? Eighty years old? Is there a universal way to categorize and classify this? The answer is indisputably no. Someone at the age of eighty-five can demonstrate physical strength and prowess, while someone at the age of forty-five can demonstrate physical weakness and frailty. Age simply cannot be a sole indicator of health and potential, no matter how it is defined.
Older age, unlike childhood, has very few normed processes that can be used as developmental milestones. In childhood, we have a plethora of checklists and charts that outline social, emotional, communication, cognitive, and physical developmental tasks, like taking the first step, mimicking sounds, and pedaling a bicycle. In childhood, these are normed, meaning that we expect these abilities to happen in a progressive and relatively predictable way. In many ways, older age is completely counter to this predictable progression. In reality, there is evidence that as people get older they become less alike, more heterogeneous, and frankly more individual and unique to themselves. Accordingly, using any definition of age as the lone indicator of one’s suitability for employment is misguided and flawed; no definition should be considered a barometer of one’s ability to perform any kind of work.
The Social Security Act of 1935 cemented the practice of retiring at sixty-five, which became mandatory under many private pension plans. Concurrently, the Great Depression altered the financial landscape by drastically increasing the number of older people that were poor and providing significantly fewer employment opportunities for job seekers of all ages. A combination of factors including the need to reduce the size of the labor force, the low level of benefits provided under the initial legislation, and the “retirement test” (which penalized beneficiaries for having too much earned income) created a prejudicial system designed to move older people out of the workforce. Social Security benefits, which were touted as a social advancement, reinforced rather than eliminated discriminatory practices and had the practical effect of imposing enforced poverty on many older people.
There has been a precipitous decline in men’s labor force participation rates for those over sixty-five ever since:
●78 percent in 1880
●65 percent in 1900
●58 percent in 1930
●< 20 percent in 1990
Very clearly, “in a society obsessed with youth and productivity, there was no place for the older worker. Though the poor and minorities were the most heavily burdened, no class, race, ethnic group or sex were untouched by the pernicious effect of age discrimination.” The process of mandatory retirement became justified by the view of the aged as disabled, less efficient, less able to maintain production standards, intellectually declining, suffering decreasing stamina and strength, unable to adjust to new work situations, inflexible, and frequently absent due to illness. All of which happen to be inaccurate. Of note, it wasn’t until 1967 that the Age Discrimination Act was passed to protect workers forty to sixty-four who wished to remain in the workforce. Furthermore, it took until 1978 for the ADA to extend this legislation to workers up to age sixty-nine, effectively abolishing the practice of mandatory retirement before age seventy.
Let’s clear up the misperceptions about older people at work with the facts:
Myth: Older workers are less productive and unreliable.
Reality: The majority of older workers can work as effectively as younger workers, with some studies showing older workers as having higher productivity.
Myth: Older workers are less motivated with less ambitious career goals.
Reality: Older workers care about their jobs, have goals and ambitions and interest in taking on new challenges.
Myth: Older workers are not as mentally sharp.
Reality: Older workers bring valuable knowledge and skills to the job including higher verbal ability and knowledge from lived experience (known as crystallized intelligence). Myth: The risk of dementia is increasing among working older Americans.
Reality: There is evidence that the incidence of dementia is declining and has been over the past three decades.
Myth: Older workers are resistant to technology and not digitally savvy.
Reality:Countless surveys show that most older people use the internet and are more digitally connected than ever before. Research also shows that older adults are skilled at multitasking with media and technology.
Myth: Older workers cost more.
Reality: Among other cost-saving benefits, older workers are more likely to remain with their employers longer, which enhances profitability.
Although these myths have been debunked, negative stereotypes about older people at work have persisted, and ageist and ableist thought and practice are now profoundly institutionalized. Age discrimination in the workplace today often goes completely unnoticed. There are many ubiquitous and seemingly innocent ways it seeps into everyday organizational culture. To take stock of your organization, it can be helpful to critically examine who the learning opportunities are offered to, who is being given the challenging assignments and who is being passed over, and who is being left out of meetings or activities. If you happen to notice any age-related trends, pay attention.
If I asked you outright if age discrimination exists at your workplace, you would likely instinctively respond “no!” and offer justifiable explanations for why the types of decisions described above are made. But it is quite possible, if not likely, that underlying, unconscious motivations are driving these decisions. It is also quite possible, if not likely, that these behaviors are perceived as discriminatory by some people. Sometimes it is even the most innocent, well-intentioned, good-humored ageist behaviors that slip right by our awareness.
Have you ever been part of or witnessed a birthday party at work that has an over-the-hill theme? You might not have consciously realized it, but of course this is poking at and making fun of getting older. Celebrating another year of life with jokes about the aches, pains, and declining experiences that we equate with being “over the hill” are ageist and ableist microaggressions. A microaggression is a statement, action, or incident that is a subtle or unintentional zinger of discrimination against a person perceived as different. Ageist and ableist microaggressions are often disguised as innocuous quips, jokes, memes, emails, and offhand remarks like the following:
“At your age, this is probably a trend that you haven’t heard of…”
“You are too young to remember this, but…”
Saying a job applicant isn’t a good “cultural fit.”
Making jokes about silly texts that your parents or grandparents sent.
“Your generation doesn’t appreciate this…”
“Happy birthday — you are old as dirt!”
“I can’t believe you have worked here for twenty years! You’re a dinosaur!”
Language matters, and what seems benign can be deeply offensive; at the very least, it perpetuates ageist and ableist thoughts and behavior. It is not difficult to find words to use in your everyday conversation that don’t reflect a worldview in which differently aged or differently abled people are less valued. Our quick, automatic thinking is a deterrent from identifying our language as biased or prejudicial. It is so much easier to say what is easily understood than to say what you actually mean. A simple way to check yourself is to substitute another form of identity (such as race, class, or gender) and see if you find the comment offensive. How do you feel about saying “someone of your race [or economic class or gender identity] wouldn’t appreciate this…”? If that makes you uncomfortable, which it should, it is offensive in relation to age, too.
While we are on the topic of offensive language, it was also during the period when retirement was becoming formalized and pervasive that terms like senior citizens (coined in the 1930s) and the elderly became euphemisms for older people. These terms stigmatize older people by implying that they are a homogeneous group with common attributes like weakness, frailty, and senility. These terms condemn being old; if they didn’t, why wouldn’t we call younger people “junior citizens”?
Most troubling is that these ageist and ableist ideologies embedded in our language are internalized and become self-fulfilling prophecies. Becca Levy’s theory of stereotype embodiment illustrates how stereotypes from the surrounding culture become internalized as negative, self-directed ageism. At work, older employees are primed to internalize the underlying beliefs and judgments expressed as microaggressions and prejudicial behaviors, such as age-based comments on their performance and potential. This internalization creates a vicious cycle whereby the myths and stereotypes of older workers that provoke age discrimination in the workplace become driving forces that perpetuate the very same behavior — a phenomenon called stereotype threat.
Stereotype threat occurs when members of a marginalized group are aware that a negative stereotype exists in reference to their group and demonstrate apprehension about confirming the stereotype. That feeling of dread makes us more likely to internalize the stereotype and then display that exact behavior! Essentially, we become the stereotype, and in turn confirm it. A research study demonstrated this when older people who were told that age leads to poorer memory performed worse on a memory test than those who were told that older people perform as well as younger people. We tend to care about what other people think of us, and when we feel we are being labeled and judged, we “give up and give in.” This is a part of the process of relational ageism and is why we never get off the proverbial treadmill — we absorb the stereotypes, we feel devalued or unvalued, we perpetuate the stereotype, the stereotype is then confirmed and absorbed by others as truth, and the loop goes on and on.
There is an easy strategy to neutralize the impact of stereotype threat and relational ageism in the workplace: Promote positive stereotypes by encouraging diversity in teams based on age. Creating the infrastructure for people of different ages to work together and learn together is the single most cost-effective way to create an anti-ageist workplace. When people have the chance to genuinely connect with each other as authentic human beings, influential relationships develop organically and bias is challenged. An extra pro tip: Start this endeavor by having age-diverse teams spend some time discovering what they have in common with one another. You will find that, generally speaking, we tend to have a lot in common that can serve as a point of relationship bonding — no matter what ages we are.
When there is talk of “managing diversity” in the workplace, we typically focus on the much-needed work surrounding race, gender, and sexual identity and orientation, but we have yet to prioritize age and ability as important factors related to intersectionality. Yet it is undeniable from looking at history that age and ability exclusion have been interwoven into the fabric of work environments.
A Note About Generations
Another point of conflict in work environments stems from the perceived discordance among people of different generations. Until the eighteenth century, the term generation was used to refer to a familial generation, describing sets of relatives similar to each other in age within a line of descendants and ancestors. However, the social change resulting from industrialization and modernization sparked a youthful rebellion against the established social order. In 1863, a lexicographer named Emile Littré redefined generation to refer to a social construct describing all people living in society at a given point in time. The concept of social generations gained in popularity, and societal divisions based on age became more prevalent. This was accompanied by political forces and movements encouraging the idea that progress and change should be driven by the power of youth. Thus, the term generations began to be equated with the concept of youth enfranchisement and liberation.
Karl Mannheim was the first social scientist to investigate a theory of generations and postulate that birth cohorts shared values and experiences. But it was William Strauss and Neil Howe who developed our modern-day understanding of generational archetypes in their 1992 book Generations. Generational theory posited that generational cohorts were produced by specific social and biographical experiences that created common trends in values, attitudes, and preferences. In essence, historical events and the time’s zeitgeist shaped a collective personality. Generational tension, then, arises when there is discord between perceived values among people of differing generations. It has also been said that generational language and jargon contribute to misunderstanding among age cohorts.
Generational nicknames are sticky labels used to describe an entire group of people born during a specific period. Current names of generations are Traditionalists or Silent Generation (born 1945 and before), Baby Boomers (born 1946–1964), Generation X (born 1965–1976), Millennials or Gen Y (born 1977–1995), and Gen Z or iGen (born 1996–TBD). These generational labels are now commonly used to describe and predict behavioral traits.
We lump together millions upon millions of people and believe that they all have a similar blueprint of behavioral characteristics in common. For example, we regard Baby Boomers as competitive, self-disciplined, safety seeking, and good team players. We believe that the millions of individuals born in the Millennial generation feel entitled and are politically passive. I believe the concept of tying generational blueprints to behavior is total and absolute bullshit. Historical influences do not have a blanket effect on individual outcomes. Historical and cultural events shared by similarly aged people are experienced and internalized in different ways. Not only is it impossible to predict behaviors of any one individual based on date of birth, but the labeling of people within a somewhat arbitrary twenty- to thirty-year span makes no logical sense. Let me illustrate with an example. As a group, the Baby Boomers were born from 1946 to 1964. Baby Boomer A born in 1946 experienced their formative years in the 1960s, while Baby Boomer B born in 1964 lived their formative years in the 1980s. Baby Boomer A grew up in a time of the Civil Rights Movement, hippie culture, the Vietnam War, and the assassination of JFK. These were powerful cultural forces that had a strong influence on personality and identity. Baby Boomer B grew up in the 1980s and experienced the first launching of the space shuttle, the end of the Cold War, and a rise in technology, the internet, and conservative politics. Quite clearly Baby Boomer A and Baby Boomer B had very different historical events and experiences shape their development — yet we refer to them both as Baby Boomers as if they fit one mold. The same homogeneous classification system is used for every generation, and it makes absolutely no sense.
People in different generations are not as different as you might think. All generations undergo development patterns that follow similar ebbs and flows. A common head-shaking euphemism is “kids these days,” but in reality, it’s normal for teenagers to feel invincible and to exhibit recklessness no matter what decade they are born into. We also commonly hear “[insert generation here] is responsible for [insert grievance here],” yet the truth is that every generation has people who dedicate their time, energy, and passion to bettering the world. Regardless of the period into which people are born, they have more in common with individuals born into other generations than we are led to believe.
Pitting generations against each other kindles internalized ageism. It is no surprise that if we see older generations as wildly outdated and inept then we are likely to fear our own process of growing older. Generational labeling and stereotyping also promote other-directed ageism by perpetuating an us-versus-them mentality. Ageism results when generations see each other as them. Need I say more than the catchphrase and meme “OK boomer?” If you aren’t familiar, “OK boomer” popped up online in 2019 and went viral as a slang phrase used to dismiss or mock attitudes ascribed to Baby Boomers. The phrase was used as commentary on political and social issues like climate change, and not surprisingly it morphed into an ageist catchall. Another form of ageism — adultism — also results from generational labels. During my ageism presentations with groups of elders, I invariably hear derogatory and dismissive comments about “those irresponsible Millennials.” Remember, ageism is bidirectional. As a myth-busting side note, despite what is commonly believed about Millennials as a selfish generation, one-fourth of people providing care to elders right now are Millennials, and that number is expected to rise. Just saying.