Category Archives: Media

Pandemic Spurs a Migration

Immigration.  wnd.com image

 

By Casey Bukro

Ethics AdviceLine for Journalists

 

The covid-19 pandemic is spurring a migration, one of many in human history, from crowded big cities where death tolls are rising to smaller cities, suburbs and rural areas thought to be safer from the scourge.

Humans are a restless species even in the best of times, spreading across nations or the globe, sometimes prodded voluntarily or un-voluntarily by war, famine, disease, conquest, wanderlust, living conditions, religious freedom and prejudice.

Today the fear of a plague is the driving force.

“A record 27% of home searchers looked to move to another metro area in April and May 2020, a new high in the share of Redfin.com users searching for homes outside their area,” reported Redfin, a residential real estate company. It’s up from 25.2% in the second quarter of 2019 and 26.0% in the first quarter of this year.

The immigration analysis is based on a sample of more than one million Redfin.com users searching for homes across 87 metro areas in April and May. Overall, searches for homes in small towns surged. Pageviews of homes in towns with fewer than 50,000 residents were up 87% from a year ago, more than triple the 22% increase in pageviews of homes in cities with more than a million residents.

They want to leave the Bay Area, Washington, D.C. and Seattle for places like Sacramento, Las Vegas and Nashville.

“While there has been a huge increase in the number of people looking online at homes in small towns, the long-term impact of the pandemic on people actually moving from one part of the country to another remains to be seen,” said Redfin economist Taylor Marr. Most of them probably were already considering a lifestyle change. “The pandemic and the work-from-home opportunities that come with it is accelerating migration patterns that were already in place toward relatively affordable parts of the country. But for many people, the lure of large homes in wide open spaces will be a passing dream fueled by coronavirus-induced isolation.”

                                                      Past immigrations

Judging from past immigrations, isolation hardly seemed like a goal. And pandemics have a way of upending and diverting plans. Moving might not be that easy.

New York, New Jersey and Connecticut enacted 14-day traveler quarantines, trying to check the spread of the coronavirus. Governors hope to preserve their hard-won recoveries by making travelers from more than half-dozen virus hot-spot states isolate themselves.

The United States leads the world in coronavirus cases and deaths. While the U.S. wrestles with easing restrictions safely, the European Union declared that Americans will not be allowed to travel to the block of 27 countries when it reopens to some foreign travel. Nearly 10.3 million coronavirus cases have been detected worldwide, with roughly 2.6 million infections reported in the United States. At least 124,000 people have died of the disease in the U.S., and the global death count is near 505,000.

It’s too early to tell how this plays out nationally and globally on immigration, a story of movement by people from one place to another, particularly different countries, with intentions of settling temporarily or permanently in a new location.

Early human migration includes people migrating to regions where there were no humans. Colonialism involves expansion of populations into sparsely settled territories or territories with no permanent settlements. In modern times, humans migrated within and between nations, legally or illegally. It can be voluntary or unvoluntary, such as deportation, slave trade, trafficking in human beings and flight from wars or ethnic cleansing.

The history of immigration to the United States starts with the first European settlements around 1600. In 1607, the first successful English colony settled in Jamestown, Virginia. In 1619, Africans were imported as slaves. So began the first and longest era of immigration, lasting until the American Revolution in 1775. It brought Northern European immigrants, mainly from Britain, Germany and the Netherlands. The British were the largest group of arrivals. Ninety percent of these early immigrants became farmers.

                                                    Later in history

Looking later in history, the University of Washington’s “America’s Great Migrations Project,” derived in large part from the work of James Gregory, professor of history. Recognizing that “Americans have always been a moving people, coming from other places, moving to new places,” the project focused on four historic migrations:

The Great Migration – Upwards of seven million African Americans left the  South between 1916 to 1970 to settle mostly in the big cities of the North, Midwest and the West. They transformed cities and set the foundations for reconstruction of race, politics and even regional balances of the U.S.

Driven from their homes by unsatisfactory economic opportunities, they took advantage of the need for industrial workers during the First World War. To the dismay of white Southerners, black newspapers, particularly the widely read Chicago Defender, published advertisements touting the opportunities available in Northern and Western cities, along with first-person accounts of success.

By the end of 1919, some one million blacks had left the South, traveling by train, boat or bus. Some had autos and even horse-drawn carts. In the decade between 1910 and 1920, the black population of major Northern cities grew fast, including 66% in New York, 148% in Chicago, 500% in Philadelphia and 611% in Detroit.

Many new arrivals found jobs in factories, slaughterhouses and foundries, where working conditions were hard and sometimes dangerous. Female migrants had a harder time finding work. Aside from competition for employment, there was competition for living space in the crowded cities. Though segregation was not legal in the North, as it was in the South, racism and prejudice were widespread.

During the Great Migration, African Americans began building a new place for themselves in public life. They confronted racial prejudice as well as economic, political and social challenges to create a black urban culture with enormous influence in decades to come.

                                             Latinx migration

Latinx American Great Migrations – Spanish-speaking people were living in what is now the United States decades before English-speaking people crossed the Atlantic seeking colonies. Centuries later, the United States annexed Florida, Louisiana and the northern half of Mexico. More than 100,000 Spanish-speaking residents became U.S. citizens.

Though the University of Washington project calls this a migration, it seems more like a population capture.

The 1850 U.S. census counted more than 80,000 former Mexicans, 2,000 Cuban and Puerto Ricans and 20,000 people from Central and South America. Today the descendants of those 1850 citizens are part of an Latinx American population that grew enormously. As of 2017, more than 58 million Americans claimed Latin American heritage.

Conventional wisdom says Latin American migrants continue coming to the U.S. seeking a better life and the “American Dream,” writes Roque Planas in huffingtonpost.com. That’s true, but there’s another part of the story, he writes. “People leave Latin America because life there can be very hard. Poverty, political instability and recurring financial crises often conspire to make Latin American life more challenging than in the U.S., a wealthy country with lots of job opportunities.”

Southern Diaspora – More than 20 million whites left the South during the 20th century, vastly outnumbering the 7-8 million African Americans who left, according to the University of Washington project. They were joined by nearly one million Latinx who moved west to California and into the Midwest.

                                                              Dust Bowl

Dust Bowl Migration – Close to 400,000 people fled Oklahoma, Texas, Arkansas and Missouri  during a period of severe dust storms that struck during the Great Depression. Known as the Dust Bowl migration, it was the most publicized mass migration of that decade.

The drought came in three waves, 1934, 1936 and 1939-40. But some regions of the High Plains had draught conditions for up to eight years. Though early explorers called the region the Great American Desert, the federal government encouraged settlement and development of the plains for agriculture with the Homestead Act of 1862, offering settlers 160-acre plots. With the end of the Civil War in 1865 and the completion of the First  Transcontinental Railroad in 1869, waves of migrants and immigrants reached the Great Plains.

Without understanding the ecology of the plains, farmers deep-plowed the virgin topsoil, displacing the native, deep-rooted grasses that normally trapped soil and moisture even during droughts and high winds. During the droughts of the 1930s, soil deprived of the anchoring grass roots turned to choking clouds of dust that blackened the sky.

The Dust Bowl forced tends of thousands of poverty-stricken families to abandon their farms, unable to pay mortgages or grow crops. Losses reached $25 million a day by 1936.

                                                 Native Americans

Native American Forced Relocations – Although this migration does not appear in the University of Washington’s migration project, it deserves recognition as the kind where people are moved unwillingly.

At the beginning of the 1830s, nearly 125,000 Native Americans lived on millions of acres of land in Georgia, Tennessee, Alabama, North Carolina and Florida — lands their ancestors occupied and cultivated for generations. By the end of the decade, few natives remained anywhere in the southeastern United States. Working on behalf of white settlers who wanted to grow cotton on Indian land, the federal government forced Native Americans to leave their homelands and walk hundreds of miles to designated “Indian Territory” across the Mississippi River.

Relocated people suffered from exposure, disease and starvation on their way to reservations, killing thousands. The term “Trail of Tears” springs from a description of the forced removal of Native American tribes, including the Cherokee Nation relocation in 1838. Cherokees were the last forced removal east of the Mississippi, resulting in an estimated 2,000 to 8,000 deaths among the 16,543 relocated. By some estimates, about 60,000 Native Americans were forced from their homelands.

The Indian Removal Act in 1830, forced all eastern tribes to move to southwestern reservations, to land considered useless. Twenty-six tribes were removed or assigned reservations in the new territory between 1830 and 1862. The end of the Civil War allowed another surge of Anglo-American settlement in the West, forcing 28 tribes to move to Indian Territory between 1867 and 1892.

Andrew Jackson long advocated what he called “Indian removal.” As an army general, he led brutal campaigns against the Creeks in Georgia and Alabama and the Seminoles in Florida. As president, he signed the Indian Removal Act in 1830. Indian removal was Jackson’s top legislative priority upon taking office. The Choctaw became the first nation expelled from their land. The Choctaw, Chickasaw, Seminole, Creek and Cherokee were known as the “Five Civilized Tribes” for learning to speak and write English and following European-style customs such as land and property ownership.

                                               Lincoln no friend

Though Abraham Lincoln is known as “The Great Emancipator,” he was no friend to Native Americans. Beginning in 1863, the Lincoln administration oversaw the removal of 10,000 Navajos and Mescalero Apaches, forcing them to walk 450 miles to the Bosque Redondo reservation in New Mexico and held as military prisoners. More than 2,000 died.

Rampant corruption in the Indian Office, later known as the Bureau of Indian Affairs, continued throughout Lincoln’s term and beyond. In many cases, government-appointed Indian agents stole resources that were supposed to go to the tribes.

Several massacres of Indians happened during Lincoln’s watch. The Dakota War in Minnesota in 1862 led to the hanging of 38 Indian men. Although 303 Indian men were sentenced to hang, Lincoln pardoned the rest. The Sand Creek Massacre in southeastern Colorado in 1864 led to the deaths of hundreds of Cheyenne and Arapaho.

An historian said Lincoln made no revolutionary change in relations with Native Americans as he did for African Americans with the Emancipation Proclamation. Though Lincoln called for reforming the Indian system in his last two annual messages to Congress, he gave no specifics and continued the policy of confining Indians to reservations as wards of the government after negotiating treaties.

Migrations often are seen as paths to a better life. In the age of the coronavirus, we are still trying to figure out what that might look like. Judging from the past, acts of kindness toward others would help. Social distancing and wearing face coverings might become icons of our generation, as well as our salvation, on our covid-19 migration to an unknown destination.

********************************************

The Ethics AdviceLine for Journalists was founded in 2001 by the Chicago Headline Club (Chicago professional chapter of the Society of Professional Journalists) and Loyola University Chicago Center for Ethics and Social Justice. It partnered with the Medill School of Journalism at Northwestern University in 2013. It is a free service.

Professional journalists are invited to contact the Ethics AdviceLine for Journalists for guidance on ethics. Call 866-DILEMMA or ethicsadvicelineforjournalists.org.

 

Stop Cringe Worthy Clichès

Messiahnetwork.org image

 

By Casey Bukro

Ethics AdviceLine for Journalists

 

“Thanks for having me!”

You hear that over and over again, on radio and television. People express their gratitude for being invited to speak or appear as guests. They are trying to be polite, but they are trite. And look so proud of themselves.

Over and over again, you hear it. “Thanks for having me.” I cringe every time I hear it. And I’ve been cringing a lot lately. It’s getting on my nerves. It’s a linguistic epidemic during the covid-19 pandemic.

“Thanks for having me!” Okay, I think. You’ve been had, and you liked it. It sounds indecent. Shame on you. Not only for appearing to talk dirty, but also because you can’t think of something more original to say. Make something up, rather than repeating something you hear other people say, like sheep following sheep.

                                   Dignified expressions

How about something more dignified, like, “Thanks for inviting me.” Or, “Thanks for your invitation.” Or, “Thanks for your interest.” Or, “Glad to be here.” Or “What would you like to know?” Or, “How can I help you?” Or, “What can I do for you?” Or, “Glad to be with you.” Or, “Thanks for asking me to participate today.”

Anything, anything but that tired cliché heard dozens of times every day ad nauseam  across the country because everyone else is doing it. Stop!

Why do people resort to clichés? They are the mark of lazy thinking and lazy writing. But they survive, even when their original meanings are sometimes lost or used incorrectly.

By definition, a cliché is an expression, idea, an element of an artistic work that has become overused to the point of losing its original meaning or effect, even to the point of being trite or irritating (see that!), when at some earlier time it was considered meaningful or novel.

The French poet Gerard de Nerval once said: “The first man who compared a woman to a rose was a poet. The second, an imbecile.”

And that’s the point. Saying something over and over again because others are saying it doesn’t make you look smart. You look like an imbecile.

But, unfortunately, we are not all poets or original thinkers.

                                   Expressing commonalities

Scholars say people use clichés and repeat tired phrases because they are expressing a commonality, showing that they share certain values with others, even though they sound like parrots.

Psychologists say clichés serve a purpose, as stale and tiresome as they might be. They can be seen as life’s sign-posts.

“Clichés are not simply tired bromides,” writes Dr. Steven Mintz in Psychology Today. “They are instruments through which a ‘common-sense ‘ view of life is disseminated. Pithy aphorisms play a central role in the transmission of beliefs. They serve as conduits through which psychological concepts flow into the broader culture.”

Clichés shift over time, writes Dr. Mintz. Fortitude, stoicism and reticence once were regarded as admired virtues. A person facing adversity was encouraged to “suck it up” or “tough it out” in earlier times. Today, emotional expressiveness is more highly valued. We’re told to “express your anger” and “don’t hold it in.” Otherwise, we’re seen as uncommunicative and emotionally numb.

“Nuture your inner child,” we’re told. “Pursue your passion” and “never lose hope.” These are concepts of positive thinking.

“Though often misused,” writes Dr. Mintz, “clichés serve as guides to life that reflect assumptions deeply embedded in popular culture. Yet much as writers need to steer clear of clichés and invent images that are fresh and original, so, too, in our personal lives we need to break free from shopworn banalities and truisms and recognize that life does not conform to simplistic formulas.”

                                               Fresh, new clichés

And, as one of my journalism professors once said, stop using boring, old clichés. Give me some fresh, new clichés.

There is an abundance of old and tired clichés, and thinking people should avoid them. Author Robert Jay Lifton calls clichés “The language of non-thought.” It’s thought on automatic pilot.

HuffPost listed 13 clichés “you shouldn’t be caught dead using.” And they make you “unbearably boring.” Among them:

“Don’t cry over spilt milk.” It’s outdated and nonsensical. Who sheds tears over a toppled tumbler of milk?

“Selling like hotcakes.” Popular in the 19th century, they were made from cornmeal and fried in pork lard. They would be on no health-conscious shopper’s grocery list these days.

                                              History repeats itself

“Avoid like the plague.” Considered outdated and an unlikely expression only months ago. But then the coronavirus pandemic struck. This is more of a health warning now instead of a cliché, showing how history repeats itself.

“The rest is history.” A vapid way of wrapping up a well-known story, so why tell it?

“Every cloud has a silver lining.” The original source of this phrase is Milton’s “Comus,” in which the author is describing moonlight behind clouds at night, not every cloud. Aside from being trite, the cliché is incorrect.

“Beg the question.” Almost everyone uses this cliché incorrectly. It does not mean a question needs to be asked or raised. Aristotle around 350 BC coined the phrase, meaning a type of logical fallacy where a statement refers to its own assertion to prove the assertion, or circular reasoning. That’s what happens when you try to simplify Aristotle.

“When it rains it pours.”  Not always. Sometimes it drizzles.

“Cat got your tongue?” A benefit of a cliché is that it communicates an idea most people can relate to. But who can relate to having their tongue stolen by a cat? It’s a bizarre way to ask somebody to speak up.

“Dressed to kill.” Defined as dressing in extravagantly fancy or stylish clothing to impress others. But it makes no sense. It does not mean dressing in a way fit to kill someone. Taken literally, it could mean wearing something that sheds blood stains. Nothing attractive about that.

                                             Lost in translation

“Spitting image.” Derives from the phrase “spit and image,” meaning you are genetically similar to your kin and look like them. But it sounds gross and looks like something got lost in the translation.

“Go climb a tree.” Meant as a mild insult or rebuke. In these contentious times, a person considering to deliver an insult probably would be advised to make it soul-shattering. Or not at all.

“Don’t judge a book by its cover.” Advice likely given by someone who doesn’t read books. A book’s cover contains a lot of useful information, like its title and the name of the author. An illustrated cover jacket can dazzle you with beckoning details.

                                                Creative writing

Writers are advised to shun clichés, but I suspect some take pride in having a vocabulary full of them, as good as any other best-selling author.

“Editors may reject creative writing on the basis of too many clichés alone,” advises be-a-better-writer.com. “Reviewers will point them out unless it’s obvious that the writer used them for comic effect, such as to define an overly earnest or boring character.”

The creative writing site adds: “If clichés are frequent and easy to spot, you’re not doing your job as a writer, and you should spend more time weeding them out.”

That’s exactly what to do with “thanks for having me.” Weed it out, mercilessly. Remove it as an irritant to our ears and our intelligence.

We are judged by our words. Use them wisely to express ourselves and our individuality. Thanks for your courteous attention.

*************************************************

The Ethics AdviceLine for Journalists was founded in 2001 by the Chicago Headline Club (Chicago professional chapter of the Society of Professional Journalists) and Loyola University Chicago Center for Ethics and Social Justice. It partnered with the Medill School of Journalism at Northwestern University in 2013. It is a free service.

Professional journalists are invited to contact the Ethics AdviceLine for Journalists for guidance on ethics. Call 866-DILEMMA or ethicsadvicelineforjournalists.org.

 

 

 

Predicting a Future With Covid-19

Predicting a future with covid. Barrymoltz.com photo

By Casey Bukro

Ethics AdviceLine for Journalists

“Life as we know it” is a phrase used so blithely and innocently in the past, before the coronavirus ushered in a global pandemic that turned life as we know it into a big mystery.

How long will this deadly disease continue to stalk the world’s population? How many more cases? How many more deaths? Can it be cured or treated?  So far, there are more questions than answers.

In such uncertain times, humans respond by turning to an age-old tendency to divine the future with crystal balls, Ouija Boards, sorcerers, fortune-tellers and prophets. Today we call them predictions.

It’s always interesting to hear what people believe is in store for us. We normally get such reports at the advent of a new year, or the arrival of something totally unexpected.

One thing is certain: The disease already is changing life as we know it.

The AARP Bulletin appears to be among the first to make predictions on how life will change in the wake of this outbreak.

“Just a few months of life within the coronavirus pandemic has caused almost every business leader, researcher and planner to thoroughly rethink the future of America and how it will work for older Americans,” reports AARP, formerly known as the American Association of Retired Persons.

Americans might rethink past pleasures, like leisurely browsing in stores. Or living in a small apartment in a congested city. Or going to a ballgame with 50,000 others in the stadium. Or going to crowded restaurants. Taking frequent vacations. Or use public transportation.

                                           Goodbye to handshakes

One epidemiologist, says AARP, predicts that handshakes will be retired, possibly for good. They said nothing about elbow-bumps. Others predict that downsizing retirees will choose less populated areas. Hyperattention to cleaning will be the new normal in aircraft, office buildings and wherever people gather.

It’s too early for a full exploration of how the pandemic will change future behavior, customs and policies. The coronavirus pandemic took the world by surprise, despite warnings from some scientists.

But this is a good time to consider whether past predictions by some of the smartest people in the world thought a pandemic or something like it was looming. For that, it’s worth looking at two reports delving 50 years into the future.

“What Will the World Be Like in 50 Years? 19 Futuristic Predictions,” appeared in Bustle.com in June, 2014, written by Seth Millstein.

“Predicting the future is tricky business,” allowed Millstein. “And while attempting to project decades into the future is damn-near impossible, plenty of people attempt to do so on the regular regardless. They’re called futurists, and it’s their job to predict what the world will look like in hundreds of years from now and beyond.”

Many predictions are comically off-base, wrote Millstein. The New York Times in 1920 proclaimed that “a rocket will never be able to leave the Earth’s atmosphere,” while Variety insisted in 1955 that rock and roll was merely a fad, and would “be gone by June.”

                                   Predictions by leading minds

Millstein went on to list 19 predictions by some of the leading minds. Right at the top was, “disease will be more common, as everybody will be physically closer to everyone else….” Though a pandemic was not mentioned specifically, the prediction touched on the spread of disease and scored a point for the futurists.

Also touching on health, the report said going to a doctor for a checkup will not be necessary in the future. Run a scanner over your body and results will be forwarded to a health network.

Futurists commented on global warming, population growth and technological advances.

The pandemic clashes with two of the predictions: That a majority of people will live in cities and that air travel “will be exponentially more awesome.” The coronavirus already is putting a damper on those expectations as people flee crowded urban areas with high virus death rates and avoid sitting shoulder-to-shoulder on aircraft without social distancing. Disease is reversing those trends, at least for now.

All of us are racing toward what is blithely called “the new normal,” which is yet to be fully defined.

                                         Future of digital life

Another fifty-year forecast, practically on the eve of the pandemic, looked at the future of digital life.

“Fifty years after the first computer network was connected, most experts say digital life will mostly change humans’ existence for the better over the next 50 years,” wrote Kathleen Stansberry, Janna Anderson and Lee Rainie, in October, 2019. “However, they warn this will happen only if people embrace reforms allowing better cooperation, security, basic rights and economic fairness.”

Their report is based on work by the Pew Research Center and Elon University’s Imaging the Internet Center. They asked 530 experts how lives might be affected by the evolution of the internet over the next 50 years. They included technology pioneers, innovators, developers, business and policy leaders, researchers and activists.

Disease is not specifically mentioned, but one finding involved living longer and feeling better. “Internet-enabled technology will help people live longer and healthier lives. Scientific advances will continue to blur the line between human and machine,” said the report.

Artificial intelligence is expected to take over repetitive, unsafe and physically taxing labor, leaving humans with more time for leisure, a claim made since the beginning of the technological revolution.

                                Hopeful and worrisome visions

The report is broken down into hopeful visions and worrisome visions. Among the hopeful visions:

* Digital life will be tailored to each user.

* A fully networked world will enhance opportunities for global collaboration, cooperation and community development, unhindered by distances, language or time.

* Expanded internet access could lead to further disruption of existing social and political power structures, potentially reducing inequality and empowering individuals.

Among the worrisome visions:

* The divide between haves and have-nots will grow as a privileged few hoard the economic, health and educational benefits of digital expansion.

* A powerful elite will control the Internet and use it to monitor and manipulate, while providing entertainment that keeps the masses distracted and complacent.

* Personal privacy will be an archaic, outdated concept, as humans willingly trade discretion for improved healthcare, entertainment opportunities and promises of security.

* Digital life lays you bare. It can inspire a loss of trust, often earns too much trust and regularly requires that you take the plunge even though you have absolutely no trust.

* The future of humans is inextricably connected to the future of the natural world. Without drastic measure to reduce environment degradation, the very existence of human life in 50 years is in question.

Some 72% of the respondents say there would be change for the better, 25% say there would be change for the worse and 3% believe there would be no significant change.

                              Updated predictions needed

The coronavirus was not yet loose in the world when this report came out. It might have changed perceptions and predictions.

Among those responding to the survey was John McNutt, a professor in the school of public policy and administration at the University of Delaware. He said:

“Not every technology is a good idea, and every advance should be carefully considered in terms of its consequence. On balance, technology has made much human progress possible. This is likely to continue. We will always have false starts and bad ideas. People will misuse technology, sometimes in horrific ways. In the end, human progress is based on creating a future underpinned by knowledge, not ignorance.”

It’s not a matter of good or bad outcomes, argues Erik Brynjolfsson, director of the MIT Initiative on the Digital Economy, but rather “how will we shape the outcome, which is currently indeterminate?”

Fiona Kerr, industry professor of neural and systems complexity at the University of Adelaide, South Australia, saw it this way:  “People love bright, shiny things. We adopt them quickly and then work out the disadvantages, slowly, often prioritizing on litigious risk. The Internet has been a wonderful summary of the best and worst of human development and adoption — making us a strange mixture of connected and disconnected, informed and funneled, engaged and isolated, as we learn to design and use multipurpose platforms shaped for an attention economy.”

Attention economy is the recognition of attention as a limited and valuable resource subject to market forces. The coronavirus captured world attention and swayed market forces.

The futurists and the experts most likely are rethinking their notions of life as we know it in the next 50 years.

****************************************

The Ethics AdviceLine for Journalists was founded in 2001 by the Chicago Headline Club (Chicago professional chapter of the Society of Professional Journalists) and Loyola University Chicago Center for Ethics and Social Justice. It partnered with the Medill School of Journalism at Northwestern University in 2013. It is a free service.

Professional journalists are invited to contact the Ethics AdviceLine for Journalists for guidance on ethics. Call 866-DILEMMA or ethicsadvicelineforjournalists.org.

 

 

Coronavirus Taking Mental Health Toll

Covid-19 taking a mental health toll. Web24.news photo.

By Casey Bukro

Ethics AdviceLine for Journalists

Writers often resort to the word “dystopian” to signify an imaginary place of misery and dread, a place beloved by horror and science-fiction movie fans.

Then along came covid-19, and the world finds it is such a place. It’s not fictitious. It’s real.

The toll this dreaded disease is taking on the human race is easy to measure in one way, and not so easy in another.

It’s relativity easy to count the dead, or those stricken, if reports are accurate.  By about mid-May, the count by those measures were 4.8 million cases worldwide, with 319,187 deaths and 1.8 million recovered.

Past Terrors Shook The Nation

Polio patients in iron lungs. New York Daily News photo

By Casey Bukro

Ethics AdviceLine for Journalists

As uncomfortable as it is to us now, the coronavirus pandemic will interest future historians as another cataclysmic eruption distorting lives and causing death around the world.

They happened before. Every generation, it seems, worries about some kind of existential threat. They are events that grab us by the throat and leave lasting impressions

The struggle against COVID-19 is described as a war likely to last 12 to 24 months.

War, whether in medical or military terms, is a good description. One of its definitions is to “state one’s intent to suppress or eradicate.” The medical community is doing its best to suppress or eradicate the coronavirus as it tries to do the same to us biologically. It’s a war against a “novel” virus, meaning it’s new and the way it acts is largely unknown.

Pandemic Ethics

A pandemic image. Allure.com photo.

By Casey Bukro

Ethics AdviceLine for Journalists

Look what happened to ethics in this time of a global viral pandemic.

It became important, a matter of life and death.

This became clear when the national demand for life-saving ventilators was greater than the supply, forcing doctors and medical technicians to decide which patients struggling to breathe gets them.

Until now, this is not how most people imagine ethics works. Mention ethics and they think it’s something for ivory tower scholars to ponder, but nothing that touches them personally, more a matter for study and debate.  A sleepy sort of science, they thought. By definition, ethics is a system of moral principles or values, of right or good conduct.

Americans tend to have a me-first attitude. If they need something, they want it now. The coronavirus humbled those attitudes as medical ethicists step in to decide who gets scarce medical resources. They must wait their turn, if at all.

Journalism of a Plague Year

Plague in Phrygia. Art Institute

Journalism of a Plague Year

By Hugh Miller

Ethics AdviceLine for Journalists

On April 3rd, Alexandria Ocasio-Cortez, a member of the U.S. House of Representatives for the 14thCongressional district of New York, wrote in a tweet: “COVID deaths are disproportionately spiking in Black + Brown communities. Why? Because the chronic toll of redlining, environmental racism, wealth gap, etc. ARE underlying health conditions. Inequality is a comorbidity.”

The following Tuesday, April 7th, Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, stood at a podium at the White House and praised the “incredible courage and dignity and strength and activism” of the gay community’s response to the AIDS crisis. Fauci, much of whose career has been dedicated to battling HIV/AIDS, then drew a connection between the “extraordinary stigma” which then attached to the gay community, and a similar stigma and marginalization which, he argued, today was increasing the burden and death toll imposed on African-American COVID-19 sufferers, who make up a disproportionately high number of fatalities of the latter-day plague.

As a philosopher and ethicist, I’ve been reflecting on the role of my discipline in coming to grips with this new and sudden event since it first burst into the headlines in early March. As the novel virus grew from an outbreak to an epidemic and then to pandemic dimensions, and the gravity of the illness associated with it, COVID-19, became clearer, the ethical approach to it became less so, to me.

Muzzled Scientists, Stifled Media

Muzzled scientists, stifled media: New restrictions on speaking directly to government scientists about the coronavirus are dangerous, writes Margaret Sullivan.

“We’re now at a moment when experts must be free to share their knowledge and front-line workers must be free to tell their stories without being muzzled or threatened — and certainly without being fired,” she writes. Lives depend on it.