Monday, December 29, 2014

New year's resolutions 'r' us -- Dec. 30, 2014 column

By MARSHA MERCER

There’s one – and only one -- foolproof way to avoid the disappointment of breaking your New Year’s resolutions: Don’t make any.

Let me quickly say that I do not subscribe to this strategy of avoidance, although many people do.

I’m a serial resolution maker. I love the freedom of the fresh start. The clarity of the clean slate. The blue sky of the blank page. I may not achieve my goal, but then again, I may. What have I to lose by trying?

As the lottery people say, you can’t win if you don’t play.

The first month of the year is the great equalizer. Each of us can erase the old and have 12 months to paint our new life. OK, we also have 12 months to fail, as if it takes that long. We usually fall off the road of good intentions in a couple of weeks.

But this time could be…will be!…different. And off we ride into the clutter-free land of health, wealth and size 4.

Forty-four percent of Americans said they planned to make New Year’s resolutions for 2015, a Marist Poll reported last month. That’s about the same as a year ago, but a jump from 2004. A decade ago, only 35 percent planned to make a resolution. The poll didn’t explore why people changed.  

Fifty-nine percent of people said they kept their resolutions in 2014 – which sounds high. But it compares with the 72 percent who said they’d kept theirs a year earlier. Really? Could we be seeing a wee bit of selective memory at work? 

The great 18th century essayist and biographer Samuel Johnson said second marriages are the triumph of hope over experience. The same can be said of New Year’s resolutions. We know all too well what happened before, but…that was then.

Social scientists at the University of Pennsylvania’s Wharton School say that certain time landmarks, such as birthdays and holidays, “create discontinuities in time perceptions that make people feel disconnected from their past imperfections…and disrupt people’s focus on day-to-day minutiae, thereby promoting a big-picture view of life.”

And that encourages people to engage in “aspirational behavior,” write Hengchen Dai, Katherine L. Milkman and Jason Riis in their paper on “The Fresh Start Effect,” published online in June in the journal Management Science.

I don’t know about all that, but I do know that January gives me a sense of possibilities that isn’t there most months, although back-to-school September always calls me to a new beginning.

In any case, younger people are more likely to make resolutions than oldsters. Among those under 45, more than half said they were very likely or somewhat likely to make a resolution for 2015. For those 45 and older, only about one in three said they were likely or somewhat likely to do so.  

Interestingly, men and women are almost equally likely to make resolutions – 43 and 44 percent, respectively -- but men are slightly more likely to keep them. Or so they say.  

So let’s say you make a resolution. Some behaviorists who study how people keep resolutions suggest that it’s better to think small -- resolve to lose five pounds, not 50. Write down specific goals. Tell people. Keep track. Other experts, though, say it’s best to keep your resolutions vague. Don’t monitor yourself too closely or you might get discouraged.   

In light of such contradictory advice, you could consult your Uncle Sam. Seriously.   

The website www.usa.gov has a page on Popular New Year’s Resolutions. At the top of the list is Lose Weight, Americans’ No. 1 goal. There’s also Manage Stress, Save Money, Get Fit, Drink Less Alcohol and eight others. Each of the 13 topics links to other government sites for free help.

For example, click on Quit Smoking and you go to smokefree.gov, where you can choose to write a quit plan, call a live counselor and sign up for supportive text messages and free apps.

“It doesn’t matter where you start. Just start,” smokefree.gov says in big, bold letters.

That’s not bad advice. Just start.

Happy New Year! And good luck.

© 2014 Marsha Mercer. All rights reserved.

Tuesday, December 23, 2014

What's your grade for 2014? -- Dec. 23, 2014 column

By MARSHA MERCER

To hear President Barack Obama, 2014 has been a heckuva year.

“Pick any metric you want: America’s resurgence is real,” the president declared at his year-end news conference.

In case you missed it: Over the last 57 months, businesses have created nearly 11 million new jobs, most of them full-time. Wages are rising. America is the world’s largest producer of natural gas. Gas is 70 cents a gallon lower than last Christmas. Ten million Americans have gained health insurance this year, the president said.

“Meanwhile, around the world, America is leading,” he said. We lead the coalition to destroy ISIL, the international fight to check Russian aggression in the Ukraine, the global fight against Ebola in West Africa and efforts to combat climate change. Our combat mission in Afghanistan is nearly over, he said, thanking the troops and their families.

“There is no doubt that we can enter into the New Year with renewed confidence that America is making significant strides where it counts,” Obama said.

Sounds good, but when I asked a few people to grade 2014, they looked pained.

One said “fast,” as in the year flew by. Others reluctantly gave the year a C or just shrugged. I agree that 2014 deserves a C – not the worst of years, certainly not the best – but we may be easy graders.

Several polls this month have found between 64 and 69 percent of Americans still think the country is heading in the wrong direction. Seven in 10 people say the next president should govern differently than Obama, putting Obama in George W. Bush territory near the end of Bush’s second term, the Wall Street Journal reported.

Presidential historian Robert Dallek likens what has happened to Obama to what befell President Lyndon B. Johnson in the mid-1960s.

LBJ won the 1964 presidential election by one of the largest landslides in history and pushed landmark, controversial measures through Congress, including Medicare and Medicaid, civil rights and voting rights laws. Obama won decisively in 2008 and pushed through the Affordable Care Act.

“This is not to suggest that history is repeating itself. There are too many differences between Johnson and Obama — both the men and their presidencies — to argue that,” Dallek wrote in an online essay for Reuters in October. “Yet, as Mark Twain said, history may not repeat itself, but it does rhyme.”

After both presidents achieved progressive change, they lost public support. The war in Vietnam ruined LBJ’s credibility and stopped his domestic agenda. Obama has been blocked by a combination of foreign and domestic developments. His popularity has tanked.

But historians may view Obama differently than people do today, says Dallek, one of the historians who have had several dinners with the president.

“It is doubtful that Obama will end up with as poor a reputation as Johnson,” the historian writes. A recent ranking of public approval of the last 10 presidents placed Johnson third from the bottom, above only Richard M. Nixon and George W. Bush.

“Historians will likely credit the Obama administration with more advances toward a more humane society,” says Dallek, citing the Affordable Care Act, equal rights for women, equal treatment for gays and lesbians and sympathetic treatment for “Dreamers,” children brought illegally to the United States by their parents.

As Obama thinks about his place in history, he is signaling that he hasn’t given up on the present. His December surprise announcement to start normalizing relations with Cuba shows he wants to remain relevant. 

He acknowledges there is plenty of work left to do and says he’s energized and excited about what he calls his presidency’s fourth quarter. He also claims he wants to work with Republicans.

But as the GOP takes control of Congress with what they believe is a mandate to roll back Obama’s policies, the president is staking out a confrontational game plan.

“I intend to continue to do what I’ve been doing,” Obama said of his use of executive actions, which many in the GOP consider unconstitutional.

Partisan battles, standoffs, vetoes and more disappointment surely lie ahead. By this time next year, 2015 may be lucky to get a C.   


© 2014 Marsha Mercer. All rights reserved.

Wednesday, December 17, 2014

Wanted: Candidates who tell us to eat our spinach -- Dec. 18, 2014 column

By MARSHA MERCER

As we dive into the 2016 presidential race, it’s worth remembering a low point in the 2012 contest -- so that maybe, just maybe, we won’t repeat it.

Cast your mind back to August 2011, when eight candidates for the Republican presidential nomination were in a debate in Ames, Iowa. A moderator asked for a show of hands: Who would walk away from a deal that cut $10 from the deficit for every $1 in tax increases?

All eight candidates raised their hands – and the crowd of GOP activists erupted in cheers and shouts of approval.  

We can agree that the way the question was asked, with a quick show of hands, hardly invites serious discussion. And, yes, primaries and caucuses bring out true-believers from party wings. General election voters are more moderate. Still, it was a defining moment: Revenues would be off the table -- again.

Former Utah Gov. Jon Huntsman later said he regretted going along with the pack but not doing so would have required a lot of explaining.

“What was going through my mind was, `Don’t I just want to get through this?’” Huntsman said. “That has caused me a lot of heartburn.”

Democrats as well as Republicans want to “get through this” without tackling the tough fiscal issues. But thoughtful people in both parties believe meaningful deficit reduction must include a combination of spending cuts and revenue increases. We keep putting off the hard conversations needed to find the right mix.    

Conventional wisdom says that nobody gets elected telling voters to eat their spinach. Successful candidates promise pie with meringue a mile high. Obviously, it’s more inspiring to talk about creating jobs, improving education and promoting cutting-edge technology than about slicing entitlements and raising the retirement age.

And yet, many voters do want what’s best for America. They would listen to a presidential contender who has the courage and skill to explain why it’s time for spinach.

Two veteran political pros are urging us to ask more of our presidential candidates this time around.

Retired U.S. Sens. John C. Danforth, Republican of Missouri, and J. Robert Kerrey, Democrat of Nebraska, have a message for the presidential candidates of 2016: “Don’t duck the debt.”

The debt is the cumulative amount the nation owes, and it continues to rise. This is happening even though the deficit, the difference between what the government takes in and spends in a year, has been declining since the end of the Great Recession.

Kerrey and Danforth co-chaired the 1994 Bipartisan Commission on Entitlement and Tax Reform, in which 30 of 32 commission members agreed then that “current trends are not sustainable” and urged policy changes. These included raising national savings, preventing entitlements from consuming ever more of the federal budget, dealing with projected increases in health care costs, and balancing spending and revenues for Medicare and Social Security. 

The New York Times headline about that 1994 report read, “Yawns greet a warning about the burning fuse on entitlements.”

Some things have changed for the better in the last 20 years, Kerrey and Danforth agreed Tuesday as they released an update at a forum sponsored by the Bipartisan Policy Council and Concord Coalition in Washington. We spend less than projected on interest on the debt; health care costs have declined. Overall, though, the trend still is not sustainable, they said. The burning fuse is shorter.

The burden of debt has risen from about half the gross domestic product in 1994 to three-fourths, and is still growing. We tip-toe around the big problems – health care costs and entitlements. Our aging population taps Social Security and Medicare more each year, taking money we could invest in domestic and international priorities to make us stronger as a nation.

To the presidential candidates in 2016, whoever they are, Kerrey and Danforth say: “Your campaign promises for reviving the economy, strengthening national defense, improving the social safety net or reducing the tax burden will ring hollow if they count on an escalating and unsustainable infusion of borrowed money.”

Like it or not, the next president will have to focus on fiscal issues. Voters should ask candidates for their solutions to the nation’s fiscal mess, and candidates should be straight with the people.

Let us eat spinach.  

© 2014 Marsha Mercer. All rights reserved.
30

.  

Thursday, December 11, 2014

Here we go again: Wealthy to get even louder voice -- Dec. 11, 2014 column

By MARSHA MERCER

Americans have a hate-hate relationship with money in politics.

We hate the corrupting influence of lobbyists and big money, but most of us also hate to open our own wallets to campaigns and political parties. We take a dim view as well of public or taxpayer funding of campaigns.   

Some wish for a constitutional amendment, which is about as likely as a magic wand, to make the money in politics disappear. Instead, we cede influence to a sliver of the population that does contribute to political campaigns and committees.

Only 0.2 percent of Americans gave $200 or more in the 2014 election cycle and only 0.04 percent gave more than $2,600, according to the Center for Responsive Politics, a nonpartisan group that tracks money in politics.

“In other words, current rules limiting how much an individual can give to party committees haven’t been a pressing problem for most Americans,” the center says on its opensecrets.org website.

But the fortunate few who do feel restrained by current limits have an ally in Sen. Mitch McConnell, R-Ky. The incoming majority leader negotiated a provision in the $1.1 trillion federal spending bill (on page 1,599 of 1,603 pages) to raise the limits on contributions by wealthy people to national party committees.

In effect, the provision guts what’s left of McCain-Feingold, the bipartisan 2002 law that prohibited large contributions by the wealthy and corporations to national party committees and required disclosure of those contributions. Donors soon found that the law allowed individuals and corporations to contribute “soft” money that didn’t have to be reported to outside groups.

This year, the maximum someone can give either the Democratic National Committee or Republican National Committee is $32,400. Under the new scenario, the political committees could set up three new separate accounts with separate contribution limits. Estimates vary on how much people then could give.  

In a letter to senators, six campaign finance watchdog groups said a single individual could contribute $777,600 per year or $1,555,200 per two-year election cycle. A couple could give $3,110,400 per two-year cycle, said the letter signed by Common Cause, the League of Women Voters, Public Citizen and three other groups.

“There is absolutely no justification for allowing these massive contributions that can only be given by millionaires and billionaires and are bound to result in corruption and national scandals,” the letter said. 

So where did the bail-out for the parties come from?

“This provision was worked out in a bipartisan way to allow those of us who are organizing political conventions to raise the money from private sources as opposed to using taxpayer funds,” House Speaker John Boehner told reporters.

It’s not the first time Congress has cut public funding of campaigns. In April, President Barack Obama signed bipartisan legislation to eliminate taxpayer funding for national political conventions and redirect the money to childhood disease research at the National Institutes of Health.

Former Rep. Eric Cantor, R-Va., championed the measure, which was named the Gabriella Miller Kids First Research Act in remembrance of a 10-year-old girl in Leesburg, Va., who died of brain cancer.

Taxpayers have been paying part of the conventions’ costs since 1976. In 2012, taxpayers gave each party about $18 million for the events, a pittance of the total bill, most of which was paid by corporate and individual donors.

We are witnessing the last gasp of the campaign finance reform era that was born after the Watergate scandal in the mid-1970s. Although the Supreme Court has watered down laws seeking to limit contributions, the people never embraced the laws either.

Congress approved a $1 checkoff  ($2 for a married couple) to help pay for presidential campaigns. Later tripled to $3 and $6, the checkoff never caught on. At its most popular, 29 percent of taxpayers used it, in 1980. By 2012, only 6 percent did.   

Since 1976, about $1.5 billion has gone to publicly financed candidates and nominating conventions, according to the Congressional Research Service. But presidential candidates have chosen to opt out rather than abide by the limits that accompany public funds. 

In 2012, Obama became the first president elected without accepting public funds.

We tend to act only in the wake of scandals, so it may take the next Watergate to wake us up to the need for campaign finance reform. Until then, the people with deep pockets will dig even deeper, and reap the rewards.   

© 2014 Marsha Mercer. All rights reserved.

30

Thursday, December 4, 2014

New racial reality is transforming America -- Dec. 4, 2014 column

By MARSHA MERCER

The current clash between congressional Republicans and President Barack Obama, while crucial for millions of undocumented immigrants, is also political theater aimed at shoring up Democratic and Republican constituencies.  

But the political debate misses the point. Powerful demographic forces that will transform the country are already in place.    

Regardless of how the latest episode of “Washington on the Brink” ends, sometime between 2040 and 2050 whites will no longer be in the majority in the United States.  That’s a monumental change from 1790 to 1980, when whites made up 80 to 90 percent of the population.

Minorities will have as profound an impact on American society in the 21st Century as baby boomers did in the 20th Century, says demographer William H. Frey, a senior fellow at the Brookings Institution in Washington. He looks at the inevitable demographic changes and challenges in his new book, “Diversity Explosion: How New Racial Demographics Are Remaking America.”

It’s not future waves of immigrants who will change the country’s racial makeup, Frey says. Most of the immigration that will shape our future occurred in the 1980s and 1990s.

The shift to a “majority-minority” population will result from births to people who are already here – mostly Hispanics and Asians – and to multiracial births, says Frey, who expects the ranks of Hispanics, Asians and the multiracial population to more than double in the next 40 years.

The country reached a milestone in 2011 when more minority babies than white babies were born. In Texas, New Mexico and California minorities are now in the majority.  Hawaii has never had a white majority, and whites are a minority in the District of Columbia.

This year for the first time, there are more minority students in U.S. elementary and high schools than whites. Multiracial marriages are also proliferating.  About one in seven new marriages is multiracial, including nearly half of those involving Hispanics or Asians.

Racial change has never been easy and older whites may fear losing their majority status, Frey says. But he believes that the shift to a majority-minority population will come “just in time,” as the country copes with a dwindling white population.

Because of low levels of white immigration, reduced fertility and aging, the white population grew a tepid 1.2 percent from 2000 to 2010, Frey says. In 2010, the median age of whites was 42, Asians 35, blacks 32 and Hispanics 27.

“Rather than being feared, America’s new diversity – poised to reinvigorate the country at a time when other developed nations are facing advanced aging and population loss – can be celebrated,” he writes.  

The solvency of Social Security and other retirement programs depends on younger workers’ energizing the economy. Our culture and politics will also change with the influx of new minority voters.

Minorities – defined as everyone but single-race, non-Hispanic whites -- now make up about 37 percent of the population. The Census Bureau projects minorities will be 57 percent in 2060.  

Latinos are the nation’s largest minority group and growing fast. Their political clout has yet to be felt nationally.

More than 25 million Hispanics are eligible voters – that is, citizens over 18 – up from 17.3 million in 2006, the Pew Research Center reports. A larger share are native-born than are naturalized citizens.

Hispanics are moving into areas of the country that previously had few Spanish speakers. Since 2006, the number of Hispanic eligible voters has grown fastest in South Carolina, up 126.2 percent, Tennessee, up 113.7 percent and Alabama, up 110.5 percent, according to Pew.

So far, minorities have overwhelmingly voted Democratic for president. In 2012, Hispanics voted for Obama over Republican Mitt Romney 71 percent to 27 percent. But Hispanics haven’t turned out to vote in rates as high as blacks or whites. 

In 2012, Latinos were 17 percent of the population but 11 percent of eligible voters due to lower median age and lower rates of citizenship and voter registration, Frey’s analysis found.

By 2024, one in three eligible voters will be nonwhite. That’s also the first year Latinos are projected to surpass the share of eligible black voters.

We can see the future – and it looks nothing like the 1950s or even 1980. How we adjust to the new reality will determine our success or failure.

© 2014 Marsha Mercer. All rights reserved.
 


Tuesday, November 25, 2014

It's better than you think -- Nov. 27, 2014 column

By MARSHA MERCER

Nobody ever went broke telling Americans things are worse than they think.

A book about the dysfunctional Congress seems unlikely to hit the bestseller lists, but that’s what happened in 2012. The provocative title didn’t hurt.

“It’s Even Worse than it Looks: How the American Constitutional System Collided with the Politics of Extremism” by Thomas Mann and Norman Ornstein placed the blame for the mess in Washington squarely at Republicans’ feet. Predictably, some readers were enraged, others energized.

And the race to warn people that things are worse than they think was on.

In the last year, pundits and think tanks have declared a wide range of problems “worse than you think”: Iraq, the federal budget outlook, wealth inequality, domestic spying, light pollution, the Home Depot security breach and the health of the homeless, to name a few.

Then there’s the “it’s bad and getting worse” department. Allergies, climate change and public housing are in that category, according to news reports.

But, in the spirit of the season, let’s consider the possibility that some things may be better than we think. For example, we hear a lot about rampant consumerism this time of year. Generosity -- not so much.  

And yet, as measured by our opening our wallets to charity, generosity in the United States has rebounded since the depths of the financial crisis.

Charitable giving rose 3 percent last year, the largest year-over-year increase since the Great Recession, according to Giving USA and Indiana University Lilly Family School of Philanthropy. It was the fourth straight year of increases, mostly fueled by individuals.  

Individuals, companies, foundations and bequests gave an estimated $335.2 billion last year, nearly as much as before the economic downturn. That figure comes from a study of itemized tax, household surveys and other sources.  

While middle and lower-income Americans give a larger share of their income to charity than wealthier people, the 1 percent is also engaged. Since Bill and Melinda Gates and Warren Buffett founded The Giving Pledge in 2010 to spur charitable giving by the richest of the rich, 127 billionaires in 12 countries have pledged to give the majority of their wealth to philanthropic causes or charitable organizations either during their lifetimes or in their wills. Check out the list and pledge profiles on givingpledge.org.

Among them is Facebook founder Mark Zuckerberg, who gave the federal Centers for Disease Control Foundation $25 million last month to help fight the Ebola crisis. He has made several eight- and nine-figure donations to education, health and community development groups, The New York Times reported, making him “one of the most generous entrepreneurs of his generation."

But you don’t have to be as rich as Zuckerberg to give.

This Tuesday, Dec. 2, all of us can go online and donate. Now in its third year, Giving Tuesday targets the Tuesday after the mega-shopping days of Black Friday and Cyber Monday as a global day to give back.
Last year, people pledged $19 million on Giving Tuesday, with an average gift of $142. That was nearly double the $10 million raised in 2012, when the average gift was $101, according to Blackbaud, a company that provides software and services to nonprofits.

Giving Tuesday encourages donors to tweet their gifts and to post “unselfies,” pictures of themselves making the donation.

Ten thousand nonprofits participated last year, up from 2,500 the first year. Groups are elbowing each other for support, and that’s prompted a bit of backlash.

The head of a charity in New York recently chastised other charities, saying they should stop begging on Giving Tuesday and start giving.

David Nocenti, executive director of the Union Settlement Association, the largest social service provider in East Harlem, wrote that on Giving Tuesday he and his staff will be “walking the streets of East Harlem, giving away hundreds of single-ride transit farecards to members of our community.

“We will then ask each recipient to give in some way to at least three other people, such as visiting someone who is lonely, bringing food to someone who is hungry or offering a helping hand to someone needing assistance. We’ll also ask them to tell us how they gave and to post their charitable action online.”

By focusing on giving instead of asking for gifts, “we aim to inspire hundreds of people to join in helping others,” he wrote on philanthropy.com, the website of the Chronicle of Philanthropy.  

Generosity is back. Things are better than we thought.

© 2014 Marsha Mercer. All rights reserved.
30


Wednesday, November 19, 2014

Thankfully, we finally agree on Thanksgiving -- Nov. 20, 2014 column

By MARSHA MERCER
Thanksgiving, now deeply entrenched in modern American life, got off to a shaky start.
Yes, there were prayers of thanksgiving in Virginia and harvest feasting in Massachusetts in the 17th century. But the first Congress squabbled over even asking the president to issue a thanksgiving proclamation.
In September 1789, a representative from New Jersey proposed that a committee from the House and Senate visit President George Washington and ask him to recommend to the people a day giving thanks for the many favors of Almighty God, especially the “opportunity peaceably to establish a Constitution of government for their safety and happiness.”
Two representatives from South Carolina objected -- one to the “mimicking of European customs, where they made a mere mockery of thanksgivings” and the other to interfering in matters beyond the proper scope of Congress, according to an account in The Papers of George Washington at the University of Virginia.
“Why should the president direct the people to do what, perhaps, they have no mind to do?” asked Thomas Tudor Tucker of South Carolina. “They may not be inclined to return thanks for a Constitution until they have experienced that it promotes their safety and happiness.”
Besides, said Tucker, Congress had no business getting involved in religion, and, he added, “If a day of thanksgiving must take place, let it be done by the authority of the several states.”
Despite the opposition, the resolution passed, and a committee did visit Washington, who issued a proclamation naming Thursday, Nov. 26, 1789, a day to unite in “sincere and humble thanks.”
Citizens and churches took to the first Thanksgiving, but the observance wasn’t set in November. Washington later proclaimed Feb. 19, 1795, a “day of public thanksgiving and prayer.” 
The second president, John Adams, issued proclamations for May 9, 1798, and April 25, 1799, but they weren’t officially for thanksgiving. We’d never recognize our feast-football-shop extravaganza in Adams’ day of “solemn humiliation, fasting and prayer.”
But when Thomas Jefferson became president, the proclamations of prayer or thanksgiving ceased. For eight years, he refused to issue any on the ground that it would have infringed on the separation of church and state.
During the War of 1812, Congress asked President James Madison to declare a day of “public humiliation and fasting and prayer to Almighty God for the safety and welfare of these States,” and he chose Jan. 12, 1815. A few months later, Madison named the second Thursday in April 1815 as a day of thanksgiving for the blessing of peace.

After that, no president until Abraham Lincoln proclaimed a national day of thanksgiving.

Jefferson Davis, president of the Confederate States called for a day of fasting and humiliation in 1861 “in view of impending conflict,” and Lincoln proclaimed three days of thanksgiving for battle victories in 1862 and 1863.

For the national Thanksgiving holiday, we can thank Sarah Josepha Hale, an author and editor of Godey’s Lady Book magazine who campaigned tirelessly. By the 1850s, she had successfully lobbied more than 30 states and territories to put Thanksgiving on their calendars. Her goal, though, was a national holiday, which she believed would unify the country.

With the nation torn apart by Civil War, Hale wrote Lincoln on Sept. 28, 1863, asking him to use his executive authority to give Thanksgiving national recognition “to become permanently an American custom and institution.”

Days later, on Oct. 3, Lincoln signed a proclamation, actually written by Secretary of State William Seward, that the last Thursday of November would be “a day of thanksgiving and praise to our beneficent Father who dwelleth in the heavens.”

Thanksgiving became our holiday on the last Thursday of November, not by law but by tradition.  

But in 1939, when the last Thursday fell on Nov. 30, with just 24 days before Christmas, retailers begged Franklin D. Roosevelt to move Thanksgiving up a week to lengthen the Christmas shopping season.

FDR proclaimed Thanksgiving to be on Nov. 23. His edict applied only to the District of Columbia and federal workers, but angry letters poured into the White House.

Sixteen states refused to accept the change. Two Thanksgivings were celebrated until 1941, when Congress stepped in.

A representative from Michigan declared that only Congress could change the date, “not the fancy or whim of any president.”

Congress set the federal holiday as the fourth Thursday in November. It may be one of the few things for which we all can be thankful.

©2014 Marsha Mercer. All rights reserved.
30




Thursday, November 13, 2014

On taking Christmas off the school calendar -- Nov. 13, 2014 column

By MARSHA MERCER

A Maryland school board could hardly have angered residents more had it abolished Christmas.

The Montgomery County Board of Education outside Washington didn’t scrap Christmas but it did vote Tuesday to eliminate any mention of Christmas and other religious holidays from next year’s official school calendar. 

Schools in Maryland’s largest county will still be closed as usual around Christmas, Easter and the Jewish holidays of Rosh Hashanah and Yom Kippur, but students will be on Winter Break, Spring Break or the awkward “no school for students and teachers.”

The reaction on social media was swift and intense. Infuriating almost everybody was the board’s insistence that it was not closing the schools to observe religious holidays, which it said would be illegal, but as a practical, operational matter because of the high absenteeism that would result if school were held on those days.

The illegality argument is debatable. The state requires schools to be closed at Christmas and Easter, and the county has been closing school on Jewish holidays since the 1970s. It’s disingenuous and ridiculous to pretend schools are not closed so families can observe religious holidays. But which families and which holidays?

For years, local Muslim leaders have asked Montgomery schools to close for at least one Muslim holiday. To bolster their case, they’ve urged Muslim parents to keep their students home on Eid al-Adha, also called the Feast of Sacrifice.

Montgomery says absenteeism runs about 5 percent that day, a little higher than usual, but not high enough to justify closing the schools. Absences are excused, but Muslim families say fairness demands that Muslim holidays be recognized.

Asked again to add a Muslim holiday, the school board opted out, deciding instead to scrub all mention of religious holidays from the calendar.

“It makes no sense,” Saqib Ali, a former Maryland state legislator and co-chair of the Eid Coalition wrote on Facebook.  

“By stripping the names Christmas, Easter, Rosh Hashanah and Yom Kippur, they have alienated other communities now, and we are no closer to equality,” he told The Washington Post.  

Board members said they meant no disrespect to any religion.

“No matter how well-intentioned we are, it comes off as insensitive” to Muslim families, said Michael A. Durso, the lone vote against the calendar change, the Post reported.

“Political correctness reaches a new level of absurdity,” one parent of former county students commented on Facebook. “Next thing you know they’ll change the name of Church Street in Rockville.”

This hullabaloo didn’t have to happen. Many school districts have already quit mentioning Christmas, Easter and the Jewish holidays on their official calendars – and it hasn’t caused a fuss. Baltimore city schools have Winter Holiday and Spring Break.

Montgomery County board members cited the example of Winter Break instead of Christmas Vacation in Fairfax County, Virginia’s largest school district. Fairfax’s Spring Break is March 30 through April 3 next year, and April 6 is a Student Holiday, a.k.a. elsewhere as Easter Monday.

The diverse Fairfax district also has an online combined calendar of religious events that teachers can consult in planning lessons. For example, Sikh Martyrdom Day is Nov. 24; Bodhi Day, a Buddhist celebration, and the Feast of the Immaculate Conception in the Roman Catholic Church are Dec. 8.

A check of websites finds other Virginia districts that call their time off Winter and Spring Breaks include Richmond, Alexandria and Lynchburg. The school calendar in Bristol, Va., highlights Christmas programs, Christmas Eve and Christmas Day but doesn’t formally name the December vacation; Easter is mentioned as part of Spring Break. 

In Atlanta, the December holidays are called Semester Break.

Most people understand that schools cannot and should not favor one religion over another, and calling holidays Winter Break and Spring Break may make religious minorities feel more accepted, a worthy goal. It’s not as if anyone needs a school calendar to remind that it’s Christmas.

At the same time, not all districts are silent about Christmas. Dothan City Schools in Alabama have Christmas Break and schools are closed for Good Friday. And some districts are putting Christmas back into December.  

In 2006, Falcon School District northeast of Colorado Springs, Colo., returned to Christmas Break after receiving a letter from a religious rights group. In Woodbury, Tenn., the Cannon County Board of Education agreed in October 2013 to again call the December days off Christmas Break. 

When the school committee in Marshfield, Mass., a coastal town near Boston, changed the December calendar to Holiday Break, residents started a petition drive, demanding that Christmas Vacation be restored. More than 2,000 names have been collected.

Conservatives often rage against a War on Christmas. That’s silly, given the country’s obsession with the holiday. But we live in an age of silly political fights. A War on Winter Break could be next.

©2014 Marsha Mercer. All rights reserved.

30

Thursday, November 6, 2014

White Southern Democrats face extinction -- Nov. 6, 2014 column

By MARSHA MERCER

Fifty years ago, President Lyndon Johnson signed the Civil Rights Act of 1964 at a joyous White House ceremony. That night, though, when presidential aide Bill Moyers stopped by the living quarters, he found the president melancholy.

“He looked at me morosely and said, in effect, `I think we just handed the South to the Republicans for the rest of my life and yours.’” Moyers recounted on PBS, adding, “And so we had.”

The 2014 midterm elections marked the demise of the white Southern Democrat. On Tuesday, voters fired the last one in the U.S. House from a state in the Deep South.

Democrats also lost Senate races in Arkansas, Florida, Georgia, North Carolina and South Carolina and nearly lost Virginia. A Dec. 6 run-off in Louisiana is a challenge for Democratic Sen. Mary Landrieu. All seven gubernatorial races in the South went to the GOP.  

Republicans ran the table across the country, not just in the South, but considering that Southern Democrats once ruled Congress – 103 of 105 House members from the South were Democrats in 1950 -- their disappearance is remarkable.

Dubbed “the loneliest man in Congress,” Rep. John Barrow, D-Ga., had the distinction of being the last white Democrat in the House from the Deep South. Barrow, a conservative who had the endorsement of the National Rifle Association, had held his seat since 2004. He lost to Republican Rick Allen. And so ends an era.

In the next Congress, every one of the Democrats in the House from the Deep South states of Alabama, Georgia, Louisiana, Mississippi and South Carolina will be black.

Virginia will have two white Democratic members in the U.S. House, both from Northern Virginia, and one black House member representing a majority-black district that stretches from Richmond to Hampton Roads.

An anti-President Obama fever felled Barrow and other Democrats. Southern voters weren’t just turning the page; they were tearing it up.

Even having a distinguished political pedigree couldn’t save the Southern Democrat. Also in Georgia, Jason Carter, grandson of former President Jimmy Carter, and Michelle Nunn, daughter of former Sen. Sam Nunn, lost their bids for governor and senator, respectively.

Nunn campaigned with her dad, promising to adopt his practice of working across the political aisle to get things done.

In Arkansas, Sen. Mark Pryor, son of former Sen. David Pryor, lost his re-election bid to freshman Rep. Tom Cotton, an Iraq War veteran. Pryor has been a name in Arkansas politics since 1960 when David Pryor was first elected a state representative. He went on to be a congressman and governor before serving in the Senate from 1979 to 1997.

The South has evolved a two-party system deeply divided by race. White voters form the base of the Republican party and African Americans the base of the Democratic party.  

“The racial split remains one of the starkest divides in Georgia politics,” the Associated Press reported from early exit polls.

Republican Senate candidate David Perdue won about 70 percent of the white vote and Nunn took the overwhelming majority of the black vote, AP said. Nunn had hoped to win enough of the white vote to force Perdue into a run-off, but he won with 53 percent to her 45 percent.

Mark Pryor also won the black vote, exit polls reported, but he suffered a stinging loss to Cotton, 57 percent to 40 percent.

The Southern disaffection with Democrats is hardly new. Sen. Strom Thurmond of South Carolina switched parties and became a Republican three months after Johnson signed the Civil Rights law. Sen. Harry F. Byrd Jr. of Virginia quit the party and became an independent in 1970, and Sen. Richard Shelby of Alabama became a Republican in 1994.

Ronald Reagan courted Southern voters in 1980 and enlisted support for his legislative agenda from the Conservative Democratic Forum, known as the boll weevils, many of whom were Southerners concerned about deficit spending.

The Blue Dog Coalition of fiscally conservative House Democrats, founded by Southerners in 1995 in a last gasp to remain relevant, has been shrinking. In 2010, it had about 50 members and before the midterm was down to 19. Now it has lost its last white member from the Deep South.    

©2014 All Rights Reserved



Tuesday, November 4, 2014

On STATELINE.ORG -- Nov. 4, 2014

Interstate Egg Fight Erupts Over Cramped Hen Cages

  • November 04, 2014 
  • By Marsha Mercer
chickenAP
Chickens huddle in their cages at an egg processing plant at the Dwight Bell Farm in Atwater, California in September 2008, shortly before Californians approved a ballot initiative prohibiting farmers from confining hens in cramped cages. Six states are challenging California’s restrictions. (AP)
In a case that could affect farmers and consumers nationwide, six states are back in federal court to challenge a California ban on the sale of eggs from hens kept in cramped cages.
The governor of Iowa and the attorneys general of Missouri, Nebraska, Oklahoma, Alabama and Kentucky filed a notice Oct. 24 that they will appeal a U.S. district court’s dismissal of their case. They had argued that the law forces farmers in other states to make costly changes in their operations and violates the U.S. Constitution.
“We don’t want a trade war in America but we think that California is dead wrong on this,” said Iowa Gov. Terry Branstad, a Republican. Iowa is the country’s top egg-producing state.
“In Alabama, consumers are free to make their own choice of which eggs to buy at their grocery stores, and it is preposterous and quite simply wrong for California to tell Alabama how we must produce eggs,” Alabama Attorney General Luther Strange said in a statement. “If California can get away with this, it won’t be long before the environmentalists in California tell us how we must build cars, grow crops, and raise cattle, too.”
In 2008, California voters approved a ballot initiative prohibiting the state’s farmers from confining hens in a way that prevents them from turning around freely, lying down, standing up and fully extending their limbs. Two years later, California lawmakers banned the sale of eggs—from any state—that have been produced by hens in conventional or “battery” cages.
Battery cages provide each hen an average of only 67 square inches of floor space, smaller than an 8x10 sheet of paper. The 2010 law, which goes into effect Jan.1, cites the increased risk of salmonella from birds in large flocks in confined spaces.
About 95 percent of eggs in the U.S. are produced in battery cages. Farmers brought hens inside to battery cages in the 1950s as a way to reduce disease and produce a cleaner egg than those from barnyard chickens that pecked in filth. But animal welfare advocates, including the Humane Society of the United States, which pressed for Proposition 2, have long maintained that battery cages are cruel because hens are unable to behave naturally.
The Pew Commission on Industrial Farm Production’s 2008 report, “Putting Meat on the Table: Industrial Farm Animal Production in America” recommended the phaseout within 10 years of all intensive confinement systems, including battery cages. (Pew funds Stateline.)

Language from the U.K.

The language of California’s 2008 ballot measure echoed a 1965 United Kingdom report that advocated Five Freedoms for farm animals: to turn around, lie down, stand up, stretch and groom without restriction of movement. The European Union banned battery cages in 1999 with a phaseout period of 12 years.
“What farmers and ranchers need to recognize is that consumers are demanding higher animal welfare,” said Joe Maxwell, a farmer himself, former lieutenant governor of Missouri and a vice president of the Humane Society of the United States.
Some consumers are willing to pay higher prices for such products, Maxwell said.
But Blake Hurst, a farmer and president of the Missouri Farm Bureau, said he worries about people who may care about animal welfare but can’t afford to pay a higher price for eggs.  
“That’s the person who doesn’t get a voice,” he said.
Plus, said Hurst, it’s not hard to imagine other states taking protectionist steps. Missouri grows grapes without irrigation for wine, for example. It might decide to prohibit imports of wine from grapes grown with irrigation, as in California, he said.  

A Level Playing Field

In passing the 2010 law, California legislators wanted to protect California egg producers from being unfairly disadvantaged by out-of-state competition. It had become clear that complying with the 2008 requirements would cost California farmers more than out-of-state producers’ operations, so the law was extended to cover all eggs sold in the state, including those from other places.
Now Missouri farmers, who export one-third of their eggs to California, must decide whether to invest more than $120 million in new henhouses to conform to California’s law or stop selling to the largest egg market in the country, said Missouri Attorney General Chris Koster. The states filing the lawsuit claim California violated the commerce clause of the Constitution by requiring out-of-state farmers to meet production requirements.
U.S. District Court Judge Kimberly Mueller ruled Oct. 6 that the officials lacked legal standing to bring the lawsuit, because the law affects only the subset of farmers who are not planning to comply with California’s law.

Other States Act

Meanwhile, some other states are following California’s lead. Three states—Michigan, Oregon and Washington—have passed laws mandating more space for hens, and Ohio has banned the construction of new battery cages. Lawmakers in New York and Massachusetts also have considered bills. A proposal for a national standard for laying-hen cages was dropped from the 2014 farm bill.
While the six-state appeal makes its way through the court, egg producers around the country are scrambling to meet California’s requirements by Jan. 1. The 2010 law did not specify what size or type of cage is acceptable, which has led to confusion. Many in the industry believe the EU standard of 116 square inches per hen is about right. That provides each hen space slightly smaller than a sheet of legal paper, which is 8.5x14 inches.
“We’re in new territory,” said Dermot Hayes, an agricultural economist at Iowa State University, who estimates that 40 percent of the laying hens in Iowa will be killed to make room for the new henhouse space requirements for the California market by Jan. 1.
“Egg prices will go up everywhere – California, too, for a while,” Hayes predicted. Then, egg producers will build new barns and raise production and prices will settle down.
“It’s a sea change,” said Jill Benson, a fourth-generation egg farmer in Modesto, California, who started researching cages soon after Proposition 2 passed. Her family company, JS West & Companies, became the first in the country to choose “enriched colony” cages. These are the standard in the EU and are approved by the American Humane Association, a different group from the Humane Society of the United States.
About 150,000 of Benson’s 1.8 million hens live in enriched colony, also called furnished colony, cages. One cage typically houses 60 hens with each getting 116 square inches of space. The cages are outfitted with perches, a nesting box for laying eggs in private and space to stretch, perch and groom.
“We have been very pleased to see they can do all those behavioral things” listed in Proposition 2, she said. “We are compliant.” To show consumers how well the hens are treated, Benson has installed six video cameras in the henhouse that provide 24-hour Hens Live feeds online.

Is Cage Free Healthier?

Besides battery and enriched colony cages, some hens live in cage-free and free-range settings. Cage free typically means hens can move around the house and outside, if they wish. Free-range hens live mostly outside. The Humane Society is calling on California egg producers to go cage free.  
“What level of animal cruelty do we want to tolerate?” said Paul Shapiro, vice president for farm animal protection at the Humane Society.
So is cage free best for hens? Again, there’s disagreement.
“Hen health is better in cages and worse in cage free,” said Joy Mench, professor of animal science at University of California Davis, who has done extensive research into hen housing. Enriched colony settings offer the protection of the cage from predators and give hens more opportunity to act like hens, she said.
The mortality rates for cage free are double those of conventional and enriched colony cages, in part because cage-free systems tend to house very large groups of hens, and that leads to cannibalistic behavior.
“I’ve seen some really awful cage-free systems that are without the things hens need,” she said, adding that amenities like perches, foraging areas and nesting boxes may be more important to hen welfare than cage size.
As for egg safety, both sides cite academic studies about cages and salmonella. The first federal study comparing hens in three commercial housing systems—cage free, conventional and enriched colony—found no difference in the rate of salmonella infection. 
“I can’t really tell them I have a silver bullet,” said Deana Jones, research food technologist in the Egg Safety and Quality Research Unit of the U.S. Department of Agriculture Agricultural Research Service, who led that study and others.