Is Memorial Day About Grief, Glory, or Hot Dogs?

reuters

Memorial Day is one of America’s most confusing holidays. Depending on the celebrant, it can be a day of grief, glory—or backyard barbecues.

It’s not a bad thing to have such disparate takes on a day of remembrance. And don’t worry: You’re not a bad person if you choose to sit back and enjoy your day off. But sometimes it pays to think about why we get the day off in the first place and ponder the mysterious forces that bind hot dogs, tears, and flags all together.

Decoration Day, as the holiday was once known, arose in the years after the Civil War as a way to grieve for the 750,000 soldiers who had perished over four bloody years. Families who stifled their mourning during wartime sought public ways to pay tribute to the fallen in peacetime. Understandably, graves become a focus for the bereaved, and mourners took flowers to cemeteries to decorate them.

This practice first received semi-official sanction in 1868 when General John Alexander Logan, the head of a large fraternal organization of Union veterans, designated a day each year “for the purpose of strewing with flowers or otherwise decorating the graves of comrades who died in defense of their country during the late rebellion.” Southerners didn’t take too kindly to this initial effort, but by 1890 all the Northern states had recognized the holiday.

This emphasis on the Northern dead wasn’t just born of sectional spite. The ultimate sacrifice made by hundreds of thousands of men to preserve the Union elevated the value of the nation to its citizens. Lacking the traditional building blocks of other nations (such as centuries of shared history on the land or ancient blood ties), the U.S. had long had a difficult time forging a unifying national culture. The idealistic nature of American nationhood left people hungry for a more flesh-and-blood connection to their country.

With work, less is more

flickr / t_fern

“Leisure is the new productivity.”
 
That counterintuitive slogan emerged from a panel I attended last week at the annual conference of the New America Foundation, a Washington D.C. think tank where I am fortunate to be a fellow. The panel was anchored by Brigid Schulte, a Washington Postreporter and the author of a new bookOverwhelmed: Work, Love and Play When No One Has the Time.
 
Time and the way we spend it was Schulte’s focus, and she argued that we spend too much time working, logging more hours at the office than employees in any other developed country save Japan and South Korea. As a result, “we have a lot of unproductive, sick, unhappy, burned out, and disengaged workers,” Schulte noted. Ironically, we are less productive, creative, and innovative than we would be if we had more time off.
 
Our continual state of busyness, she explained, prevents us from entering the loose, associative mental state in which unexpected connections and aha! insights are achieved. Schulte was drawing here on the research of psychologists and neuroscientists, one of whom, Northwestern University professor Mark Beeman, was also on the panel.
 

Look Out, Borrowers: Sequestration Is Back

flickr / calamity_photography

This post originally appeared on Forbes. Follow New America’s publications on Forbes here.

Sequestration is back. Actually, it never left, but now it’s getting worse for federal student loan borrowers.

The Department of Education announced last week that the origination fees charged on federal student loans are set to rise yet again for the 2014-15 academic year. That’s because sequestration “cuts” funding in the student loan program by increasing the origination fees that the program charges to borrowers. Those increases first went into effect in 2013, and they followed another sharp increase in the student loan origination fees that Congress enacted as part of the law that mandated sequestration: the Budget Control Act.

Although the increases to the fees are relatively small this year, there are two important points to keep in mind. First, the overall origination fees on certain federal student loans is now noticeably higher than they were a few years ago, once all of the recent changes have been factored in. For Stafford loans, the fee is now set at 1.073 percent, while for PLUS loans, it totals 4.292 percent. And second, sequestration will probably result in further increases to the fees in coming  years.

I’m One of the Muslims the NYPD Spied On

Reuters

 

For many years, local and federal law enforcement agencies spied on Muslims and mosques in the U.S., hoping to find bad guys before they could commit acts of terrorism. But earlier this month, New York City Mayor Bill de Blasio announced that he was abolishing the police department’s Muslim-spying task force so that “our cops and our citizens can help one another go after the real bad guys.” Which makes sense, because in over a decade of surveillance, the NYPD failed to find even one potential terrorist.

Instead, they found people like me.

On September 26, 2007, long before the surveillance program was public knowledge, I received a phone call from a blocked number. The voice on the other side identified himself as a detective from the NYPD anti-terrorism unit. He and another detective, from the FBI’s weapons of mass destruction unit, wanted to meet with me in person, and they wanted to meet within 30 minutes. It was an urgent matter.

During our very brief conversation, the NYPD detective asked me where I was. Because I was going to be with a friend of mine in Manhattan’s Koreatown, we decided to meet on a nearby midtown corner. Rushing, I arrived about 10 minutes early. During those 10 minutes, time seemed to stop altogether. Why was I so important for them? What had I done that was so wrong?

I had talked to NYPD officers before. On September 11, 2001, I saw the second plane crash into the World Trade Center with my own eyes while reporting live over the phone from the Brooklyn Heights Promenade for Kanal-D TV in Turkey. I had worked on a top floor of the South Tower as a photographer from August 2000 to May 2001. After the attacks, I spent the next few months at Ground Zero, interviewing firefighters, rescue workers, and police officers. By 2001, I had lived in New York for three years. But September 11 became a defining moment in my personal life. For the first time, I felt like a New Yorker.

Is Placelessness the Cost of American Freedom?

Forty-four years ago—well before the advent of the contemporary mobile phone, Wi-Fi, and social media technology—fabled futurist Alvin Toffler predicted a “historic decline in the significance of place to human life.” He was right, of course. And no country has proven him more right than the United States.

Let’s face it. We are a nation of commitment-phobes, always eager to liberate ourselves from life’s constraints. Unhappy with your family of origin? Form a family of choice. Has your marriage soured over the years? Find someone new! In the 21st century, even an individual’s gender at birth is seen as changeable—and when we talk about sex change, we use the language of liberation: a man trapped in a woman’s body. From where we work to where we live, Americans see change—or is it exchange?—as a birthright.

This culture of impermanence has created what sociologists call “limited liability communities.” Whenever we attach ourselves to, well, anything, we reserve the right to quit when that attachment no longer serves our purposes. That’s what freedom of choice is all about, right?

Yes, but we rarely acknowledge that the flipside of all this freedom is weak attachments to people, groups, and, particularly, to place.

Toffler’s prediction has given way to a lot of contemporary handwringing about “placelessness”—the notion that a monotonous, standardized, and homogeneous American landscape, working in concert with new technologies, has disconnected us from the uniquely rooted locales that once grounded us in community. Not long ago, the National Trust for Historic Preservation launched a national campaign called “This Place Matters” to encourage communities to rally around iconic locales. A growing number of philanthropic foundations are funding a movement called “placemaking,” which emphasizes the role specific public places can play in building strong communities with a healthy sense of belonging.

Why Do Americans Drink Half as Much Coffee Today as They Did 60 Years Ago?

Flickr / mckaysavage

We live in a golden age of coffee. Starbucks alone has ensured that you can get a well-brewed cup anywhere in America—even in truck stops, strip malls, and drive-throughs. But it’s not just Starbucks. The boom in specialty coffee is so wide and deep that Dunkin’ Donuts and McDonald’s boast of their 100-percent Arabica beans. There’s also a thriving “third-wave” coffee scene, where self-proclaimed coffee snobs pay $5 for a cup of single-origin, drip-brewed joe. So, yes, our nation is awash in terrific coffee.

Given our obsessive, even fetishistic, interest in coffee, it seems axiomatic that we are drinking more coffee than ever before. But that’s not just wrong. It’s entirely wrong. Our grandparents drank twice as much coffee as we do.

American coffee consumption peaked just after World War II. In that era, soldiers chugged it from tin cups, factory workers used it to brace for long shifts, and office break rooms were chock-a-block with coffee pots. Gum-popping waitresses refilled countless coffee cups lining the counters of all-night diners. Jukeboxes blared the Ink Spots’ melodic harmonizing, “I love the java jive and it loves me,” and Frank Sinatra singing, “They’ve got an awful lot of coffee in Brazil.”

In 1946, American coffee consumption peaked at over 46 gallons per person annually. By 1995, it was less than half that amount.

So what happened to bring an end to coffee’s heyday? The short answer is that Coke took over.

L.A.’s Rocking a New Healthcare Culture

It’s still early to evaluate all the impacts of the Affordable Care Act (ACA) on Los Angeles and its immigrants, but public perception of and conversation about the law are driving huge changes both in Southern California and throughout the country, said panelists at a Zócalo Public Square event, co-presented by The California Wellness Foundation, at the Goethe-Institut Tuesday night.

Responding to questions from the moderator, Kaiser Health News senior correspondent Anna Gorman, those panelists—St. John’s Well Child and Family Center CEO Jim Mangia, National Immigration Law Center health policy attorney Gabrielle Lessard, and UCLA Center for Health Policy Research health insurance studies director Shana Alex Lavarreda—described emerging evidence of what Mangia called a new “culture of coverage.”

Immigrants who are eligible for coverage and care are starting to get it. And even undocumented immigrants, who are excluded from the benefits of Obamacare, are more likely to seek treatment as a result of the publicity on the law and focus on health issues, Mangia said. Many of those who are now seeking coverage have avoided or been unable to afford healthcare in the past, which creates opportunities for significant gains in public health.

“We are beginning to create a culture of access,” Mangia said. “I think the increased access as a result of the ACA is going to have an impact. I do believe that we’re getting much closer” to having healthcare access for all, but “we still have a ways to go.”

At the same time, public perceptions of Obamacare also represent obstacles. Lavarreda, whose UCLA center does extensive polling of Californians on health issues, said she had encountered many people—immigrants and non-immigrants—who are given reliable information about their eligibility for coverage but simply do not believe it. “There’s this barrier of, ‘Even if it exists, it doesn’t apply to me,’” Lavarreda said.

Does America Need a Tahrir Square?

flickr / rahmyraoof

Maidan Square in Kiev. Taksim Square in Istanbul. Tahrir Square in Cairo. Recent democratic movements around the globe have risen, or crashed and burned, on the hard pavement of vast urban public squares. The media largely has focused on the role of social media technology in these movements. But too few observers have considered the significance of the empty public spaces themselves.

Comedian Jon Stewart was one who got it. He quipped that if he ever becomes a dictator, he’d “get rid of these [bleep]ing squares” Why? Because “nothing good happens for dictators” in such places.

In the U.S., children are taught that the public square is essential to democracy. Here, the phrase “public square” is practically synonymous with free political speech. But these days “public square” is more likely to be a metaphor for media in all its forms than it is a reference to an actual, concrete place.

For at least a generation, urban planners and sociologists have bemoaned the decline of public space in American life. While older towns and cities, particularly in the Northeast and South, may have been built around a commons or town square, most newer cities in the West—often planned with the automobile in mind—were designed without town centers. The explicit intention of many planners was to give people their own private spaces rather than provide opportunities to come together in public.

“We stopped building public squares in the post-war years also in part because of the fear of who would use them,” says Fred Kent, president of the Project for Public Spaces in New York. “And those we do have, we don’t use very much.”

If public squares are essential to democracy, is their relative absence in modern American life bad for our democracy—or a sign that we’re not as democratic as we imagine?

Public Service Loan Forgiveness Does Not Make Your Loan Affordable

flickr / lendingmemo

Our Chronicle of Higher Education article on how Georgetown Law uses federal student loans, Income-Based Repayment (IBR), and Public Service Loan Forgiveness (PSLF) to offer its students free educations financed largely with taxpayer funds has won third prize for a stand-alone feature in the Education Writers Association’s National Awards for Education Reporting.

The article set off a lively debate about the excesses of an unlimited forgiveness program largely benefitting graduate students. Indeed the implications of the scheme were not lost on the Obama administration—the president’s budget released in March proposes a series of reforms to the IBR program for student loans, including a cap on how much debt can be forgiven under PSLF—a recommendation we made in the article.

The proposed cap on PSLF has little to no bearing on the ability of such a borrower to take a low-paying job in the public sector.

Million Records Project Raises as Many Questions as Answers

flickr / USACE-TAS

Last month, the Student Veterans of America (SVA), together with the Department of Veterans Affairs and the National Student Clearinghouse, released a host of new data on veterans’ higher education outcomes. The Million Records Project report, published by SVA, made use of previously unavailable data to show that more than half of veterans in a large sample had graduated within the 10-year period. (For the full summary, check out our earlier write-uphere.)

But for all the questions the SVA report answers, it raises at least as many more. There are a number of blind spots in the report about veterans’ paths through college, and their eventual outcomes. Future iterations of the Million Records Project may—and should—endeavor to answer some of these questions; without the answers, institutions of higher education, nonprofit groups, and veterans’ organizations will remain in the dark about how well higher education is serving our nation’s veterans.

Are veterans graduating?

The SVA report says 51.7 percent of the veterans in its sample attained a degree or credential. But that figure isn’t comparable to other graduation rates like those calculated by the Department of Education. That’s because instead of following a single cohort of students for multiple years, it identifies how many of the total sample—those who first used their Montgomery or Post-9/11 GI Bill benefits between 2002 and 2010, all cohorts included—graduated by 2010.