The debate around the Affordable Care Act (ACA) continues, but regardless of the outcome, a shift is already underway — the latest in a long history of shifts regarding who is responsible for healthcare.
Although government and employers have assumed more responsibility in recent years, the foundation for insurance has multiple lineages. We are now entering an era in which employers and government still provide the infrastructure, but employees must carry more of the decision-making and financial responsibility, setting into effect a bold new chapter of proliferating benefits, a cost-sharing model for healthcare and a new employer-employee contract. Both parties need to take a deeper look at their role as we enter into a period of turbulent change.
To understand how we got here, let’s first look back at history. The earliest instance of insurance actually traces back to the Roman Empire when military leaders set aside money to pay for proper burial of warriors killed in battle — key to attracting loyal fighters.
In the first example of insurance in the New World, neighbors set aside money to help insure their homes against fire in Charleston, South Carolina. While similar examples pop up throughout history, it wasn’t until 1929 at the Baylor University Hospital in Dallas, Texas that the concept of employers providing for employees really caught hold. Baylor’s teachers could not afford to go to the hospital when they were sick, so the university came up with the “Baylor Plan,” which charged teachers 50 cents a month for up to 21 days of care, should they need it.
Then World War II began, and the government put price controls on everything, including labor. While factory owners couldn’t pay workers more than the amount set by the War Labor Board, they still needed some way to attract employees, so they petitioned the board for permission to offer fringe benefits. Employers took on the cost of these benefits, and when the war ended, the idea remained and grew, inadvertently creating a social contract, which set the precedent for ‘paternalistic’ employer-sponsored healthcare.
The government has since created various laws to strengthen employee protections, including ERISA (1974), which is considered the grandfather of modern welfare benefits, COBRA (1986) to protect those recently unemployed, health savings accounts (HSAs, 2003) to help individuals cover out-of-pocket healthcare costs and the ACA (2010), arguably the most controversial of all. These safeguards were not meant to turn healthcare responsibility over to the government. Instead, they were meant to put rules and requirements on employers to live up to what they were already providing, none more so than the ACA. So where does this leave us today?
Employer-sponsored healthcare isn’t going away, but it is about to go through another cycle of — perhaps violent — change, and very few people understand how disruptive it may be. We’re entering an era in which insurance has evolved from an employer benefit with virtually no decision-making or financial involvement from employees, to a cost-sharing model where employees have a real stake in the outcome. As the employer-provided health insurance model quickly evolves, it gives rise to a host of new benefits and programs designed to complement and supplement the primary health coverage.
This transition is not going to be easy, as employees have been sheltered from these costs and decisions for so long. As these expenses have become unsustainable for the employer alone, employees who now foot more of the bill are demanding control. Their mentality is, “If I’m going to pay for this, I’m going to choose what meets my current needs.”
In this new era, employees will consider their unique situation and look across the entire benefits package to choose what works best for them. Some may value paying down student loans more than health insurance, while others prioritize pet insurance.
One problem that will emerge is that healthcare is different than other consumer products, and applying our consumer-behavior to healthcare can create some problems. For example, when presented with a list of premiums, people tend to buy down, which is consistent with other forms of purchasing. This holds true until they need care, at which time they often buy the most expensive service because they associate the cost with quality. This counterintuitive way of thinking in which the cheapest doctor is seen as the worst doctor drives up healthcare costs and steers employees toward more expensive care.
Without a massive shift in the way we approach healthcare decisions and a true understanding of how these decisions impact our health and financial well-being, this situation will only get worse.
(I can’t help but hear Uncle Ben’s warning to Spider-Man, “with great power comes great responsibility.”)
For the cost-sharing model to work, both parties need to understand their new roles, and they need to think about the long-term consequences of their decisions. Employers should provide the options their workers demand, but of equal importance are the tools and education to help them succeed with their newfound responsibilities. Using data to help make these increasingly complex decisions will be vital. Employees will expect more personalized benefit options, and they will only be loyal to employers who offer a broad and flexible package from which to choose. Regardless of how the ACA shakes out, the next chapter of the employee-employer contract is well underway, and it’s up to both employers and employees to make it work.
Shawn Jenkins is co-founder and CEO of Benefitfocus (NASDAQ: BNFT), a platform that supports benefits enrollment/management for more than 825 large enterprises, 55 insurance carriers’ and numerous consumers. The company enables employers and carriers with a powerful SaaS technology platform that empowers people to make better health, wealth and wellness decisions.
The views expressed by contributors are their own and are not the views of The Hill.