Are Americans healthier after Obamacare?
Nobody believes this. No one. It's so obvious that after 8 years of Obamacare, America has continued to decline in health at least as fast as ever before. For Obama's last few years in office the average life expectancy even began declining for the first time ever. Obesity is up. More and more boys are on Ritalin. People want to change their sex.
There's a very clear trend between the more we spend on health care and the sicker we get. Doctors are worthless. They don't cure any diseases. They sell rip off drugs that manage symptoms, which proves all scientists should be in jail because obviously symptoms are the body's way of telling us something is wrong and covering them up will cause more problems. Even if we wanted these drugs, it should be our decision not a doctor's.
"Abortion should be a decision between a woman and her doctor" liberals always say. That's how brainwashed they are. They think even an abortion shouldn't be up to the woman getting one. Liberals will never admit that health care is wrong. They will always demand more money even when our entire economy is nothing but health care because everyone weighs 400 pounds.