Is it not the case that you are required to have insurance to drive a car? How is that different from requiring people to buy health insurance? I presume the answer is that auto insurance is a state matter, and the Obamacare mandate would be a federal one; and that you can, in theory, choose not to drive. I get how these might be legal responses. But I'm seeing a fair amount of talk from people who seem mostly philosophically concerned with the idea of the mandate -- not worrying about it on a strictly legal basis, but fretting about the whole notion of a government that can make you buy things. To these people, neither of the answers seem relevant (from this point of view, surely, state/federal is a moot point (if it's oppressive it's oppressive at either level, if not not), and the freedom not to be able to drive, while exercised by many people, is still a very significant life restriction in our society). So what would they say about the fact that we already do this?
For that matter, it seems like the Government requires us to buy all sorts of things. Don't public decency laws require people to buy clothes? What would you say to a nudist who objects that they own no clothes and by requiring them you're going out and forcing them to engage in commerce?
1 comment:
I suggest you read more and listen less to state propaganda.
Post a Comment