@indigomoth - In fact, although insurance companies often get painted as bad guys and although sometimes they *are* bad guys, most of the time they do pay out if you deserve it. They have to, after all, you've signed a contract which entitles you to the money.
If you didn't pay to ensure a certain circumstance, they won't pay for it, which is where people often get angry. They think they have full health care, but maybe they didn't read what was actually covered.
I personally think there's nothing wrong with being in insurance. It is the same as any other job. You can do good in it, or you can do harm.