Hello everyone,
some time ago I was reading a post where someone mentioned travel insurances in specific connection to the problems Thailand was going through at the time.
So now I'm wondering, regardless of the Red shirts, do you always get an insurance when you travel? any specific one?

I have to say I only got it once last year during a month long holiday in Thailand.. I'm not even sure why I got it to be honest but it wasn't too expensive so what the heck.
I get worldwide coverage from my health plan so I'm not sure why I should get an extra one...?