For example in TV shows and sitcoms like How I Met Your Mother, America seems to 'hate' Canada and vise versa. I keep hearing phrases like "Be careful Canada might invade.. etc etc", but I don't get the history that makes it funny.

I tried looking it up but nothing seems to answer my question :] Also I'm not from any of those countries so I don't have a clue 8)