Anti-American ideas steering my country towards anti-American values? ATTACK. Just the way it is, man.
The people in the US I have nothing against, but the US itself? For the most part, it fucking sucks. I'd much prefer my national healthcare and other aspects that most European countries embrace. Your country is often false and tacky (particularly your TV). Your country's foreign policy is atrocious and to me, the most damning.
However, the recent developments like "Obamacare" is a welcome change, and is gradually showing how the US is maturing as a country. Next thing the US should do is to stop acting like they're the world police.
If I were to visit, It would mostly be for the food really. At least the food's awesome. The price of the dollar is brilliant too, which means I can get stuff MUCH cheaper from the US than buying it here.
If they don't like this country, they can fuck off somewhere else.
So what if they don't like it? I think they've a right to stay as much as you do. Kicking out people for having different views and opinions only really erodes personal liberties even further.
I don't subscribe to blind patriotism. I don't have to like my country in order to stay here, especially once I'm not committing crimes and contributing to society. In fact, I hate my country and most of it's inhabitants - but I have friends here, a family and an education to go through.