As Americans, I think most of us welcome legal immigrants, we just do not feel that they should impose their belief systems and their lifestyles on us. They need to become Americans and obey our laws, and respect our constitution. Otherwise, they just need to go back to their homeland.
I guarantee you, if I became a citizen of a different country, I would have to obey the laws of that country and I wouldn't even try to impose my lifestyle upon them.
Why is America's government becoming so liberal that we are cowing down to immigrants and trying to allow them to influence our government and be prejudice towards our religious beliefs?
Leave us alone, we are not trying to change you. That is why America has always been called the land of the free.
This irks me! What do you think about this subject?