This is no means meant to be rude whatsoever, but I need some questions answered about this. As a citizen of the United States, a majority of what I hear about Islam is from the news and of course that'll be skewed in the favor of saying Muslims are bad.
I don't want to be ignorant like so many people are in my country so here's my question. I know I've heard some say Islam is a religion of peace, but then I see stuff in the middle East like a gay man being thrown off a building or women being controlled. I figured asking a question would be the best way to understand things and I hope I haven't offended anyone with this question.
Edit: If I said anything wrong , please point it out also, I'm a big ignorant on social cues.
[link] [comments]
from Islam https://ift.tt/33FJj5x
Post A Comment:
0 comments so far,add yours