Is there any doctrine that teaches that Islam will take over the Western World, or become the majority religion? BTW, I know that Muslims once already had a large empire and conquered many places like Spain and Italy, but I mean now. Is that taught in Islamic theology? That Western World, like Europe, North America and Latin America and Australia, will be dominated by Islam?

Thanks.

P.S: I am a Roman Catholic, so please, do not insult my faith, even if you disagree. Thanks

submitted by /u/orangeblueatm
[link] [comments]

from Islam http://bit.ly/2VYCi8v
Share To:

Unknown

Post A Comment:

0 comments so far,add yours