The question is... what is the main purpose of Hollywood? Are they just in it for the money? Or is there some kind of belief that they want to share with the world?
There's no doubt that the movie trade is a profitable trade, but some people like to think that Hollywood is all part of a big plan to infiltrate the world with some religious or irreligious belief...
I personally don't believe that Hollywood can be discussed in this sense as a single body, as there are many members of Hollywood who have various ideas and principles that they uphold, and I don't think anyone can conrol the whole system as one!
However the company may have started, perhaps to spread some faith of some kind, I think the sole purpose of most of the people is money... or perhaps even more likely: Fame!
A lot of people in the Hollywood trade may have been lonely in their youth, and perhaps want to be seen by the world, want to be recognised as a real person!
I don't want to make the whole industry sound vain because I think that there are a number of directors, etc, (e.g. George Lucas) will try to use the industry to speak to the world! But for the most part it's a very disjointed industry, and there is no single purpose!