Social media, online forums, and online e-commerce heavily encourage and rely on content posted by humans to attract visitors and enable participation in their sites. However, inappropriate user-generated content in the form of violent, disturbing, infringing or fraudulent materials has become a serious challenge for public safety, law enforcement, and business integrity. It has also become increasingly difficult for end users to locate the most relevant content from the huge amount and variety of potentially interesting content selections. Therefore, content moderation and curation serve the two key purposes of protection and promotion to ensure compliance to site policy, local tastes or norms, or even the law, as well as the creation of an entertaining and compelling user experience via high-quality content. In this paper, we survey the governance, processes, standards, and technologies developed and deployed within the industry. The primary challenge faced today by the industry is the scalability of the governance model in the moderation and curation process. A symbiotic human-machine collaboration framework has emerged to address the burdensome and time-consuming nature of manual moderation and curation. We illustrate how this framework can be extended to optimize the outcome by focusing on applying moderation and curation on content that has not been previously moderated or curated.