MODCLUB: The Future Of User-Generated Content & A Healthy Web3 Internet
The Internet today is a giant metaverse of content, communications, and social communities made up of activity like emails, videos, social media, publications, news, short-video clips, online streams, forums, shopping, and a host of other services. However, the Internet has grown too big and has become at times an unhealthy and dangerous place. Some examples of this unhealthiness include data privacy, online abuse and harassment, virtual arguments, threats, and broadly defined malicious behavior which may even escalate towards fraud, hacks, violence, and other crimes.
In response, companies like Facebook, Twitter, Instagram, Google, Tencent, and Youtube have all employed different policies and standards to regulate, oversee, and moderate this content, but none seemed to have solved the problem, nor reached the ideals of a truly democratic and healthy Internet.
Global governments have also become closely involved in this process, constantly calling on Big Tech “to clean up the Internet.” And it’s a problem that hasn’t been solved and it’s a problem that doesn’t look like there’s a good Web2 solution. This is where a Web3 innovative project called MODCLUB comes into play.
What Is MODCLUB?
MODCLUB is a decentralized content moderation platform that simplifies the moderation process through online content moderators and advanced technology. MODCLUB provides a Moderation-as-a-Service (MaaS) solution to applications on the Internet Computer, including something called Proof of Humanity (POH) to address fake accounts. What’s great about this system is that moderators can earn rewards and developers on the other hand can better control and review how their content is vetted at a cheaper cost than doing it themselves with a team of content moderators.
What’s The Issue With Content Moderation?
Content moderation is the practice of monitoring, assessing, and filtering content based on a predetermined set of rules set by the platforms and their management teams. Moderation is good for maintaining and enforcing community guidelines and for fostering a healthy online environment and culture. This is what every company strives for.
Twitter, Facebook, and Instagram of course don’t want bots and bad actors destroying the app and it’s likeability, so they enforce company guidelines by incorporating AI technology, employing human moderators, and relying on the community to report harmful content.
In fact, as you can see in the graphic below, Twitter hires over 1,000 staff, Youtube and Google over 10,000, and Facebook 15,000 staff just to moderate online content.
So it makes you think, is it really productive, efficient, and useful to have humans surfing the web all day trying to moderate content? In reality, no. The Internet expands too quickly and is too big for people to moderate by hand, therefore, technological solutions are the only way out.
It’s also not economically viable for the Big Tech companies themselves. Imagine Twitter paying these moderators $5,000 a month, which is $60,000 a year — multiplied by 1,500 personnel gives you a $90 million cost to run this service. That number for Youtube is $600 million and for Facebook is $900 million. So it’s a cost that could otherwise be used on a technological solution and put back into R&D.
Web2 Content Moderation Solutions < Web3 Content Moderation Solutions
As mentioned above, Web2 tech companies can use their own labor to do content moderation, or they can hire outside services, of which there are many. However, this is still largely an AI and human-resource labor, which will still be very costly for the company desiring the service.
Web3 potentially is a solution for content moderation, as Web3 can better incentivize human moderators as well as use smart contracts to better enforce online rules.
As a solution to a new era of content moderation, one company in particular has risen up — MODCLUB. MODCLUB too recognizes the limitations of AI-based content moderation as well as the costs associated with full-time human labor, while simultaneously noting the better judgment human moderators have over AI. So the solution it came up with was an incentive for humans to moderate content, who get rewarded with platform tokens MOD.
The process is fairly simple too, an application first gets submitted to MODCLUB to be reviewed, the moderators then vote to approve/reject the content based on conditions, the rewards get sent out for the work, and the final result gets filtered down to the app layer.
But who are the content moderators? The content moderators are actual persons who get verified through a process called Proof of Humanity, which is a MODCLUB solution to the problem of bots created on blockchains because of easy to create wallet address procedures.
What’s great about Proof-of-Humanity is that it ensures an actual human is associated with an account, thus it eliminates bot accounts, creates a high security threshold, it’s customizable, the process is reviewed by other real people, it’s transferable, and there’s seamless integration with other apps.
However, these people don’t have to worry about their data or privacy because the data is stored completely on the Internet Computer blockchain and actually gets deleted after a grace period of time, thus leaving no trail of the personal identification involved, which keeps blockchain privacy in a high place.
Future Of Content Moderation & MODCLUB
The industry is stuck in a dilemma, keep doing what everyone is doing now and let privacy concerns leak, let people continue to be harassed and target others in online hate speech or harassment, or employ a decentralized solution that employs real humans who are incentivized not through salaries, but through consensus to create a healthy Internet environment together. The latter seems like a viable solution to attempt. Let’s hope Web3 cannot build on top of Web2 and create a better and more ‘nice’ Internet.