Decentralized chat app OpenChat, that is based on the Internet computer blockchain aims to promote more virtuous discourse on social media and believes that requiring users to both prove they are human and have just one account could be the way forward.
To do this, the on-chain messaging app, similar to Discord and Slack, plans to test facial recognition.
“The proof of humanity is one thing, and that is relatively easy to do. What is more difficult is the proof of unique humanity. I can solve all kinds of tests for the proof of humanity, but I could do it a hundred times and get a hundred accounts,” OpenChat co-founder Matt Grogan told Het Blok. “This is huge for this space, a testament to a unique humanity,” he added, noting how eliminating people from controlling multiple online accounts could limit the extent to which some users benefit from token farming or airdrops by use more than one account.
OpenChat, which has more than 100,000 users, recently entered into a partnership with Modclub, a platform that mainly focuses on decentralized content moderation. But the platform, which also runs on an Internet computer, will also test facial recognition for OpenChat as part of its push to implement a “proof of unique humanity” system, Grogan said.
“They have facial recognition… we’re going to try it out and see how it works,” he said. “It won’t be 100% perfect, but it should limit how easy it is to have multiple identities.”
However, Grogan also said that, in addition to not having worked out all the details about the trial yet, OpenChat will not require all users to verify that they are human and have only one account. He did say that proving unique humanity through facial recognition could be something in the future that could be used in determining who qualifies for future airdrops. Users can even use it to boost their own reputation on the platform, he added.
So far, unlike traditional social media platforms, which use email IDs and unique usernames, OpenChat has used crypto addresses and NFTs for authentication and monetization.
Avoiding Toxicity on Facebook and X
The anonymous use of multiple accounts and bots by users has long been considered a problem that not only contributes to the amount of illegal online behavior, but also supports the level of toxic discourse prevalent on traditional social networking platforms such as Facebook and Twitter).
Modclub hopes to help prevent similar behavior on OpenChat. Earlier this week, the platform announced its partnership with OpenChat. “Users on the OpenChat platform will have the ability to report content that violates the rules established by the OpenChat DAO. These reports will be sent to Modclub’s pool of moderators for careful review,” the platform said. said in his post. “Decisions will be made and content will be removed or left up based on the results.”
According to Grogan, in the past it has generally been the responsibility of the leaders who lead specific groups and communities within OpenChat to moderate the discourse. The moderation tactics are intended to be guided, at least in part, by OpenChat’s platform rules, he also said.
However, with the new partnership, OpenChat will “transfer” that responsibility to Modclub, which incentivizes its moderators by paying them in crypto.
In addition to hoping to foster a reputation system that encourages virtuous discourse, OpenChat is also currently rewarding users with tokens in an effort to encourage growth and engagement.