Key learning points
- The Biden administration and companies are tackling AI misinformation with blockchain, amid growing deepfake technology that is outpacing detection efforts.
- The lack of comprehensive federal laws and the slow pace of regulatory responses highlight the urgent need for effective measures against AI-generated disinformation.
- Saudi Aramco has signed one oneagreement with droppGroup to build Web3 applications to help Aramco employees.
As deepfake technology advances, the Biden administration and companies like Saudi Arabian Oil Group (Saudi Aramco) are mobilizing to authenticate communications and pioneering blockchain to stem the tide of AI-powered disinformation. However, the pace of detection and regulation is lagging behind advances in AI and deepfake technology.
The White House is urgently working to ensure the authenticity of its statements and videos as sophisticated deepfakes undermine trust in institutions. Recently, an illegal AI-generated robocall mimicked President Biden’s voice in an attempt to discourage voting.
In response, Ben Buchanan, Biden’s AI adviser, revealed White House plans to “cryptographically authenticate” all communications using technology to certify real videos or documents. This initiative follows the explosive growth of AI, such as ChatGPT, which can efficiently create remarkably realistic fake multimedia.
The promise for verification comes from a deep concern about public manipulation. Deepfakes depicting celebrities and women without consent are flooding social media, mainly created for revenge or pornography. Multiple states have passed measures to specifically ban deepfake pornography, but enforcement has been inconsistent.
Saudi Aramco partners with droppGroup
As governments catch up to deepfake threats, major companies like Saudi Aramco are moving forward with their partnership with droppGroup on Web3 and blockchain pilots to help employees.
Possible initiatives include tokenized incentive programs and blockchain-based training and onboarding. With $2 trillion in assets, Aramco’s efforts indicate growing mainstream adoption, moving blockchain from speculation to reality.
The applications will initially be intended to help Aramco employees. Planned offerings include blockchain-based onboarding and training ecosystems to ease workforce access and growth. There are also symbolic incentive structures on the table to motivate and reward employees.
Calls for deepfake regulation
Despite advances in corporate blockchain, unresolved dangers of AI, such as deepfakes, loom amid lax regulation. In addition to personal violations, deepfakes can also manipulate markets or public opinion around events such as Russia’s ongoing war in Ukraine.
The United States lacks an overarching federal law that explicitly prohibits the production or distribution of deepfake. Pending EU regulations, platforms would be forced to label AI-generated content as synthetic. But without bat signal indicators, constantly evolving deepfakes can evade protected groups.
Startups and tech giants like Intel are making strides in deepfake detection through AI and other analytics. However, identification technology tracks viral spread through social channels. The result is that a dangerous divide is created. As democracy and rights are threatened by unchecked AI risks, pressure is mounting for impactful countermeasures from government and industry leaders.
Conclusion
As a global pioneer in Web3 technology with AI and machine learning, droppGroup knows how to apply Web3 tools to the AI field. Together with Saudi Aramco, droppGroup created several Web3 technologies to help Aramco’s workforce, combining the benefits of Web3 and AI. Their goal is to build blockchain-based onboarding and training ecosystems to ease workforce access and growth.