OpenAI lays out its misinformation strategy ahead of 2024 elections


As the US prepares for the 2024 presidential election, OpenAI shares its thoughts plans preventing disinformation about elections around the world, with a focus on increasing transparency around the origin of information. One such point is to encode the origin of generated images using cryptography standardized by the Content Origin and Authenticity Coalition. DALL-E 3. This will allow the platform to better detect AI-generated images with the help of provenance classification to help voters assess the credibility of certain content.

This approach is similar, if not better, than DeepMind’s SynthID for digital watermarking of AI-generated images and audio, as part of Google’s own. election content strategy was published last month. Meta’s AI image generator also adds an invisible watermark to its content, although the company has yet to share its readiness to combat election-related disinformation.

OpenAI says it will soon work with journalists, researchers and platforms to get feedback on origin classification. Along with the same theme, ChatGPT users will start seeing real-time news from around the world full of attribution and links. They will also be redirected CanIVote.orgThe official online resource for US voting when they ask procedural questions like where to vote or how to vote.

In addition, OpenAI reiterates its existing policy of blocking impersonation attempts in the form of deep fakes and chatbots, as well as content designed to distort the voting process or discourage people from voting. The company also bans apps designed for political campaigning, and its new GPTs allow users to report potential violations when necessary.

OpenAI says the learnings from these initial events, if successful at all (and it is very big “if”), will help spread similar strategies around the world. The firm will have more related announcements in the coming months.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *