Artists criticize Apple’s lack of transparency around Apple Intelligence data


By the end of this year, millions of Apple devices will be launched Apple Intelligence, Cupertino has generative artificial intelligence that, among other things, allows people to create images from text prompts. But some members of the creative community are unhappy with the lack of transparency around the raw data that powers the company’s AI model.

“I wish Apple would be more transparent with the public about how they collect their training data,” Vancouver-based video game artist and creator rights advocate Jon Lam told Engadget. “I think their announcement couldn’t have come at a worse time.”

Creators have historically been Apple’s most loyal customers, with its founder famously calling it “the intersection of technology and the liberal arts.” But photographers, concept artists and sculptors who spoke to Engadget said they were frustrated by Apple’s relative silence on collecting data for its AI models.

Generative AI is only as good as the data its models are trained on. To that end, most companies have accepted whatever they can find on the internet, endorsements or compensation be damned. The nearly 6 billion images used to train numerous artificial intelligence models also came from LAION-5B, a database of defaced images. one interview with ForbesDavid Holz, CEO of Midjourney, said the company’s models are trained on “just a big scrap of the internet” and “there’s really no way to get a hundred million images and know where they’re coming from.”

Artists, authors and musicians have accused generative AI companies of siphoning their work for free and profiting from it. claims only in 2023. Major music labels including Universal and Sony last month sued AI music generator startups Suno and Udio hit with hundreds of millions of dollars for copyright infringement. Tech companies – ironically – have both defended their actions and also concluded license agreements with content providers, including news publishers.

Some creatives thought Apple could do better. “So I wanted to give them a little benefit of the doubt,” Lam said. “I thought they would approach the ethical conversation differently.”

Instead, Apple has provided little information about the source of the training data for Apple Intelligence. One post Like other generative AI companies, it uses AppleBot to mine public data from the open internet, the company wrote in a post on the company’s machine learning research blog. they also said on stage. So does John Giannandrea, Apple’s head of artificial intelligence and machine learning reported “A large amount of training information was actually created by Apple,” but did not go into specifics. And Apple is said to have signed contracts as well Shutterstock and Photobucket to license training images, but has not publicly acknowledged these relationships. Although Apple is trying to gain a reputation for a more privacy-focused approach using Intelligence processing on the device and on-demand cloud computing, the fundamentals that unite its AI model are little different from competitors.

Apple did not respond to specific questions from Engadget.

In May, he worked on films such as Andrew Leung, an artist from Los Angeles Black Panther, The Lion King and Mulangenerative artificial intelligence called “the biggest robbery in the history of human intelligence” in his work testimony Before the California State Assembly on the impact of artificial intelligence on the entertainment industry. “I want to point out that when they use the term ‘publicly available,’ it doesn’t just pass muster,” Leung said in an interview. “It doesn’t automatically become fair use.”

It’s also problematic for companies like Apple, Leung said, offering an option for people to opt out after training AI models on data they don’t agree with. “We never wanted to be a part of it.” apple allows Apple Intelligence training data to opt out of websites being scraped by AppleBot — the company says it respects robots.txt, a text file any website can post to tell crawlers to stay away — but that would be triage at best. It’s not clear when AppleBot started breaking the internet or how anyone could have given up before then. And technologically, it’s an open question How or if Data extraction requirements from generative models can even be met.

It’s a sentiment echoed even by blogs aimed at Apple fanatics. “It’s disappointing to see Apple’s otherwise compelling feature set (some of which I want to try) muddied with experiences that are no better than the rest of the industry.” he wrote Federico Viticci, founder and editor-in-chief of Apple Enthusiast Blog MacStories.

Adam Bean, Los Angeles-based sculptor who created the likeness of Steve Jobs Esquire In 2011, it has been using only Apple products for 25 years. But he said he was disappointed by the company’s unwillingness to be open about the source of the Apple Intelligence training data.

“I’m getting angrier at Apple,” he told Engadget. “You have to be knowledgeable and savvy enough to know how to opt out of training Apple’s AI, and then trust a corporation to carry out your wishes. Plus, I can see it being offered as an option to opt out. more train their AI with your data.”

San Francisco-based illustrator Carla Ortiz is one of the contenders in 2023. claim Against Stability AI and DeviantArt, the companies behind Stable Diffusion and the image creation models DreamUp and Midjourney, respectively. “The main thing is that we know [that] For generative AI to work as it is, [it] based on mass transgression and violation of personal and intellectual rights”. he wrote On the viral X thread about Apple Intelligence. “This is true for everyone [generative] As AI companies and Apple shove this technology down our throats, it’s important to remember that they are no exception.”

The anger at Apple is also part of a larger sense of betrayal among creative professionals toward the tech companies that depend on their tools to do their jobs. In April A Bloomberg report Adobe, which makes Photoshop and many other programs used by artists, designers and photographers, has revealed that it used images from dubious sources to train Firefly, its image creation model, which Adobe claims is “ethically” trained. And earlier this month it was the company is forced to update terms of service to clarify that it will not use its customers’ content to develop generative AI models following customer outrage. “The entire creative community has been betrayed by every software company we trusted,” Lam said. It’s impossible for him to completely walk away from Apple products, he is trying to cut back – he plans to give up the iPhone for a while. Lightweight Phone III.

“I think there’s a growing sense that Apple is just like everyone else,” Beane said. “A huge corporation that prioritizes their bottom line over the lives of the people who use their products.”

This article contains affiliate links; we may earn a commission if you click on such a link and make a purchase.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *