An AI Executive Becomes an AI Defender for Creatives!

By Roze 6 Min Read

An ethical dilemma exists with generative AI, according to Ed Newton-Rex. He should be aware since he was formerly a part of the rapidly expanding sector. Before resigning in November out of anger with Stability AI’s policy on gathering training data, Newton-Rex worked as head AI designer at TikTok and was an executive there.

Following his public exit, Newton-Rex dove headfirst into numerous discussions regarding the practical implications of developing AI in an ethical manner. An overwhelming number of individuals are interested in generative AI models that are fair to creators, he remarks. “It would be helpful if you could provide them with better decision-making tools.”

Newton-Rex has now established a new charitable organization called Fairly Trained to provide individuals with just that kind of decision-making resource. Companies in the artificial intelligence industry that license their training data can be found through a certification program. There is now a “fair trade” certification mark for artificial intelligence, similar to the ones you see on coffee.

Fairly Trained’s certification label, L Certification, requires companies to demonstrate that their training data was either publicly available, owned by the company, or explicitly licensed for training purposes.

The nine companies that have been certified thus far include the image generator Bria AI and the music generating platform LifeScore Music. The former uses data licensed from sources like Getty Images for training purposes, while the latter licenses material from all the major record labels.

There are a number of others that are almost finished with their certification. Applicants’ business sizes determine the amount of the $500 to $6,000 fee charged by the NGO. There is no way to build generative AI services like ChatGPT without utilizing unlicensed data, according to OpenAI, the top generative AI business in the world.

An AI Executive Becomes an AI Defender for Creatives!

Newton-Rex and the pioneering businesses that were approved by Fairly Certified have differing opinions. Bria CEO Yair Adato states, “We already think of it as a mandatory thing to do” when it comes to data licensing. While comparing his own company to Spotify, he draws parallels between his AI models that are developed on unauthorized data and Napster and the Pirate Bay.

Tom Gruber, cofounder of Lifescore Music and an advisor to Fairly Certified, claims, “It’s really easy for us to be compliant.” “The music industry is very concerned with ownership and authenticity.” Companies like Universal Music Group and trade associations like the Association of Independent Music Publishers have lent their support to Newton-Rex, according to the company.

However, the current push to change the way AI companies typically gather training data is still in its early stages. Plus, there’s just one person behind Fairly Trained. Not that Newton-Rex cares; his mentality is still very much that of a startup founder.

“I think it’s important to get things out the door early,” he admits. Additionally, there are several organizations working to standardize the practice of including ingredient information on AI product labels.

Fairly Trained is comparable to Adobe’s Content Authority Initiative, which aims to assist users in determining the legitimacy of photographs; Howie Singer, a former executive at Warner Music Group who now researches the impact of technology on the music industry, has noticed this. According to him, Newton-Rex’s project is progressing well.

Singer believes the ethical data certification will be more appealing to specific groups of connected insiders than the general public, similar to how some consumers look for pasture-raised, non-GMO eggs while others simply grab the cheapest carton they can find. “Will a typical individual be concerned? He states, “some might, but not everybody.”

People who are concerned about the sources of the data used by AI algorithms might be willing to pay for more credentials. Fairly Trained is a great concept, but Neil Turkewitz, a copyright campaigner and ex-RECA official, thinks the service’s current offering is inadequate.

He explains that the AI company isn’t using fair use as an excuse for illicit scraping because of this accreditation. “It doesn’t state that the business ethics adhere to creators’ expectations regarding the boundaries of their permission or that the company’s procedures are equitable.”

Everyone agrees, including Newton-Rex. “What I refrain from doing is asserting that an individual is completely ethical just because they hold this certification,” he states. In the future, he plans to release more certificates, which might include topics like pay. However, he is nonetheless pleased with this groundbreaking endeavor: “It won’t fix every problem, but I believe it can make a difference.”

If you found this technological topic to be enjoyable, you might also be interested in the following:

Share this Article
Leave a comment