Home Data-Driven Thinking As Generative AI Grows, Content Overload Could Lead To Data Challenges

As Generative AI Grows, Content Overload Could Lead To Data Challenges

SHARE:
Chris Comstock, chief growth officer, Claravine.

With over 100M active users in just four months after its launch, ChatGPT claimed nearly half of all unique visitors to Microsoft’s Bing search engine within its first month. 

Moreover, Snowflake recently announced a number of capabilities to bring generative AI and large language models (LLMs) directly to customers’ proprietary data through a single, secure platform. The allure of this generative AI technology is its speedy, seemingly effortless results that allow for creative assets to be created at scale. 

So, what’s the catch?

One challenge is the bottleneck being created for the human-powered and analyzed data sets that accompany content and assets. More specifically, marketing analytics, data operations and ad ops teams are overwhelmed with an increased need to categorize and standardize associated data, which is compounded by the accelerated pace of generative AI-produced content.

Here are some of the biggest implications of generative AI’s looming content overload.

Inconsistencies and silos within the enterprise

Improved compliance and consistency are critical as AI delivers larger amounts of creative assets. In fact, the American Marketing Association cites that 90% of marketing materials are never even put to use because they’re irrelevant, out of date or inaccessible. 

Successful AI tools need quality data to operate effectively. Without the correct metadata or naming conventions as inputs, AI doesn’t have the proper resources to provide relevant results. Thus, irrelevant, incomplete or inconsistent assets become plentiful, and data gaps develop from content creation all the way to measurement.

While it is very easy to assume that a simple folder system for categorization will solve content retrieval issues, it’s also very easy for this type of organizational structure to become just as muddled and inconsistent. In fact, a 2020 Gartner study reported that poor data quality costs organizations an average of $12.9M in lost revenue annually.

And there is more to consider. AI-generated content causes an overload for the analytics and operations employees who take on the burden of sifting through large amounts of data to find a specific asset. Forrester found that teams can spend 2.4 hours per day searching for the correct data, amounting to 30% of a person’s week. The projected growth of AI will only increase this overload.

The allure of more content is a desirable outcome on the surface. However, large brands run the risk of sharing more content without any control, which can present problems for businesses. When exploring the content supply chain for any team or brand, it can become more challenging to govern the people, teams and various technologies at play to deliver the right customer experience.

Subscribe

AdExchanger Daily

Get our editors’ roundup delivered to your inbox every weekday.

Measurement and attribution challenges

AI-fueled content overload can also have trickle-down effects on measurement and attribution, especially when it comes to metadata and taxonomy inconsistencies. When marketers are unsure how well various assets in a campaign perform because of the lack of data organization and tracking, it is challenging to measure campaign results.

Since measurement and analytics teams are at the receiving end of the content supply chain, they are often left to interpret what other teams or systems have provided. The goal is to feed better inputs into AI models. Otherwise, attribution and personalization can be daunting with more accelerating amounts of content. Even more so, as more content is created with the help of AI, being able to measure the performance of a human-created asset against that of a computer-generated asset could surface very valuable insights for optimization decisions.

With some recent announcements from Snowflake and other platforms, LLMs are becoming more accessible and commoditized. This means differentiation is going to be squarely on data access and the ability to consistently and accurately feed data to finally turn on the promise of AI for commercial use.

Yet data inconsistencies, employee burnout, measurement and attribution challenges are important considerations. The growth of AI-powered technology can act as an incredible resource for marketing teams, but it will become even more important to stay agile and be ready to update current practices as the AI revolution continues.

Data-Driven Thinking” is written by members of the media community and contains fresh ideas on the digital revolution in media.

Follow Claravine and AdExchanger on LinkedIn.

For more articles featuring Chris Comstock, click here.

Must Read

Comic: What Else? (Google, Jedi Blue, Project Bernanke)

Project Cheat Sheet: A Rundown On All Of Google’s Secret Internal Projects, As Revealed By The DOJ

What do Hercule Poirot, Ben Bernanke, Star Wars and C.S. Lewis have in common? If you’re an ad tech nerd, you’ll know the answer immediately.

shopping cart

The Wonderful Brand Discusses Testing OOH And Online Snack Competition

Wonderful hadn’t done an out-of-home (OOH) marketing push in more than 15 years. That is, until a week ago, when it began a campaign across six major markets to promote its new no-shell pistachio packs.

Google filed a motion to exclude the testimony of any government witnesses who aren’t economists or antitrust experts during the upcoming ad tech antitrust trial starting on September 9.

Google Is Fighting To Keep Ad Tech Execs Off the Stand In Its Upcoming Antitrust Trial

Google doesn’t want AppNexus founder Brian O’Kelley – you know, the godfather of programmatic – to testify during its ad tech antitrust trial starting on September 9.

Privacy! Commerce! Connected TV! Read all about it. Subscribe to AdExchanger Newsletters

How HUMAN Uncovered A Scam Serving 2.5 Billion Ads Per Day To Piracy Sites

Publishers trafficking in pirated movies, TV shows and games sold programmatic ads alongside this stolen content, while using domain cloaking to obscure the “cashout sites” where the ads actually ran.

In 2019, Google moved to a first-price auction and also ceded its last look advantage in AdX, in part because it had to. Most exchanges had already moved to first price.

Thanks To The DOJ, We Now Know What Google Really Thought About Header Bidding

Starting last week and into this week, hundreds of court-filed documents have been unsealed in the lead-up to the Google ad tech antitrust trial – and it’s a bonanza.

Will Alternative TV Currencies Ever Be More Than A Nielsen Add-On?

Ever since Nielsen was dinged for undercounting TV viewers during the pandemic, its competitors have been fighting to convince buyers and sellers alike to adopt them as alternatives. And yet, some industry insiders argue that alt currencies weren’t ever meant to supplant Nielsen.