How AI Can Be an Asset, Not a Threat, to Creative Work

by | Jan 17, 2024

Depending on who you ask in advertising, the meteoric rise of AI either sparked a gold rush of creative inspiration or triggered an avalanche of creative obstructions… and several creative existential crises. As a member of Infillion’s Creative Studio team, this is front and center in both our internal and external meetings. And because generative AI is continuing to evolve rapidly, the answers aren’t easy.  

The potential of AI – in particular generative AI – to revolutionize creativity across the industry is undeniable and already in motion. AI’s integration automates mundane tasks, liberating creatives to focus on ideation and innovation instead of endlessly resizing images or adjusting character counts for an infinite number of placement sizes. At Infillion, our designers are using Adobe’s generative AI technology to craft masterpieces from lean asset packs, enabling resourceful creativity in seconds through generative fill in Photoshop. We can use ChatGPT as a catalyst during brainstorming sessions, as well as with help for research (as long as we fact-check). It’s inspiring to see how much it can help us. 

Yet generative AI also poses many threats when used as a replacement for human creativity. There are ethical questions about intellectual property and displacement of human work, as well as questions of accuracy, and – yes – brand safety. When you put the machines in charge, they can miss the very human nuances of your brand.

So, in this new era where generative AI intersects with the business of advertising at every turn, the onus lies on us humans to wield this great power with great responsibility. Let’s look at a basic tactic that brands can employ to check themselves on their AI ethics, and get some inspiration from brands that are already doing it well.


How to build a framework for ethical AI use.

The Turing Test, developed by renowned mathematician Alan Turing in 1950, is a method used to determine a machine’s ability to generate intelligent human responses. Over 70 years later, now that AI is becoming a reality rather than a far-off possibility, we can adapt this test to check for ethical use of generative AI.

Here are a few questions we ask ourselves at Infillion’s Creative Studio when utilizing AI in the workplace:

  • Does the creative result follow our internal best practices? 
  • Is the creative result relevant to our audience’s behavior? 
  • Is the creative result accurate? Could the creative results become inaccurate? 
  • Is the creative result ethical?

To ensure creatives use AI as a powerful tool that assists our creative endeavors, rather than stifling them, consider developing your own modified test before using generative AI to aid in your ad campaigns. 

Let’s take a look at a generative AI success story of the past few months that very much passed our test – and went on to engage consumers’ own creativity.

Over the past few years, Amazon has become a runaway leader in the sports broadcasting world, with its exclusive Thursday Night Football streaming rights and it’s AWS powered “Next Gen Stats”. 

So it’s fitting that the NFL relationship would find its way into other branches of the Amazon family tree. Last year, Amazon Web Services (AWS) and the NFL built a generative AI-based game called “Playbook Pass Rush,” which  harnesses real-time NFL data and invites players to craft plays using the news stat developed by AWS – “pressure probability.” 

Using data from over 90,000 plays from 5 years’ worth of NFL games, the game simultaneously showcases AWS’ AI capabilities and engages users in an experience that helps them learn about the latest in NFL stats and gives them insight into the complex machinations happening on the sidelines of each game. 

The lesson here: Giving users access to generative AI in a controlled environment can help them personalize their experience with your brand.

Here’s why we like it so much: 

  • It’s a great showcase for AWS. Amazon’s B2B software and cloud computing arm might not get as much mainstream attention as its consumer retail and streaming businesses, but it’s a computing powerhouse – and its relationship with the NFL, which has near-universal brand recognition, can be an entry point for new business. Playbook Pass Rush is a fun, interactive entry point for fans to learn more about AWS’ deeper relationship with the league.

  • It helps enhance football fans’ experience. Thanks to a certain pop icon whose name we probably don’t need to mention, the NFL is getting an uptick of interest from new fans in demographics that historically haven’t followed football closely, or maybe never watched regularly beyond the Super Bowl. But football stats are complex and can seem intimidating to newcomers – especially the latest stat developed using AWS – and Playbook Pass Rush can help get their knowledge up to speed.

  • It shows how generative AI activations can be controlled and brand-safe. The biggest brand-centric generative AI story of the past few months was the meme-fest surrounding Nicki Minaj’s album “Pink Friday 2” and its Barbie-hued “Gag City,” which got brands from Oreo to Microsoft using generative AI tools to create their own Gag City outposts. It was a hit – but a completely organic AI stunt, using enormous and sometimes unpredictable datasets, easily could’ve gone horribly wrong for brands.

    In the case of Playbook Pass Rush, the AI data in question came from a proprietary dataset – plays from relatively recent NFL games. The experience is closely guided; you can’t do just anything with it. While brands may be excited that “the sky’s the limit” when it comes to AI, setting parameters for both ethics and outcomes might bring them back down to earth… but it’s the right thing to do.

    In other words, a close human hand (or, more likely, many human hands) was guiding the creative process throughout development and deployment.


Generative AI is still new enough, especially to brands and publishers, that we’re hearing as many if not more stories about ethical and technical “fails” as we are about successes. News outlets experimenting with stories written partially or completely by AI chatbots were swiftly condemned, both for publishing factual inaccuracies as well as for not paying human journalists to do the same work. A supermarket chain’s AI-based meal planner served up a recipe for chlorine gas. And AI image creation tools’ ability to create realistic-looking but fake news photos came to light, albeit rather benignly, when people fell for a viral “photo” of the Pope.

That’s a lot for brands to navigate, especially as AI and the regulations and norms around it continue to evolve. An over-reliance on AI, after all, risks undermining the value of human creativity and input, and its inherent biases derived from existing patterns and data can inadvertently stifle creativity and diversity in ideas. 

In advertising, this over-reliance will end up fostering homogeneity and monotony in design and tone across campaigns. Plus, inaccuracies and inappropriate content generated by AI can significantly compromise brand fidelity.

But as the AWS-NFL example shows, the potential to foster more rather than less creativity is thrilling. For brands that want to wield generative AI so skillfully, responsibility reigns supreme. 

And it turns out that the easiest way to ensure responsibility is to return to an over-70-year-old “test” and see what we can learn from those who predicted our current moment. 


We’re always down to chat about AI ethics at Infillion. Reach out to us for a chat, or follow us on LinkedIn to learn more.

Subscribe to our blog:

Related Posts:

Let's Connect

We can help you create the personalized ad experiences viewers expect.