How Marketing & Communications Teams Really Feel About AI—and What to Do Next
Executive Summary
Strategic AI adoption in Marketing & Communications requires leadership-first implementation, role-specific training, and transparent governance frameworks.
Key Findings: Qualitative Insights
Click any finding below to explore the research and training recommendations from our focus group analysis
Leadership-first adoption accelerates organizational transformation
Organizations achieving measurable AI efficiency gains implemented C-suite adoption before wider deployment, with executive competency KPIs driving cultural change. Companies struggling with AI adoption report senior leaders remaining disengaged from innovative use cases emerging from junior employees.
Why it matters
Executive fluency determines whether AI becomes a strategic capability or remains fragmented experimentation. When leaders understand AI's practical applications, they can make informed decisions about tool procurement, governance frameworks, and team restructuring. Without executive engagement, junior staff innovations remain isolated, creating perception gaps that undermine credibility and limit scaling opportunities. Leadership behavior cascades throughout organizations, setting cultural "permission" for adoption and risk-taking.
Evidence from the research
"We rolled Claude out to the senior team first. We wanted to understand how it worked before changing any processes. We've seen clear success because of this approach."
Executive
"It's the huge elephant in the room that we don't discuss. Leadership doesn't understand what we're actually doing."
Mid-level
"Partners and senior management need to be hands-on themselves; without that, it feels like rules are being written in a vacuum."
Director
Implications for teams
Inflexible guardrails and 'hush-hush' culture are stemming AI innovation
Organizations with overly rigid AI policies and secretive cultures drive shadow usage and stall innovation. Successful AI adoption requires "sandbox plus" frameworks that balance governance security with flexibility, supported by explicit cultural permission to experiment.
Why it matters
Marketing & Communications teams work with sensitive client information while requiring creative flexibility. Overly restrictive policies force employees toward potentially unsafe workarounds, while "hush-hush" cultures prevent knowledge sharing and best practice development. This creates compliance risks, limits innovation potential, and undermines the collaborative learning necessary for effective AI integration.
Evidence from the research
"Partners and senior management need to be hands-on themselves; without that, it feels like rules are being written in a vacuum."
Director
"It's quite hush-hush in the office... we need everyone comfortable talking about their AI usage."
Junior level
"I choose to willfully ignore the guidelines... I've never been satisfied with the sandbox results."
Mid-level
Implications for teams
Prompt engineering is the core training priority for mid-to-junior employees
Prompt engineering represents the fundamental skill gap limiting AI effectiveness among mid-to-junior Marketing & Communications professionals. Current training approaches overemphasize generic tutorials while underdelivering on practical, role-specific prompt development.
Why it matters
Quality AI outputs depend entirely on input quality and context-setting. Without structured prompt engineering skills, junior staff cannot leverage AI's full potential, instead producing generic content that requires extensive revision. This skills gap also perpetuates the perception that AI outputs are inherently low-quality, when poor prompting is often the root cause. For M&C teams, where brand voice and strategic messaging are critical, prompt engineering becomes essential for maintaining standards while achieving efficiency gains.
Evidence from the research
"Brief-writing and context-setting will overtake raw execution as the core junior skill."
Junior level
"What we need is hands-on workshops comparing tools, not one-hour generic tutorials."
Mid-level
"Prompt generation is the hill that I will die on."
Junior level
Implications for teams
Clear AI policies are becoming a client requirement for agencies
Agency teams report increasing client demands for AI usage transparency, with clear agency-wide approaches to disclosure becoming critical for maintaining client trust and winning new business. Proactive transparency policies and value narratives are emerging as competitive differentiators.
Why it matters
Client relationships depend on trust and perceived value. As AI adoption becomes widespread, clients are explicitly asking about agency AI capabilities and usage in RFP processes. Agencies without clear policies risk losing business to competitors who can articulate their AI-enhanced value proposition. Furthermore, undisclosed AI usage creates relationship risks if clients discover it independently, potentially damaging long-term partnerships and industry reputation.
Evidence from the research
"Clients are explicitly asking how we deploy AI; it's becoming a selection criteria in some RFPs. We're expecting this to become par for the course."
Director
"In a sea of AI-generated grey content, strategic judgment, sector expertise and a distinctive voice will still command premiums."
Executive
"I haven't been on an account where we've admitted using AI... the next hurdle is explaining our value."
Junior level
Implications for teams
Skills development roadmaps must be reimagined for an AI-enabled workforce
Traditional talent development models are failing to address the dual challenge of preserving critical thinking capabilities while developing AI fluency. Managing Directors worry about analytical skill erosion in junior staff, while junior employees struggle to balance AI efficiency with voice preservation and career progression clarity.
Why it matters
Career progression in Marketing & Communications has historically depended on developing writing craft, analytical capabilities, and strategic judgment through repeated practice. AI automation of routine tasks threatens to remove these learning opportunities, potentially creating a generation of professionals who can orchestrate AI but lack the foundational skills to evaluate quality or provide strategic guidance. This skills gap could undermine the industry's long-term value proposition and professional standards.
Evidence from the research
"At a senior level we're challenging AI outputs; at a junior level they're taking it as truth. I'm often not seeing a critical layer of thinking in the output I'm being sent for final review."
Director
"I do get worried... my ability to write in my unique voice is clearly diminishing. But I'm also producing a higher volume of higher quality work."
Junior level
Implications for teams
A measurement vacuum is stalling firm-wide AI integration strategies
Despite widespread anecdotal reports of AI efficiency gains, Marketing & Communications organizations lack systematic measurement frameworks connecting AI activity to business outcomes, limiting strategic decision-making and investment justification.
Why it matters
Without measurement frameworks, organizations cannot optimize AI investments, justify expanded access and training, or demonstrate value to clients and stakeholders. Anecdotal efficiency claims lack credibility for budget planning and strategic decisions. Furthermore, the absence of quality metrics alongside efficiency metrics risks prioritizing speed over standards, potentially undermining client relationships and professional reputation.
Evidence from the research
"Everyone talks about efficiency gains, but nobody measures what happens to the saved time. Does it go to innovation? More client work? Or just more of the same?"
Executive
"We've proven AI works in isolated projects, but without metrics showing broader impact, leadership hasn't yet committed to the tool access and training we need."
Director
Implications for teams
Key Findings: Quantitative Data
Key insights from our survey of 82 Marketing & Communications professionals across 8 global markets
How would you rate your organization's support in helping employees adopt generative AI tools?
Is AI adoption consistent across teams and seniority levels in your organization?
Has your organization provided training tailored to your role on how to best use generative AI?
Does your organization have clear policies and guidelines dictating how generative AI can be used at work?
Top 8 Strategic Actions for Marketing & Communications Leaders
Essential steps to transform fragmented AI adoption into strategic communications advantage
Prioritize Leadership-First AI Competency Development
Executive teams must develop hands-on AI fluency before setting organizational direction. This means creating structured learning pathways for C-suite leaders that emphasize practical experimentation over theoretical understanding. Leaders who understand AI capabilities through direct usage make more informed decisions about governance, investment, and cultural change. Consider executive competency milestones, peer learning forums, and visible sponsorship of AI initiatives to model the behaviors you expect throughout the organization.
Invest in Prompt Engineering as a Core Professional Capability
Prompt engineering represents the fundamental skill determining AI effectiveness across all seniority levels. Organizations need comprehensive training programs that move beyond generic tutorials to role-specific, practical applications. Focus on building prompt libraries, sharing successful templates, and creating certification pathways that recognize this as a legitimate professional competency alongside traditional communications skills.
Design Adaptive Governance Frameworks That Enable Innovation
Move beyond restrictive policies toward "sandbox plus" models that balance security requirements with innovation flexibility. Successful frameworks provide clear boundaries while encouraging experimentation, with rapid approval pathways for new tools and regular policy iterations based on frontline experience. The goal is governance that enables rather than restricts, with explicit cultural permission for controlled experimentation.
Develop Transparent Client Engagement Strategies for the AI Era
Create proactive approaches to AI disclosure that build trust and demonstrate value. This includes developing clear narratives about how AI enhances rather than replaces human expertise, establishing pricing models that reflect judgment and strategic thinking over hours worked, and training account teams to confidently discuss AI usage as a competitive strength rather than a cost-cutting measure.
Build Measurement Frameworks That Connect AI Activity to Business Value
Establish systems that move beyond anecdotal efficiency claims to demonstrate tangible business impact. Track not just time saved but where that time is reinvested, monitor quality metrics alongside efficiency gains, and create role-specific productivity indicators. Measurement should inform strategic decisions about tool procurement, training investment, and organizational restructuring.
Protect Human Capabilities While Embracing AI Efficiency
Create deliberate strategies to preserve critical thinking, writing craft, and strategic judgment even as AI automates routine tasks. This means building "human-first" moments into workflows, funding mentoring programs from efficiency gains, and developing career progression models that value both AI orchestration skills and traditional expertise. The goal is preventing skill atrophy while leveraging technological advantages.
Close the Training Gap Through Systematic Capability Building
Address the reality that over 80% of professionals lack role-specific AI training by creating comprehensive upskilling initiatives. Focus on practical, hands-on learning that connects to real project work, establish internal champions who can provide ongoing support, and develop clear competency standards that align with career progression. Training should be continuous rather than one-time, adapting as tools and capabilities evolve.
Establish Centers of Excellence for Continuous Learning and Innovation
Create dedicated resources for tool evaluation, best practice development, and capability building across the organization. These centers should curate knowledge, facilitate cross-team learning, and provide rapid support for complex challenges. Consider rotating assignments to build broad organizational capability, regular demonstration of new applications, and structured experimentation budgets to explore emerging possibilities.
Methodology
Research approach, participant demographics, and analytical framework
Research Approach
This research employed a mixed-methods approach combining an industry-wide quantitative survey with structured qualitative focus groups segmented by seniority level. Focus groups were conducted under Chatham House Rules with full anonymization to encourage candid discussion about AI integration experiences, challenges, and organizational needs.
Focus Group Structure
Three separate focus groups were conducted by seniority level: Executive (VP/Partner level), Director/Managing Director, and Junior/Mid-level professionals. Participants represented leading strategic communications agencies and in-house teams across multiple sectors, ensuring diverse organizational perspectives on AI adoption challenges.
Survey Demographics
The quantitative survey captured responses from 82 Marketing & Communications professionals across 8 global markets, representing diverse seniority levels, organization types, and geographic regions to ensure comprehensive industry perspective on AI adoption patterns and training needs.
Analysis & Limitations
Qualitative data was analyzed using thematic analysis to identify patterns across seniority levels and organizational types. Quantitative survey data underwent statistical analysis to validate key trends. This study focused primarily on English-speaking markets and may not fully represent global perspectives or other industry sectors.