How Marketing & Communications Teams Really Feel About AI—and What to Do Next

How Marketing & Communications Teams Really Feel About AI—and What to Do Next

Author: fourfold Date: September 2025 Research Period: June - September 2025

Executive Summary

Strategic AI adoption in Marketing & Communications requires leadership-first implementation, role-specific training, and transparent governance frameworks.

Marketing & Communications teams are navigating a fundamental shift as generative AI moves from experimental tool to operational necessity. Our research, conducted across eight global markets with 82 survey respondents and nine focus group participants from June to September 2025, reveals a sector where individual productivity gains are real but organizational readiness remains fragmented.
The data shows that while individual employees report significant efficiency gains, company-wide adoption faces critical barriers. Nearly half (46.3%) of professionals rate their organization's AI support as poor or nonexistent, while 70.7% report inconsistent adoption across teams and seniority levels. Most tellingly, 63.4% have received no role-specific AI training despite daily tool usage.
Several key findings emerge from this research.
First, successful AI integration follows a leadership-first pattern. Organizations where executives developed hands-on fluency before wider deployment felt more certain of the long-term value of AI and more clear on the areas of their business operations that have the highest potential for efficiency gains from AI integration.
Conversely, a lack of leadership advocacy and restrictive governance policies are creating a dynamic where "hush-hush" cultures drives shadow IT usage i.e. employees unsanctioned AI models for more innovative use cases. This is undermining both security and innovation.
The agency landscape faces unique pressures. Client demands for AI transparency are becoming explicit RFP requirements, forcing agencies to articulate value beyond hourly billing. Traditional talent development models are also breaking down. While prompt engineering has emerged as a priority skill gap for junior-to-mid level employees, in a work environment where AI automates entry-level tasks, senior leaders worry that recent graduates will not develop foundational skills like critical thinking and writing.
A key aspect preventing teams from seeing immediate value from AI adoption, despite widespread efficiency claims, organizations lack measurement frameworks to connect AI activity to business outcomes, limiting long-term strategic decision-making.
The path forward demands three critical actions: executive-led adoption with structured learning pathways, governance frameworks that enable rather than restrict innovation, and role-specific training. Marketing & Communications leaders who act now will transform AI from productivity tool to competitive advantage.

Key Findings: Qualitative Insights

Click any finding below to explore the research and training recommendations from our focus group analysis

Leadership-first adoption accelerates organizational transformation

Organizations achieving measurable AI efficiency gains implemented C-suite adoption before wider deployment, with executive competency KPIs driving cultural change. Companies struggling with AI adoption report senior leaders remaining disengaged from innovative use cases emerging from junior employees.

Why it matters

Executive fluency determines whether AI becomes a strategic capability or remains fragmented experimentation. When leaders understand AI's practical applications, they can make informed decisions about tool procurement, governance frameworks, and team restructuring. Without executive engagement, junior staff innovations remain isolated, creating perception gaps that undermine credibility and limit scaling opportunities. Leadership behavior cascades throughout organizations, setting cultural "permission" for adoption and risk-taking.

Evidence from the research

"We rolled Claude out to the senior team first. We wanted to understand how it worked before changing any processes. We've seen clear success because of this approach."

Executive

"It's the huge elephant in the room that we don't discuss. Leadership doesn't understand what we're actually doing."

Mid-level

"Partners and senior management need to be hands-on themselves; without that, it feels like rules are being written in a vacuum."

Director

80% of employees report inconsistent AI adoption across seniority levels

Implications for teams

People
Executive teams need practical AI fluency before setting organizational direction. Senior leaders must model usage behaviors and create explicit cultural permission for experimentation.
Process
Governance frameworks require input from leaders who understand AI capabilities firsthand. Policy development should follow, not precede, executive competency development.
Culture
Organizations need visible executive sponsorship and time-boxing for AI learning to legitimize experimentation and reduce perception gaps between leadership and frontline staff.

Inflexible guardrails and 'hush-hush' culture are stemming AI innovation

Organizations with overly rigid AI policies and secretive cultures drive shadow usage and stall innovation. Successful AI adoption requires "sandbox plus" frameworks that balance governance security with flexibility, supported by explicit cultural permission to experiment.

Why it matters

Marketing & Communications teams work with sensitive client information while requiring creative flexibility. Overly restrictive policies force employees toward potentially unsafe workarounds, while "hush-hush" cultures prevent knowledge sharing and best practice development. This creates compliance risks, limits innovation potential, and undermines the collaborative learning necessary for effective AI integration.

Evidence from the research

"Partners and senior management need to be hands-on themselves; without that, it feels like rules are being written in a vacuum."

Director

"It's quite hush-hush in the office... we need everyone comfortable talking about their AI usage."

Junior level

"I choose to willfully ignore the guidelines... I've never been satisfied with the sandbox results."

Mid-level

Implications for teams

People
Leaders must model transparent AI usage and create explicit permission for guided experimentation with non-sensitive information.
Process
Governance frameworks need iterative design with frontline input, balancing security requirements with operational flexibility.
Culture
Organizations require open discussion forums, usage sharing sessions, and clear escalation paths for policy questions.

Prompt engineering is the core training priority for mid-to-junior employees

Prompt engineering represents the fundamental skill gap limiting AI effectiveness among mid-to-junior Marketing & Communications professionals. Current training approaches overemphasize generic tutorials while underdelivering on practical, role-specific prompt development.

Why it matters

Quality AI outputs depend entirely on input quality and context-setting. Without structured prompt engineering skills, junior staff cannot leverage AI's full potential, instead producing generic content that requires extensive revision. This skills gap also perpetuates the perception that AI outputs are inherently low-quality, when poor prompting is often the root cause. For M&C teams, where brand voice and strategic messaging are critical, prompt engineering becomes essential for maintaining standards while achieving efficiency gains.

Evidence from the research

"Brief-writing and context-setting will overtake raw execution as the core junior skill."

Junior level

"What we need is hands-on workshops comparing tools, not one-hour generic tutorials."

Mid-level

"Prompt generation is the hill that I will die on."

Junior level

82% of employees have not received role-specific AI training

Implications for teams

People
Junior roles must evolve from execution-focused to orchestration-focused, with prompt engineering as a core competency alongside traditional writing and research skills.
Process
Workflow design should incorporate prompt libraries, template sharing, and structured review processes that capture effective prompt patterns.
Culture
Teams need regular prompt-sharing sessions, cross-tool comparisons, and scenario-based learning that connects prompting to real project outcomes.

Clear AI policies are becoming a client requirement for agencies

Agency teams report increasing client demands for AI usage transparency, with clear agency-wide approaches to disclosure becoming critical for maintaining client trust and winning new business. Proactive transparency policies and value narratives are emerging as competitive differentiators.

Why it matters

Client relationships depend on trust and perceived value. As AI adoption becomes widespread, clients are explicitly asking about agency AI capabilities and usage in RFP processes. Agencies without clear policies risk losing business to competitors who can articulate their AI-enhanced value proposition. Furthermore, undisclosed AI usage creates relationship risks if clients discover it independently, potentially damaging long-term partnerships and industry reputation.

Evidence from the research

"Clients are explicitly asking how we deploy AI; it's becoming a selection criteria in some RFPs. We're expecting this to become par for the course."

Director

"In a sea of AI-generated grey content, strategic judgment, sector expertise and a distinctive voice will still command premiums."

Executive

"I haven't been on an account where we've admitted using AI... the next hurdle is explaining our value."

Junior level

Implications for teams

People
Account teams need training on AI value articulation and client education about proprietary AI applications.
Process
Agencies require standardized disclosure frameworks, client education materials, and pricing models that emphasize judgment and expertise over hours.
Culture
Organizations must shift from AI secrecy to AI transparency as a competitive strength, with consistent messaging across all client interactions.

Skills development roadmaps must be reimagined for an AI-enabled workforce

Traditional talent development models are failing to address the dual challenge of preserving critical thinking capabilities while developing AI fluency. Managing Directors worry about analytical skill erosion in junior staff, while junior employees struggle to balance AI efficiency with voice preservation and career progression clarity.

Why it matters

Career progression in Marketing & Communications has historically depended on developing writing craft, analytical capabilities, and strategic judgment through repeated practice. AI automation of routine tasks threatens to remove these learning opportunities, potentially creating a generation of professionals who can orchestrate AI but lack the foundational skills to evaluate quality or provide strategic guidance. This skills gap could undermine the industry's long-term value proposition and professional standards.

Evidence from the research

"At a senior level we're challenging AI outputs; at a junior level they're taking it as truth. I'm often not seeing a critical layer of thinking in the output I'm being sent for final review."

Director

"I do get worried... my ability to write in my unique voice is clearly diminishing. But I'm also producing a higher volume of higher quality work."

Junior level

63.4% of employees have not received role-specific AI training

Implications for teams

People
Career progression must emphasize judgment development and critical thinking alongside AI orchestration skills.
Process
Teams need "human-first" moments in workflows, dedicated voice preservation exercises, and mentoring time funded by AI efficiency gains.
Culture
Organizations should celebrate both AI proficiency and human craft skills, with clear progression pathways that value both capabilities.

A measurement vacuum is stalling firm-wide AI integration strategies

Despite widespread anecdotal reports of AI efficiency gains, Marketing & Communications organizations lack systematic measurement frameworks connecting AI activity to business outcomes, limiting strategic decision-making and investment justification.

Why it matters

Without measurement frameworks, organizations cannot optimize AI investments, justify expanded access and training, or demonstrate value to clients and stakeholders. Anecdotal efficiency claims lack credibility for budget planning and strategic decisions. Furthermore, the absence of quality metrics alongside efficiency metrics risks prioritizing speed over standards, potentially undermining client relationships and professional reputation.

Evidence from the research

"Everyone talks about efficiency gains, but nobody measures what happens to the saved time. Does it go to innovation? More client work? Or just more of the same?"

Executive

"We've proven AI works in isolated projects, but without metrics showing broader impact, leadership hasn't yet committed to the tool access and training we need."

Director

Implications for teams

People
Organizations need dedicated measurement capabilities and training on KPI development that connects AI activity to business outcomes.
Process
Workflow design must incorporate measurement from day one, with clear metrics for quality, efficiency, and business impact.
Culture
Teams require data-driven decision-making cultures that value measurement alongside experimentation and innovation.

Key Findings: Quantitative Data

Key insights from our survey of 82 Marketing & Communications professionals across 8 global markets

How would you rate your organization's support in helping employees adopt generative AI tools?

Good — some training and communication provided
42.7%
Poor — minimal support or inconsistent training
31.7%
None — no support or guidance at all
14.6%
Excellent — tailored training and clear guidelines
11.0%
Key Insight: 46.3% of professionals rate their organization's AI support as poor or nonexistent, while only 11% receive excellent support.

Is AI adoption consistent across teams and seniority levels in your organization?

70.7%
Report Inconsistent Adoption
Very inconsistent 42.7%
Somewhat inconsistent 28.0%
Yes, fairly consistent 18.3%
I don't know 11.0%
Key Insight: More than 7 in 10 employees experience inconsistent AI adoption across their organization, with only 18.3% reporting consistency.

Has your organization provided training tailored to your role on how to best use generative AI?

63.4%
Have Not Received Role-Specific Training
No 63.4%
In progress 18.3%
Yes 18.3%
Key Insight: Nearly two-thirds of professionals have not received role-specific AI training, with only 18.3% having completed such training.

Does your organization have clear policies and guidelines dictating how generative AI can be used at work?

Yes
52.4%
No
47.6%
Key Insight: Organizations are almost evenly split on AI governance, with just over half having clear policies while nearly half operate without formal guidelines.

Top 8 Strategic Actions for Marketing & Communications Leaders

Essential steps to transform fragmented AI adoption into strategic communications advantage

Methodology

Research approach, participant demographics, and analytical framework

Research Approach

This research employed a mixed-methods approach combining an industry-wide quantitative survey with structured qualitative focus groups segmented by seniority level. Focus groups were conducted under Chatham House Rules with full anonymization to encourage candid discussion about AI integration experiences, challenges, and organizational needs.

82 Survey Respondents
3 Focus Group Sessions

Focus Group Structure

Three separate focus groups were conducted by seniority level: Executive (VP/Partner level), Director/Managing Director, and Junior/Mid-level professionals. Participants represented leading strategic communications agencies and in-house teams across multiple sectors, ensuring diverse organizational perspectives on AI adoption challenges.

9 Focus Group Participants
6 Discussion Questions Per Session

Survey Demographics

The quantitative survey captured responses from 82 Marketing & Communications professionals across 8 global markets, representing diverse seniority levels, organization types, and geographic regions to ensure comprehensive industry perspective on AI adoption patterns and training needs.

8 Global Markets
60% Agency Professionals

Analysis & Limitations

Qualitative data was analyzed using thematic analysis to identify patterns across seniority levels and organizational types. Quantitative survey data underwent statistical analysis to validate key trends. This study focused primarily on English-speaking markets and may not fully represent global perspectives or other industry sectors.

6 Primary Findings
3 Month Research Period