X
InnovationWebflowGoogle AIResourcesAboutOur Approach
Case Study

Carnegie Investment Bank

Pontus Gavelin
·
December 20, 2025
·
6 min read

The Short Answer

Carnegie Investment Bank needed to move from AI curiosity to structured deployment without compromising its regulatory standing. Here's how a phased readiness-first approach delivered a scalable Copilot rollout across the organization.

Article hero image

Answer: Carnegie Investment Bank successfully deployed Microsoft Copilot across departments by starting with a comprehensive readiness assessment rather than immediate rollout. The assessment revealed that teams needed stronger Microsoft 365 foundations first — an insight that reshaped the entire deployment strategy. A structured two-track pilot followed: superusers testing Copilot across departments in parallel with a full department-scale adoption test. The result is a replicable adoption framework, a security-first deployment meeting financial services compliance requirements, and a custom internal AI solution built on Azure.

What Was Carnegie Investment Bank Trying to Achieve with AI?

Carnegie Investment Bank is one of the Nordic region's leading independent investment banks, operating across investment banking, securities, and wealth management. When Chief Digital Officer Emma Trygger set out to explore AI capabilities, the goal was clear: move from buzzword to business value.

The challenge wasn't technical ambition — it was responsible implementation. As a regulated financial institution, Carnegie needed to navigate data security, compliance requirements, and organizational readiness before deploying AI tools at scale.

Why Did the Engagement Start with a Readiness Assessment?

Because skipping readiness is the most common reason enterprise AI pilots fail to scale. Before any technology deployment, we ran a comprehensive Copilot Readiness Assessment covering Carnegie's data landscape, security posture, Microsoft 365 environment, and organizational readiness for AI adoption.

The assessment revealed a critical finding: teams needed stronger foundations in Microsoft 365 before Copilot could deliver its full value. Without this assessment, Carnegie would have deployed Copilot into an environment not set up to use it effectively — and attributed the underperformance to AI rather than to infrastructure gaps.

This finding shaped everything that followed.

How Was the Pilot Structured?

Rather than a broad rollout, the pilot ran two parallel tracks.

Track one: Superuser group. Representatives from across departments tested Copilot in diverse business contexts — investment research, client communications, internal reporting, compliance documentation. This generated a detailed picture of where Copilot delivered immediate value and where it required workflow adjustment.

Track two: Department-scale adoption. A full department deployed Copilot to test adoption dynamics at scale — not just whether individuals could use the tool, but whether a team could integrate it into their operating rhythm.

Critically, M365 proficiency training preceded both tracks. Sequencing training before tooling ensured that when AI capabilities were introduced, teams had the foundational skills to use them effectively.

What Were the Results?

The superuser pilot completed successfully, with Carnegie scaling Copilot deployment to full departments.

Key outcomes:

  • Structured adoption framework that can be replicated across additional departments with clear sequencing and training protocols — no ad hoc rollout required for each new department.
  • Security-first deployment meeting Carnegie's regulatory requirements for data handling in financial services. Compliance was designed in, not added after.
  • Internal AI capability developed: "chatCIB" — a custom AI solution built on Azure services for internal data analysis, extending beyond Copilot to address specific Carnegie use cases.
  • Change management playbook covering the human side of AI adoption — not just the technical implementation, but how to help teams actually change how they work.

What's the Key Lesson for Financial Institutions Exploring AI?

As Emma Trygger observed, organizations consistently overestimate what can be achieved in a year and underestimate what can happen in ten. The Carnegie engagement validates the clarity-before-action approach: assess honestly, prepare thoroughly, deploy methodically.

Don't start with the technology. Start with readiness. The time invested in assessment and preparation saves months of failed pilots and abandoned initiatives — and produces a deployment the organization can actually scale.

Key Takeaways

  • A readiness assessment before deployment is not a delay — it's what makes the deployment work. Carnegie's assessment revealed M365 gaps that would have undermined a direct Copilot rollout
  • Two-track pilots — superusers for breadth, one department for depth — generate the adoption data needed to design a replicable rollout framework
  • Sequencing training before tooling is non-negotiable for regulated industries where error tolerance is low and adoption confidence is critical

Frequently Asked Questions

How long did the Carnegie AI deployment take from readiness assessment to production rollout?

The readiness assessment ran approximately four weeks, followed by M365 proficiency training across the pilot groups. The two-track pilot ran concurrently over six to eight weeks. From engagement start to department-scale production deployment, the full timeline was approximately four to five months. For a regulated financial institution deploying AI at scale, this is a fast and disciplined timeline — made possible by the readiness-first sequencing that prevented mid-deployment architectural pivots.

How did you address data security concerns for an investment bank?

Security architecture was defined in the readiness assessment phase, not after deployment. This meant identifying what data Copilot would have access to, configuring Microsoft 365 data governance settings to align with Carnegie's compliance requirements, and establishing clear policies on what data the AI tools could and could not access. The "chatCIB" custom solution added a second layer — a controlled environment for internal data analysis that operates within Carnegie's security perimeter rather than through general cloud AI services.

What's the difference between Microsoft Copilot and a custom AI solution like chatCIB?

Microsoft Copilot is a general-purpose AI assistant integrated into the Microsoft 365 ecosystem — it works within Teams, Outlook, Word, and Excel. It's fast to deploy and broad in capability. A custom solution like chatCIB is purpose-built for specific internal use cases — in Carnegie's case, internal data analysis that requires working with proprietary financial data in a controlled environment. Both have roles: Copilot for general productivity enhancement, custom solutions for high-value specialized tasks that require tighter data governance and domain-specific accuracy.

Pontus Gavelin

Product and Platform Engineering

Pontus drives Ignite's Google Cloud and Gemini practice, helping enterprises move from AI experimentation to production-ready deployment.

Related Reading

Ready to Build?

Let's turn your digital vision into production-ready reality. Start with a conversation.

Book a Call