Rubberduck
AI-Powered Developer Assistant

Executive Summary

Scania’s development team needed a faster way to navigate their extensive GitLab repository ecosystem and internal documentation. Developers were spending significant time searching for code examples, understanding project structures, and finding internal best practices across multiple repositories.

Buzzcloud delivered a custom AI-powered assistant built on Strands SDK and AWS Bedrock that provides instant, context-aware answers about code, repositories, and internal processes. The solution integrates seamlessly with GitLab and platform Wiki, enabling developers to ask natural language questions and receive accurate, source-cited responses.

Value Created:

  • Reduced search time – Developers can find code and documentation in seconds instead of minutes
  • Improved onboarding – New team members can quickly understand project structures and conventions
  • Better knowledge sharing – Project-specific context automatically loaded for accurate, contextual answers

About the Customer

Scania PDM is the Product Data Management organisation within Scania, responsible for developing and operating IT solutions that support product development and lifecycle management. The organisation builds and maintains internal tools, platforms, and integrations used by engineers and developers across Scania and Traton.

Scania is a leading manufacturer of trucks, buses, and engines, with a strong focus on digital transformation and software development.

Industry: Automotive / Manufacturing
Size: Large enterprise with multiple development teams
Challenge: Managing knowledge across numerous GitLab repositories and internal documentation systems

Challenge

Scania’s PDM development teams faced several productivity obstacles:

What was missing:

  • Time-consuming onboarding – New team members struggled to understand the codebase and internal processes
  • No unified search – Developers had to manually browse multiple GitLab repositories to find code examples or understand project structures
  • Knowledge fragmentation – Project-specific conventions and best practices were scattered across repositories and wiki pages
  • Context switching – Developers frequently switched between GitLab, wiki, and documentation to answer questions

Why this created obstacles:

  • Inconsistent answers when team members interpreted code or processes differently
  • Risk of outdated information when documentation wasn’t kept in sync with code
  • Difficulty maintaining project-specific context across multiple repositories

The goals the customer wanted to achieve:

  1. Enable developers to ask natural language questions about code and get instant, accurate answers
  2. Automatically load project-specific context (coding conventions, architecture decisions) when working with repositories
  3. Integrate with existing GitLab and wiki infrastructure without disrupting workflows
  4. Scale to support multiple teams and hundreds of repositories

Solution

Buzzcloud designed and implemented a PDM IT Rubberduck, a custom AI assistant built on AWS Bedrock using Anthropic’s Claude model with the Strands SDK.

Technologies Used

  • AWS Bedrock + Claude – AI with high accuracy for code understanding
  • Anthropic Strands SDK – Agent framework for tool orchestration and memory management
  • AWS AgentCore Memory – Persistent conversation memory for personalized responses
  • AWS Lambda + DynamoDB – Serverless architecture for scalability and cost efficiency
  • React + TypeScript – Modern frontend with WebSocket streaming for real-time responses
  • Microsoft Entra ID – Enterprise authentication and group-based access control
  • Terraform – Infrastructure as Code for reproducible deployments
  • GitLab API + Plattform Wiki – Integrated search across code repositories and internal documentation

How It Works
The solution uses a serverless architecture where users interact with a React frontend that connects via WebSocket to Lambda functions running the AI agent. The assistant is built with the Strands SDK and leverages MCP-style tool use (via AWS Bedrock tool calling) to reason iteratively: it calls tools, inspects results, updates its understanding, and decides the next best action. This continuous decision loop allows the agent to navigate between high-level architecture questions and low-level code details across repositories and internal documentation, resulting in reliable answers grounded in retrieved sources.

The agent automatically:
1. Searches GitLab repositories – Finds relevant code and reads file content
2. Loads project context – Automatically detects and loads project-specific documentation when accessing repositories
3. Searches dual sources – Combines information from both GitLab (code) and platform Wiki (processes/documentation)
4. Maintains conversation memory – Uses AgentCore Memory for personalized responses based on user preferences and past interactions
5. Streams responses – Provides real-time feedback as answers are generated

Result

With the Rubberduck in place, PDM IT developers can now find code and documentation much faster than before. The AI assistant automatically searches GitLab repositories and the plattform Wiki, reads actual file content before answering, and cites sources with file paths and project names. This eliminates hallucinations and ensures accurate, context-aware responses.

The solution significantly reduces the time developers spend searching repositories. New team members can quickly understand project structures and conventions through AI-guided code exploration, reducing onboarding time. The automatic project context loading ensures developers always follow the right conventions for each project.

The intuitive chat interface allows developers to ask natural language questions like “How is authentication handled in this project?” or “Where are the API endpoints defined?” instead of manually searching file names. Information flows seamlessly from GitLab repositories to the cloud and becomes available within seconds.

Benefits

  • Faster code search – Questions answered quickly with accurate, source-cited responses
  • Reduced search time – Developers spend less time manually browsing repositories
  • Faster onboarding – New team members productive faster with AI-guided code exploration
  • Scalable architecture – Designed to handle multiple repositories and concurrent users
  • Future-proof platform – Ready for S3 Vectors integration and enhanced AI capabilities

Next steps

The solution is deployed and actively used by the development team, providing immediate value in daily development workflows. The platform is designed to scale horizontally, supporting multiple GitLab repositories and concurrent users. The flexibility of this platform allows for future integrations, enabling Scania PDM IT to expand their use of AI and developer productivity tools as their needs evolve.