Aws Mcp Servers Awesome Mcp Servers

Gombloh
-
aws mcp servers awesome mcp servers

AWS MCP Servers Access AWS documentation, best practices, and service integrations via the Model Context Protocol. Open source MCP servers for AWS A suite of specialized MCP servers that help you get the most out of AWS, wherever you use MCP. Table of Contents - Open source MCP servers for AWS - Table of Contents - What is the Model Context Protocol (MCP) and how does it work with MCP Servers for AWS? - Open source MCP servers for AWS Transport Mechanisms - Why MCP Servers for AWS?

Available MCP Servers: Quick Installation - 🚀Getting Started with AWS - Browse by What You're Building - Browse by How You're Working - MCP AWS Lambda Handler Module - When to use Local vs Remote MCP Servers? - Use Cases for the Servers - Installation and Setup - Samples - Vibe coding - Additional Resources - Security - Contributing - Developer guide - License - Disclaimer What is the Model Context Protocol (MCP) and how does it work with MCP Servers for AWS?

The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. Whether you're building an AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need. An MCP Server is a lightweight program that exposes specific capabilities through the standardized Model Context Protocol. Host applications (such as chatbots, IDEs, and other AI tools) have MCP clients that maintain 1:1 connections with MCP servers.

Common MCP clients include agentic AI coding assistants (like Kiro, Cline, Cursor, Windsurf) as well as chatbot applications like Claude Desktop, with more clients coming soon. MCP servers can access local data sources and remote services to provide additional context that improves the generated outputs from the models. MCP Servers for AWS use this protocol to provide AI applications access to AWS documentation, contextual guidance, and best practices. Through the standardized MCP client-server architecture, AWS capabilities become an intelligent extension of your development environment or AI application.

MCP Servers for AWS enable enhanced cloud-native development, infrastructure management, and development workflows—making AI-assisted cloud computing more accessible and efficient. The Model Context Protocol is an open source project run by Anthropic, PBC. and open to contributions from the entire community.

For more information on MCP, you can find further documentation here Open source MCP servers for AWS Transport Mechanisms Supported transport mechanisms The MCP protocol currently defines two standard transport mechanisms for client-server communication: - stdio, communication over standard in and standard out - streamable HTTP The MCP servers in this repository are designed to support stdio only. You are responsible for ensuring that your use of these servers comply with the terms governing them, and any laws, rules, regulations, policies, or standards that apply to you.

Server Sent Events Support Removal Important Notice: On May 26th, 2025, Server Sent Events (SSE) support was removed from all MCP servers in their latest major versions. This change aligns with the Model Context Protocol specification's backwards compatibility guidelines. We are actively working towards supporting Streamable HTTP, which will provide improved transport capabilities for future versions. For applications still requiring SSE support, please use the previous major version of the respective MCP server until you can migrate to alternative transport methods. Why MCP Servers for AWS?

MCP servers enhance the capabilities of foundation models (FMs) in several key ways: - Improved Output Quality: By providing relevant information directly in the model's context, MCP servers significantly improve model responses for specialized domains like AWS services. This approach reduces hallucinations, provides more accurate technical details, enables more precise code generation, and ensures recommendations align with current AWS best practices and service capabilities. - Access to Latest Documentation: FMs may not have knowledge of recent releases, APIs, or SDKs.

MCP servers bridge this gap by pulling in up-to-date documentation, ensuring your AI assistant always works with the latest AWS capabilities. - Workflow Automation: MCP servers convert common workflows into tools that foundation models can use directly. Whether it's CDK, Terraform, or other AWS-specific workflows, these tools enable AI assistants to perform complex tasks with greater accuracy and efficiency.

Specialized Domain Knowledge: MCP servers provide deep, contextual knowledge about AWS services that might not be fully represented in foundation models' training data, enabling more accurate and helpful responses for cloud development tasks. Available MCP Servers: Quick Installation Get started quickly with one-click installation buttons for popular MCP clients.

Click the buttons below to install servers directly in Cursor or VS Code: 🚀 Getting Started with AWS For AWS interactions, we recommend starting with: Browse by What You're Building 📚 Real-time access to official AWS documentation 🏗️ Infrastructure & Deployment Build, deploy, and manage cloud infrastructure with Infrastructure as Code best practices. Container Platforms Serverless & Functions Support 🤖 AI & Machine Learning Enhance AI applications with knowledge retrieval, content generation, and ML capabilities 📊 Data & Analytics Work with databases, caching systems, and data processing workflows.

SQL & NoSQL Databases Search & Analytics - Amazon OpenSearch MCP Server - OpenSearch powered search, Analytics, and Observability Backend API Providers Caching & Performance 🛠️ Developer Tools & Support Accelerate development with code analysis, documentation, and testing utilities. 📡 Integration & Messaging Connect systems with messaging, workflows, and location services. 💰 Cost & Operations Monitor, optimize, and manage your AWS infrastructure and costs. 🧬 Healthcare & Lifesciences Interact with AWS HealthAI services.

Browse by How You're Working 👨💻 Vibe Coding & Development AI coding assistants like Kiro, Cline, Cursor, and Claude Code helping you build faster Workshop: Check out the Vibe Coding with AWS MCP Servers workshop for hands-on guidance and examples.

Core Development Workflow Infrastructure as Code Application Development Container & Serverless Development Testing & Data Lifesciences Workflow Development Healthcare Data Management 💬 Conversational Assistants Customer-facing chatbots, business agents, and interactive Q&A systems Knowledge & Search Content Processing & Generation Business Services 🤖 Autonomous Background Agents Headless automation, ETL pipelines, and operational systems Data Operations & ETL Caching & Performance Workflow & Integration Operations & Monitoring MCP AWS Lambda Handler Module A Python library for creating serverless HTTP handlers for the Model Context Protocol (MCP) using AWS Lambda.

This module provides a flexible framework for building MCP HTTP endpoints with pluggable session management, including built-in DynamoDB support. Features: - Easy serverless MCP HTTP handler creation using AWS Lambda - Pluggable session management system - Built-in DynamoDB session backend support - Customizable authentication and authorization - Example implementations and tests See src/mcp-lambda-handler/README.md for full usage, installation, and development instructions. When to use Local vs Remote MCP Servers? MCP servers can be run either locally on your development machine or remotely on the cloud.

Here's when to use each approach: Local MCP Servers - Development & Testing: Perfect for local development, testing, and debugging - Offline Work: Continue working when internet connectivity is limited - Data Privacy: Keep sensitive data and credentials on your local machine - Low Latency: Minimal network overhead for faster response times - Resource Control: Direct control over server resources and configuration Remote MCP Servers - Team Collaboration: Share consistent server configurations across your team - Resource Intensive Tasks: Offload heavy processing to dedicated cloud resources - Always Available: Access your MCP servers from anywhere, any device - Automatic Updates: Get the latest features and security patches automatically - Scalability: Easily handle varying workloads without local resource constraints - Security: Centralized security controls with IAM-based permissions and zero credential exposure - Governance: Comprehensive audit logging and compliance monitoring for enterprise-grade governance Note: Some MCP servers, like the official AWS MCP server (in preview) and AWS Knowledge MCP, are provided as fully managed services by AWS.

These AWS-managed remote servers require no setup or infrastructure management on your part - just connect and start using them. Use Cases for the Servers For example, you can use the AWS Documentation MCP Server to help your AI assistant research and generate up-to-date code for any AWS service, like Amazon Bedrock Inline agents. Alternatively, you could use the CDK MCP Server or the Terraform MCP Server to have your AI assistant create infrastructure-as-code implementations that use the latest APIs and follow AWS best practices.

With the AWS Pricing MCP Server, you could ask "What would be the estimated monthly cost for this CDK project before I deploy it?" or "Can you help me understand the potential AWS service expenses for this infrastructure design?" and receive detailed cost estimations and budget planning insights. The Valkey MCP Server enables natural language interaction with Valkey data stores, allowing AI assistants to efficiently manage data operations through a simple conversational interface. Installation and Setup Each server has specific installation instructions with one-click installs for Kiro, Cursor, and VSCode.

Generally, you can: - Install uv from Astral - Install Python using uv python install 3.10 - Configure AWS credentials with access to required services - Add the server to your MCP client configuration Example configuration for Kiro MCP settings (~/.kiro/settings/mcp.json ): For macOS/Linux { "mcpServers": { "awslabs-core-mcp-server": { "command": "uvx", "args": [ "awslabs.core-mcp-server@latest" ], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } } } } See individual server READMEs for specific requirements and configuration options.

For Windows When configuring MCP servers on Windows, you'll need to use a slightly different configuration format: { "mcpServers": { "awslabs-core-mcp-server": { "disabled": false, "timeout": 60, "type": "stdio", "command": "uv", "args": [ "tool", "run", "--from", "awslabs.core-mcp-server@latest", "awslabs.core-mcp-server.exe" ], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } } } } If you have problems with MCP configuration or want to check if the appropriate parameters are in place, you can try the following: # Run MCP server manually with timeout 15s $ timeout 15s uv tool run <MCP Name> <args> 2>&1 || echo "Command completed or timed out" # Example (Aurora MySQL MCP Server) $ timeout 15s uv tool run awslabs.mysql-mcp-server --resource_arn <Your Resource ARN> --secret_arn <Your Secret ARN> ...

2>&1 || echo "Command completed or timed out" # If the arguments are not set appropriately, you may see the following message: usage: awslabs.mysql-mcp-server [-h] --resource_arn RESOURCE_ARN --secret_arn SECRET_ARN --database DATABASE --region REGION --readonly READONLY awslabs.mysql-mcp-server: error: the following arguments are required: --resource_arn, --secret_arn, --database, --region, --readonly Note about performance when using uvx "@latest" suffix: Using the "@latest" suffix checks and downloads the latest MCP server package from pypi every time you start your MCP clients, but it comes with a cost of increased initial load times.

If you want to minimize the initial load time, remove "@latest" and manage your uv cache yourself using one of these approaches: uv cache clean <tool> : where {tool} is the mcp server you want to delete from cache and install again (e.g.: "awslabs.lambda-tool-mcp-server") (remember to remove the '<>').uvx <tool>@latest : this will refresh the tool with the latest version and add it to the uv cache. Running MCP servers in containers Docker images for each MCP server are published to the public AWS ECR registry.

This example uses docker with the "awslabs.nova-canvas-mcp-server and can be repeated for each MCP server - Optionally save sensitive environmental variables in a file: # contents of a .env file with fictitious AWS temporary credentials AWS_ACCESS_KEY_ID=ASIAIOSFODNN7EXAMPLE AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY AWS_SESSION_TOKEN=AQoEXAMPLEH4aoAH0gNCAPy...truncated...zrkuWJOgQs8IZZaIv2BXIa2R4Olgk - Use the docker options: --env ,--env-file , and--volume as needed because the"env": {} are not available within the container.{ "mcpServers": { "awslabs.nova-canvas-mcp-server": { "command": "docker", "args": [ "run", "--rm", "--interactive", "--env", "FASTMCP_LOG_LEVEL=ERROR", "--env", "AWS_REGION=us-east-1", "--env-file", "/full/path/to/.env", "--volume", "/full/path/to/.aws:/app/.aws", "public.ecr.aws/awslabs-mcp/awslabs/nova-canvas-mcp-server:latest" ], "env": {} } } } - For testing local changes you can build and tag the image.

You have to update the MCP configuration to use this tag instead of the ECR image. cd src/nova-canvas-mcp-server docker build -t awslabs/nova-canvas-mcp-server . Getting Started with Kiro Install in Kiro See the Kiro IDE documentation or the Kiro CLI documentation for details. In the Kiro IDE: - Navigate Kiro >MCP Servers - Add a new MCP server by clicking the + Add button. - Paste the configuration given below. For global configuration, edit ~/.kiro/settings/mcp.json . For project-specific configuration, edit .kiro/settings/mcp.json in your project directory.

~/.kiro/settings/mcp.json For macOS/Linux: { "mcpServers": { "awslabs-core-mcp-server": { "command": "uvx", "args": ["awslabs.core-mcp-server@latest"], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } } } } For Windows: { "mcpServers": { "awslabs-core-mcp-server": { "disabled": false, "timeout": 60, "type": "stdio", "command": "uv", "args": [ "tool", "run", "--from", "awslabs.core-mcp-server@latest", "awslabs.core-mcp-server.exe" ], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } } } } Getting Started with Cline and Amazon Bedrock Getting Started with Cline and Amazon Bedrock IMPORTANT: Following these instructions may incur costs and are subject to the Amazon Bedrock Pricing. You are responsible for any associated costs.

In addition to selecting the desired model in the Cline settings, ensure you have your selected model (e.g. anthropic.claude-3-7-sonnet ) also enabled in Amazon Bedrock. For more information on this, see these AWS docs on enabling model access to Amazon Bedrock Foundation Models (FMs). - Follow the steps above in the Installation and Setup section to install uv from Astral, install Python, and configure AWS credentials with the required services. - If using Visual Studio Code, install the Cline VS Code Extension (or equivalent extension for your preferred IDE).

Once installed, click the extension to open it. When prompted, select the tier that you wish. In this case, we will be using Amazon Bedrock, so the free tier of Cline is fine as we will be sending requests using the Amazon Bedrock API instead of the Cline API. - Select the MCP Servers button. - Select the Installed tab, then click Configure MCP Servers to open the cline_mcp_settings.json file. - In the cline_mcp_settings.json file, add your desired MCP servers in themcpServers object.

See the following example that will use some of the current MCP servers that are available in this repository. Ensure you save the file to install the MCP servers.

cline_mcp_settings.json For macOS/Linux: { "mcpServers": { "awslabs-core-mcp-server": { "command": "uvx", "args": ["awslabs.core-mcp-server@latest"], "env": { "FASTMCP_LOG_LEVEL": "ERROR", "MCP_SETTINGS_PATH": "path to your mcp settings file" } } } } For Windows: { "mcpServers": { "awslabs-core-mcp-server": { "disabled": false, "timeout": 60, "type": "stdio", "command": "uv", "args": [ "tool", "run", "--from", "awslabs.core-mcp-server@latest", "awslabs.core-mcp-server.exe" ], "env": { "FASTMCP_LOG_LEVEL": "ERROR", "MCP_SETTINGS_PATH": "path to your mcp settings file" } } } } - Once installed, you should see a list of your MCP Servers under the MCP Server Installed tab, and they should have a green slider to show that they are enabled.

See the following for an example with two of the possible MCP servers for AWS. Click Done when finished. You should now see the Cline chat interface. - By default, Cline will be set as the API provider, which has limits for the free tier. Next, let's update the API provider to be AWS Bedrock, so we can use the LLMs through Bedrock, which would have billing go through your connected AWS account. - Click the settings gear to open up the Cline settings.

Then under API Provider, switch this from Cline toAWS Bedrock and selectAWS Profile for the authentication type. As a note, theAWS Credentials option works as well, however it uses a static credentials (Access Key ID and Secret Access Key) instead of temporary credentials that are automatically redistributed when the token expires, so the temporary credentials with an AWS Profile is the more secure and recommended method. - Fill out the configuration based on the existing AWS Profile you wish to use, select the desired AWS Region, and enable cross-region inference.

Next, scroll down on the settings page until you reach the text box that says Custom Instructions. Paste in the following snippet to ensure the mcp-core server is used as the starting point for every prompt: For every new project, always look at your MCP servers and use mcp-core as the starting point every time. Also after a task completion include the list of MCP servers used in the operation. - Once the custom prompt is pasted in, click Done to return to the chat interface.

Now you can begin asking questions and testing out the functionality of your installed MCP servers. The default option in the chat interface is is Plan which will provide the output for you to take manual action on (e.g. providing you a sample configuration that you copy and paste into a file). However, you can optionally toggle this toAct which will allow Cline to act on your behalf (e.g. searching for content using a web browser, cloning a repository, executing code, etc).

You can optionally toggle on the "Auto-approve" section to avoid having to click to approve the suggestions, however we recommend leaving this off during testing, especially if you have the Act toggle selected. Note: For the best results, please prompt Cline to use the desired MCP server you wish to use. For example, Using the Terraform MCP Server, do...

Getting Started with Cursor Getting Started with Cursor - Follow the steps above in the Installation and Setup section to install uv from Astral, install Python, and configure AWS credentials with the required services. - You can place MCP configuration in two locations, depending on your use case: A. Project Configuration - For tools specific to a project, create a .cursor/mcp.json file in your project directory. - This allows you to define MCP servers that are only available within that specific project. B.

Global Configuration - For tools that you want to use across all projects, create a ~/.cursor/mcp.json file in your home directory. - This makes MCP servers available in all your Cursor workspaces.

.cursor/mcp.json For macOS/Linux: { "mcpServers": { "awslabs-core-mcp-server": { "command": "uvx", "args": ["awslabs.core-mcp-server@latest"], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } } } } For Windows: { "mcpServers": { "awslabs-core-mcp-server": { "disabled": false, "timeout": 60, "type": "stdio", "command": "uv", "args": [ "tool", "run", "--from", "awslabs.core-mcp-server@latest", "awslabs.core-mcp-server.exe" ], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } } } } - Using MCP in Chat The Composer Agent will automatically use any MCP tools that are listed under Available Tools on the MCP settings page if it determines them to be relevant.

To prompt tool usage intentionally, please prompt Cursor to use the desired MCP server you wish to use. For example, Using the Terraform MCP Server, do... - Tool Approval By default, when Agent wants to use an MCP tool, it will display a message asking for your approval. You can use the arrow next to the tool name to expand the message and see what arguments the Agent is calling the tool with.

Getting Started with Windsurf Getting Started with Windsurf - Follow the steps above in the Installation and Setup section to install uv from Astral, install Python, and configure AWS credentials with the required services.

Access MCP Settings - Navigate to Windsurf - Settings > Advanced Settings or use the Command Palette > Open Windsurf Settings Page - Look for the "Model Context Protocol (MCP) Servers" section - Add MCP Servers - Click "Add Server" to add a new MCP server - You can choose from available templates like GitHub, Puppeteer, PostgreSQL, etc.

Alternatively, click "Add custom server" to configure your own server - Manual Configuration - You can also manually edit the MCP configuration file located at ~/.codeium/windsurf/mcp_config.json - You can also manually edit the MCP configuration file located at ~/.codeium/windsurf/mcp_config.json For macOS/Linux: { "mcpServers": { "awslabs-core-mcp-server": { "command": "uvx", "args": ["awslabs.core-mcp-server@latest"], "env": { "FASTMCP_LOG_LEVEL": "ERROR", "MCP_SETTINGS_PATH": "path to your mcp settings file" } } } } For Windows: { "mcpServers": { "awslabs-core-mcp-server": { "disabled": false, "timeout": 60, "type": "stdio", "command": "uv", "args": [ "tool", "run", "--from", "awslabs.core-mcp-server@latest", "awslabs.core-mcp-server.exe" ], "env": { "FASTMCP_LOG_LEVEL": "ERROR", "MCP_SETTINGS_PATH": "path to your mcp settings file" } } } } Getting Started with VS Code Install in VS Code Configure MCP servers in VS Code settings or in .vscode/mcp.json (see VS Code MCP docs for more info.): .vscode/mcp.json For macOS/Linux: { "mcpServers": { "awslabs-core-mcp-server": { "command": "uvx", "args": ["awslabs.core-mcp-server@latest"], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } } } } For Windows: { "mcpServers": { "awslabs-core-mcp-server": { "disabled": false, "timeout": 60, "type": "stdio", "command": "uv", "args": [ "tool", "run", "--from", "awslabs.core-mcp-server@latest", "awslabs.core-mcp-server.exe" ], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } } } } Getting Started with Claude Code Install in Claude Code Configure MCP servers in Claude Code through the CLI or in .mcp.json - Follow the steps above in the Installation and Setup section to install uv from Astral, install Python, and configure AWS credentials with the required services.

Using Claude Code CLI Commands Claude Code CLI commands to add MCP servers: # Add core AWS services claude mcp add aws-api uvx awslabs.aws-api-mcp-server@latest claude mcp add aws-cdk uvx awslabs.cdk-mcp-server@latest claude mcp add aws-docs uvx awslabs.aws-documentation-mcp-server@latest claude mcp add aws-support uvx awslabs.aws-support-mcp-server@latest claude mcp add aws-pricing uvx awslabs.aws-pricing-mcp-server@latest # Add AI/ML and Bedrock services claude mcp add bedrock-kb uvx awslabs.bedrock-kb-retrieval-mcp-server@latest claude mcp add nova-canvas uvx awslabs.nova-canvas-mcp-server@latest claude mcp add synthetic-data uvx awslabs.syntheticdata-mcp-server@latest # Add data and analytics services claude mcp add aws-dataprocessing uvx awslabs.aws-dataprocessing-mcp-server@latest claude mcp add aurora-dsql uvx awslabs.aurora-dsql-mcp-server@latest claude mcp add valkey uvx awslabs.valkey-mcp-server@latest # List installed servers claude mcp list - Manual Configuration (Alternative) You can also manually configure MCP servers by creating a .mcp.json file in your project root: .mcp.json For macOS/Linux: { "mcpServers": { "awslabs.cdk-mcp-server": { "command": "uvx", "args": ["awslabs.cdk-mcp-server@latest"], "env": { "FASTMCP_LOG_LEVEL": "ERROR" } }, "awslabs.aws-documentation-mcp-server": { "command": "uvx", "args": ["awslabs.aws-documentation-mcp-server@latest"], "env": { "FASTMCP_LOG_LEVEL": "ERROR", "AWS_DOCUMENTATION_PARTITION": "aws" } } } } Samples Ready-to-use examples of open source MCP servers for AWS in action are available in the samples directory.

These samples provide working code and step-by-step guides to help you get started with each MCP server. Vibe coding You can use these MCP servers with your AI coding assistant to vibe code. For tips and tricks on how to improve your vibe coding experience, please refer to our guide.

Additional Resources - Introducing AWS MCP Servers for code assistants - Vibe coding with AWS MCP Servers | AWS Show & Tell - Supercharging AWS database development with AWS MCP servers - AWS costs estimation using Amazon Q CLI and AWS Pricing MCP Server - Introducing AWS Serverless MCP Server: AI-powered development for modern applications - Announcing new Model Context Protocol (MCP) Servers for AWS Serverless and Containers - Accelerating application development with the Amazon EKS MCP server - Amazon Neptune announces MCP (Model Context Protocol) Server - Terraform MCP Server Vibe Coding - How to Generate AWS Architecture Diagrams Using Amazon Q CLI and MCP - Harness the power of MCP servers with Amazon Bedrock Agents - Unlocking the power of Model Context Protocol (MCP) on AWS - AWS Price List Gets a Natural Language Upgrade: Introducing the AWS Pricing MCP Server - AWS SheBuilds: AWS Team's Journey from Internal Tools to Open Source AI Infrastructure - Guidance for Vibe Coding with AWS MCP servers - Vibe coding with AWS MCP Servers | Hands-on Workshop Security See CONTRIBUTING for more information.

Contributing Big shout out to our awesome contributors! Thank you for making this project better! Contributions of all kinds are welcome! Check out our contributor guide for more information. Developer guide If you want to add a new MCP Server to the library, check out our development guide and be sure to follow our design guidelines. License This project is licensed under the Apache-2.0 License.

Disclaimer Before using an MCP Server, you should consider conducting your own independent assessment to ensure that your use would comply with your own specific security and quality control practices and standards, as well as the laws, rules, and regulations that govern you and your content. Related Servers AWS EC2 Pricing Get up-to-date EC2 pricing information with one call. Fast. Powered by a pre-parsed AWS pricing catalogue.

Remote MCP Server on Cloudflare A remote MCP server deployable on Cloudflare Workers with OAuth login support for both local development and remote deployment. EdgeOne Pages MCP An MCP service for deploying HTML content to EdgeOne Pages and obtaining a publicly accessible URL. WaveGuard Physics-based anomaly detection via MCP — send any data, get anomalies back using wave-equation dynamics. No training pipelines, no model files. Remote MCP Server (Authless) An authentication-free remote MCP server deployable on Cloudflare Workers or locally via npm.

AWS Cost Analysis Analyze CDK projects to identify AWS services used and get pricing information from AWS pricing webpages and API. AWS‑IReveal‑MCP Provides a unified interface to AWS services for security investigations and incident response. Qlik Cloud Interact with Qlik Cloud applications and extract data from visualizations using the Qlik Cloud API. Spotify Control Spotify playback and manage your liked songs using LLMs. S2T Accelerators 36 enterprise MCP tools for AWS security, infrastructure generation, AI workflows, and AI agent governance.

People Also Asked

AWS MCP Servers | Awesome MCP Servers?

MCP Servers for AWS enable enhanced cloud-native development, infrastructure management, and development workflows—making AI-assisted cloud computing more accessible and efficient. The Model Context Protocol is an open source project run by Anthropic, PBC. and open to contributions from the entire community.

AWS MCP Server Documentation?

MCP servers bridge this gap by pulling in up-to-date documentation, ensuring your AI assistant always works with the latest AWS capabilities. - Workflow Automation: MCP servers convert common workflows into tools that foundation models can use directly. Whether it's CDK, Terraform, or other AWS-specific workflows, these tools enable AI assistants to perform complex tasks with greater accuracy and ...

GitHub - awslabs/mcp: Official MCP Servers for AWS?

Specialized Domain Knowledge: MCP servers provide deep, contextual knowledge about AWS services that might not be fully represented in foundation models' training data, enabling more accurate and helpful responses for cloud development tasks. Available MCP Servers: Quick Installation Get started quickly with one-click installation buttons for popular MCP clients.

Welcome to Open Source MCP Servers for AWS - awslabs.github.io?

Using Claude Code CLI Commands Claude Code CLI commands to add MCP servers: # Add core AWS services claude mcp add aws-api uvx awslabs.aws-api-mcp-server@latest claude mcp add aws-cdk uvx awslabs.cdk-mcp-server@latest claude mcp add aws-docs uvx awslabs.aws-documentation-mcp-server@latest claude mcp add aws-support uvx awslabs.aws-support-mcp-server@latest claude mcp add aws-pricing uvx awslabs.aw...

AWS MCP Servers (Part-1): Your AI Assistant's Gateway to Real-Time AWS ...?

These AWS-managed remote servers require no setup or infrastructure management on your part - just connect and start using them. Use Cases for the Servers For example, you can use the AWS Documentation MCP Server to help your AI assistant research and generate up-to-date code for any AWS service, like Amazon Bedrock Inline agents. Alternatively, you could use the CDK MCP Server or the Terraform MC...