Neurolink

MCP Server

Edge-first platform unifying 12 AI providers and 100+ models. Single interface for OpenAI, Anthropic, Gemini, and more.

by Community

★★★★☆ 4.2/5 (80 reviews)
Updated 2026-03-01
typescriptmulti-provideredgeaiactive

Installation

npx -y neurolink-mcp

Quick Start

```json
{
  "mcpServers": {
    "neurolink": {
      "command": "npx",
      "args": ["-y", "neurolink-mcp"],
      "env": { "NEUROLINK_API_KEY": "your-key" }
    }
  }
}
```

Tools & Capabilities

call_model
list_models
route_request
get_usage
configure_fallback

Key Features

  • Unified API across 12 AI providers
  • Access to 100+ models through a single MCP server
  • Automatic fallback when primary providers are unavailable
  • Cost optimization with model routing rules
  • Edge deployment for low-latency inference

Use Cases

  • Use the best AI model for each task without code changes
  • Avoid provider lock-in with a unified model interface
  • Implement cost-optimized AI routing strategies
  • Build resilient AI applications with automatic failover

Compatibility

Claude Desktop Claude Code Continue Cursor

About Neurolink

Neurolink is an edge-first AI gateway that unifies access to 12 major AI providers and 100+ models through a single MCP interface. It handles provider-specific authentication, model routing, fallback strategies, and cost optimization. With Neurolink, AI agents can use the best model for each task without provider lock-in.

Language: TypeScript

Category: Agent Frameworks

Browse Agent Frameworks servers →

← Back to MCP Hub Directory
This site uses cookies from Google for advertising and analytics. Learn more