TL;DR
Building a bridge between a VS Code extension (TypeScript/Node.js) and a Tauri desktop app (Rust) requires choosing an IPC mechanism. The three options: WebSocket (1ms latency, bidirectional, most flexible), stdio pipes (sub-1ms, unidirectional per pipe, simpler), and HTTP localhost (2-5ms, request-response only, simplest). This guide covers the architecture, implementation, and tradeoffs of each approach, with production code for the WebSocket pattern.
The Three IPC Options
When your VS Code extension needs to communicate with a Tauri companion app, you have three architectural choices:
WebSocket IPC
The Tauri app opens a WebSocket server on a local port. The VS Code extension connects as a client. Bidirectional, event-driven communication. Latency: ~1ms. Best for: real-time state synchronization where both sides push updates.
stdio Pipes
The VS Code extension spawns the Tauri binary as a child process and communicates via stdin/stdout. Latency: <1ms. Best for: simple request/response patterns where the extension controls the Tauri process lifecycle.
HTTP Localhost
The Tauri app runs an HTTP server on localhost. The extension makes fetch() calls. Latency: 2-5ms. Best for: simple, stateless queries where you don't need persistent connections or server-push.
The WebSocket Architecture
For AI context applications, WebSocket is the optimal choice. Here's the architecture:
// VS Code Extension (Client Side):
const ws = new WebSocket('ws://127.0.0.1:9319');
ws.onopen = () => console.log('Connected to Tauri');
ws.send(JSON.stringify({ type: 'file_focus', path: currentFile }));
// Tauri App (Server Side — Rust):
use tokio_tungstenite::accept_async;
let listener = TcpListener::bind("127.0.0.1:9319").await?;
while let Ok((stream, _)) = listener.accept().await {
let ws_stream = accept_async(stream).await?;
// Handle messages from VS Code extension
}
Production Considerations
Building production IPC between VS Code and Tauri has several non-obvious challenges:
Port Discovery
Don't hardcode ports. Have the Tauri app write its port to a well-known file path (~/.context-snipe/port). The extension reads this file on activation. If the file is stale (Tauri not running), the extension shows a 'Start Companion' button.
Reconnection Logic
The VS Code extension must handle disconnections gracefully. Implement exponential backoff reconnection (1s, 2s, 4s, max 30s). Show connection status in the VS Code status bar. The developer should never need to manually reconnect.
Message Protocol
Define a typed message protocol with version numbers. Each message has: type (string enum), version (semver), and payload (typed JSON). Version the protocol so the extension and Tauri app can evolve independently.
Security
Bind to 127.0.0.1, never 0.0.0.0. Validate the WebSocket origin header to reject connections from untrusted sources. Use a shared secret (written to the port file) for authentication.
Context Snipe Uses This Exact Architecture.
The VS Code extension ↔ Tauri companion IPC architecture described here is exactly how Context Snipe works in production. 1ms message latency. Automatic reconnection. Typed protocol. Secure by default.
🔧 Production IPC. VS Code ↔ Rust. 1ms latency.
Context Snipe's VS Code extension communicates with its Rust companion via WebSocket IPC — exactly as described in this guide. The bridge is invisible to the developer. Context flows automatically. Start free — no credit card →