Stop writing YAML, start using proxy.ts
Ditch the indentation errors and start treating your network routing like actual code with programmable TypeScript proxies.
It’s 2026. Why are we still fighting indentation errors in YAML files to route HTTP traffic?
For the last decade, we accepted a painful separation of concerns. Developers wrote application code in TypeScript, Go, or Rust, while "DevOps" managed traffic in static, declarative configuration files for Nginx, HAProxy, or Kubernetes Ingress.
That separation was a mistake.
The feedback loop of editing a .yaml file, committing it, waiting for a CI/CD pipeline to apply it to a staging cluster, and then realizing you missed a trailing slash is dead. It’s too slow for how we build software today.
It is time to stop configuring your network boundaries and start programming them. Enter proxy.ts.
The Death of Static Gateways
Traditional gateways treat your application as a black box. You tell Nginx to listen on port 80 and forward /api to upstream_a. This works fine until logic gets complex.
What if you need to route based on a JWT claim? What if you need to transform a legacy XML response into JSON before it hits the client? In the old world, you’d be writing Lua scripts for OpenResty or dealing with complex Envoy filters. You are essentially coding in configuration files—untestable, brittle, and decoupled from your source.
We need to move the network boundary out of the infrastructure layer and directly into the application layer.
By using a programmable proxy written in TypeScript, your routing logic becomes version-controlled code. It lives in your repo. It runs through your linter. It passes your unit tests. You aren't "configuring" a route; you are importing a function.
// proxy.ts
import { createProxy, route } from './lib/gateway';
export const gateway = createProxy({
port: 8080,
routes: [
// Logic, not config
route('/api/v1', async (req) => {
if (req.headers.get('x-beta-user')) {
return 'http://beta-service:3000';
}
return 'http://stable-service:3000';
}),
],
});
Type-Safe Boundaries and Zero CORS
The biggest lie we told ourselves about microservices was that they could be completely decoupled. In reality, your frontend relies heavily on specific contract structures from your backend services.
When you use Nginx, there is no compile-time guarantee that the route /users/:id actually exists on the upstream service. You find out it's broken when your users get a 404 or a 502 in production.
With proxy.ts, we can unify fragmented microservices under a single, type-safe domain boundary. By importing the API definitions of your microservices directly into your proxy code, you can catch routing errors at compile time.
Furthermore, we can finally kill the CORS headache.
CORS is a browser security feature that makes local development miserable because your frontend runs on localhost:3000 and your API runs on localhost:8080. Developers waste hours configuring Access-Control-Allow-Origin headers.
When your proxy is the entry point for everything—serving your static frontend assets and proxying your API calls—everything comes from the same origin. No pre-flight requests. No header overhead. Just clean, fast networking.
// The compiler screams if 'UserService' doesn't have a 'getUser' method
import { UserService } from '@monorepo/user-service';
route('/api/user/:id', (req, params) => {
// Type-safe contract validation before the request leaves the proxy
return UserService.validate(params.id)
? 'http://user-service'
: new Response('Invalid ID', { status: 400 });
});

Environment Parity in One Script
The phrase "It works on my machine" usually implies a drift between local dev and production infrastructure.
In the YAML era, replicating a production topology locally was a nightmare. You had to spin up Minikube or Docker Compose with a dozen containers and a mocked ingress controller just to test a redirect.
With proxy.ts, your infrastructure dependencies are abstracted. You can write a single script that detects its environment.
When running in NODE_ENV=production, it resolves upstream services using internal DNS or Service Discovery. When running locally, it seamlessly maps those same routes to localhost ports where your fellow developers are running their services.
New developer onboarding simplifies drastically. They don't need to install Nginx or understand Kubernetes ingress objects. They run npm run start:proxy, and they have an exact mirror of the production routing layer running on their laptop.
const getTarget = (serviceName: string, localPort: number) => {
const isProd = process.env.NODE_ENV === 'production';
// In prod, use K8s DNS. Locally, use localhost ports.
return isProd
? `http://${serviceName}.svc.cluster.local`
: `http://localhost:${localPort}`;
};
// The routing logic remains identical across envs
route('/payments', getTarget('payment-service', 4001));
route('/auth', getTarget('auth-service', 4002));

Programmable Traffic Control
Complex traffic management usually requires expensive enterprise gateways or service meshes like Istio. If you want to do a canary release (send 5% of traffic to a new version), you are often stuck editing weight maps in a YAML file or navigating a cloud console.
When your proxy is code, traffic control is just... logic.
You can implement A/B testing, feature flagging, and canary releases using standard if/else statements or by pulling from your existing feature flag provider (like LaunchDarkly or a database). You don't need a plugin ecosystem; you just need TypeScript.
You also gain granular control over the request lifecycle. Need to inject an authentication header for a legacy service? Do it mid-stream. Need to sanitize sensitive data from a response before it leaves the boundary? Parse the JSON, delete the field, and re-serialize.
Because this runs on modern JavaScript runtimes (like Bun or Deno in 2026), the latency overhead is negligible compared to the network hop itself.
route('/checkout', async (req) => {
// complex logic usually reserved for enterprise meshes
const user = await getSession(req);
// A/B test logic lives in the app, not the infra
if (user.flags.has('new-checkout-flow')) {
// Inject headers dynamically
req.headers.set('X-Experiment-Group', 'B');
return 'http://checkout-v2';
}
return 'http://checkout-v1';
});
The Next Step
The era of treating infrastructure configuration as a separate discipline from application development is closing. The tools are here to collapse the stack.
By adopting a programmable proxy, you regain control, visibility, and speed. You stop guessing if your YAML indentation is correct and start trusting your compiler.
Stop writing config. Start shipping code.
Comments (0)
No comments yet.