Skip to main content

Upstash Redis vs Redis Cloud vs Valkey 2026

·APIScout Team
Share:

TL;DR

Upstash for serverless. Redis Cloud for always-on production workloads. Valkey for self-hosted. Upstash's HTTP API makes it the only Redis that works in edge environments (Cloudflare Workers, Vercel Edge) and its per-request pricing means $0 when you're not using it. Redis Cloud is the managed version of standard Redis — better performance at high throughput but $70/month minimum. Valkey (the Redis open-source fork created after Redis re-licensed in 2024) is now the go-to for self-hosted, with growing cloud support.

Key Takeaways

  • Upstash: HTTP API, works in Cloudflare Workers/edge, $0.20/100K commands, free 10K/day
  • Redis Cloud: lowest latency (<1ms), persistent connections, $70/month minimum for production
  • Valkey: 100% Redis-compatible open-source fork, self-hosted via Docker
  • Edge requirement: only Upstash supports Cloudflare Workers (HTTP, no persistent connections)
  • Rate limiting: Upstash @upstash/ratelimit is the standard for Next.js/edge apps
  • Session storage: all three work; Redis Cloud/Valkey have lower latency for session-heavy apps

Upstash: Redis for Serverless

Best for: Cloudflare Workers, Vercel Edge, Next.js, any serverless function that can't hold persistent connections.

// npm install @upstash/redis
import { Redis } from '@upstash/redis';

const redis = new Redis({
  url: process.env.UPSTASH_REDIS_REST_URL!,
  token: process.env.UPSTASH_REDIS_REST_TOKEN!,
});

// All standard Redis operations:
await redis.set('key', 'value', { ex: 3600 });     // With TTL
const value = await redis.get<string>('key');
await redis.del('key');

// Hash:
await redis.hset('user:123', { name: 'Alice', plan: 'pro' });
const user = await redis.hgetall<{ name: string; plan: string }>('user:123');

// Lists:
await redis.lpush('queue', 'job1', 'job2');
const job = await redis.rpop('queue');

// Sets:
await redis.sadd('active-users', 'user:123', 'user:456');
const count = await redis.scard('active-users');

Rate Limiting with @upstash/ratelimit

This is Upstash's killer feature for Next.js and edge apps:

// npm install @upstash/ratelimit @upstash/redis
import { Ratelimit } from '@upstash/ratelimit';
import { Redis } from '@upstash/redis';
import { NextRequest, NextResponse } from 'next/server';

const ratelimit = new Ratelimit({
  redis: Redis.fromEnv(),
  limiter: Ratelimit.slidingWindow(10, '10 s'),  // 10 requests per 10 seconds
  analytics: true,  // Track rate limit hits in Upstash console
});

// Next.js middleware:
export async function middleware(request: NextRequest) {
  const ip = request.ip ?? '127.0.0.1';

  const { success, limit, remaining, reset } = await ratelimit.limit(ip);

  if (!success) {
    return NextResponse.json(
      { error: 'Rate limit exceeded. Try again later.' },
      {
        status: 429,
        headers: {
          'X-RateLimit-Limit': limit.toString(),
          'X-RateLimit-Remaining': remaining.toString(),
          'X-RateLimit-Reset': reset.toString(),
          'Retry-After': Math.ceil((reset - Date.now()) / 1000).toString(),
        },
      }
    );
  }

  return NextResponse.next();
}

export const config = {
  matcher: '/api/:path*',
};
// Different limits per plan:
const rateLimiters = {
  free: new Ratelimit({
    redis: Redis.fromEnv(),
    limiter: Ratelimit.slidingWindow(10, '1 m'),   // 10/minute
  }),
  pro: new Ratelimit({
    redis: Redis.fromEnv(),
    limiter: Ratelimit.slidingWindow(100, '1 m'),  // 100/minute
  }),
};

export async function POST(request: NextRequest) {
  const session = await getSession();
  const limiter = session?.user?.plan === 'pro' ? rateLimiters.pro : rateLimiters.free;

  const { success } = await limiter.limit(session?.user?.id ?? request.ip ?? '127.0.0.1');
  if (!success) return new Response('Rate limited', { status: 429 });

  // ... your API logic
}

Caching with Upstash

// Simple cache wrapper:
async function cachedQuery<T>(
  key: string,
  ttlSeconds: number,
  query: () => Promise<T>
): Promise<T> {
  const cached = await redis.get<T>(key);
  if (cached !== null) return cached;

  const result = await query();
  await redis.set(key, result, { ex: ttlSeconds });
  return result;
}

// Usage:
const user = await cachedQuery(
  `user:${userId}`,
  300,  // 5 minutes
  () => db.users.findUnique({ where: { id: userId } })
);

Upstash Pricing

Free: 10,000 commands/day
Pay-as-you-go: $0.20/100K commands

For reference:
  1M API requests, each doing 1 Redis read: $2/month
  10M API requests, each doing 2 Redis ops: $40/month

Standard plans (always-on, lower latency):
  $10/month: 100M commands/month
  $25/month: 500M commands/month

Redis Cloud: Production-Grade

Best for: high-throughput apps needing <1ms latency, persistent connections, complex data structures.

// Standard Redis client (ioredis):
import Redis from 'ioredis';

const redis = new Redis({
  host: process.env.REDIS_HOST,
  port: parseInt(process.env.REDIS_PORT ?? '6379'),
  password: process.env.REDIS_PASSWORD,
  tls: {},  // Redis Cloud requires TLS
  retryStrategy: (times) => Math.min(times * 50, 2000),
  maxRetriesPerRequest: 3,
});

// Connection pooling for serverless (use lazyConnect):
const redis = new Redis({
  host: process.env.REDIS_HOST,
  password: process.env.REDIS_PASSWORD,
  tls: {},
  lazyConnect: true,
});
await redis.connect();
// Pipeline for batching multiple operations (reduces round trips):
const pipeline = redis.pipeline();
pipeline.get('key1');
pipeline.get('key2');
pipeline.set('key3', 'value3', 'EX', 300);
const results = await pipeline.exec();
// [null, 'value1'], [null, 'value2'], [null, 'OK']
// Pub/Sub for real-time features:
const subscriber = new Redis({ host: process.env.REDIS_HOST, password: process.env.REDIS_PASSWORD });
const publisher = new Redis({ host: process.env.REDIS_HOST, password: process.env.REDIS_PASSWORD });

await subscriber.subscribe('notifications');

subscriber.on('message', (channel, message) => {
  const notification = JSON.parse(message);
  // Handle notification...
});

// Publish from another process:
await publisher.publish('notifications', JSON.stringify({
  userId: '123',
  type: 'message',
  content: 'New message from Alice',
}));

Redis Cloud Pricing

Free: 30MB (dev/testing only)
Essentials: $7/month (250MB, no persistence)
Production:
  $43/month: 1GB RAM, persistence, replication
  $107/month: 3GB RAM

Redis Cloud requires minimum $43/month for production features.
Compare to Upstash: $10/month for equivalent workloads at moderate QPS.
Redis Cloud wins at >1M commands/day where per-request pricing adds up.

Valkey: Self-Hosted Redis (Open Source)

Valkey is the Redis fork created by ex-Redis engineers after Redis changed its license in 2024. It's now the default "open source Redis" maintained by the Linux Foundation.

# docker-compose.yml — run Valkey locally or on a VPS:
services:
  valkey:
    image: valkey/valkey:8
    ports:
      - "6379:6379"
    command: valkey-server --save 60 1 --loglevel warning --requirepass ${REDIS_PASSWORD}
    volumes:
      - valkey_data:/data
    restart: unless-stopped

volumes:
  valkey_data:
# Valkey CLI:
valkey-cli -a $REDIS_PASSWORD ping
# PONG

# Or use redis-cli (100% compatible):
redis-cli -h localhost -a $REDIS_PASSWORD

Valkey is 100% drop-in compatible with Redis — any Redis client works:

// Works with any Redis client (ioredis, redis npm package):
import Redis from 'ioredis';

const valkey = new Redis({
  host: 'your-vps-ip',
  port: 6379,
  password: process.env.REDIS_PASSWORD,
});

// Exact same API as Redis
await valkey.set('test', 'hello');
const val = await valkey.get('test');

Valkey monthly cost: just your VPS ($5-20/month on DigitalOcean/Hetzner).


Common Patterns: Session Storage

// Session storage with any Redis:
// (Upstash example, same for Redis Cloud/Valkey)

const SESSION_TTL = 7 * 24 * 60 * 60;  // 7 days in seconds

async function createSession(userId: string): Promise<string> {
  const sessionId = crypto.randomUUID();
  await redis.set(
    `session:${sessionId}`,
    JSON.stringify({ userId, createdAt: Date.now() }),
    { ex: SESSION_TTL }
  );
  return sessionId;
}

async function getSession(sessionId: string) {
  const data = await redis.get<{ userId: string; createdAt: number }>(`session:${sessionId}`);
  return data;
}

async function deleteSession(sessionId: string) {
  await redis.del(`session:${sessionId}`);
}

// Extend session on activity:
async function extendSession(sessionId: string) {
  await redis.expire(`session:${sessionId}`, SESSION_TTL);
}

Valkey Cloud: Managed Options Growing

Valkey (the Linux Foundation's Redis fork) started as self-hosted-only, but managed cloud options are growing in 2026. Aiven for Valkey and Upstash (which added Valkey support) let you run Valkey without managing your own server. The distinction matters: if you want open-source Redis compatibility with no operational burden, Valkey Cloud is now a viable option. Why Valkey over Redis Cloud managed? The license. Redis re-licensed in 2024 to SSPL (Server Side Public License), which restricts cloud providers from offering Redis as a managed service without a commercial agreement. Valkey remains BSD-licensed — meaning any cloud provider can offer it, which drives competition and lower prices. AWS ElastiCache now supports Valkey as a first-class engine (replacing Redis in their managed offering). GCP Memorystore for Redis is transitioning. For teams on AWS or GCP, Valkey on their native managed service offers the lowest latency (same region as your application) and eliminates the need for a third-party Redis provider. The catch: Valkey is newer — version 8 is the current stable release, with slightly less community documentation than Redis. For standard operations (get/set/expire, pub/sub, lists, sorted sets), Valkey is production-stable. For Redis-specific newer features (Redis modules, RedisSearch, RedisJSON), verify Valkey compatibility first.


Real Cost Comparison for Typical SaaS Workloads

Abstract pricing is hard to evaluate — here's how the three options compare for specific workloads:

Low-volume startup (< 1M commands/month):

  • Upstash free tier: $0/month (10,000 commands/day = ~300K/month)
  • Redis Cloud free: 30MB (sufficient for dev/staging, not production)
  • Valkey self-hosted: $5-10/month (small VPS) Winner: Upstash for serverless, Valkey for always-on.

Mid-volume SaaS (10M commands/month, 10K daily active users):

  • Upstash Pay-As-You-Go: $20/month ($0.20 × 100 hundred-thousands)
  • Redis Cloud Essentials: $7/month (250MB) — watch the memory limit
  • Valkey on DigitalOcean Droplet: $12/month (2GB RAM) Winner: Redis Cloud or Valkey depending on memory needs.

High-volume SaaS (200M commands/month, 100K+ DAU):

  • Upstash Pro plan: $25-100/month (500M-2B commands included)
  • Redis Cloud Production: $43-107/month (1-3GB RAM, replication)
  • Valkey on Hetzner (EU) or AWS ElastiCache: $30-60/month Winner: Redis Cloud if you need enterprise support; Valkey on AWS if you're already on AWS.

The edge case that changes everything: if ANY of your code runs in Cloudflare Workers, Vercel Edge, or Deno Deploy — Upstash is the only option. These environments block TCP connections; only HTTP-based Redis (Upstash's core feature) works.


Which One to Use

Use UPSTASH if:
  → Serverless / edge functions (required for Cloudflare Workers)
  → Irregular traffic (scale to zero cost)
  → Rate limiting in Next.js middleware
  → Simple caching with per-request billing
  → Cost: $0 at low volume, predictable at scale

Use REDIS CLOUD if:
  → Long-running servers (not serverless)
  → >10M commands/day (per-request pricing gets expensive)
  → Need <1ms latency with persistent connections
  → Complex pub/sub or Redis Streams
  → Enterprise support needed

Use VALKEY (self-hosted) if:
  → Full control required
  → Data sovereignty / compliance
  → Cost optimization at scale (VPS cost only)
  → Already running own infrastructure
  → Development and staging environments

Pub/Sub and Real-Time Features

Upstash's HTTP API creates a fundamental limitation for real-time use cases: you can't subscribe to channels over HTTP because subscriptions require a persistent connection. This is the clearest boundary between Upstash and Redis Cloud/Valkey.

For pub/sub (chat, live feeds, collaborative features), Redis Cloud and Valkey both support the standard SUBSCRIBE/PUBLISH commands over persistent TCP connections. A typical pattern: one connection per subscriber (or a connection pool), messages published by your API server, and subscribers listening in a background process:

// Redis Cloud pub/sub — requires persistent connection (not edge):
const subscriber = new Redis({ host: process.env.REDIS_HOST, password: process.env.REDIS_PASSWORD });
const publisher = new Redis({ host: process.env.REDIS_HOST, password: process.env.REDIS_PASSWORD });

await subscriber.subscribe('user:notifications');
subscriber.on('message', (channel, message) => {
  const payload = JSON.parse(message);
  broadcastToWebSocketClients(payload.userId, payload.data);
});

await publisher.publish('user:notifications', JSON.stringify({
  userId: '123',
  data: { type: 'message', content: 'You have a new message' },
}));

Upstash offers subscribe() for basic pub/sub via HTTP long-polling rather than true TCP — latency is higher and it's not suitable for high-frequency message streams. For real-time collaborative features or chat, Redis Cloud or Valkey running in a long-lived process (Railway, Fly.io, Render) is the right choice.

Redis Streams for Persistent Event Queues

Redis Streams are a better choice than pub/sub when you need message persistence, consumer groups, and guaranteed delivery. Unlike pub/sub (fire-and-forget), Streams store messages and allow multiple consumer groups to process the same stream independently.

// Produce to a stream (Upstash-compatible via HTTP):
await redis.xadd('events:orders', '*', {
  orderId: 'order_abc123',
  customerId: 'cust_xyz',
  amount: '9900',
  status: 'created',
});

// Consume with consumer group (Redis Cloud/Valkey, persistent worker):
const results = await redis.xreadgroup(
  'GROUP', 'order-processors', 'worker-1',
  'COUNT', '10',
  'STREAMS', 'events:orders', '>'
);

for (const [id, fields] of results?.[0]?.[1] ?? []) {
  await processOrder(Object.fromEntries(fields));
  await redis.xack('events:orders', 'order-processors', id);
}

Streams are particularly valuable for webhooks, audit logging, and async job processing where you need at-least-once delivery guarantees. Upstash supports Streams via its HTTP API for lower-volume use cases, making it usable for edge-native event queues where Cloudflare Queues or similar isn't available.

Choosing Between Upstash Plans

Upstash's per-request pricing is transparent but can surprise teams as they scale. The key breakeven analysis: compare your expected monthly commands to the fixed plan thresholds.

At 1M commands/day (30M/month), per-request pricing at $0.20/100K commands costs $60/month. Upstash's Standard plan at $10/month offers 100M commands — this is the cost floor for dedicated instances. For bursty workloads with days of low usage, pay-as-you-go is cheaper. For consistent high-volume applications, the fixed plans save money.

The analytics feature (analytics: true in the Ratelimit constructor) adds approximately 2 commands per rate limit check — a small overhead that adds up at scale. Disable it in production if you're cost-sensitive and not actively using the Upstash console analytics.

Valkey Managed Options

Self-hosting Valkey on a VPS is the cheapest option, but managed Valkey is available if you want the open-source license without the operational overhead. Aiven and several cloud providers now offer managed Valkey clusters. The managed options are newer than Redis Cloud and have less enterprise track record, but they provide the same Linux Foundation-governed open-source binary without Redis Ltd.'s SSPL licensing terms.

The main reason to choose Valkey over Redis Cloud today is licensing: Valkey is BSD-3-Clause, Redis 7.4+ is SSPL. For companies with open-source distribution requirements or that prefer OSI-approved licenses, Valkey is the compliant path. For teams that don't care about license terms and want the most mature managed option, Redis Cloud has more operational history.

Operational Considerations: Persistence, Backups, and High Availability

The operational model differs significantly across all three options — and it's often the deciding factor for production deployments.

Upstash manages all infrastructure for you. Replication, failover, and backups are handled automatically. The serverless tier doesn't guarantee low-latency persistence (writes are eventually durable), but Upstash's Standard and Pro plans include daily backups and automatic failover. You have no visibility into or control over the underlying infrastructure — which is both the point and the limitation.

Redis Cloud offers the most mature managed experience. High availability (automatic failover with a replica) requires the Production tier ($43+/month) — the Essentials plan is a single node with no replica. Production clusters support multi-AZ replication and automatic backup to S3. Redis Cloud's SLA is 99.99% uptime for the Production tier. The trade-off: you're locked into Redis Ltd.'s pricing and SSPL licensing, and the per-GB pricing is higher than self-hosted equivalents.

Valkey self-hosted requires you to manage everything: replication setup, Sentinel or Cluster mode for high availability, backup configuration, and monitoring. A minimal production-grade Valkey setup for high availability requires at least 3 nodes (1 primary + 2 replicas in Sentinel mode) or a 6-node minimum for Cluster mode. This is meaningful operational overhead — plan for it in your infrastructure investment. The reward: full control, predictable costs, and no licensing constraints.

For teams with a dedicated DevOps function, Valkey's operational flexibility and cost at scale make it competitive. For teams where infrastructure management is a distraction from product development, Upstash (for serverless) or Redis Cloud (for persistent workloads) provides better price-to-engineer-time value.

Monitoring and Observability

Each option has different observability support:

Upstash provides a built-in console with command latency graphs, daily bandwidth usage, and real-time command monitoring. For custom alerting, Upstash supports webhook alerts on threshold breaches (e.g., database size approaching the plan limit). Integration with Datadog and Grafana is available via the Upstash metrics API.

Redis Cloud offers detailed cluster metrics in its UI: memory usage, operations per second per shard, hit/miss ratios, keyspace size, and connected clients. The Pro tier integrates with Datadog, Grafana, and Prometheus. Redis Insight (Redis Ltd.'s free GUI tool) connects to Redis Cloud clusters and provides real-time profiling, latency analysis, and key-level inspection.

Valkey self-hosted gives you INFO statistics via the CLI, which any Redis monitoring tool can consume. RedisInsight, Prometheus Redis Exporter, and Grafana are the standard monitoring stack — configure the redis_exporter sidecar alongside your Valkey container and point Prometheus at it. This requires setup but gives you the most flexible and powerful observability of the three options, including custom alerts on any Redis metric.

Methodology

Pricing sourced from Upstash (upstash.com/pricing), Redis Cloud (redis.io/cloud/pricing), and Valkey's Docker Hub page as of March 2026. Upstash per-request pricing applies to the serverless (pay-as-you-go) plan; Standard plans are billed monthly at fixed rates. Redis Cloud pricing reflects the Essentials tier (single node, no persistence) and the Pro tier (with persistence and replication). Valkey version 8.x is the current stable release, maintained by the Linux Foundation's Valkey project (valkey.io). All Redis client libraries (ioredis, redis npm package) are compatible with Valkey without code changes. Upstash pub/sub latency figures are estimates based on HTTP round-trip characteristics vs. TCP WebSocket; actual latency depends on geographic proximity to Upstash's edge locations. Redis Streams consumer group behavior (XREADGROUP, XACK) is consistent across Upstash, Redis Cloud, and Valkey — all implement the Redis Streams specification without API differences. The @upstash/redis TypeScript SDK wraps all Redis commands in fetch-compatible HTTP calls; commands that require long-held connections (blocking reads like BLPOP, SUBSCRIBE) either time out or fall back to polling, which is a known limitation documented in Upstash's SDK README. Valkey's Sentinel mode requires a minimum of 3 Sentinel processes (typically co-located with Redis nodes) to achieve quorum-based leader election; 2-node Sentinel configurations cannot achieve quorum and should not be used in production.


Find and compare caching and Redis-compatible APIs at APIScout.

Related: Upstash vs Redis Cloud API: Serverless Redis 2026, Auth0 vs Firebase Auth, Best Database-as-a-Service APIs in 2026

The API Integration Checklist (Free PDF)

Step-by-step checklist: auth setup, rate limit handling, error codes, SDK evaluation, and pricing comparison for 50+ APIs. Used by 200+ developers.

Join 200+ developers. Unsubscribe in one click.