From O(n) to O(1): How We Fixed Our API Key Validation Performance
When you're building a SaaS API, every millisecond counts. We recently discovered that our API key validation was a ticking time bomb—and fixed it before it exploded.
The Problem
Our API uses bearer tokens for authentication. Every request includes an API key:
Authorization: Bearer paymint_production_apikey_a1b2c3...
For security, we encrypt API keys before storing them in the database. The encryption uses AES-256-GCM, which means we can't simply query for a matching key—we have to decrypt to compare.
Here's what our original validation looked like:
export async function validateApiKey(apiKey: string): Promise<ApiKeyValidation> {
// Get ALL active API keys
const apiKeyRecords = await database.apiKey.findMany({
where: { status: 'active' },
});
// Decrypt each one and compare
for (const record of apiKeyRecords) {
const decryptedKey = decrypt(record.encryptedKey);
if (decryptedKey === apiKey) {
return { isValid: true, organizationId: record.organizationId };
}
}
return { isValid: false };
}
This works. But there's a problem hiding in plain sight.
The Math
Let's say decryption takes 5ms per key (it's actually faster, but let's be conservative).
| Active API Keys | Time to Validate |
|---|---|
| 10 | 50ms |
| 100 | 500ms |
| 1,000 | 5 seconds |
| 10,000 | 50 seconds |
Every API request—fetching products, listing subscriptions, canceling a subscription—would need to wait for this validation. At 1,000 keys, we'd be adding 5 seconds of latency to every single request.
This is O(n) complexity. As our customer base grows, performance degrades linearly.
Why Not Just Query the Encrypted Key?
You might think: "Just store the encrypted key and query for it directly."
SELECT * FROM api_keys WHERE encrypted_key = ?
This doesn't work because encryption is non-deterministic. AES-GCM uses a random initialization vector (IV) for each encryption, so encrypting the same plaintext twice produces different ciphertexts.
encrypt('my_api_key') // => 'abc123...'
encrypt('my_api_key') // => 'xyz789...' (different!)
This is actually a security feature—it prevents attackers from identifying duplicate keys by comparing ciphertexts.
The Solution: Hash-Based Lookup
The fix is elegant: store a hash of the API key alongside the encrypted version.
Unlike encryption, hashing is deterministic—the same input always produces the same output. And unlike encryption, we don't need to reverse it. We just need to find a match.
import crypto from 'node:crypto';
function hashApiKey(apiKey: string): string {
return crypto.createHash('sha256').update(apiKey).digest('hex');
}
New Schema
model ApiKey {
id String @id @default(uuid())
keyHash String? @unique // SHA-256 hash for O(1) lookup
encryptedKey String // AES-256-GCM encrypted key
// ... other fields
}
New Validation
export async function validateApiKey(apiKey: string): Promise<ApiKeyValidation> {
const keyHash = hashApiKey(apiKey);
// O(1) lookup using unique index
const record = await database.apiKey.findFirst({
where: {
keyHash: keyHash,
status: 'active',
},
});
if (!record) {
return { isValid: false };
}
// Defense in depth: verify by decrypting
const decryptedKey = decrypt(record.encryptedKey);
if (decryptedKey !== apiKey) {
return { isValid: false };
}
return {
isValid: true,
organizationId: record.organizationId,
};
}
Now validation is O(1)—constant time regardless of how many API keys exist.
| Active API Keys | Old Time | New Time |
|---|---|---|
| 10 | 50ms | ~5ms |
| 100 | 500ms | ~5ms |
| 1,000 | 5s | ~5ms |
| 10,000 | 50s | ~5ms |
Why Keep the Encrypted Key?
You might wonder: if we have the hash, why bother with encryption?
Two reasons:
Defense in depth: After finding a hash match, we decrypt and verify. This protects against hash collisions (astronomically unlikely with SHA-256, but defense in depth is good practice).
Key rotation: If we ever need to re-encrypt keys (e.g., rotating the encryption key), we need the actual key value. The hash alone isn't reversible.
Migration Strategy
We couldn't just flip a switch—existing API keys didn't have hashes. Here's how we handled the migration:
1. Make the hash column nullable
ALTER TABLE "ApiKey" ADD COLUMN "keyHash" TEXT;
CREATE UNIQUE INDEX "ApiKey_keyHash_key" ON "ApiKey"("keyHash");
2. Add hash on new key creation
function generateApiKey(environment: 'sandbox' | 'production') {
const key = `paymint_\({environment}_apikey_\){crypto.randomBytes(32).toString('hex')}`;
const keyHash = hashApiKey(key);
const encryptedKey = encrypt(key);
return { key, keyHash, encryptedKey };
}
3. Fallback for legacy keys
export async function validateApiKey(apiKey: string): Promise<ApiKeyValidation> {
const keyHash = hashApiKey(apiKey);
// Try O(1) lookup first
const record = await database.apiKey.findFirst({
where: { keyHash, status: 'active' },
});
if (record) {
// Verify and return
const decryptedKey = decrypt(record.encryptedKey);
if (decryptedKey === apiKey) {
return { isValid: true, organizationId: record.organizationId };
}
return { isValid: false };
}
// Fallback: check legacy keys without hash
return validateApiKeyLegacy(apiKey);
}
async function validateApiKeyLegacy(apiKey: string): Promise<ApiKeyValidation> {
const records = await database.apiKey.findMany({
where: { status: 'active', keyHash: null },
});
for (const record of records) {
const decryptedKey = decrypt(record.encryptedKey);
if (decryptedKey === apiKey) {
// Auto-migrate: add hash for future O(1) lookups
const keyHash = hashApiKey(apiKey);
await database.apiKey.update({
where: { id: record.id },
data: { keyHash },
});
return { isValid: true, organizationId: record.organizationId };
}
}
return { isValid: false };
}
The legacy fallback auto-migrates keys on first use. Over time, all keys get hashes, and the fallback path becomes unused.
Security Considerations
Is storing a hash safe?
Yes. SHA-256 is a one-way function—you can't reverse it to get the original key. An attacker with database access would see:
keyHash: "a1b2c3d4e5f6..."
encryptedKey: "encrypted_blob..."
They can't use the hash to authenticate (the API expects the original key), and they can't decrypt without the encryption key (stored separately in environment variables).
What about rainbow tables?
API keys are high-entropy random strings (72+ characters of hex). Rainbow tables are only practical for low-entropy inputs like passwords. The search space for our keys is 16^72, which is... large.
What about timing attacks?
We use constant-time comparison for the final verification:
import crypto from 'node:crypto';
function secureCompare(a: string, b: string): boolean {
return crypto.timingSafeEqual(Buffer.from(a), Buffer.from(b));
}
Results
After deploying this change:
P50 latency: Reduced by 40ms
P99 latency: Reduced by 200ms
Database load: Significantly reduced (no more full table scans)
More importantly, we removed a scaling bottleneck. Our API can now handle 10x more customers without degrading performance.
Key Takeaways
Audit your auth paths: Authentication runs on every request. Even small inefficiencies compound.
Encryption ≠ Hashing: Encryption is reversible and non-deterministic. Hashing is one-way and deterministic. Use the right tool.
Plan for scale: Code that works at 10 customers might break at 10,000. Think about complexity classes.
Migrate gracefully: Use fallbacks and auto-migration to avoid big-bang deployments.
Defense in depth: Even with hash-based lookup, we still verify by decrypting. Belt and suspenders.
Building a SaaS? Check out Paymint—we handle subscription billing so you can focus on your product.
Tags: performance, security, api-design, database, optimization, saas, authentication

