Biometric Verification: Backend Architecture
Designing the backend architecture for biometric verification systems, covering liveness detection, face matching, secure storage, and integration patterns in TypeScript.
Biometric verification adds a powerful layer to identity checks. A forged document might pass visual inspection, but matching a live selfie to a passport photograph is significantly harder to fake. At the same time, biometric data is among the most sensitive categories of personal information. A leaked password can be changed; a leaked face template cannot. The backend architecture must be designed not just for accuracy but for security, privacy, and regulatory compliance from the ground up.
At Klivvr, Oasis integrates biometric verification as part of the identity verification pipeline. This article covers the backend architecture: how biometric data flows through the system, how we handle liveness detection and face matching, how templates are stored and protected, and the integration patterns we use with external biometric providers.
The Biometric Verification Flow
A biometric verification session in Oasis follows a defined sequence: the client captures a selfie, the backend verifies liveness, the face is compared against the reference document photo, and the result is recorded. Each step is handled by a dedicated component.
interface BiometricSession {
id: string;
customerId: string;
verificationRequestId: string;
status: BiometricSessionStatus;
selfieReference: string | null;
referenceDocumentId: string;
livenessResult: LivenessResult | null;
matchResult: FaceMatchResult | null;
createdAt: Date;
completedAt: Date | null;
}
type BiometricSessionStatus =
| "awaiting_capture"
| "processing_liveness"
| "processing_match"
| "completed"
| "failed";
interface LivenessResult {
isLive: boolean;
confidence: number;
method: "passive" | "active";
spoofType: string | null;
}
interface FaceMatchResult {
isMatch: boolean;
similarity: number;
threshold: number;
referenceQuality: number;
selfieQuality: number;
}
class BiometricVerificationService {
constructor(
private sessionStore: BiometricSessionStore,
private livenessDetector: LivenessDetector,
private faceMatcher: FaceMatchService,
private documentStore: DocumentStore,
private imageStore: SecureImageStore,
private auditLogger: AuditLogger
) {}
async createSession(
customerId: string,
verificationRequestId: string,
referenceDocumentId: string
): Promise<BiometricSession> {
const session: BiometricSession = {
id: generateUUID(),
customerId,
verificationRequestId,
status: "awaiting_capture",
selfieReference: null,
referenceDocumentId,
livenessResult: null,
matchResult: null,
createdAt: new Date(),
completedAt: null,
};
await this.sessionStore.save(session);
await this.auditLogger.log("biometric_session_created", {
sessionId: session.id,
customerId,
});
return session;
}
async processSelfie(
sessionId: string,
selfieData: Buffer
): Promise<BiometricVerificationResult> {
const session = await this.sessionStore.findById(sessionId);
if (session.status !== "awaiting_capture") {
throw new InvalidSessionStateError(
`Session ${sessionId} is in state ${session.status}, expected awaiting_capture`
);
}
const selfieRef = await this.imageStore.storeEncrypted(
selfieData,
session.customerId
);
session.selfieReference = selfieRef;
session.status = "processing_liveness";
await this.sessionStore.save(session);
const livenessResult = await this.livenessDetector.detect(selfieData);
session.livenessResult = livenessResult;
if (!livenessResult.isLive) {
session.status = "failed";
session.completedAt = new Date();
await this.sessionStore.save(session);
await this.auditLogger.log("biometric_liveness_failed", {
sessionId,
spoofType: livenessResult.spoofType,
});
return {
success: false,
reason: "Liveness check failed",
session,
};
}
session.status = "processing_match";
await this.sessionStore.save(session);
const referenceDoc = await this.documentStore.findById(
session.referenceDocumentId
);
const referenceImage = await this.documentStore.fetchImage(
referenceDoc.fileReference
);
const matchResult = await this.faceMatcher.compare(
selfieData,
referenceImage
);
session.matchResult = matchResult;
session.status = "completed";
session.completedAt = new Date();
await this.sessionStore.save(session);
await this.auditLogger.log("biometric_verification_completed", {
sessionId,
isMatch: matchResult.isMatch,
similarity: matchResult.similarity,
});
return {
success: matchResult.isMatch,
reason: matchResult.isMatch ? "Verified" : "Face mismatch",
session,
};
}
}The flow is deliberately sequential rather than parallel. Liveness must be confirmed before face matching proceeds, because running a face match against a spoofed selfie wastes resources and could produce misleading audit records.
Liveness Detection Strategies
Liveness detection determines whether the selfie was captured from a real, physically present person rather than a photograph, screen, or mask. There are two broad approaches: passive liveness and active liveness.
Passive liveness analyzes a single image for telltale signs of spoofing, such as screen moire patterns, paper texture, or unnatural lighting. Active liveness asks the user to perform actions (turn their head, blink, smile) and analyzes the video sequence for natural motion.
interface LivenessDetector {
detect(imageData: Buffer): Promise<LivenessResult>;
}
class PassiveLivenessDetector implements LivenessDetector {
constructor(private provider: LivenessProvider) {}
async detect(imageData: Buffer): Promise<LivenessResult> {
const analysis = await this.provider.analyzeImage(imageData);
return {
isLive: analysis.livenessScore >= 0.8,
confidence: analysis.livenessScore,
method: "passive",
spoofType: analysis.livenessScore < 0.8 ? analysis.detectedSpoofType : null,
};
}
}
class ActiveLivenessDetector implements LivenessDetector {
constructor(
private provider: LivenessProvider,
private challengeGenerator: ChallengeGenerator
) {}
async detect(imageData: Buffer): Promise<LivenessResult> {
const challenge = this.challengeGenerator.generate();
const analysis = await this.provider.analyzeWithChallenge(
imageData,
challenge
);
const challengeCompleted = analysis.challengeResults.every(
(r) => r.completed && r.confidence >= 0.75
);
return {
isLive: challengeCompleted && analysis.livenessScore >= 0.7,
confidence: analysis.livenessScore,
method: "active",
spoofType: !challengeCompleted ? "challenge_failed" : null,
};
}
}
class ChallengeGenerator {
private readonly challenges = [
{ action: "turn_left", instruction: "Slowly turn your head to the left" },
{ action: "turn_right", instruction: "Slowly turn your head to the right" },
{ action: "blink", instruction: "Blink your eyes" },
{ action: "smile", instruction: "Smile naturally" },
];
generate(): LivenessChallenge {
const selectedActions = this.selectRandom(this.challenges, 2);
return {
id: generateUUID(),
actions: selectedActions,
timeoutSeconds: 30,
createdAt: new Date(),
};
}
private selectRandom<T>(array: T[], count: number): T[] {
const shuffled = [...array].sort(() => Math.random() - 0.5);
return shuffled.slice(0, count);
}
}We use passive liveness as the default because it requires no extra effort from the customer. Active liveness is reserved for cases where the passive check returns a borderline score or when the customer's risk profile warrants enhanced verification. This graduated approach minimizes friction for the majority of customers while providing stronger assurance for higher-risk cases.
Face Matching Architecture
Face matching compares the selfie against the reference photo extracted from the customer's identity document. The comparison produces a similarity score, and a configurable threshold determines whether the match is accepted.
class FaceMatchService {
constructor(
private primaryProvider: FaceMatchProvider,
private fallbackProvider: FaceMatchProvider,
private config: FaceMatchConfig
) {}
async compare(
selfie: Buffer,
reference: Buffer
): Promise<FaceMatchResult> {
const selfieQuality = await this.assessFaceQuality(selfie);
const referenceQuality = await this.assessFaceQuality(reference);
if (selfieQuality < this.config.minimumFaceQuality) {
throw new InsufficientImageQualityError(
`Selfie quality ${selfieQuality} is below minimum ${this.config.minimumFaceQuality}`
);
}
let matchResult: ProviderMatchResult;
try {
matchResult = await this.primaryProvider.match(selfie, reference);
} catch (error) {
matchResult = await this.fallbackProvider.match(selfie, reference);
}
const adjustedThreshold = this.adjustThreshold(
this.config.baseThreshold,
selfieQuality,
referenceQuality
);
return {
isMatch: matchResult.similarity >= adjustedThreshold,
similarity: matchResult.similarity,
threshold: adjustedThreshold,
referenceQuality,
selfieQuality,
};
}
private adjustThreshold(
base: number,
selfieQuality: number,
referenceQuality: number
): number {
const qualityPenalty = Math.max(
0,
(1 - Math.min(selfieQuality, referenceQuality)) * 0.1
);
return Math.min(base + qualityPenalty, 0.95);
}
private async assessFaceQuality(image: Buffer): Promise<number> {
const analysis = await this.primaryProvider.analyzeFace(image);
return analysis.qualityScore;
}
}
interface FaceMatchConfig {
baseThreshold: number;
minimumFaceQuality: number;
maxRetries: number;
}
const DEFAULT_CONFIG: FaceMatchConfig = {
baseThreshold: 0.75,
minimumFaceQuality: 0.4,
maxRetries: 2,
};An important detail is threshold adjustment based on image quality. When both images are high quality, a lower similarity threshold is acceptable because the comparison is more reliable. When either image is lower quality, we raise the threshold to compensate for the increased uncertainty. This prevents false positives from degraded images while maintaining a good experience for customers with clear photos.
Secure Storage of Biometric Data
Biometric data requires special handling. Many jurisdictions classify it as a special category of personal data under privacy regulations like GDPR, requiring explicit consent and enhanced protection. We follow the principle of minimal retention: store the least amount of biometric data for the shortest time necessary.
interface SecureImageStore {
storeEncrypted(imageData: Buffer, ownerId: string): Promise<string>;
retrieveDecrypted(reference: string, requesterId: string): Promise<Buffer>;
scheduleDelete(reference: string, afterDate: Date): Promise<void>;
}
class EncryptedBiometricStore implements SecureImageStore {
constructor(
private storage: ObjectStorage,
private keyManager: KeyManagementService,
private accessLog: AccessLogger
) {}
async storeEncrypted(imageData: Buffer, ownerId: string): Promise<string> {
const encryptionKey = await this.keyManager.generateDataKey(ownerId);
const encrypted = await this.encrypt(imageData, encryptionKey.plaintext);
const reference = `biometric/${ownerId}/${generateUUID()}`;
await this.storage.put(reference, encrypted, {
metadata: {
ownerId,
encryptionKeyId: encryptionKey.id,
storedAt: new Date().toISOString(),
},
});
await this.accessLog.logStore(reference, ownerId);
return reference;
}
async retrieveDecrypted(
reference: string,
requesterId: string
): Promise<Buffer> {
const stored = await this.storage.get(reference);
const keyId = stored.metadata.encryptionKeyId;
const decryptionKey = await this.keyManager.decryptDataKey(keyId);
const decrypted = await this.decrypt(stored.data, decryptionKey);
await this.accessLog.logAccess(reference, requesterId, "retrieve");
return decrypted;
}
async scheduleDelete(reference: string, afterDate: Date): Promise<void> {
await this.storage.setExpiration(reference, afterDate);
await this.accessLog.logScheduledDeletion(reference, afterDate);
}
private async encrypt(data: Buffer, key: Buffer): Promise<Buffer> {
const iv = crypto.randomBytes(16);
const cipher = crypto.createCipheriv("aes-256-gcm", key, iv);
const encrypted = Buffer.concat([cipher.update(data), cipher.final()]);
const authTag = cipher.getAuthTag();
return Buffer.concat([iv, authTag, encrypted]);
}
private async decrypt(data: Buffer, key: Buffer): Promise<Buffer> {
const iv = data.subarray(0, 16);
const authTag = data.subarray(16, 32);
const encrypted = data.subarray(32);
const decipher = crypto.createDecipheriv("aes-256-gcm", key, iv);
decipher.setAuthTag(authTag);
return Buffer.concat([decipher.update(encrypted), decipher.final()]);
}
}Each biometric image is encrypted with a unique data key derived from the customer's master key. The data key is itself encrypted and stored alongside the data. This envelope encryption pattern means that even if the storage layer is compromised, the images cannot be decrypted without access to the key management service.
We schedule biometric images for deletion as soon as the verification decision is made. The raw selfie and reference images are typically deleted within 24 hours of verification completion. Only the match result (similarity score, liveness result) is retained for audit purposes. Every access to biometric data is logged with the requester's identity, creating a full access trail.
Error Handling and Retry Strategies
Biometric verification involves external services that can fail. Network timeouts, provider outages, and transient processing errors must be handled gracefully without forcing the customer to restart the entire process.
class ResilientBiometricService {
constructor(
private primaryService: BiometricVerificationService,
private circuitBreaker: CircuitBreaker,
private sessionStore: BiometricSessionStore
) {}
async processSelfieWithResilience(
sessionId: string,
selfieData: Buffer
): Promise<BiometricVerificationResult> {
if (this.circuitBreaker.isOpen()) {
return this.handleDegradedMode(sessionId, selfieData);
}
try {
const result = await this.primaryService.processSelfie(
sessionId,
selfieData
);
this.circuitBreaker.recordSuccess();
return result;
} catch (error) {
this.circuitBreaker.recordFailure();
if (this.isRetryable(error as Error)) {
return this.retryWithBackoff(sessionId, selfieData, 3);
}
throw error;
}
}
private async retryWithBackoff(
sessionId: string,
selfieData: Buffer,
maxAttempts: number
): Promise<BiometricVerificationResult> {
for (let attempt = 1; attempt <= maxAttempts; attempt++) {
try {
await this.sleep(Math.pow(2, attempt) * 1000);
return await this.primaryService.processSelfie(sessionId, selfieData);
} catch (error) {
if (attempt === maxAttempts) throw error;
}
}
throw new Error("Exhausted all retry attempts");
}
private async handleDegradedMode(
sessionId: string,
selfieData: Buffer
): Promise<BiometricVerificationResult> {
const session = await this.sessionStore.findById(sessionId);
await this.sessionStore.markForManualReview(sessionId, {
reason: "biometric_provider_unavailable",
selfieStored: true,
});
return {
success: false,
reason: "Verification temporarily unavailable, queued for manual review",
session: { ...session, status: "failed" },
};
}
private isRetryable(error: Error): boolean {
return (
error.message.includes("timeout") ||
error.message.includes("503") ||
error.message.includes("429")
);
}
private sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms));
}
}The circuit breaker pattern prevents cascading failures when a biometric provider is down. When the circuit is open, new requests are immediately routed to a degraded mode that queues the verification for manual review rather than failing outright. This ensures that customers can still complete onboarding even during provider outages, albeit with a slight delay.
Conclusion
Biometric verification strengthens identity assurance significantly, but the backend architecture must respect the sensitivity of the data involved. The patterns described here, a sequential verification flow, graduated liveness detection, quality-aware face matching, envelope-encrypted storage with scheduled deletion, and resilient error handling, form a system that is both effective and responsible.
The most important principle is minimal data retention. Biometric images should be treated as toxic assets: necessary for the moment they are used but a liability the moment they are no longer needed. By encrypting at rest, logging every access, and deleting images as soon as the verification decision is made, Oasis maintains strong security posture while delivering a verification experience that customers trust.
Related Articles
Improving Customer Conversion Through Better Onboarding
Data-driven strategies for improving customer conversion rates during fintech onboarding, from A/B testing frameworks to personalization and real-time optimization.
Integrating Third-Party Verification APIs
Practical strategies for integrating third-party identity verification APIs, covering adapter patterns, error handling, rate limiting, and provider management in TypeScript.
Data Privacy in Customer Onboarding
Strategies for protecting customer data during the onboarding process, covering data minimization, encryption, consent management, and regulatory compliance.