All Posts

Clay Filters and Conditions: Building Smart Enrichment Logic

Enriching every lead the same way wastes credits and misses context. Use Clay filters to route leads through different enrichment paths based on fit, source, and available data.

Clay Filters and Conditions: Building Smart Enrichment Logic

Published on
February 22, 2026

Overview

Enriching every lead the same way is one of the most expensive mistakes GTM teams make. When your Clay table runs the same waterfall of enrichment providers against every row regardless of fit, source, or existing data, you burn credits on leads that will never convert and miss critical context on the ones that matter most.

The solution lies in Clay's filtering and conditional logic capabilities. By building intelligent routing into your enrichment workflows, you can direct high-fit leads through premium enrichment paths while routing low-fit or data-rich leads through lighter-touch processes. This approach protects your budget, improves data quality, and ensures your sales team receives the context they need to have meaningful conversations.

In this guide, we will walk through how to design conditional enrichment logic in Clay, from basic filters to multi-branch routing strategies. Whether you are building your first Clay workflow for outbound or optimizing an existing pipeline, these patterns will help you scale enrichment without scaling costs linearly.

Why One-Size-Fits-All Enrichment Fails

The default approach to enrichment treats every lead identically. A free trial signup from a 10-person startup gets the same premium data pulls as an enterprise demo request from a Fortune 500 company. This creates three distinct problems that compound as your pipeline grows.

Credit Burn on Low-Fit Leads

Most enrichment providers charge per successful lookup. Running Apollo, Clearbit, and ZoomInfo against every lead regardless of fit burns through your monthly allocation quickly. When a single comprehensive enrichment can cost $0.50 to $2.00 per lead, enriching 10,000 leads monthly without filtering wastes thousands on prospects who will never convert.

The math becomes painful when you analyze conversion rates by segment. If only 15% of your inbound leads match your ICP, you are spending 85% of your enrichment budget on leads your sales team will ultimately disqualify. Smart filtering inverts this equation by front-loading qualification before expensive enrichment.

Context Gaps on High-Value Prospects

Paradoxically, treating all leads equally often means under-enriching your best prospects. When budget constraints force you to limit enrichment depth across the board, enterprise accounts that deserve deep research get the same shallow data pulls as SMB leads. Your AEs end up researching manually what automation should have surfaced.

The opportunity cost compounds in competitive deals. When your team lacks the right Clay columns for personalization, they default to generic outreach that fails to demonstrate understanding of the prospect's specific situation. High-fit leads deserve premium enrichment paths that surface company initiatives, technology stack, organizational structure, and buying signals.

Redundant Data Pulls

Not all leads arrive with the same data completeness. Leads from LinkedIn Ads often include job title and company size. Leads from your CRM may already have phone numbers and historical engagement data. Running the same enrichment waterfall against these leads wastes credits pulling data you already possess.

Conditional logic solves this by checking existing fields before triggering enrichment. If a lead already has a verified mobile number, skip the phone enrichment step. If company size is known, skip the firmographic lookup. These micro-optimizations accumulate into significant savings at scale.

Clay Filter Fundamentals

Before building sophisticated routing logic, you need to understand Clay's filtering primitives. These building blocks combine to create nuanced enrichment paths that respond to lead characteristics dynamically.

Row-Level Filters

Clay's most basic filtering mechanism operates at the row level. You can configure any column to only run when specific conditions are met. For example, an enrichment column might only execute when the "Employee Count" field is empty or when "Lead Score" exceeds a threshold.

Row-level filters use simple conditional logic: equals, not equals, contains, greater than, less than, and is empty. You can combine multiple conditions with AND logic, requiring all conditions to be true before the column executes. This prevents unnecessary API calls and keeps your Clay rate limits and quotas under control.

Filter Placement Matters

Apply filters to the enrichment columns themselves, not just downstream processing columns. A filtered enrichment column that never executes costs zero credits, while an unfiltered enrichment feeding a filtered AI column still burns the enrichment credit even if the output is discarded.

Formula-Based Conditions

For more sophisticated logic, Clay supports formula columns that evaluate conditions and return boolean values. These formulas can reference multiple columns, perform calculations, and implement nested IF statements that would be unwieldy in standard filters.

A typical formula might evaluate ICP fit based on multiple signals: IF(AND(employee_count > 50, employee_count < 1000, industry IN ("SaaS", "FinTech", "MarTech")), "high_fit", "standard"). This formula column then serves as the filter condition for downstream enrichment steps, routing leads through different paths based on calculated fit scores.

Waterfall Logic

Clay's waterfall enrichment feature automatically tries multiple data providers in sequence until one returns a result. But you can make waterfalls smarter by wrapping them in conditional logic. Instead of running a full waterfall for every lead, configure different waterfall depths based on lead priority.

High-fit leads might run through Apollo, then Clearbit, then ZoomInfo, then manual research queues. Standard leads might stop after Apollo. Low-fit leads might skip person-level enrichment entirely, receiving only basic company verification. This tiered approach maximizes ROI on your Clay enrichment recipes.

Practical Routing Patterns

Understanding primitives matters, but implementation patterns drive results. These four routing strategies cover most conditional enrichment scenarios GTM teams encounter.

Pattern 1: Source-Based Routing

Different lead sources warrant different enrichment depths. A hand-raised demo request signals higher intent than a content download, which signals higher intent than a purchased list contact. Your enrichment logic should reflect these differences.

1
Create a source classification column that normalizes your various lead sources into tiers (Tier 1: Demo requests, Free trials; Tier 2: Content downloads, Webinar registrations; Tier 3: Purchased lists, Scraped data).
2
Configure premium enrichment columns with filters that only execute for Tier 1 sources. These might include phone verification, org chart lookups, and technology stack identification.
3
Configure standard enrichment columns with filters for Tier 1 and Tier 2, providing email verification and basic firmographics.
4
Configure minimal enrichment columns that run for all tiers, handling only essential data validation like email deliverability checks.

This pattern ensures your most engaged prospects receive the deepest context while protecting budget on less qualified leads. As you refine your understanding of which sources convert best, adjust tier assignments accordingly.

Pattern 2: Fit-Score Gating

Rather than routing by source alone, calculate a preliminary fit score before triggering enrichment. This approach works particularly well when combined with minimal-data qualification strategies that assess fit using only the data present at capture.

The key insight is that you often have enough information to calculate rough fit before any enrichment. Company domain alone can reveal industry, employee count ranges, and funding status through lightweight lookups. Job title strings contain seniority and function signals. Geographic data indicates timezone and regulatory environment.

Build a preliminary scoring formula that weights available signals: score = (title_match * 30) + (industry_match * 25) + (size_match * 25) + (geo_match * 20). Leads scoring above 70 proceed to full enrichment. Leads between 40-70 receive basic enrichment. Leads below 40 receive only validation before being deprioritized or excluded.

Pattern 3: Data Completeness Checks

Not every lead needs every enrichment step. A lead arriving with verified email, phone, company size, and technology stack needs far less enrichment than a lead arriving with only name and company domain. Check what data exists before deciding what to fetch.

Create a data completeness score column that counts populated fields: completeness = COUNTIF(email, phone, title, company_size, industry, tech_stack). Then configure enrichment columns to run only when their target fields are empty. This prevents paying for data you already have while ensuring gaps get filled.

This pattern proves especially valuable when coordinating Clay with your CRM. Leads synced from Salesforce might arrive with substantial historical data. Leads from form submissions might arrive nearly empty. The same Clay table handles both scenarios efficiently by checking completeness before enriching.

Pattern 4: Multi-Branch Persona Routing

Different personas require different enrichment strategies. A VP of Engineering benefits from technology stack data and GitHub activity. A CFO benefits from funding data and financial signals. A VP of Sales benefits from sales tech stack and team size data. Rather than enriching everyone with everything, route personas through tailored paths.

First, classify personas using job title parsing or AI classification. Then create persona-specific enrichment branches that gather relevant context. The VP of Engineering path pulls BuiltWith data and developer community presence. The CFO path pulls Crunchbase funding and financial news. The VP of Sales path pulls sales tool indicators and SDR team signals.

This approach requires more table complexity but delivers significantly better context for personalization. Your AI-powered personalization performs better when fed persona-relevant context rather than generic firmographic data.

Implementation Guide

With patterns understood, let's walk through implementing conditional enrichment in a real Clay workflow. We will build a table that routes leads through different enrichment depths based on source, fit, and data completeness.

Step 1: Define Your Routing Logic

Before touching Clay, document your routing rules on paper. Which signals matter most? What constitutes high fit for your business? How many enrichment tiers do you need? Most teams benefit from three tiers: premium (full enrichment), standard (core enrichment), and minimal (validation only).

Create a decision tree that maps input signals to enrichment paths. For example:

  • Demo request + Enterprise company = Premium path
  • Demo request + SMB company = Standard path
  • Content download + Enterprise company = Standard path
  • Content download + SMB company = Minimal path
  • Any source + Disqualified industry = Excluded

Step 2: Build Classification Columns

In your Clay table, create columns that classify leads according to your routing logic. Start with a source tier column that maps lead source values to your defined tiers. Add a preliminary fit column that evaluates available signals. Add a completeness column that counts populated fields.

Finally, create a routing column that combines these signals into a final path assignment: IF(source_tier = "Tier1" AND fit > 70, "premium", IF(source_tier IN ("Tier1", "Tier2") AND fit > 40, "standard", "minimal")). This column becomes the master switch for all downstream enrichment.

Step 3: Configure Filtered Enrichment

For each enrichment column, add a filter condition based on the routing column. Premium enrichment steps filter on routing = "premium". Standard enrichment steps filter on routing IN ("premium", "standard"). Minimal enrichment steps run for all leads.

Within each tier, implement data quality checks to validate returned data. An enrichment that returns obviously bad data (fake emails, impossible phone numbers, mismatched names) should trigger fallback logic rather than polluting your dataset.

Step 4: Handle Edge Cases

Real-world data is messy. Build logic to handle missing classification signals gracefully. If source is empty, default to minimal path rather than premium. If job title parsing fails, flag for manual review rather than guessing persona. If company domain is invalid, skip enrichment entirely rather than wasting credits on guaranteed failures.

Consider creating a "needs review" path for leads that defy easy classification. These might be redirected to a human queue or held for manual enrichment rather than processed automatically. Some leads simply require judgment that automation cannot provide.

Advanced Techniques

Once basic routing works reliably, these advanced techniques can further optimize your enrichment efficiency and data quality.

Dynamic Waterfall Depth

Rather than fixed waterfall configurations, adjust depth based on how critical complete data is for each lead. Enterprise leads in active sales cycles might justify running through all 5 providers in your waterfall. Nurture leads might stop after the first successful hit. Implement this by creating multiple waterfall columns with different provider sequences, then filtering each based on lead priority.

Feedback Loop Integration

Connect your routing logic to downstream outcomes. When a lead converts, track which enrichment path they took. Over time, analyze whether premium enrichment actually improves conversion rates enough to justify the cost. If standard enrichment produces similar conversion rates at lower cost, adjust your routing thresholds accordingly.

Tools like Octave can help close this feedback loop by connecting enrichment decisions to pipeline outcomes. Rather than guessing which leads deserve premium enrichment, let actual conversion data inform your routing rules. This transforms enrichment from a cost center into an optimized investment.

AI-Assisted Routing

For complex routing decisions that resist formula-based logic, consider using Clay's AI column to evaluate fit. An AI prompt might assess whether a lead's situation warrants deep research based on nuanced signals that formulas miss. "Based on this company description, job title, and recent news, is this lead likely to benefit from our enterprise solution?"

AI routing works best as a tie-breaker for edge cases rather than the primary routing mechanism. AI columns consume credits and add latency, so reserve them for leads that fall between your formula-defined boundaries.

Measuring Success

How do you know if your conditional enrichment logic is working? Track these metrics to evaluate performance and identify optimization opportunities.

Credit Efficiency

Calculate enrichment cost per qualified lead rather than per lead overall. If you spend $5,000 on enrichment and produce 500 qualified leads, your cost is $10 per qualified lead. Conditional routing should improve this metric by reducing spend on leads that never qualified.

Compare credit usage by routing path. If premium enrichment costs $2 per lead and produces 40% qualification rates, while standard enrichment costs $0.50 per lead and produces 35% qualification rates, the marginal value of premium enrichment is clear. Adjust routing thresholds to capture that marginal value where it exists.

Data Quality by Path

Measure data completeness for leads in each routing path. Premium paths should produce more complete records. If completeness is similar across paths, your premium enrichment may not be worth the additional cost. Alternatively, if standard paths produce incomplete data that causes downstream problems, you may need to tighten routing criteria.

Sales Feedback

The ultimate test of enrichment quality is whether sales teams find it useful. Survey reps about data quality for leads from different paths. Are premium-path leads noticeably better researched? Do standard-path leads have the context needed for initial outreach? This qualitative feedback guides routing adjustments that metrics alone might miss.

When working with research-to-qualification workflows, ensure the context reaching your sequencer actually improves reply rates. The best enrichment in the world adds no value if it never reaches the rep or never influences the conversation.

Common Pitfalls to Avoid

Conditional enrichment logic can fail in subtle ways. Watch for these common mistakes as you build and refine your routing.

Over-Engineering Initial Builds

Start simple. Two paths (enriched vs. minimal) often work better than five paths with complex branching. Add complexity only when simple routing proves insufficient. Every additional branch increases maintenance burden and debugging difficulty.

Ignoring Edge Cases

What happens when a lead has no company domain? When job title is empty? When source is null? Unconsidered edge cases cause silent failures where leads receive no enrichment at all. Test your routing with incomplete data to ensure graceful handling.

Static Thresholds

Routing thresholds that made sense six months ago may not make sense today. As your product evolves, ICP shifts, and market conditions change, revisit routing logic quarterly. What constituted a high-fit lead last year might be standard fit today.

Missing Monitoring

Without monitoring, routing failures go undetected. Build alerts for when premium path volume drops unexpectedly (possible classification bug) or when minimal path volume spikes (possible source quality issue). Catch problems before they impact pipeline quality.

Scaling Your Conditional Logic

As volume grows, conditional enrichment logic must scale without becoming unmanageable. These practices keep complexity under control.

Document your routing rules outside of Clay. When logic lives only in formula columns and filter conditions, new team members struggle to understand intent. Maintain a routing specification document that explains why each path exists and what signals trigger each branch.

Use consistent naming conventions. If routing_tier indicates enrichment depth in one table, use the same column name in every table. Standardization makes copying logic between tables straightforward and reduces errors.

Consider centralizing routing logic in a shared Clay template or using Octave's context engine to standardize how leads are classified across your GTM stack. When every Clay table implements its own routing logic, inconsistencies creep in. Centralized classification ensures a "premium" lead in your outbound table receives the same treatment as a "premium" lead in your inbound table.

Finally, plan for the eventual complexity ceiling. Clay tables with 50+ columns become unwieldy. If your conditional logic requires dozens of routing branches with unique enrichment steps, consider splitting into multiple specialized tables that handle different lead types. A table optimized for enterprise leads can implement different logic than a table optimized for SMB leads, without either becoming unmanageable.

Moving Forward

Conditional enrichment transforms Clay from a blunt instrument into a precision tool. By routing leads through appropriate enrichment paths based on fit, source, and data completeness, you protect budget, improve data quality, and ensure your sales team receives the context they need.

Start with the simplest routing that addresses your biggest inefficiency. For most teams, that means gating expensive enrichment steps behind basic fit criteria. As you see results, add sophistication gradually, always measuring whether additional complexity improves outcomes.

The patterns in this guide provide a foundation, but optimal routing depends on your specific ICP, data sources, and sales motion. Test aggressively, measure honestly, and refine continuously. Your enrichment logic should evolve as fast as your go-to-market strategy.

For teams looking to take conditional logic further, Octave provides the infrastructure to connect enrichment decisions with downstream outcomes, enabling data-driven routing that improves automatically as you learn what works for your pipeline.

FAQ

Frequently Asked Questions

Still have questions? Get connected to our support team.