TL;DR — Intent-based prospecting relies on collecting and processing business events that often touch personal data. GDPR applies the moment you handle information linked to an identifiable person, even in a B2B context. This guide walks you through the technical measures you need: data minimization at the API level, lawful basis selection, retention and purge policies, handling data subject access requests (DSARs), erasure workflows, and what to look for in a Data Processing Agreement (DPA). Follow these steps and you will build a prospecting pipeline that is both effective and regulation-proof.
What Is GDPR in the Context of Intent-Based Prospecting?
The General Data Protection Regulation (GDPR) is the European Union’s framework for protecting personal data. It applies to any organization that processes data of EU residents, regardless of where the organization is based. In the context of B2B prospecting, “personal data” does not only mean consumer information. A professional email address, a LinkedIn profile URL, a job title attached to a named individual: all of these fall under GDPR’s scope.
Intent-based prospecting amplifies this concern. When you monitor intent signals (funding rounds, job postings, leadership changes, recruitment campaigns), you are collecting data about companies. But the moment those signals reference a specific person, say a newly appointed CTO or a hiring manager behind a recruitment drive, the data becomes personal. That means GDPR obligations kick in.
Many sales teams assume that B2B data lives in a gray area. It does not. Recital 14 of the GDPR makes clear that the regulation covers the processing of personal data of natural persons, regardless of context. The name and professional email of a decision-maker at a SaaS company is personal data, full stop.
This guide focuses on the technical implementation of GDPR compliance when using APIs like the Rodz API to collect, enrich, and act on intent signals. If you are looking for a general overview of the Rodz API, see the complete API reference.
Prerequisites
Before implementing the compliance measures described in this guide, make sure you have the following in place:
- A data processing inventory. You need a clear picture of what personal data you collect through intent-based prospecting, where it is stored, and how long you keep it. If you do not have this inventory yet, build it first.
- A designated data protection point of contact. For companies required to appoint a Data Protection Officer (DPO), this is mandatory. For smaller teams, designate someone responsible for GDPR decisions.
- Access to your API configurations. You should be able to modify your signal configurations, webhook endpoints, and data storage logic. Familiarity with the Rodz API endpoints is helpful. Refer to the API reference if needed.
- A working webhook pipeline with proper security in place. If you have not yet set up webhook signature verification, follow our HMAC-SHA256 webhook verification guide before continuing.
- Legal review. The technical measures in this guide support compliance, but they do not replace legal advice. Have your legal team or external counsel validate your approach before going live.
Technical Compliance Measures
The sections below cover the six pillars of GDPR compliance as they apply to intent-based prospecting. Each section includes concrete implementation steps.
1. Data Minimization
Article 5(1)(c) of the GDPR states that personal data must be “adequate, relevant and limited to what is necessary.” In practice, this means you should never collect more data than you actually need.
At the signal configuration level, use filters aggressively. The Rodz API lets you define which signal types, geographies, company sizes, and industries to monitor. The tighter your filters, the less irrelevant personal data flows into your systems.
For example, if your ICP targets Series A startups in France with 20-100 employees, configure your signals to match exactly that. Do not monitor “all funding rounds globally” and filter later. Filtering at the source is both a performance optimization and a compliance measure.
At the enrichment level, request only the fields you need. If your sales workflow requires a prospect’s name, company, and professional email, do not also pull their phone number, social media profiles, and personal address “just in case.” Every additional field increases your compliance surface area.
At the storage level, strip or pseudonymize fields that are not essential for your use case before writing to your database. If you only need to know that a company hired a new VP of Sales, you may not need to store the person’s full name in your signal log. Consider storing a hashed identifier instead and resolving it only when a sales rep takes action.
// Example: selective field extraction from a webhook payload
const relevantData = {
companyName: signal.company.name,
companySiren: signal.company.siren,
signalType: signal.type,
signalDate: signal.detected_at,
// Only store the person's role, not their full identity
contactRole: signal.contact?.job_title || null,
contactIdHash: signal.contact ? hash(signal.contact.email) : null
};
2. Lawful Basis for Processing
GDPR requires a lawful basis for every processing activity. For B2B intent-based prospecting, two bases are commonly used:
Legitimate interest (Article 6(1)(f)) is the most frequently cited basis for B2B prospecting. The reasoning: your business has a legitimate interest in identifying sales opportunities, and the data subjects (business professionals) can reasonably expect their professional activities to be visible to potential business partners. However, legitimate interest is not a free pass. You must conduct and document a Legitimate Interest Assessment (LIA) that weighs your interest against the data subject’s rights.
A proper LIA for intent-based prospecting should cover:
- Purpose. What specific business outcome does each signal type serve?
- Necessity. Could you achieve the same outcome with less personal data?
- Balancing. Does the processing cause any detriment to the data subject? Would they reasonably expect it?
- Safeguards. What measures do you have in place to protect the data?
Document this assessment for each signal type you activate. A signal monitoring public funding announcements has a different risk profile than a signal tracking individual job changes.
Consent (Article 6(1)(a)) is the alternative basis, but it is rarely practical for prospecting at scale. Consent must be freely given, specific, informed, and unambiguous. Collecting consent before you even know who the prospect is contradicts the nature of intent-based discovery. Reserve consent for scenarios where you have an existing relationship and want to expand the scope of data processing.
Whichever basis you choose, record it in your processing inventory alongside the specific processing activity, the data categories involved, and the date of the assessment.
3. Retention Policies
Article 5(1)(e) requires that personal data be kept “for no longer than is necessary.” Signal data is inherently time-sensitive, which actually works in your favor from a compliance perspective. A funding round detected six months ago is stale intelligence anyway.
Define retention periods per data category. Here is a starting framework:
| Data Category | Suggested Retention | Rationale |
|---|---|---|
| Raw signal payloads | 30 days | Enough time to process and act on the signal |
| Enriched contact data | 90 days | Covers a typical B2B sales cycle |
| Aggregated/anonymous signal stats | Unlimited | No personal data, purely statistical |
| Webhook delivery logs | 14 days | Debugging window for delivery issues |
| DSAR response records | 3 years | Regulatory evidence of compliance |
Implement automated purge jobs. Do not rely on manual cleanup. Set up scheduled tasks that delete or anonymize data once the retention period expires.
# Example: cron job to purge signal data older than 30 days
0 2 * * * /usr/bin/psql -d prospecting \
-c "DELETE FROM raw_signals WHERE detected_at < NOW() - INTERVAL '30 days';"
Log every deletion. Maintain a purge log that records what was deleted, when, and under which retention policy. This log itself should not contain personal data. Use record counts and data categories rather than listing individual records.
4. Data Subject Access Rights (DSARs)
Under Articles 15-20, individuals have the right to access, rectify, and port their personal data. If a prospect contacts you and asks “What data do you hold about me?”, you must be able to answer within 30 days.
Build a DSAR lookup capability. Your system should be able to search across all data stores (signal database, CRM, enrichment cache, email tools) using an email address or name as the search key. This is harder than it sounds in a distributed pipeline.
Practical steps:
- Create a central data map that documents every system where personal data from signals might land: your database, your CRM, your email sequencing tool, any analytics platforms.
- Build or configure a search endpoint that queries all these systems. Some CRMs (HubSpot, Pipedrive) offer search APIs that can help.
- Prepare a response template that presents the data in a structured, machine-readable format (JSON or CSV). GDPR does not prescribe a format, but clarity helps.
- Track DSAR requests in a dedicated log with timestamps, actions taken, and response dates to demonstrate compliance.
// Example: structured DSAR response
{
"data_subject": "jane.doe@example.com",
"request_date": "2026-03-10",
"response_date": "2026-03-15",
"data_held": {
"signals": [
{
"type": "leadership-change",
"company": "Acme Corp",
"detected_at": "2026-02-20",
"source": "rodz-api"
}
],
"enrichment": {
"name": "Jane Doe",
"job_title": "VP of Sales",
"company": "Acme Corp",
"email": "jane.doe@example.com"
}
},
"processing_basis": "legitimate_interest",
"retention_policy": "90_days_from_collection"
}
5. Right to Erasure
Article 17 gives individuals the right to request deletion of their personal data. In a intent-based pipeline, this creates a specific challenge: you need to erase data across multiple systems and ensure the person is not re-collected by future signals.
Implement a suppression list. When someone exercises their right to erasure, delete their data from all systems and add a hashed version of their email to a suppression list. Before writing any new enriched contact data, check this list. If there is a match, discard the record.
// Suppression check before storing enriched data
const emailHash = sha256(contact.email.toLowerCase());
const isSuppressed = await suppressionList.has(emailHash);
if (isSuppressed) {
logger.info('Contact suppressed, skipping storage', { hash: emailHash });
return;
}
Cascade deletions across your pipeline. If your signal data flows from the Rodz API to a webhook handler, then to a database, then to a CRM, then to an email tool, the erasure request must propagate through every stage. Map out the data flow and build deletion scripts for each node.
Respond within 30 days. The clock starts when you receive the request. Automate what you can. If a particular system requires manual deletion (some SaaS tools do not have a deletion API), document the manual steps and include them in your erasure runbook.
Preserve the suppression record. This may seem contradictory, but storing a hashed email on a suppression list is lawful because it serves the purpose of complying with the erasure request itself. The CNIL (France’s data protection authority) has confirmed this approach. Just make sure the suppression list contains only the hash, not the original email or any other personal data.
6. Data Processing Agreements (DPAs)
When you use the Rodz API (or any third-party data provider), you are engaging a data processor. Article 28 requires a Data Processing Agreement between you (the controller) and the processor.
A compliant DPA should cover:
- Subject matter and duration of the processing.
- Nature and purpose of processing (signal detection, contact enrichment).
- Types of personal data processed (names, email addresses, job titles, company affiliations).
- Categories of data subjects (business professionals in your target market).
- Obligations of the processor, including security measures, sub-processor management, breach notification timelines, and assistance with DSARs.
- Audit rights allowing you to verify compliance.
- Data return and deletion procedures upon termination of the agreement.
Review DPAs with every vendor in your prospecting stack: your signal provider, your enrichment tools, your CRM, your email sequencing platform. Each one processes personal data on your behalf, and each one needs a DPA in place.
For webhook-based integrations, verify that data in transit is protected. TLS encryption is the minimum. Payload-level verification through HMAC signatures adds a second layer. See our webhook security guide for the implementation details.
Data Storage Recommendations
Where and how you store signal-derived personal data matters as much as what you collect. Here are practical recommendations:
Encrypt at rest and in transit. Use AES-256 encryption for databases containing personal data. Ensure all API calls and webhook deliveries use TLS 1.2 or later. If you are self-hosting, configure your database server to reject unencrypted connections.
Separate personal and non-personal data. Store signal metadata (type, date, company identifier) in one table and personal contact data in another. This separation makes it easier to implement retention policies (you can purge contact data while keeping anonymized signal statistics) and to respond to erasure requests without losing valuable aggregate intelligence.
Use role-based access controls. Not every team member needs access to raw personal data. Sales reps may only need the enriched profile when they are ready to reach out. Engineers debugging the pipeline may only need anonymized samples. Configure your database and application permissions accordingly.
Choose EU-based hosting when possible. If your prospects are in the EU, storing their data in EU data centers simplifies compliance. It eliminates the need for additional transfer mechanisms like Standard Contractual Clauses (SCCs). Major cloud providers (AWS, GCP, Azure) all offer EU regions.
Log access to personal data. Every time a user or system reads personal data from your database, log the access event. This audit trail is invaluable during compliance audits and when responding to DSARs. Include the user or service identity, the timestamp, and the type of data accessed.
Frequently Asked Questions
Does GDPR apply to B2B prospecting at all?
Yes. GDPR protects the personal data of natural persons, not companies. The moment your prospecting data includes a person’s name, email, phone number, or any other identifier, GDPR applies. The B2B context does not create an exemption. What it does influence is the lawful basis analysis: legitimate interest is generally easier to justify in a B2B setting because business professionals can reasonably expect their professional data to be used for commercial purposes.
What qualifies as “personal data” in an intent signal?
Any information that relates to an identified or identifiable person. In intent-based prospecting, common examples include: the name of a person mentioned in a leadership change signal, a professional email address returned by an enrichment endpoint, a LinkedIn profile URL, a job title when combined with a company name (which often makes the person identifiable even without their name). Company-level data like revenue, employee count, or industry classification is not personal data on its own.
Can I use legitimate interest as my lawful basis?
In most B2B prospecting scenarios, yes. Legitimate interest under Article 6(1)(f) is the standard basis for outbound B2B sales activity. However, you must conduct a Legitimate Interest Assessment (LIA) for each processing activity and document it. The LIA must show that your interest is genuine, the processing is necessary to achieve it, and it does not override the data subject’s fundamental rights. Keep these assessments updated and reviewable.
How long can I keep signal data that contains personal information?
There is no fixed number in the regulation. GDPR says “no longer than is necessary.” In practice, signal data loses relevance quickly. A 30-day retention window for raw signals and 90 days for enriched contact data is a reasonable starting point. The key is to define your retention periods, document the rationale, and enforce them through automated purge jobs. If an auditor asks why you keep enriched data for 90 days, you should be able to explain that it aligns with your average sales cycle length.
What happens if a prospect asks me to delete their data?
You must comply within 30 days. Delete the person’s data from every system in your pipeline: your signal database, your CRM, your enrichment cache, your email tools. Then add a hashed version of their email to a suppression list so they are not re-collected by future signals. Confirm the deletion in writing to the requester. If full deletion is technically impossible in a specific system within the deadline, inform the data subject of the delay and complete it as soon as feasible.
Do I need a DPA with my signal provider?
Yes. If you use a third-party API to collect or enrich data that includes personal information, that provider is a data processor under GDPR. Article 28 requires a written agreement (the DPA) between you and the processor. This applies to your signal API provider, your enrichment tools, your CRM vendor, and any other SaaS platform that handles personal data on your behalf. Request a DPA from each vendor and review it for the elements listed in Section 6 above.
How do I handle cross-border data transfers?
If personal data leaves the European Economic Area (EEA), you need a valid transfer mechanism. The most common options are Standard Contractual Clauses (SCCs), an adequacy decision (for countries the EU recognizes as having equivalent protection), or Binding Corporate Rules (for intra-group transfers). In practice, the simplest approach is to store EU prospect data in EU-based data centers. If your tooling requires transfer to the US or elsewhere, ensure the vendor has signed SCCs and check whether supplementary measures are needed following the Schrems II ruling.
Where can I find the full Rodz API documentation?
The complete Rodz API documentation is available at api.rodz.io/docs. It covers authentication, all signal and enrichment endpoints, webhook configuration, rate limits, and error handling. For a structured walkthrough, start with the API reference guide on this blog.