Automated decision-making is a routine feature of credit scoring, insurance pricing, recruitment screening, and public benefit assessments. When those decisions carry legal or significant financial consequences for individuals, organisations are not free to rely on algorithms without accountability.
Under the EU General Data Protection Regulation (“GDPR“), individuals have the right to know how automated decisions about them were made and organisations have a legal obligation to explain. Brazil’s Lei Geral de Proteção de Dados (“LGPD“) does not replicate article 22 of the GDPR in identical terms, but it imposes broadly comparable practical obligations through its transparency and data subject rights framework. Organisations operating across both jurisdictions must understand the similarities and the gaps.
Quick Read
Prohibition with exceptions: Automated decisions with legal or significant effects are, in principle, prohibited under the GDPR unless supported by law, contract necessity, or explicit consent. The LGPD takes a parallel approach through its transparency and rights framework.
Two explanation levels : A general explanation, provided proactively to all individuals; and a specific (personal) explanation, provided when an individual makes an access request.
Technical accuracy is not enough: Providing a mathematical formula does not satisfy the obligation. The individual must be able to understand the outcome and, if necessary, challenge it.
Opaque models create a compliance problem: If an algorithm cannot be adequately explained, the organisation cannot lawfully use it for automated decisions with significant impact under the GDPR. Under the LGPD, the same issue creates a significant compliance risk: if the controller cannot provide clear and adequate information about the criteria and procedures used, it may be unable to satisfy the transparency, accountability and automated decision-review requirements.
Explainability must be built in: Organisations that design automated decision-making processes without considering explanation will find compliance significantly harder to achieve under both regimes.
1. When Does the Right to an Explanation Apply?
(a) The rule under GDPR
The GDPR prohibits fully automated decision-making where the decision produces legal effects for the individual or similarly significant effects, for example, a decision affecting access to credit, insurance, employment, or public services. Organisations can depart from this prohibition in three cases:
• a law specifically authorises the automated decision and provides appropriate safeguards;
• the decision is necessary for the performance of a contract with the individual; or
• the individual has given explicit consent.
In each case, the organisation must take protective measures including providing an explanation of the decision.
(b) The LGPD framework
Brazil’s LGPD does not contain a provision equivalent to article 22 GDPR that expressly prohibits automated decision-making. However, the law creates parallel obligations through several interconnected provisions:
• Article 20 gives data subjects the right to request a review of any decision made solely on the basis of automated processing, including decisions that affect their interests, such as professional, consumer, credit, or personality profiles.
• Articles 18 and 19 establish data subject rights to access, correction, and portability, which apply directly to algorithmic outputs such as scores and categorisations.
• Article 6 requires that all personal data processing comply with the principles of purpose, adequacy, necessity, free access, data quality, transparency, security, prevention, non-discrimination, and accountability.
• Brazil’s National Data Protection Authority (Autoridade Nacional de Proteção de Dados, “ANPD“) is explicitly empowered under article 20, paragraph 2 to issue regulations on automated decision-making, signalling that more specific rules are expected.
In practice, the LGPD’s transparency principle and the right to review of automated decisions create an obligation functionally similar to the GDPR’s explanation requirement. Organisations subject to both regimes will find that a GDPR-compliant explanation approach substantially addresses LGPD obligations, subject to the differences noted below.
2. What counts as a significant automated decision?
Under the GDPR, significant decisions are not limited to formal legal determinations. They include decisions that materially affect an individual’s financial situation, access to services, freedom of movement, or similar interests. A credit score that determines whether a loan is approved, an algorithm that prices an insurance premium, or an automated recruitment shortlisting tool all fall within scope.
Under the LGPD, the threshold for the right to review is framed broadly: any decision made solely on the basis of automated processing that affects the data subject’s interests qualifies. This formulation is at least as broad as the GDPR’s, and arguably broader, since it is not restricted to decisions with legal or similarly significant effects.
3. What Must the Explanation Cover?
(a) Two distinct obligations under the GDPR
Organisations subject to article 22 GDPR must provide explanation at two levels: a general explanation, provided proactively and available before any specific decision is made; and a specific (personal) explanation, triggered when an individual makes an access request or invokes their rights.
(b) General explanation: what it must contain
The general explanation must describe:
• the fact that automated decision-making is used, and whether profiling is involved;
• the categories of personal data taken into account (for example, age, income, driving history);
• how the different factors are weighted in general terms;
• the legal basis on which the automated decision is made; and
• the significance and expected consequences of the decision for individuals.
(c) Personal explanation: what it must contain
The personal explanation must go further and cover:
• the specific personal data used as inputs in the individual’s decision;
• the essential elements of the algorithm applied to that decision, including the weighting of factors, the steps followed, and any intermediate outputs such as scores or category assignments; and
• the relationship between the data and the outcome, in other words, why the data produced the result it did.
It is good practice to provide both levels together in response to an access request: a personal explanation of the specific outcome, with a reference to the general explanation for context on how the system works overall.
(d) The LGPD parallel
The LGPD‘s framework does not use the distinction between general and personal explanations in the same explicit terms, but the same structure emerges from its rights framework. Under article 9, controllers must inform data subjects of the purpose and form of processing at the time of data collection. Under article 18 and 20, data subjects can request access to their data and a review of automated decisions. Together, these provisions require:
• proactive disclosure of the use of automated processing and its purposes (equivalent to the general explanation); and
• on request, disclosure of the specific data used, the logic applied, and how those factors produced the outcome (equivalent to the personal explanation).
4. What Does the Explanation Have to Look Like?
(a) Useful, not just technical
The GDPR requires information to be provided in a concise, transparent, intelligible and easily accessible form, in clear and plain language. This standard applies to explanations of automated decisions.
A technically accurate description of an algorithm is not sufficient. The Court of Justice of the EU confirmed in Dun & Bradstreet (C-203/22, February 2025) that a complex algorithm, or a step-by-step description of all its operations, is neither concise nor comprehensible, and does not meet the GDPR’s requirements. The test is whether the individual can actually understand why the decision was made and, if they disagree, how to challenge it.
A useful rule of thumb: the explanation should be comparable to what a person who made the same decision manually would say if asked to justify it.
The LGPD‘s transparency principle (article 6, IV) reaches the same outcome. The ANPD has confirmed that transparency means providing clear, accessible and accurate information, not a technical description that the data subject cannot act on.
(b) Language and accessibility
Under both regimes, organisations must:
• avoid technical jargon, acronyms and internal terminology;
• avoid vague language such as “may”, “could” or “possibly”;
• write at a reading level appropriate to the audience; and
• test the explanation with real members of the relevant audience and remain open to feedback.
(c) Layered explanations
Where complete conciseness and completeness are difficult to achieve simultaneously, a layered approach can help. The first layer presents the key information, the fact of automated decision-making, the outcome, its consequences, and the individual’s rights. A second layer, accessible on request or via a link, provides the more detailed technical explanation. This approach is endorsed in regulatory guidance under both the GDPR and, by analogy, the LGPD.
5. Explaining the Logic: How Insightful Is the Model?
How easily an automated decision can be explained depends substantially on the design of the model used. Organisations should understand where their models sit on the spectrum from transparent to opaque and what that means for their ability to fulfil their obligations under either regime.
| Model Type | Characteristics | Explanation Approach |
| Insightful | Linear, monotonous, non-complex (e.g. small decision trees, rule-based systems, small regression models) | Explain directly: show the decision rules, formula or parameters |
| Requires additional techniques | Non-linear or complex models, large regressions, neural networks with limited detectable inputs | Use factor weighting (e.g. SHAP) or comparative (counterfactual) explanations |
| Opaque | Highly non-linear, non-monotonous, many interacting parameters; large neural networks; most large language models | Cannot currently be explained sufficiently under the GDPR or the LGPD transparency standard |
(a) Insightful models
A model is insightful when changes in the input have a direct, predictable effect on the outcome; when the direction of that effect is consistent; and when the number of interacting factors is small enough to follow. Rule-based systems, small decision trees and simple regression models typically fall into this category. Their logic can be explained directly by showing the rules, the formula, or the decision tree.
(b) Models that can be made transparent
Many models in commercial use are too complex to follow step by step but can be made more transparent through supplementary techniques. Two categories are most commonly used:
- factor weighting shows how much each input variable contributed to the specific outcome. Tools such as SHAP (SHapley Additive exPlanations) calculate the average effect of each factor, making it possible to tell an individual: “Your application was rejected primarily because of the loan amount requested, relative to your income.” Factor weighting must be calculated for the individual decision, not just presented as a general model summary.
- comparative (counterfactual) explanations show what would have had to change in the individual’s data for the outcome to have been different. For example: “If your annual income had been R$120,000 rather than R$80,000, or if the loan amount requested had been R$25,000 rather than R$50,000, the application would have been approved.” Comparative explanations give the individual a basis for action and for challenging the decision.
Neither technique is a complete solution on its own. Factor weighting does not show how factors interact; comparative explanations do not describe the underlying model logic. Organisations will typically need to combine both, supplemented by information about the algorithm’s objectives, the full set of factors considered, and the relationship between factors and outcome.
(c) Opaque models and the compliance problem
Some algorithms (particularly large neural networks and large language models) are too complex to explain adequately using any available technique. No current technique provides a sufficient explanation for these models in the context of individual automated decisions with significant impact.
Organisations that wish to use such systems for automated decisions face a fundamental compliance problem under both the GDPR and the LGPD: if the model cannot be explained, the obligation to explain cannot be met. Before deploying any automated decision-making system, organisations must satisfy themselves that adequate explanation is achievable. Complexity is not a defence, it is a risk to be managed at the design stage.
6. GDPR and LGPD: A Comparative Overview
The table below summarises the key differences between the GDPR and LGPD frameworks for automated decision-making explanation. Organisations operating across both jurisdictions should use this as the basis for a gap analysis, not a substitute for legal advice on specific circumstances.
| Topic | GDPR (EU) | LGPD (Brazil) |
| Specific prohibition on automated decisions | Yes. Article 22 prohibits decisions producing legal or significant effects, subject to three exceptions | No equivalent express prohibition, but article 20 grants the right to review of automated decisions affecting the data subject’s interests |
| Scope of trigger | Legal effects or similarly significant effects | Decisions affecting the data subject’s interests, arguably broader than the GDPR threshold |
| Right to explanation | Not framed as a standalone right under that label, but derived from Articles 13, 14, 15 and 22, Recital 71, and confirmed by the CJEU in Dun & Bradstreet | Implicit through articles 9, 18 and 20; ANPD expected to issue further specific regulation |
| General (proactive) explanation | Required under articles 13 and 14 at or before data collection | Required under article 9 (purpose and transparency) and the general transparency principle |
| Personal (specific) explanation | Required on access request under article 15; confirmed by CJEU in Dun & Bradstreet (2025) | Available under article 20 right to review and article 18 right of access |
| Human review right | Mandatory under article 22(3): right to human intervention, to express a view, and to challenge the decision | Article 20: right to request review of the automated decision; the LGPD does not currently require that the review be conducted by a human, although human review may be necessary or advisable in higher-risk cases. |
| Legal bases for processing | 6 legal bases under article 6 GDPR | 10 legal bases under article 7 LGPD, including “legitimate interest” and “credit protection” |
| Enforcement authority | EU member state data protection authorities (e.g. CNIL, ICO, Datatilsynet); significant enforcement record | ANPD; enforcement increasingly active since 2023; first sanctions issued |
| Maximum penalty | 4% of global annual turnover or EUR 20 million, whichever is higher | 2% of Brazilian revenues (prior year), capped at BRL 50 million per violation |
| Explainability-by-design | Derived from data protection by design (article 25 GDPR) | Derived from accountability and prevention principles (article 6 LGPD) |
7. Limits on the Obligation to Explain
(a) Trade secrets
Where a complete explanation would require disclosure of genuinely protected trade secret information, for example, a proprietary algorithmic formula subject to intellectual property rights, the organisation may limit the explanation to avoid that disclosure. However, a general concern about commercial sensitivity is not sufficient. The organisation must demonstrate that the information is legally protected, that disclosure would cause concrete harm, and that the limitation is proportionate. Critically, trade secrecy limits what must be disclosed, not whether an explanation must be given at all.
The LGPD does not contain an equivalent express exception for trade secrets in the context of the article 20 right to review. However, the general principle of balancing conflicting rights, established in article 18, paragraph 3, and developed by the ANPD in regulatory guidance, achieves a similar result. An organisation that cannot disclose its algorithm may still be required to describe what the algorithm does and what factors it considered.
(b) Gaming the system
An organisation may withhold specific information where disclosure would allow individuals to manipulate the decision-making process, for example, in fraud detection systems where knowledge of the detection criteria would enable circumvention. A general concern is not enough; the risk must be concrete and the withholding proportionate. Both the GDPR and LGPD frameworks support this balancing exercise.
Procedural requirements when limiting explanation
When an organisation limits an explanation, it must in both jurisdictions:
• inform the individual that the explanation has been limited;
• explain the reasons for the limitation; and
• inform the individual of their right to complain to the relevant supervisory authority (the ANPD or the competent EU data protection authority) and to seek judicial remedy.
8. Governance: Building Explainability
(a) Explainability-by-design
Both the GDPR‘s data protection by design principle (article 25) and the LGPD‘s accountability and prevention principles (article 6) require organisations to consider explainability from the outset, not as an afterthought. In practice, this means making conscious choices at the design stage about which type of algorithm to use, confirming that adequate explanation is achievable, and documenting those choices.
(b) A three-phase explanation process
A structured approach to explainability covers three phases:
- Phase 1: Technique selection. Choose which explanation techniques are appropriate for the model in use. Determine how those techniques will be integrated into the decision-making process and IT systems.
- Phase 2: Delivery strategy. Develop a strategy for communicating explanations clearly and simply. Train the staff responsible for responding to access requests. Establish a contact point for individual queries.
- Phase 3: Evaluation. Test explanations with representative members of the relevant audience. Gather feedback. Review the process regularly, particularly when the underlying model changes.
(c) DPIAs and the LGPD equivalent
Under the GDPR, automated decision-making with significant impact will typically require a Data Protection Impact Assessment (“DPIA“). The DPIA is the appropriate place to document the organisation’s assessment of explainability risks and the measures taken to address them.
The LGPD does not yet have a mandatory DPIA equivalent in the same terms, though the ANPD has issued guidance encouraging impact assessments for high-risk processing. Organisations already conducting DPIAs for GDPR purposes should extend their explainability documentation to cover LGPD compliance as part of the same exercise.
9. Rights of Individuals
When an automated decision is made about an individual, that individual has specific rights under both regimes. The explanation is the mechanism that makes those rights meaningful, without it, an individual cannot know whether the decision was based on correct data, whether the algorithm applied relevant factors fairly, or whether there are grounds to challenge the outcome.
| Right | GDPR | LGPD |
| Human review | Article 22(3): right to obtain human intervention in the decision | Article 20: right to request review of the automated decision by a human |
| Express a view | Article 22(3): right to express one’s point of view before a final decision | General principle; no express provision equivalent to article 22(3) |
| Challenge the decision | Article 22(3): right to contest the decision | Article 20: right to request review and correction; right to complain to the ANPD |
| Access to data and logic | Article 15: right of access includes explanation of the automated decision | Articles 18 and 20: right to access data and request review of automated processing |
| Correction | Article 16: right to rectification of inaccurate data | Article 18, III: right to correction of incomplete, inaccurate or outdated data |
| Portability | Article 20: right to data portability | Article 18, V: right to data portability |
10. Frequently Asked Questions
(a) Does the right to explanation apply to every automated decision?
No. Under the GDPR, it applies to fully automated decisions that produce legal effects or similarly significant effects. Decisions made with meaningful human involvement are generally outside scope. Under the LGPD, the threshold is broader, any automated decision affecting the data subject’s interests, but the ANPD has not yet issued detailed implementing guidance.
(b) Is providing the algorithm’s formula a sufficient explanation?
No. A mathematical formula tells most individuals nothing they can act on. The explanation must translate the algorithm’s operation into language that enables the individual to understand the outcome and, if necessary, challenge it. This was confirmed by the CJEU in Dun & Bradstreet (2025) for GDPR purposes and is consistent with the LGPD‘s transparency standard.
(c) Can an organisation use a model it cannot explain?
Not for automated decisions with significant impact under the GDPR. Not, in practice, for automated decisions affecting data subjects under the LGPD, given the transparency and accountability obligations. The organisation must either choose a more explainable model, implement sufficient transparency techniques, or ensure meaningful human involvement in the decision.
(d) How quickly must an explanation be provided?
Under the GDPR, an access request, which triggers the obligation to provide a personal explanation, must be answered within one month, extendable by a further two months for complex cases. Under the LGPD, article 19 requires that access requests be responded to within 15 days. The LGPD timeline is substantially shorter and must be factored into operational planning.
(e) What penalties apply for failing to explain?
Under the GDPR, supervisory authorities may impose fines of up to 4% of global annual turnover or EUR 20 million per violation, whichever is higher. They may also order compliance, temporary or permanent processing bans, or data deletion. Under the LGPD, the ANPD can impose fines of up to 2% of Brazilian revenues (prior year), capped at BRL 50 million per violation, as well as daily fines for ongoing non-compliance, public disclosure of the violation, and data blocking or deletion.
Using automated decisions in Brazil or the EU?
My law firm advises organisations on LGPD and GDPR compliance, including automated decision-making obligations, explainability frameworks, and data subject rights.
Brazil Compliance Data Privacy GDPL GDPR LGPD
Last modified: 29 April 2026
I am a lawyer admitted to practise in Brazil and Australia, enrolled as a barrister and solicitor in New Zealand and licensed as an attorney-at-law in New York. I write and edit articles for this site and practise law at 

