Understanding
the Grades
What each rating means for you — and what to do about it
Limpid grades are not verdicts. They are readiness signals. Every tool reviewed on Limpid was reviewed because it is growing fast and developers are adopting it. That growth alone makes it worth scrutinising — not condemning. A low grade does not mean don't use this. It means here is what you need to know and do before you do.
The grade reflects how much work a typical developer or team needs to do to use the tool safely and responsibly. A high grade means the tool does most of that work for you — strong defaults, clear data practices, an accountable team. A low grade means more of that work falls on you.
How grades are assigned
Each review examines four areas: who is accountable behind the tool, how it handles your data and your users' data, whether it is safe to run without significant configuration, and whether its marketing accurately represents what it actually does. These four areas are weighted to reflect what matters most for security-conscious adoption.
Grades run from A to F across eight bands. The bands are not evenly spaced — the difference between a C and a D reflects a meaningful step up in the care required before adoption, not just a marginal score difference.
The eight grades
| Grade | Signal | What it means in practice |
|---|---|---|
| A | ✅ Production ready | The tool has done the security and privacy work. Named team, strong defaults, clear data practices, honest marketing. Suitable for most use cases including those handling sensitive data, with standard due diligence. |
| A- | ✅ Production ready with minor notes | Strong across the board with small gaps — a slightly vague clause in the privacy policy, or a minor default that warrants a quick config check. Nothing that should block adoption for most teams. |
| B | 🟡 Good — review the specifics | Generally trustworthy. Some hardening recommended before team or production use. The review will tell you exactly what to check. Most developers can adopt this with modest preparation. |
| B- | 🟡 Good — a few things to address | Above average, but with notable areas to investigate — data retention policies worth reading, or defaults that need adjusting before wider team use. Workable with attention. |
| C | 🟠 Proceed with awareness | Not a blocker for experimentation, but requires active investigation before any production or team use. The recommendations section of the review will tell you what safeguards to put in place. |
| C- | 🟠 High friction — prepare carefully | Using this tool safely requires deliberate effort. Read the full review before proceeding and follow the safeguards closely. |
| D | 🔴 Significant risks | Not recommended for production environments or any use case involving personal data. Can still be used in isolated, controlled settings by developers who understand the risks and follow the recommended safeguards. |
| F | 🚫 Avoid or isolate | If you must evaluate this tool, do so in an air-gapped environment with no access to real data. Not suitable for any connected or production use. |
The tool has done the security and privacy work. Named team, strong defaults, clear data practices, honest marketing. Suitable for most use cases including those handling sensitive data, with standard due diligence.
Strong across the board with small gaps — a slightly vague clause in the privacy policy, or a minor default that warrants a quick config check. Nothing that should block adoption for most teams.
Generally trustworthy. Some hardening recommended before team or production use. The review will tell you exactly what to check. Most developers can adopt this with modest preparation.
Above average, but with notable areas to investigate — data retention policies worth reading, or defaults that need adjusting before wider team use. Workable with attention.
Not a blocker for experimentation, but requires active investigation before any production or team use. The recommendations section of the review will tell you what safeguards to put in place.
Using this tool safely requires deliberate effort. Read the full review before proceeding and follow the safeguards closely.
Not recommended for production environments or any use case involving personal data. Can still be used in isolated, controlled settings by developers who understand the risks and follow the recommended safeguards.
If you must evaluate this tool, do so in an air-gapped environment with no access to real data. Not suitable for any connected or production use.
Grades change as tools evolve
A tool rated C today may be rated A in six months. Emerging tools move fast. Limpid reviews reflect a point in time, and every review carries a date. If a tool has addressed issues raised in a previous review, it can be re-reviewed and regraded. Grades are not permanent.
The goal is not to protect you from new tools. It is to make sure that when you adopt one, you do so with your eyes open — knowing what the risks are, what safeguards to put in place, and what to watch as the tool evolves.
If you are evaluating a tool and cannot find a Limpid review for it, you can suggest it for review — Limpid prioritises tools by how fast they are growing and how many readers have requested them.