Smart Choices
The Transparency Index | How We Actually Test Gummy Supplements
Why I Started Actually Verifying What’s Inside These Bottles
Look, I’ll be honest with you – this whole transparency scoring system started because I got burned.
Three years ago, I was dealing with vitamin D deficiency (hello, desk job life), and like millions of other people, I’d developed serious pill fatigue. You know that feeling when you’re already taking a handful of medications and the thought of adding another horse-pill vitamin makes you want to skip it entirely? That was me.
So I switched to gummy vitamins. Specifically, a popular brand that claimed 2000 IU of vitamin D3 per serving, had great reviews, and actually tasted decent. I took them religiously for six months, got retested, and my levels had barely budged.
That’s when I started digging.
Turns out, independent testing showed this particular product contained about 40% of the vitamin D claimed on the label. Not 90%. Not 80%. Forty percent. I’d been essentially taking a 800 IU supplement while paying for – and thinking I was getting – 2000 IU.


The Problem Nobody’s Talking About
Here’s what really got to me: this wasn’t some sketchy no-name brand. This was a company with millions in revenue, fancy marketing, and thousands of five-star reviews. And they were essentially lying on their label.
But here’s the thing – they probably weren’t doing it intentionally. Gummy manufacturing is incredibly complex. Vitamins degrade during the heating process. Some nutrients are fundamentally unstable in gummy format. Others interact with the gelatin or pectin base. Many companies genuinely don’t understand these formulation challenges, and consumers definitely don’t.
The result? An industry where “close enough” has become the standard, and consumers are left guessing whether they’re actually getting what they paid for.
What Makes This Different
After that experience, I became obsessed with verification. Not just reading labels or trusting manufacturer claims, but actually tracking down independent testing data, checking FDA databases, verifying certifications, and understanding the real research behind health claims.
The Transparency Index isn’t about perfection – it’s about honesty. A company that scores 7.5 isn’t necessarily worse than one that scores 8.2. But a company that refuses to disclose where their ingredients come from, has never been independently tested, and makes health claims with zero research backing? That’s a red flag you deserve to know about.
What you’ll find here:
- The actual research process I use for every product
- Examples of what high vs. low transparency looks like in practice
- The five categories I evaluate and why each one matters for your health and wallet
- Honest limitations about what this scoring system can and can’t tell you
Because honestly? You shouldn’t have to become a supplement research expert just to figure out if your gummy vitamins contain what they claim. That’s my job. Your job is understanding what these scores mean for your specific needs.
See the Difference: High vs. Low Transparency Scores
Here’s what these scores actually look like in practice. I’ve created two example products based on real testing patterns I see – one representing companies that impress me with their openness, and another that… well, let’s just say they’re not making it easy for consumers to make informed decisions.
Remember, this isn’t about the products working or not working (that’s a whole different conversation). This is purely about how transparent companies are about what they’re putting in these bottles.
Premium Brand A Organics
Top Strength
USDA Organic certified with detailed supplier disclosure and third-party testing verification
Minor Gap
Limited batch consistency data available, though no consumer complaints found
Category Breakdown
Generic Supplement Co.
Major Concern
No independent testing found, multiple consumer reports of inconsistent products
Red Flag
Makes broad immunity claims with zero research citations or evidence backing
Category Breakdown
What Makes the Difference?
High Transparency (8.5/10)
Low Transparency (2.7/10)
The Five Categories Explained
Look, when I first started doing this transparency analysis, I thought I’d just check if the vitamin content matched the label and call it a day. Turns out, that was… naive. The more I dug into how these products actually get made, the more I realized there are five distinct areas where companies can either be completely open with consumers or leave you guessing.
Here’s what I evaluate for every single product, and honestly? The patterns you see across these categories tell you a lot about a company’s priorities.
Label Accuracy (25%)
This is the big one – does the bottle actually contain what it claims? I weight this most heavily because, well, if you can’t trust the numbers on the label, everything else is kind of pointless.
My process here gets pretty nerdy. I’m searching ConsumerLab databases, checking PubMed for any independent testing studies, and digging through FDA warning letter databases to see if companies have gotten in trouble for mislabeling. The gold standard is when I can find third-party testing that shows, say, 95% of the claimed vitamin C content. That’s fantastic. When I find products with 40% of claimed content? That’s a problem.
Here’s what drives me crazy – I’ve found products where the company genuinely didn’t know their manufacturing process was destroying half the nutrients. Heat, pH levels, interaction with the gummy base… there are so many ways vitamins can degrade during production. Some companies test for this and adjust their formulations. Others just… don’t.
The scoring here is pretty straightforward: 90-110% of claimed content gets you a high score. Major deviations or FDA violations for mislabeling? You’re looking at a score below 5.
Red flag I always watch for: Companies that refuse to provide Certificates of Analysis when asked. If they’re confident in their product, why wouldn’t they share the testing data?
Research Backing (20%)
This category makes me want to bang my head against the wall sometimes. The number of products making wild health claims with zero – and I mean zero – research backing is just… ugh.
I’m not expecting every gummy company to fund their own clinical trials (though some actually do, which is impressive). But when you claim your product “boosts immunity” or “improves focus,” I want to see some evidence. Real evidence. Not a study on rats, not research using ten times the dose in your product, and definitely not a study that was completely misrepresented.
My approach here is systematic: I document every health claim on the packaging and website, then I go hunting for supporting research. I’m looking for randomized controlled trials first, then meta-analyses, then individual ingredient studies. I rank the evidence quality and check if the company is accurately representing the findings.
What impresses me: Companies that actually cite their research sources and don’t overstate the benefits. Like, a sleep gummy that says “may help with occasional sleeplessness” and cites three melatonin studies? That’s honest marketing.
What makes me skeptical: Vague claims about “supporting your immune system” with no studies mentioned anywhere. Or worse, companies that cite completely irrelevant research hoping you won’t check.
Here’s the thing about this category – I’ve personally tested products with amazing research backing that tasted terrible, and products with zero research that I actually liked using. The research score isn’t about whether the product works for you; it’s about whether the company is making honest, evidence-based claims.
Ingredient Sourcing Disclosure (20%)
Now we’re getting into territory where most companies just… don’t want to talk. Which, honestly, tells you something right there.
I’m not expecting every company to reveal their trade secrets, but basic transparency about where ingredients come from? That should be standard. I want to know if your vitamin C is synthetic (which is fine!) or food-derived. Is your vitamin D3 from lanolin (sheep’s wool) or lichen? Where are you sourcing your organic berry extracts?
The best companies I’ve tested actually name their suppliers. Like, “Our vitamin D3 comes from DSM in Scotland.” That’s transparency. The worst companies won’t even tell you if ingredients are domestic or imported, which makes me wonder what they’re trying to hide.
What I’m looking for: sustainability certifications that are actually verified, geographic origins for major ingredients, and any information about supplier quality standards. Some companies publish detailed sourcing reports. Others won’t respond to basic questions about ingredient origins.
Personal pet peeve: Companies that slap “premium ingredients” on their labels without defining what that means. Premium compared to what? Show me the sourcing standards that make your ingredients “premium.”
I’ve spent way too much time tracking down ingredient suppliers through LinkedIn and trade publications, but it’s worth it. The companies that are transparent about sourcing tend to be more careful about quality across the board.
Manufacturing Transparency(20%)
This category is fascinating because it reveals so much about a company’s approach to quality. Are they manufacturing in their own facilities or contracting it out? (Both can be fine, by the way.) Do they disclose where products are made? What quality control processes do they follow?
I’m digging through FDA facility databases, checking for manufacturing certifications like cGMP (current Good Manufacturing Practices), and looking for any inspection reports or violations. The companies that score well here aren’t necessarily the biggest or most expensive – they’re just honest about their processes.
What really impresses me: Companies that provide facility tour videos or detailed quality control explanations. I found one brand that shows their entire batch testing process on their website. That’s the kind of transparency I wish was standard.
Major red flag: Any company claiming their products are made in an “FDA-approved facility.” The FDA doesn’t approve facilities – they register them and inspect them. This kind of misleading language makes me wonder what else they’re getting wrong.
The scoring here ranges from companies that provide comprehensive facility information and certifications (high scores) to companies that won’t even tell you what country their products are made in (low scores).
Between you and me, I was pretty surprised by how much manufacturing transparency varies even within the same price range. Some budget brands are incredibly open about their processes, while some premium brands are surprisingly secretive.
Certifications (15%)
Last but definitely not least – and honestly, this category can be the most revealing because it’s where companies either put their money where their mouth is or try to fake it.
Real certifications cost money and require ongoing compliance. USP Verified, NSF Certified, USDA Organic – these aren’t just logos you can buy. They require testing, facility inspections, and continuous monitoring. When I see these certifications, I always verify them through the issuing organizations’ databases.
Here’s what kills me: the number of products displaying fake or expired certification logos. I’ve found companies using NSF logos without any NSF certification, claiming “FDA approval” (which doesn’t exist for supplements), and displaying expired organic certifications. It’s like they’re counting on consumers not checking.
The hierarchy I use:
- Tier 1: USP Verified, NSF Certified for Sport, Informed Choice – these involve actual product testing
- Tier 2: Manufacturing certifications like cGMP, SQF – these verify processes but not finished products
- Tier 3: Ingredient certifications like USDA Organic, Non-GMO Project – important but more limited in scope
The companies that score highest here don’t just have certifications – they have the right certifications for their claims, they’re all current and verified, and they understand what each one actually means.
This category gets the lowest weight (15%) because certifications aren’t everything. I’ve tested great products with minimal certifications and mediocre products with impressive certification displays. But fake certifications? That’s an immediate red flag about the company’s overall integrity.
The bigger picture: None of these categories exist in isolation. The most transparent companies score well across all five areas because transparency is part of their company culture. The least transparent companies tend to have gaps everywhere – it’s like they’re hoping consumers won’t ask the hard questions.
And honestly? That’s exactly why I started doing this analysis in the first place. Someone needs to ask the hard questions.
What Your Transparency Score Actually Means
Okay, so you’ve got a number. Now what? Look, I know seeing “7.3 out of 10” on a supplement doesn’t automatically tell you if you should buy it or not. That’s not really the point. These scores are more like… a transparency GPS. They tell you how much you can actually know about what you’re putting in your body.
Let me break down what these score ranges actually mean in practice, because honestly? I’ve learned some surprising things about how transparency (or lack thereof) plays out in the real world.
9.0-10.0: Exceptional Transparency
“These companies get it”
I’m not gonna lie – I don’t hand out scores in this range very often. We’re talking about maybe 5% of the products I test. Companies in this range aren’t just checking boxes; they’re actively trying to make it easy for you to verify their claims.
What this looks like in practice: They’ll publish Certificates of Analysis on their website. They name their suppliers. When they cite research, the studies actually support what they’re claiming (revolutionary concept, I know). You can usually find independent testing that confirms their label accuracy, and their customer service will actually answer detailed questions about sourcing and manufacturing.
Personal example: I tested one brand where I could literally trace their vitamin D3 back to the specific lichen farms in Iceland. Their manufacturing facility had live webcams you could watch. That’s… maybe a little excessive, but it shows a commitment to transparency that goes way beyond compliance.
Should you automatically buy these products? Not necessarily. High transparency doesn’t guarantee the product will work better for you, and these brands often cost more because, well, transparency and quality control aren’t free. But if you’re someone who wants to know exactly what you’re getting, this is your comfort zone.
8.0-8.9: Very Good Transparency
“Solid companies with minor gaps”
This is actually where I find a lot of my personal favorites. These companies are doing most things right, but they might have one area where they’re a bit secretive or haven’t invested in comprehensive testing yet.
Common patterns here: Great certifications and research backing, but limited sourcing disclosure. Or excellent ingredient transparency but minimal manufacturing details. Sometimes it’s a newer company that hasn’t had time to build up all their verification systems yet.
Real talk: I’ve found some genuinely excellent products in this range. The gaps tend to be more about transparency than actual quality issues. Like, maybe they won’t tell you exactly where their organic berries come from, but the USDA Organic certification is legitimate and their independent testing results are solid.
What to watch for: These companies are usually responsive when you ask questions about their gaps. A company that scores 8.2 but won’t answer basic questions about their manufacturing? That’s more concerning than a company that scores 8.2 and says, “We’re working on getting more detailed sourcing information available by next quarter.”
7.0-7.9: Good Transparency
“Meeting most consumer needs”
Here’s where things get interesting. Companies in this range usually nail the basics – their labels are generally accurate, they have some legitimate certifications, and they’re not making completely outrageous health claims. But they’re also not going above and beyond to prove their quality.
This is actually where I think most consumers should be comfortable. You’re getting enough information to make an informed decision without paying premium prices for comprehensive transparency. It’s like… you can see the engine of the car, but maybe you can’t inspect every individual part.
Typical scenario: Independent testing shows 85% of claimed vitamin content (not perfect, but functional). They mention “high-quality ingredients” without naming specific suppliers. Their research citations are relevant but maybe not comprehensive. Manufacturing location is disclosed but quality processes aren’t detailed.
Between you and me, I use plenty of products in this range. Sometimes I care more about how something tastes or fits my routine than having a complete supply chain audit.
6.0-6.9: Fair Transparency
“Some important information is missing”
Now we’re getting into territory where I start asking more questions. These scores usually mean there are some significant gaps in what you can verify about the product, but it’s not necessarily a deal-breaker.
Common issues here: Limited or no independent testing data. Health claims that are… optimistic compared to the research. Basic quality compliance but minimal disclosure about actual processes. Sometimes it’s ingredient sourcing that’s completely opaque.
When this might be okay: If you’re trying a new ingredient category and want to test the waters without a big investment. If you’re more focused on convenience and taste than comprehensive quality verification. If the specific gaps don’t matter much for your particular needs.
When to be cautious: If this is a supplement you’ll be taking long-term. If you have specific health conditions that make ingredient quality crucial. If the company won’t answer basic questions when you contact them directly.
I’ve definitely tried products in this range that worked great for me, but I went in with realistic expectations about what I could actually verify.
5.0-5.9: Basic Transparency
“Minimal disclosure, several unknowns”
Look, I’m not going to sugarcoat this – scores in this range mean you’re essentially trusting the company without much ability to verify their claims. That doesn’t automatically make the product bad, but it does make it a gamble.
What typically drives these scores: No available independent testing. Vague or missing ingredient sourcing. Health claims with little to no research backing. Basic compliance statements but no detailed quality information.
Here’s my take: Sometimes these are perfectly functional products from companies that just haven’t invested in transparency infrastructure. Other times, they’re from companies that have something to hide. The problem is, you can’t really tell the difference from the outside.
Personal experience: I’ve tested budget products in this range that actually performed well in my own usage, but I had no way to verify if I was getting consistent quality batch to batch. That uncertainty became frustrating over time.
Below 5.0: Poor Transparency
“Significant concerns”
Okay, this is where I start getting genuinely worried about what’s going on behind the scenes. Scores below 5.0 usually mean multiple red flags across different categories.
Red flags that drive these scores: False or unverifiable certifications. Major discrepancies between claims and available evidence. Refusal to answer basic questions about products. Health claims that are completely unsupported or potentially misleading.
Real example without naming names: I tested one product that claimed FDA approval (which doesn’t exist for supplements), displayed an expired organic certification logo, and made immunity claims based on a study that was completely unrelated to their actual ingredients. That’s… not great.
Should you avoid these entirely? That’s really up to your risk tolerance. Some people are fine with uncertainty if the price is right. Personally? I’ve got enough good options in higher score ranges that I don’t feel the need to gamble.
How to Actually Use These Scores
Here’s the thing – I don’t want you to just buy the highest-scoring products and call it a day. These scores are tools for making informed decisions based on your priorities.
If you’re dealing with a specific health condition: Higher transparency scores give you more confidence that you’re getting consistent, verified ingredients. The last thing you want is to think you’ve found a supplement that helps, only to discover the next batch has completely different potency.
If you’re budget-conscious: Look at the score breakdown by category. Maybe a product scores 6.8 overall, but it’s got great label accuracy and research backing – just limited sourcing disclosure. If you don’t care much about where ingredients come from, that might be perfect for you.
If you’re trying something new: Start with higher-scoring products to establish whether the ingredient category actually works for you, then you can decide if you want to experiment with less transparent options.
If taste and convenience matter most: Don’t let perfect be the enemy of good. A product that scores 7.2 but tastes terrible and has a weird texture isn’t going to help you if you won’t actually take it consistently.
And honestly? Sometimes I ignore my own scoring system. There’s one sleep gummy I’ve been using for months that only scores 6.4, but it works incredibly well for my schedule and doesn’t leave me groggy. The transparency gaps don’t outweigh the practical benefits for my specific situation.
The scores give you information. What you do with that information depends on your individual priorities, risk tolerance, and health needs. That’s exactly how it should be.
Frequently Asked Questions
How often do you update these scores?
This is actually something I’m still figuring out, to be honest.
Right now, I do formal updates quarterly for products I review regularly, but I’ll update scores immediately if I find new information that significantly changes the picture – like if a product gets hit with an FDA warning letter, or if new independent testing becomes available.
Between you and me, I probably spend too much time monitoring for updates. I’ve got Google alerts set up for every brand I’ve scored, and I check the major testing databases monthly. It’s… maybe a little obsessive, but supplement quality can change pretty dramatically when companies reformulate or switch manufacturers.
The tricky part is figuring out when a change is significant enough to warrant a score update. If a company adds a new certification, that’s easy. But what about when consumer reports start mentioning batch inconsistency issues? How many reports constitute a pattern? I’m still working out those judgment calls.
What if a company disagrees with their score?
Here’s my policy: I’m always willing to look at new evidence or correct factual errors. But I’m not changing scores just because a company doesn’t like the result. The whole point is independent analysis.
How does this compare to ConsumerLab or other testing organizations?
Totally different animals, really. ConsumerLab does actual laboratory testing of finished products – they can tell you the exact nutrient content, check for contaminants, test for dissolution rates. That’s incredibly valuable information that I can’t provide.
What I’m doing is more like… investigative journalism meets consumer advocacy. I’m looking at the broader transparency picture – not just what’s in the bottle, but how honest companies are being about everything from sourcing to research claims.
Think of it this way: ConsumerLab might tell you that a vitamin D gummy contains 92% of the claimed vitamin D content. I might tell you that the company making that gummy refuses to disclose where their vitamin D comes from, cites irrelevant research for their health claims, and has never been independently tested outside of that one ConsumerLab report.
Both pieces of information are useful. Neither tells the complete story by itself.
Do companies pay you for good scores?
No. Full stop.
I do use affiliate links when I recommend products (and I’m always transparent about that), but the scoring system is completely separate from any revenue considerations. I’ve given high scores to products I don’t have affiliate relationships with, and I’ve given low scores to products from brands that wanted to work with me.
The affiliate thing is different because it only kicks in if you decide to buy something after reading my analysis. But direct payments from companies? That would completely compromise the independence that makes this scoring system useful.
What are the biggest limitations of your system?
Oh man, where do I start? I think about this constantly, and honestly, there are some pretty significant limitations you should understand.
First, I can only score what’s publicly available or verifiable through databases. If a company is doing amazing quality control internally but doesn’t publicize it, that won’t show up in my transparency scores. I’m measuring openness, not necessarily actual quality.
Second, my research is only as good as the sources available to me. If ConsumerLab hasn’t tested a product, if the FDA hasn’t inspected a facility recently, if a company’s research citations are behind paywalls – there are gaps in what I can verify.
Third, I’m human. I make judgment calls about how to weigh different types of evidence, and other reasonable people might weigh things differently. I try to be consistent, but I’m sure there’s some subjectivity creeping in.
And honestly? The scoring system doesn’t capture everything that matters for supplement decisions. A product might score 9.2 for transparency but taste terrible, be ridiculously expensive, or just not work well for your particular biochemistry. High transparency is great, but it’s not the only factor that matters.
I’m overwhelmed by all this. Can’t you just tell me what to buy?
I get this question a lot, and I totally understand the impulse. You’re busy, you want supplements that work, and you don’t necessarily want to become an expert in supplement transparency analysis.
Here’s my practical advice for using these scores without getting overwhelmed:
If you’re a “set it and forget it” person: Look for products scoring 7.5 or higher. You’ll have enough transparency to feel confident, without needing to dig into all the details.
If you’re budget-conscious: Focus on the Label Accuracy scores. A product with great label accuracy but limited sourcing disclosure might be perfect if you don’t care about ingredient origins.
If you have specific health concerns: Higher overall scores give you more confidence in consistency, which matters when you’re trying to determine if a supplement actually helps your particular situation.
If you just want to try something new: Don’t overthink it. Pick something with decent scores that fits your budget and preferences. You can always switch to a higher-transparency option later if you decide you like the ingredient category.
The scores are there to inform your decision, not paralyze you with analysis. Sometimes good enough really is good enough.
What about new products that haven’t been independently tested yet?
This is actually a growing issue as the supplement market expands so rapidly. New brands and products are launching faster than testing organizations can keep up with.
For newer products, I rely more heavily on the other categories – research backing for claims, manufacturing transparency, certifications. Sometimes a new product from an established manufacturer will get the benefit of the doubt based on that company’s track record.
But honestly? This is where the scoring system shows its limitations. A brand-new product from a company with no testing history is going to score lower than it might deserve, simply because there’s no verification data available yet.
My advice for new products is to wait for some independent verification if possible, unless you’re comfortable being an early adopter. Sometimes that means missing out on innovative formulations, but it also means avoiding products that might not deliver what they promise.
How do I know when a score difference actually matters?
Great question, and one I wrestle with regularly. Is a product scoring 7.8 really meaningfully different from one scoring 8.1? Probably not.
Here’s how I think about significant differences:
Within 0.5 points: Basically equivalent. Choose based on other factors like price, taste, or convenience.
0.5-1.0 point difference: Noticeable difference in transparency, but might not matter depending on your priorities.
1.0+ point difference: Significantly different levels of transparency. The higher-scoring product gives you substantially more confidence in what you’re getting.
2.0+ point difference: Major transparency gap. The lower-scoring product involves considerably more uncertainty.
But honestly, sometimes I care more about the category breakdown than the overall score. A product might score 7.2 overall, but if it gets a 9.0 for label accuracy and that’s what matters most to you, the overall score is less relevant.
Use the scores as a starting point for comparison, not as the final word on what you should buy.
Got a question I didn’t cover here? Send it my way. I’m always interested in hearing what aspects of this analysis are most (or least) useful for actual supplement decisions. This whole system evolved from questions like yours, and it’ll keep evolving as I learn more about what consumers actually need to make informed choices.