How We Research, Test & Review Everything on Funnel Secrets

We Don’t Just Skim Sales Pages and Regurgitate Features. Here’s Our Actual Process.

Let me guess why you’re here.

You’re wondering: “Can I actually trust what these people say? Or is this just another affiliate site recommending whatever pays the highest commission?”

Fair question. Smart question, actually.

The internet is infested with “review sites” that:

  • Never touched the product they’re reviewing
  • Copy-paste the same feature lists from every other blog
  • Give everything 4.5 stars because… affiliate money
  • Hide their methodology (if they even have one)

That’s not us.

This page explains exactly how we research, test, and form our opinions on everything we cover—software, books, and courses. No mystery. No black box. Complete transparency.

If you disagree with our methods, tell us. If you find errors, call us out. We’ll either fix it or explain our reasoning.

That’s the deal.

But First: Let’s Talk About Most “Review” Sites

You know what most funnel software reviews look like?

Some SEO guy (who’s never built a funnel in his life) Googles “ClickFunnels review,” opens the top 3 results, mashes them together like a content smoothie, sprinkles in some keywords, and hits publish.

The result?

Dry. Generic. Boring as watching paint dry on a Tuesday afternoon.

You get the same recycled “pros and cons” lists. The same vague feature descriptions. The same suspiciously identical “4.7 out of 5” ratings that somehow every product gets.

And don’t even get me started on the “Updated 202X!” badges slapped on articles that clearly haven’t been touched since 2021.

(If I see another review that still mentions features that were discontinued two years ago, I might scream.)

Here’s What Makes Us Different

We actually USE the stuff we review.

Wild concept, I know.

But also? We try to make this stuff actually readable.

Look—sales funnels, marketing software, conversion optimization—none of this is life-or-death serious. It’s not neurosurgery. It’s not defusing bombs.

It’s building web pages that sell stuff.

So why do most reviews read like they were written by robots having a bad day?

We believe you can be helpful AND entertaining. Thorough AND not put people to sleep. We throw in jokes. We use weird analogies. We occasionally get sidetracked talking about whether a software’s UI looks like it was designed by someone who hates joy.

Because if you’re not enjoying the read, you’re not going to finish it. And if you don’t finish it, we can’t help you make a good decision.

Quick Navigation

Our Review Philosophy

Before we get into specifics, here are the non-negotiable principles that guide everything we publish:

1. We Actually Use the Thing

Revolutionary concept, apparently.

If we review software, we create a real account. If we review a book, we read it cover to cover. If we review a course, we go through the modules.

No exceptions.

(This is why we don’t review 50 different funnel builders. We review the ones we’ve actually spent meaningful time inside.)

2. We Share BOTH Pros and Cons

benefits and drawbacks

Every product has weaknesses. Every. Single. One.

If a review only talks about how amazing something is? That’s not a review. That’s an advertisement.

We dedicate real space to limitations, frustrations, and “who should NOT buy this.” Even for products we genuinely recommend.

Sometimes especially for products we genuinely recommend.

3. We Prioritize “Who Is This For?” Over “Is This Good?”

A tool that’s perfect for a beginner might be terrible for an agency.

A $2,000 course might be a waste for someone who just needs a $15 book.

Context matters more than absolute ratings. We try to match products to people, not just rank everything on a generic scale.

4. We Update When Things Change

Software gets updated. Pricing changes. New features launch. Old features disappear.

A review written in 2022 might be dangerously outdated by 2026. We revisit and update our content—and clearly mark when we do.

(Unlike those “Updated January 2025!” articles that are clearly lying.)

5. We Try Not to Be Boring

This is apparently controversial in the review space.

Most software reviews read like they were written by a corporate committee, reviewed by legal, and then had all personality surgically removed.

We’d rather you actually enjoy reading about funnel builders. Even if that means occasionally comparing a clunky UI to something that was designed by a committee of confused badgers.

6. Honest Assessment > Affiliate Revenue

Yes, we use affiliate links. (More on that later.)

But we’ve actively recommended AGAINST products that pay higher commissions because they weren’t the right fit. We’ve called out problems with products we’re affiliates for.

Long-term reader trust > short-term commission check.

Every time.

How We Review Funnel Software

This is probably why most of you are here. Let’s break down exactly how we evaluate funnel builders, landing page tools, and marketing software.

The Problem With Most Software Reviews

Here’s how 90% of “reviews” get written:

  1. SEO keyword researcher finds “ClickFunnels alternatives” has search volume
  2. A content writer (who has never touched ClickFunnels) gets assigned the article
  3. Writer Googles, opens top 5 results, takes notes
  4. The writer rewrites the same information in slightly different words
  5. The article gets published with stock photos and a generic “comprehensive” headline
  6. The article proceeds to rank because Google hasn’t figured out this game yet

The result?

You get 47 articles that all say the exact same things.

Same features listed. Same vague “user-friendly interface” claims. Same suspiciously similar “cons” sections that read like they were copied from each other. (Because they were.)

Nobody actually logged in. Nobody built anything.

That’s not what we do.

Funnel Secrets Testing Process

Step 1: Create a Real Account

Not a demo. Not a sandbox. A real account with real money (or at least a real trial that we actually use).

We sign up as a regular customer would. We experience the onboarding. We get the upsells. We see what the “inside” actually looks like.

Step 2: Build Actual Funnels

We don’t just poke around the dashboard for 20 minutes and call it a review.

We build real pages. Real opt-in forms. Real email sequences. Real payment integrations (when applicable). We try to break things. We test edge cases.

We connect the email tool. We set up the automation. We see what happens when you try to do something slightly outside the template.

(This is usually where the frustrations emerge.)

Step 3: Document Everything

Screenshots. Screen recordings. Notes on what worked, what didn’t, what made us want to throw our laptop across the room.

This becomes the evidence base for our review. Not theoretical features from a sales page—actual experience.

Step 4: Test Across Multiple Use Cases

Is it good for beginners? What about advanced users? Does it work for e-commerce? Service businesses? Course creators?

Different users have different needs. We try to evaluate from multiple perspectives—not just “what I personally want.”

Step 5: Research Pricing Thoroughly

Not just the advertised price—the REAL cost.

What features are locked behind higher tiers? What’s the price after the “introductory offer” expires? What hidden fees exist? What happens when you hit subscriber limits?

We do the annoying math so you don’t have to.

(Pro tip: That “$97/month” plan is almost never actually $97/month once you need anything useful.)

Step 6: Check What Others Say

We’re not the only opinion that matters. We look at user reviews, community feedback, support forum complaints, Reddit threads, and Facebook group rants.

positive review example
Positive Review Example
negative review example
Negative Review Example

If everyone except us is having problems, we need to investigate further. If our experience differs from the consensus, we try to figure out why.

Step 7: Write the Review

Only after all of that do we actually write anything.

And we try to make it entertaining. Because life’s too short for another 3,000-word snoozefest about “robust feature sets” and “intuitive user experiences.”

What We Evaluate in Software Reviews

FactorWhat We’re Looking For
Ease of UseCan a beginner figure this out without a PhD in marketing technology? Or does the “intuitive” interface actually require a 47-video tutorial series?
Feature SetDoes it actually do what it claims? Are the features useful or just checkbox marketing to look good on a comparison table?
Templates & DesignQuality and variety of pre-built templates. Modern or stuck in 2015?
IntegrationsDoes it play nice with other tools you probably use? Or is connecting your email provider an exercise in frustration?
PerformancePage load speed, uptime, mobile responsiveness. Does it actually work when it matters?
SupportResponse time, quality of help, available channels. When things break at 11pm, can you get help?
Pricing & ValueIs the cost justified for what you get? Hidden fees? Sneaky tier limitations?
Learning CurveHow long until you’re actually productive? Days? Weeks? Therapy sessions?

What We DON’T Do

Let’s be honest about limitations:

  • We don’t test every feature exhaustively. Some enterprise-level features we simply can’t evaluate properly without being an actual enterprise. We’re honest about that.
  • We don’t do long-term (6+ month) usage studies. Our reviews reflect focused testing periods, not years of daily use. Things might change over time.
  • We don’t have access to internal company data. We can’t verify claims about uptime percentages or customer counts. We can only report our experience.

When we don’t know something, we say so.

Unlike those reviews that confidently state “99.9% uptime!” because that’s what the sales page said.

How We Review Books

Books are different from software. You can’t really “test” them the same way. Here’s our approach:

We Read the Entire Book

Not summaries. Not Blinkist. Not “key takeaways from someone else’s blog post” that we rewrite with slightly different words.

Cover to cover. Usually with highlights, notes, and occasional profanity in the margins when we disagree with something.

(This is why we don’t review 100 books. Reading takes time. Good reviews take more time.)

We Extract Actionable Insights

A book can be entertaining but useless. It can also be boring but incredibly valuable. Some books are both boring AND useless, which is a special kind of tragedy.

We care about: What can you actually DO differently after reading this?

Our reviews focus on practical takeaways, not just whether the author is a good writer or has impressive credentials.

We Consider the Reader’s Level

A book that’s amazing for beginners might be too basic for experienced marketers.

We try to clearly identify who should read the book—and who should skip it.

“This is great!” is useless. “This is great for people who have never built a funnel and need step-by-step handholding, but probably too basic if you’ve already launched three products” is useful.

We Compare to Alternatives

Is this the best book on the topic? Or are there better options for the same information?

We contextualize books within the broader landscape of available resources. If there’s a $15 book that teaches the same thing as a $200 course, we’ll tell you.

What We Evaluate in Book Reviews

FactorWhat We’re Looking For
Actionable ContentCan you implement what you learn? Or is it all theory with no practical application?
Depth vs. FluffIs it substantive or padded to hit a page count? (Looking at you, books that could’ve been blog posts.)
Author CredibilityDoes this person actually have relevant experience? Or are they teaching theory they’ve never applied?
ClarityIs it well-written and easy to follow? Or do you need a decoder ring?
OriginalityDoes it offer new insights or just repackage existing ideas in a shinier cover?
Value for TimeIs it worth the hours required to read it? Your time has value.

How We Review Courses

Courses are the trickiest category. They’re expensive, time-consuming to evaluate, and highly variable in quality.

They’re also where the most scams live. So we’re extra careful here.

We Go Through the Actual Content

Not just the sales page. Not just the free preview. Not just the testimonials from people who may or may not exist.

We access the full course (purchased, review access, or otherwise) and work through the modules. We take notes on what’s taught, how it’s taught, and whether it delivers on the promises made on that very convincing sales page.

We Evaluate Against the Price

A $500 course needs to deliver more than a $50 course. Seems obvious, but many reviewers ignore this.

We assess value relative to cost—not just whether the information is good, but whether it’s good for the price.

Sometimes a $27 ebook contains better information than a $1,997 “masterclass.” We’ll tell you when that’s the case.

We Consider the Creator’s Track Record

Has this person actually achieved what they’re teaching? Or are they selling theory they’ve never applied?

We research the creator’s background and look for evidence of real-world results. Not just “I made $X million” claims on a sales page—actual verifiable track record.

(Spoiler: A lot of “gurus” teaching you how to make money are primarily making money from… teaching you how to make money. Circular, isn’t it?)

We Note What’s NOT Included

Some courses oversell and underdeliver.

We point out when important topics are missing, when “bonuses” are underwhelming, or when the course requires additional purchases to be useful.

“Get everything you need to succeed!” often means “Get the basics, then buy our advanced program for the actual good stuff.”

What We Evaluate in Course Reviews

FactorWhat We’re Looking For
Content QualityIs the information accurate, current, and well-organized? Or recycled from free YouTube videos?
Production ValueVideo quality, audio quality, presentation. Can you actually hear and see what’s happening?
ActionabilityCan you implement as you go? Or is it just information dump with no practical guidance?
Support & CommunityIs there help available? Active community? Or are you on your own after you pay?
Value for MoneyIs the price justified? Compared to alternatives? Could you learn this elsewhere for less?
Who It’s ForClear fit for specific audience? Or vague “everyone” marketing that means “nobody in particular”?

Our Rating System Explained

We use a simple 5-point scale. Here’s what each rating actually means:

RatingMeaning
⭐⭐⭐⭐⭐ 5/5Outstanding. One of the best in its category. Strong recommendation for the right user. (We don’t give these out like candy.)
⭐⭐⭐⭐ 4/5Very Good. Solid choice with minor limitations. Recommended with noted caveats.
⭐⭐⭐ 3/5Average. Gets the job done but nothing special. Consider alternatives first.
⭐⭐ 2/5Below Average. Significant problems. Only recommend in very specific situations.
1/5Poor. Not recommended. Major issues with quality, value, or trustworthiness.

Important Notes on Ratings

We don’t give many 5-star ratings. Outstanding means outstanding—not just “pretty good” or “I got paid well for this affiliate link.”

If everything gets 5 stars, the rating becomes meaningless.

A 4-star rating is NOT a failure. Four stars is a solid recommendation. The difference between 4 and 5 is usually about specific use cases, not overall quality.

Context matters more than the number. A 3-star product might be perfect for YOU even if it’s average overall. Read the review, not just the score.

We’re not afraid of low ratings. Some products deserve 2 stars. We’ll give them 2 stars. Even if it means less affiliate revenue. Even if the company gets mad.

How We Handle Affiliate Links (The Honest Truth)

Alright, let’s talk about the elephant in the room.

Yes, we use affiliate links. When you click certain links on this site and make a purchase, we may earn a commission.

This is how we keep the lights on. No ads. No paywalls. No sponsored posts. Just affiliate commissions from products we’d recommend anyway.

Our Affiliate Principles

1. We never recommend products BECAUSE of affiliate commissions.

If something sucks, we’ll say it sucks—even if it pays well.

If something great has no affiliate program, we’ll still recommend it.

2. We disclose affiliate relationships.

If a link is an affiliate link, we tell you. Either in the review or on our affiliate disclaimer page.

3. Affiliate commissions never influence our ratings.

Our review process is the same regardless of affiliate status. The research, testing, and evaluation happen before we even consider monetization.

4. Sometimes we recommend against our financial interest.

We’ve told people NOT to buy things we’re affiliates for because it wasn’t right for their situation.

Short-term commission < long-term trust.

Full disclosure: Read our complete Affiliate Disclaimer →

How We Update Reviews

Software changes. Pricing changes. New competitors emerge. Features get added, removed, or completely redesigned.

A review from 2021 might be dangerously outdated by 2025.

(Unlike some sites, we actually care about this.)

Our Update Process

Regular Review Audits

We periodically revisit published reviews to check if the information is still accurate.

Update Triggers

We prioritize updates when:

  • Major new features launch
  • Pricing changes significantly
  • We receive multiple reader reports of outdated information
  • The product undergoes significant changes (new ownership, rebrands, complete redesigns)

Transparent Dating

Every review shows:

  • When it was originally published
  • When it was last updated (if applicable)
  • What changed in the update (for significant revisions)

If you notice something outdated, please tell us. Seriously. Email us. We’ll investigate and update if needed.

We’d rather fix an error than pretend it doesn’t exist.

Who Does the Testing?

This isn’t a faceless content mill. There’s an actual human behind these reviews.

Nc Key - Admin Funnel Secrets

Key Nguyen — Founder & Chief Funnel Nerd

I’ve been obsessing over sales funnels, direct response marketing, and conversion strategy since 2013. Started Funnel Secrets in 2017 out of pure frustration with the garbage content that dominated this space.

What I’ve Actually Done

Training & Education:

  • Russell Brunson’s ecosystem: One Funnel Away Challenge, Funnel Builder Secrets, Traffic Secrets training, Two Comma Club coaching
  • Sabri Suby’s Sell Like Crazy framework and consulting methodology
  • Frank Kern’s CORE and Mass Control programs
  • Dan Kennedy’s direct response marketing material (probably too much of it)
  • Ryan Deiss’s Digital Marketer certifications
  • Dozens of smaller courses, workshops, and trainings I’ve accumulated over the years like a marketing hoarder

Books I’ve Actually Read Cover-to-Cover:

  • DotCom Secrets (twice)
  • Expert Secrets
  • Traffic Secrets
  • Sell Like Crazy
  • Influence by Cialdini
  • Breakthrough Advertising (yes, the whole thing—it’s dense)
  • Scientific Advertising by Claude Hopkins
  • Ca$hvertising
  • And probably 50+ others that are gathering dust on my shelf

Hands-On Experience:

  • Tested every major funnel platform with real accounts and real money
  • Built funnels that crushed it and funnels that flopped spectacularly (learned from both)
  • Made plenty of expensive mistakes so I can help you avoid them

Why This Matters

I’m not an SEO content writer who got assigned “funnel software reviews” last Tuesday.

I’ve actually used this stuff. I’ve felt the frustration of a broken integration at midnight. I’ve experienced the joy of a funnel that finally converts. I’ve wasted money on courses that promised the world and delivered a pamphlet.

When I say something is good or bad, it’s based on actual experience—not a summary of what other websites said.

Also important: I’m one person, not a team of anonymous writers. If I get something wrong, there’s a real human you can contact. Not a faceless corporation hiding behind “our editorial team.”

Found an Error? Disagree With Something?

We’re not perfect. We make mistakes. We have blind spots. We occasionally write something that turns out to be wrong.

If you:

  • Find factual errors in our reviews
  • Have a different experience with a product we reviewed
  • Think we missed something important
  • Disagree with our assessment
  • Just want to tell us we’re wrong (respectfully, please)

Contact us here →

We take feedback seriously. If you’re right, we’ll update the review and thank you. If we disagree, we’ll explain our reasoning.

Either way, we’ll actually respond. Unlike those “contact us” forms that disappear into the void.

Ready to Read Some Reviews?

Now that you know how we work, go see the results: