By Chris St. Germain, Co-Host of The OSINT Output
A Field Guide to Not Getting Played by Bad

As a kid, one of the places I lived had a laundry chute that went from the second floor to a basement. Well we were just small and dumb enough to decide that this could be a rapid travel tunnel from the top floor to the bottom. So my older brothers went to the basement to pad the landing. I asked if there was enough padding to soften my travel, and my brothers in unison said “Trust me Bro”.
The internet, from its ARPANET days was and is designed to connect the world to each other and to the sums of all human knowledge. But there are also too many internet sites whose information is backed up by “Trust me Bro”. My injury history can attest that “Trust me Bro” provides insufficient support for both booties and information.
If you do OSINT, journalism, research—or just like being right on the internet—you need something better than vibes to tell good info from garbage. That’s where a little mnemonic comes in handy: CRAAP B.S., it’s not just how you feel after scrolling news for three hours. It’s a quick, structured way to sniff out whether a source is actually useful, or just confidently wrong with good branding.Let’s walk through it like a mini‑lifecycle for any link, PDF, tweet, or “my uncle works in intelligence” claim that lands in your lap.
C – Currency: Is This Thing Still Alive?
You would not trust a weather report from 2014 to decide whether to bring an umbrella today. Yet people will absolutely cite a 10‑year‑old blog post to “prove” how a platform works right now.
Ask: When was this created, and when was it last updated? Is the topic time‑sensitive (laws, platforms, sanctions, conflicts, breaches, tech features)? Is the data about something that changes fast (prices, troop locations, company ownership, platform policies)? If you’re doing OSINT on live operations, “last year” may already be ancient history. For a history piece, a 50‑year‑old book might be fine; for a Telegram op, 50 minutes can be too old. Context is everything.
R – Relevance: Is This for You or Just for SEO?The internet is full of content that is technically “about” your topic but practically useless for your actual question.This is where you ask:Does this answer the specific question I’m working on, or is it generic fluff? Is the geographic, linguistic, and cultural context right? Is the granularity correct—am I getting raw data when I need analysis, or op‑eds when I need timelines? If you’re investigating disinformation in West Africa, a generic explainer on “social media manipulation worldwide” might be interesting, but it’s not what you need to make a call in your case file.
A – Authority: Who’s Actually Talking Here?
On the internet, everyone looks the same at 12‑point font size (16 size for our older readers, me…I mean me). “A report says…” is doing a lot of suspiciously heavy lifting.Authority is not about worshipping credentials; it’s about understanding who is speaking and from where:What is the author’s background? Are they a researcher, a random blog commenter, a PR intern, a state‑linked outlet? What’s the institution? University, think tank, front group, advocacy org, “Institute for Truth” registered last Tuesday? Do others in the field take this person or org seriously, or only when they confirm certain narratives? Online, someone can be both authoritative and biased. Your job is to know which hat they’re wearing when they speak.
A – Accuracy: Does the Story Survive Contact with Reality?
This is where we move from “vibes” to evidence.
Run through: Are there citations, links to data, documents, or primary sources—or just confident assertions? Can you cross‑check key claims with at least one independent source? Are numbers plausible, methods explained, and quotes verifiable? If a piece sounds great but refuses to show its work, treat it like a magician on stage: entertaining, maybe useful for inspiration, but not something you build policy or investigations on.
P – Purpose: Why Does This Exist?
Nothing online is neutral. Probe: Is this trying to inform, persuade, recruit, sell, troll, or distract? Who benefits if people believe and share this? Is the emotional tone dialed up—fear, outrage, pride, shame—as a delivery mechanism? Purpose doesn’t automatically invalidate a source (advocacy groups can produce excellent research), but it does color how you should read it.
B – Bias: What’s Warping the Lens?
Bias is not just “they’re on the other side.” It’s all the invisible pressures nudging what gets included, excluded, or exaggerated. Funding and affiliation (state, corporate, NGO, party, movement). Ideological framing (“freedom fighters” vs “terrorists,” “reform” vs “crackdown”). Selection bias (which cases they highlight, which voices they quote).Platform bias (algorithms favoring certain kinds of content and emotional tones). Instead of asking “Is this biased?” try asking, “In which direction is this biased, and by how much?” Then adjust your mental calibration accordingly.
S – Source: Where Did They Get It?
You’re not just evaluating the article; you’re evaluating its supply chain: Are they using primary sources (documents, satellite imagery, original interviews, raw data), or just recycling someone else? Is there a single chokepoint source—one tweet, one government press release, one anonymous quote—being laundered through ten outlets? Can you follow the citation trail back to an origin, or does it vanish into “experts say” fog? Once you start tracing this, you notice how often a “wall of sources” all lead back to one dubious node. Ten outlets repeating one unverified claim is still one unverified claim, just louder.
Putting It Together: A Quick CRAAP B.S.
Walkthrough Imagine you’re investigating a viral claim: “New encrypted messaging app from Country X is secretly funneling all messages to the government.” Run it through CRAAP B.S.
Currency: The claim is from last week, but cites documentation from three years ago. Has the app changed hands or updated since?
Relevance: The piece is focused on general privacy hype, not specifically on state access mechanisms. It might not be granular enough.
Authority: The author is a cybersecurity blogger, not a reverse‑engineer; they rely on others’ technical analysis.
Accuracy: There’s one technical report linked, but all other claims are “experts believe.” The report itself is nuanced; the blog post is not.
Purpose: The tone is panic‑bait, full of “shocking,” “you won’t believe,” and affiliate links to “safer” apps at the bottom.
Bias: The site is heavily aligned with a rival vendor and regularly trashes tools from Country X.
Source: Every article screaming about this leak ultimately cites the same two paragraphs from that original technical report.
Conclusion: there might be a real risk, but the viral story you’re reading is an exaggerated game of telephone. For OSINT, you log the technical report as a core source, treat the commentary as context, and go hunting for additional corroboration.
“I Saw It Online” is not a methodology and CRAAP B.S. won’t turn bad data into good. What it does is force you to slow down just enough to avoid building your work on sand. For OSINT practitioners, journalists, researchers, or anyone stuck explaining “no, that’s not actually what the report said” at family dinners, this gives you a repeatable, teachable checklist.
Start with CRAAP work through Bias and wipe with the Sources for additional scrutiny. Then you can flush away the bad info and keep yourself out of laundry chutes without enough padding.

Leave a comment