• What is Analytics?

Analytics

Analytics is the practice of collecting data about how people use your website, app, or campaigns, then turning that data into decisions you can actually act on. The narrow software meaning is the dashboard (Google Analytics, Mixpanel, Amplitude). The useful meaning is the discipline of asking the right question of that data.

Most teams have analytics installed and almost no one is doing analytics. There’s a difference between collecting and looking.

The four questions analytics is supposed to answer

Strip the dashboards back and a useful analytics setup answers four things:

What’s happening? Sessions, users, pageviews, conversion events. Descriptive - the baseline reality.

Why is it happening? Source, channel, content path, device. The diagnostic layer that tells you why the numbers moved.

What should we do about it? The decision layer. If trial signups dropped 20% from organic and you’ve changed nothing on the page, what’s the hypothesis? Algorithm shift? Competitor ranking? Seasonality?

Did the thing we did work? The feedback loop. Ship a change, watch the relevant metric, decide if the experiment ran true.

If your analytics setup can’t get you cleanly from question one to question four, it’s not analytics - it’s a wallpaper of charts.

The trap most teams fall into

Watching the wrong metric daily and the right metric never. Sessions go up - celebrate. Sessions go down - panic. But sessions don’t pay rent. Sessions × conversion rate × average order value × repeat purchase rate pay rent. The intermediate metric is a leading indicator at best and a vanity metric at worst.

The harder discipline: pick three metrics that actually map to revenue, watch them weekly, ignore the rest unless something dramatic moves. The dashboard with 47 widgets isn’t analytics - it’s analysis paralysis dressed up as rigour.

An example

I worked with a B2B SaaS that had Google Analytics, Mixpanel, Hotjar, and a custom data warehouse pipeline feeding Tableau. Six tools, $4k/month in subscriptions, two analysts on payroll. The CEO couldn’t tell you which channel produced their last 10 customers without scheduling a meeting with the analytics team.

We rebuilt the answer. One spreadsheet, refreshed weekly: trial-to-paid conversion by source, paid customer count by source, average revenue per customer by source. Three metrics, one column per channel, six rows for the last six weeks. Took the analyst 40 minutes a week to populate.

Within a quarter the CEO knew which two channels mattered, which three were noise, and which one to double down on. The other tools didn’t disappear (the analysts still used them) - but the decision-maker now had a dashboard that fit on one screen and answered one question.

What good analytics looks like for content people

Three things, in order:

Honest source attribution. Know where traffic actually came from. If Google Search Console shows organic clicks and Google Analytics shows half of them as “(direct)”, you have a tracking gap to close before any other question can be answered well.

Conversion event tracking that survives a redesign. The single most common analytics failure mode is shipping a new site and not noticing the conversion goal stopped firing for three weeks. Test the goal post-deploy.

A regular cadence to actually look. Weekly is enough for most teams. Monthly is too slow to catch issues. Daily is too fast - you’ll react to noise.

We built Penfriend to produce content whose performance shows up clearly in the analytics layer: distinct URLs, proper structured data, clean source attribution. The analytics story for content only makes sense when the content’s identity is stable and trackable.

Related terms

Here's how we can help you

Want a glossary just like this?

Get in touch for our DFY glossary service.