Documentation Index
Fetch the complete documentation index at: https://slatehq.com/docs/llms.txt
Use this file to discover all available pages before exploring further.
Content Updates answers one question: are the pages you publish and refresh actually pulling more traffic to your site?
Slate records every page you ship — through workflows, the CMS, or by hand — and lines those updates up against the search, AI, and engagement signals that follow. The result is a clear read on whether your content work is paying off.
Why it matters
You can publish and refresh content all day. Without a way to tie those updates to traffic, you’re guessing about which work paid off. Content Updates connects each publish or refresh to the clicks, impressions, citations, and engagement metrics that came after, so you can prove what worked, repeat it, and stop doing what didn’t.
Where to find it
In the Pages module, two tabs power this feature:
- Performance — the dashboard that answers “is it helping traffic?”
- Update History — the underlying ledger of every update, used for logging, auditing, and editing entries.
Start on Performance for the answer. Use Update History to make sure the data feeding it is complete.
How traffic impact gets measured
Two things need to be true for Slate to tell you whether a page update worked:
- The update is in the ledger. Workflows that publish through CMS blocks or the Track Content Update block log this automatically. Updates made directly in your CMS or by a teammate need to be logged manually on the Update History tab.
- At least one search or analytics source is connected. Google Search Console powers clicks and impressions, GA4 powers engagement time, and AI Search Analytics powers AI citations.
For every entry in the ledger, Slate compares a window of metrics before the update against the window after. That delta is what “did it move traffic?” means in this feature.
Find out if your publishing or refreshing is moving traffic
Open the Performance tab in Pages, then work through these four steps.
Step 1: Pick the window you want to evaluate
The Performance tab has its own date picker, separate from the rest of Pages.
Range presets:
| Preset | Window |
|---|
| Last 7 days | Rolling 7-day window |
| Last 28 days | Rolling 28-day window |
| Last 90 days | Rolling 90-day window |
| Last 12 months | Rolling 12-month window |
| Custom | Any start and end date |
Comparison options (drives the deltas on the KPI cards):
| Option | Compares the period against |
|---|
| vs previous period | The window of the same length immediately before |
| vs same period last month | The same window shifted back one month |
| vs same period last year | The same window shifted back one year |
| No comparison | Hides deltas |
Pick a range that covers the work you want to evaluate — a sprint, a quarter, or a year. Pick a comparison that reflects the question you’re asking. “Are we shipping more than last year?” → vs same period last year. “Did this month’s push beat last month’s?” → vs previous period.
The Search performance vs publishing cadence chart overlays your output on your search results.
- Bars: Published (dark) and Refreshed (orange) per bucket.
- Lines: Clicks (solid) and Impressions (dashed), from Google Search Console.
Use the Day / Week / Month toggle to find a bucket size where patterns become visible. Click the Clicks or Impressions legend to hide a series when one dominates the scale.
What to look for:
- Lines climb in the buckets where bars are tallest → your publishing rhythm is correlating with search demand.
- Lines stay flat or fall while bars are tall → output is not converting to traffic. Drop into Step 3 to find which pages are pulling their weight and which aren’t.
Step 3: Find the specific pages that moved the needle
The Top performing pages table shows the pages with the most measurable impact in the period.
| Column | What it shows |
|---|
| Page | Page title and URL |
| Type | Published or Refreshed |
| Date | When the update happened |
| Clicks | Clicks in the post-update window |
| Δ | Percent change in clicks vs the pre-update window |
| AI cites | AI citations in the post-update window (when AI Search Analytics is set up) |
| Engaged | Average engagement time per session, in mm:ss (from GA4) |
How to use it to answer the traffic question:
- Sort by Δ (click the Clicks column) to surface the biggest gainers and the biggest losers. Tall positive Δ rows are patterns to study and repeat. Negative Δ rows are pages that lost traffic after you updated them — worth opening to see what changed.
- Filter to Published to evaluate net-new pages on their own; filter to Refreshed to evaluate refreshes on their own. This separates “is publishing working?” from “is refreshing working?”
- Sort by AI cites when you care about visibility in AI answers, not just search clicks.
Step 4: Read the headline numbers
The KPI cards at the top of the tab give you the period summary, each with a sparkline of the daily trend and a delta against the comparison period:
| Card | What it counts |
|---|
| Published | New pages logged as Publish in the period |
| Refreshed | Existing pages logged as Refresh in the period |
| Drafts in flight | Entries currently marked as Draft across the workspace |
The deltas tell you whether your output is changing period-over-period. The chart and the table tell you whether that output is working. Read them together — high output with falling deltas in Top performing pages means you’re shipping more without moving traffic, which is the most important signal this feature surfaces.
Keep the ledger complete
The Performance tab is only as accurate as the ledger feeding it. The Update History tab is where you keep that ledger clean.
How updates get into the ledger
| Source | When it appears |
|---|
| Manual | You add an entry with the Add update button |
| Workflow | A workflow runs the Track Content Update block or publishes through a CMS block |
| Sheets workflow | A sheet-attached workflow publishes a row to a CMS |
| API | An external system posts to the Slate Content Updates API |
| MCP | An MCP client (for example, Claude Desktop) records an update |
The Source column in the Update History table tells you which path each entry came from.
Update types
| Type | Meaning |
|---|
| Publish | A new page went live |
| Refresh | An existing page was updated |
| Draft | The page was saved but not yet published |
Drafts are kept in the ledger but do not count toward Published or Refreshed totals on Performance — so they don’t pollute the traffic-impact picture.
Log a manual update
Use this when content ships outside of Slate (a teammate publishes directly in WordPress, an agency edits a Webflow page) and you want it counted on the Performance tab.
- On the Update History tab, click Add update.
- Enter the Page URL.
- Choose the Update type — Published or Refreshed.
- Pick the Date of update.
- Toggle Draft on if the page is not yet live.
- Click Add update.
The entry is in the ledger immediately, with Manual as the source.
Edit or delete an entry
- Find the entry in the table.
- Click the three-dot menu on the right.
- Select Edit to change the URL, type, date, or draft status. Select Delete to remove it.
Deleting an entry pulls it out of the Performance tab’s calculations.
Filter and search
- Date range — pick a preset or set a custom range.
- Update type — click Add filter > Update type and choose Publish, Refresh, or Draft.
- URL search — type any part of a URL to filter by page.
Source labels
| Label | Meaning |
|---|
| Manual | Added by hand from the Update History tab |
| Workflow | Recorded by a workflow run (CMS publish block or Track Content Update block) |
| Sheets workflow | Recorded by a workflow attached to a Slate Sheet |
| API | Recorded by an external system using the Slate API |
| MCP | Recorded by an MCP client |
When the source has a name (the workflow’s name, for example), it appears after the label as Workflow · My Publish Flow.
Use cases
Prove a content sprint moved traffic
Set the Performance range to the sprint window and choose vs previous period. The KPI cards show whether output went up. The chart shows whether clicks and impressions followed. The Top performing pages table names the specific pages that drove the result.
Decide whether to publish more or refresh more
On the Performance tab, look at the deltas on Published and Refreshed alongside the Δ column in Top performing pages, filtered to each type. Whichever motion shows more pages with positive Δ is the one worth doubling down on next quarter.
Find your highest-leverage refresh patterns
Filter Top performing pages to Refreshed, then sort by Δ. The pages at the top are refreshes that gained the most search traffic in the post-update window. Open them, study what changed, and apply the same pattern to similar pages.
Spot pages that lost traffic after a refresh
Filter Top performing pages to Refreshed and sort by Δ ascending. Pages with negative Δ are refreshes that hurt traffic — open them, compare what changed, and roll back or rewrite as needed.
Audit ledger completeness before a stakeholder review
Before reporting on content performance, spend a minute on the Update History tab. Filter by date range and scan the Source column. If you see fewer Manual entries than you expected for the period, your team is probably shipping content directly in the CMS without logging it — and your Performance numbers will undercount the work.
Tips
- The Performance tab and Update History tab use independent date pickers. Keep Performance on a long reporting window (90 days, 12 months) and Update History on a tighter window for editing.
- The AI cites column on Top performing pages only appears when AI Search Analytics is set up.
- Engagement time comes from GA4 — connect Google Analytics for that column to populate.
- If a workflow logs the same URL on the same day as an existing entry, the system deduplicates rather than creating a second row.
- Use canonical URLs (no tracking parameters) so manual entries match workflow-logged entries on the same page.
What’s next