From intuition to insight: making data useful for cultural teams

Cultural institutions collect plenty of data but often struggle to use it effectively. A practical guide to asking the right questions, understanding digital and physical behaviour together, and turning insight into action, without losing sight of what matters.

Nickolas Lago

Head of Data and Insights

6 min read

28 Jan 2026

Photo by Campaign Creators on Unsplash

Cultural professionals are excellent at reading rooms. They notice when a gallery label holds attention, when an exhibition layout confuses visitors, when a particular object becomes a talking point. This intuitive understanding, built from years of observation and experience, is invaluable. But it's also limited. Intuition tells you what's happening in the spaces you can see, with the visitors you happen to encounter, at the moments you're present. Data, used well, can extend that understanding across time, scale, and contexts you'd never otherwise observe.

The challenge is that most analytics tools weren't designed for cultural institutions. They're built for e-commerce, content – contexts where success means conversions, clicks, and time-on-site. Cultural teams inherit frameworks that don't quite fit, dashboards that overwhelm rather than inform, and metrics that feel disconnected from the work that actually matters.

The problem with most analytics

Walk into any analytics platform and you're immediately confronted with numbers. Page views, bounce rates, session duration, acquisition channels, conversion funnels. The volume is paralysing. Most cultural teams don't have dedicated data analysts, so the responsibility falls to curators, educators, and digital managers who are already stretched thin. The result? Data gets glanced at occasionally, but rarely drives decisions.

Too many metrics create noise, not clarity. When everything is measured, nothing stands out. Teams dutifully check dashboards, note that numbers have gone up or down, and then... do nothing differently. The metrics exist, but they don't connect to actionable insight.

Not enough meaning is the deeper problem. A metric like "average session duration" sounds useful, but what does it actually tell you? Is three minutes good or bad? Does it mean people are engaged, or confused? Are they finding what they need, or giving up? Without context, numbers are just numbers.

Data without decisions is perhaps the most common failure. Institutions collect vast amounts of information and then struggle to translate it into concrete actions. Spreadsheets pile up. Reports get filed. But the exhibition layout stays the same, the content doesn't change, and the visitor experience remains unchanged. Data becomes a bureaucratic exercise rather than a tool for improvement.

<!-- CTA block -->

What cultural teams actually need to know

Forget dashboards with 47 metrics. Cultural teams need answers to a small number of important questions—questions that actually inform how they work.

What content works matters enormously. Which gallery labels do people read? Which audio stops do they listen to? Which images do they zoom into? Which stories do they share? Understanding this doesn't require complex analytics. It requires tracking engagement with specific pieces of content and noticing patterns. When you know what resonates, you can create more of it.

Where people drop off reveals problems. If half your visitors abandon a digital tour at the third stop, something's wrong. Maybe the content is too long. Maybe the navigation is confusing. Maybe the technical performance is poor. Drop-off points are diagnostic tools – they tell you exactly where the experience breaks down.

How behaviour changes over time shows whether your interventions are working. You've rewritten a set of labels, redesigned a wayfinding feature, or launched a new accessibility option. Did it make a difference? Comparing behaviour before and after a change is one of the most valuable things data can do – but only if you're set up to track it.

The most interesting data insights come from understanding how digital and physical experiences intersect. Cultural visits aren't purely online or purely in-person – they're increasingly hybrid.

Digital + physical behaviour together

The most interesting data insights come from understanding how digital and physical experiences intersect. Cultural visits aren't purely online or purely in-person – they're increasingly hybrid.

Routes through spaces reveal how people actually navigate, not how you hoped they would. Heat maps of physical movement, combined with data on which digital content gets accessed where, show whether your intended narrative is landing. If everyone's skipping the introduction and heading straight to the final room, that's useful information.

Dwell time – both physical and digital – indicates engagement. How long do people spend with an object? How long do they listen to audio? How long do they read? Dwell time isn't the only metric that matters, but combined with other signals, it helps distinguish between cursory glances and genuine engagement.

Repeat engagement is a powerful indicator of value. Visitors who return to content, revisit spaces, or continue engaging after they leave are telling you something worked. Repeat behaviour is harder to game than vanity metrics because it suggests genuine interest rather than accidental clicks.

Pre-visit and post-visit patterns show the full arc of engagement. What do people do before they arrive? What prompts them to visit? What do they engage with afterward? Understanding the visitor journey beyond the physical visit helps institutions design better experiences at every stage.

Turning insight into action

Data is only useful if it changes behaviour. The goal isn't to collect information, it's to use it.

Content tweaks are often the fastest way to improve experiences. If data shows people consistently skip a particular audio stop, you can rewrite it. If a gallery text performs well, you can apply the same approach elsewhere. Small, evidence-based adjustments compound over time into significantly better experiences.

Operational decisions benefit from data in ways that aren't always obvious. Knowing when galleries are busiest helps with staffing. Understanding which content gets used most helps with maintenance priorities. Seeing where technical problems cluster helps with infrastructure investment. Data can make operations more efficient and more responsive.

Accessibility improvements are easier to prioritise when you have evidence. If captions are heavily used, that justifies investment in better captioning. If audio description has low uptake, maybe it needs better signposting. Or maybe the content needs work. Data helps distinguish between features that look good in theory and features that actually serve audiences.

Measuring success without losing soul

There's a danger in optimisation: you can make things measurably better while making them meaningfully worse. Cultural institutions aren't trying to maximise engagement the way social media platforms are. They're trying to create experiences that are educational, moving, challenging or thought-provoking, and those qualities don't always correlate with metrics.

Balancing numbers with narrative means remembering that data is one input among many. A gallery experience that "performs poorly" by conventional metrics might still be artistically important, intellectually rigorous, or emotionally powerful. Not everything worth doing is worth measuring, and not everything measurable is worth doing.

Avoiding optimisation for optimisation's sake requires discipline. It's tempting to chase increases: more visitors, longer sessions, higher engagement. But what if the goal is contemplation rather than engagement? What if less content, consumed more thoughtfully, is better than more content consumed quickly? Data should serve your mission, not replace it.

The best use of data in cultural contexts isn't to dictate decisions, it's to inform judgment. It gives you evidence to test assumptions, spot problems, and understand impact. But it doesn't tell you what's worth doing. That requires human judgment, institutional values, and a clear sense of purpose that no dashboard can provide.

Explore Smartify's analytics and reporting tools

Explore Smartify's analytics and reporting tools

Explore Smartify's analytics and reporting tools