Create trustworthy building blocks so your teammates and Metabot AI can get reliable answers from your data without hand-holding
Create clean, governed building blocks to help end-users and Metabot AI run queries easier and get consistent answers
Curate data from one or more tables. Give more context and signpost your data with more intuitive column names and descriptions to make it easier for end-users and AI to interpret, and put together queries. Verify models and persist results.
Create the official way to calculate important numbers that everyone can use. Standardize aggregations, e.g. how your revenue is calculated, for consistent, trustworthy results every time.
Give Metabot the context it needs to generate accurate answers by defining shared models and metrics it can reference.
We're expanding the semantic layer to include Transforms. Define logic in SQL or Python and save the results as a new, reusable table. Perfect for preparing the data your team - and Metabot - needs, right inside Metabase. Keep an eye on our upcoming release announcements, or follow the roadmap.
A semantic layer is a shared map of your business logic for your analytics. It’s where you define the key models, metrics, and relationships that describe your data. The semantic layer gives your team a shared source of truth for important things they’ll need to reference often, like revenue, active users, or churn. It makes analytics consistent, reusable, and understandable, whether you’re writing queries, building dashboards, or asking an AI for help.
Metabase already powers your team’s analysis. Adding your semantic layer here means your definitions live where your questions are asked, without extra tools or duplication. Define models, metrics, and permissions once and reuse them everywhere for consistent, self-serve answers. Plus, your data becomes AI-ready — giving Metabot a reliable foundation for accurate responses.
Models are curated datasets - clean, named queries that anyone can build on, with added metadata.
Metrics are reusable calculations (think common aggregations, like total revenue, conversion rate, or active users) you define once that everyone (including Metabot) can use for their own questions and queries to get consistent results.
You can (re)use models and metrics in questions, and soon transforms, so everyone’s working on the same definitions.
Transforms let you do the T in ETL - define data preparation logic, like cleaning, joining, or aggregating using SQL or Python.
Unlike models, which are reusable queries anyone can build, Transforms can write back to your database for faster performance and richer pipelines.
Together, they form a semantic layer, a shared vocabulary of your business logic that humans, AI, and tools alike, all understand.
Metabot uses your semantic layer as context to answer questions accurately. It understands your models (curated tables), metrics (definitions), and transforms, so instead of guessing, it queries your defined logic.
Soon! Transforms will let you define reusable transformations in SQL or Python without leaving Metabase. It's designed for data shaping and logic, not full notebook-style scripting and machine learning, but if you want to use Python to prepare and define the data your team analyzes, transforms will hit the spot.