Normalized vs. Denormalized Design in FileMaker

Expert

Evaluate when to normalize FileMaker data for integrity and when to deliberately denormalize for query performance, reporting speed, or API simplicity.

What you'll learn

  • When normalization is essential and when it is counterproductive
  • How to use stored calculated fields as a controlled denormalization strategy
  • How to design reporting tables that denormalize for read performance
  • The maintenance cost of denormalization and how to manage it

Database normalization reduces redundancy and improves update integrity. Denormalization trades some redundancy for query performance or simplicity. FileMaker's unique blend of relational structure and calculated fields enables a hybrid approach: normalize for transactional integrity, strategically denormalize computed or display values into stored calculated fields. The key is knowing when each approach serves the solution.

1/4
1

When normalization is essential

Normalize whenever the same logical fact could be stored in multiple places. Update anomalies (updating one copy but not another) are the primary risk of denormalization. Any data that has a single source of truth must be normalized.

TEXT
// MUST normalize:
// Customer name and address -- single source of truth in Contacts
// Product price -- single source of truth in Products
// Tax rates -- single source of truth in TaxRates table

// Update anomaly without normalization:
// Order table stores CustomerName (copied on order creation)
// Customer changes name -> old orders show old name, new orders show new name
// -> Which is correct? This is ambiguous and unmaintainable

// Historical snapshot exception:
// Invoice stores the price at time of sale (intentional snapshot for legal reasons)
// This is valid denormalization: the historical price should NOT change

Sign in to track your progress and pick up where you left off.

Sign in to FM Dojo