Custom Filters
The Filter Interface
Section titled “The Filter Interface”A filter is a single function that decides whether a token is included:
{ filter: (token: ResolvedToken) => boolean}Return true to include the token, false to exclude it.
Simple Example: Exclude Deprecated Tokens
Section titled “Simple Example: Exclude Deprecated Tokens”import type { Filter } from 'dispersa'
const excludeDeprecated: Filter = { filter: (token) => !token.$deprecated,}Any token with $deprecated: true is excluded from the output.
Example: Include Only Tokens in a Namespace
Section titled “Example: Include Only Tokens in a Namespace”Filter tokens by path prefix:
import type { Filter } from 'dispersa'
const componentTokens: Filter = { filter: (token) => token.name.startsWith('component.'),}Use a RegExp for more flexible matching:
const spacingOnly: Filter = { filter: (token) => /^spacing\.(sm|md|lg)/.test(token.name),}Example: Include Only Tokens with a Specific Extension
Section titled “Example: Include Only Tokens with a Specific Extension”Filter by custom extension fields:
import type { Filter } from 'dispersa'
const webOnly: Filter = { filter: (token) => token.$extensions?.platform !== 'mobile',}
const figmaEnabled: Filter = { filter: (token) => token.$extensions?.['com.figma']?.export === true,}Combining Multiple Filters
Section titled “Combining Multiple Filters”All filters must pass (AND logic). A token is included only when every filter returns true.
Use both global and per-output filters:
await dispersa.build({ filters: [excludeDeprecated], // global: all outputs outputs: [ css({ name: 'colors', filters: [byType('color')], // per-output: further narrow file: 'colors.css', transforms: [nameKebabCase(), colorToHex()], }), css({ name: 'spacing', filters: [byType('dimension'), byPath('spacing.')], file: 'spacing.css', transforms: [nameKebabCase(), dimensionToRem()], }), ],})- Global filters reduce the token set for all outputs.
- Per-output filters narrow tokens further for that specific output.
Custom Preprocessors Transform raw token documents before they enter the pipeline.