
Imagine you're a data analyst at a growing e-commerce company. Every month, you need to pull sales data from different regions, apply similar transformations (cleaning, aggregating, adding calculated columns), and create reports. Currently, you're copying and pasting queries, manually changing region names, and updating date ranges. What if a single query could handle all regions dynamically, accepting parameters to determine which data to process and how to transform it?
This is where dynamic M queries shine. Instead of creating static transformations that work for one specific scenario, you'll learn to build flexible, parameterized queries that adapt based on inputs. These queries can change their behavior, data sources, column operations, and logic flow based on parameters you define.
By mastering dynamic queries, you'll transform from writing rigid, single-use code to creating reusable transformation engines that scale with your organization's needs.
What you'll learn:
Before diving into dynamic queries, you should be comfortable with basic M language syntax, including let expressions, record syntax, and simple function creation. You should also understand fundamental Power Query operations like filtering, adding columns, and basic data transformations through the Power Query Editor interface.
Parameters are named values that you can reference throughout your query, making them configurable without editing the underlying code. Think of parameters as variables that users can modify to change how your query behaves.
In Power Query, parameters appear in the Queries pane just like regular queries, but they represent single values rather than tables. When you reference a parameter in your query, Power Query substitutes the parameter's current value.
Let's start with a simple example. Imagine you regularly analyze sales data for different date ranges. Instead of hardcoding dates, you can create parameters:
// Parameter: StartDate
#date(2024, 1, 1)
// Parameter: EndDate
#date(2024, 3, 31)
// Main query using parameters
let
Source = Excel.CurrentWorkbook(){[Name="SalesData"]}[Content],
FilteredRows = Table.SelectRows(Source,
each [OrderDate] >= StartDate and [OrderDate] <= EndDate)
in
FilteredRows
To create parameters in Power Query Editor, go to Home tab → Manage Parameters → New Parameter. Give your parameter a name, select its data type, and set a default value. The parameter becomes available to reference in any query within your workbook.
The real power emerges when you combine multiple parameters to control different aspects of your query's behavior. Consider a scenario where you want to analyze different product categories across various time periods:
// Parameters
// CategoryFilter: "Electronics"
// MinSalesAmount: 1000
// IncludeDiscountedItems: true
let
Source = Excel.CurrentWorkbook(){[Name="ProductSales"]}[Content],
// Dynamic category filtering
CategoryFiltered = Table.SelectRows(Source,
each [Category] = CategoryFilter),
// Dynamic sales amount filtering
SalesFiltered = Table.SelectRows(CategoryFiltered,
each [SalesAmount] >= MinSalesAmount),
// Conditional inclusion of discounted items
FinalData = if IncludeDiscountedItems
then SalesFiltered
else Table.SelectRows(SalesFiltered, each [Discount] = 0)
in
FinalData
This query adapts its filtering logic based on three parameters. Change the CategoryFilter parameter from "Electronics" to "Clothing," and the same query analyzes completely different data. Adjust MinSalesAmount to focus on higher-value transactions. Toggle IncludeDiscountedItems to include or exclude promotional sales.
Tip: Parameters are especially powerful in organizational settings where different users need the same analysis for different data subsets. Instead of creating separate queries for each department or region, build one dynamic query that parameters control.
Real-world data scenarios rarely follow simple rules. Sometimes you need to apply different transformations based on data characteristics, user preferences, or business rules. M's conditional expressions (if-then-else) combined with parameters create sophisticated decision trees within your queries.
Let's build a dynamic pricing analysis query that adjusts its calculations based on business requirements:
// Parameters
// AnalysisType: "Profitability" (could be "Volume", "Growth", "Profitability")
// RegionFocus: "North America"
// IncludeSeasonalAdjustment: false
let
Source = Excel.CurrentWorkbook(){[Name="ProductData"]}[Content],
// Dynamic region filtering
RegionFiltered = if RegionFocus = "All Regions"
then Source
else Table.SelectRows(Source, each [Region] = RegionFocus),
// Dynamic analysis based on type
AnalysisColumns = if AnalysisType = "Profitability" then
Table.AddColumn(RegionFiltered, "ProfitMargin",
each ([Revenue] - [Cost]) / [Revenue])
else if AnalysisType = "Volume" then
Table.AddColumn(RegionFiltered, "VolumeIndex",
each [UnitsSold] / List.Average(RegionFiltered[UnitsSold]))
else if AnalysisType = "Growth" then
Table.AddColumn(RegionFiltered, "GrowthRate",
each ([CurrentPeriodSales] - [PreviousPeriodSales]) / [PreviousPeriodSales])
else
RegionFiltered,
// Conditional seasonal adjustment
FinalData = if IncludeSeasonalAdjustment then
Table.AddColumn(AnalysisColumns, "SeasonallyAdjusted",
each [Revenue] * (if Date.Month([SalesDate]) >= 11 then 0.8 else 1.0))
else
AnalysisColumns
in
FinalData
This query demonstrates several dynamic patterns:
Conditional filtering: The query applies regional filtering only when a specific region is selected, otherwise processing all data.
Branching logic: Different calculation columns are added based on the analysis type parameter.
Nested conditions: Seasonal adjustments apply additional logic only when enabled.
The beauty of this approach is maintainability. Business requirements change, but your query structure remains stable. Need a new analysis type? Add another condition branch. New seasonal logic? Modify the seasonal adjustment condition.
Static queries work with fixed column sets, but dynamic queries can adapt their column operations based on parameters or data characteristics. This flexibility is crucial when working with evolving datasets or building generic transformation engines.
Consider a common scenario: you receive monthly reports with varying column structures. Some months include bonus columns, others have different naming conventions, and regional reports might contain additional fields. A dynamic query can handle these variations gracefully:
// Parameters
// RequiredColumns: "CustomerID,OrderDate,Revenue" (comma-separated string)
// OptionalColumns: "Discount,Bonus" (comma-separated string)
// RenameColumns: true
let
Source = Excel.CurrentWorkbook(){[Name="MonthlyData"]}[Content],
SourceColumns = Table.ColumnNames(Source),
// Parse parameter strings into lists
RequiredList = Text.Split(RequiredColumns, ","),
OptionalList = Text.Split(OptionalColumns, ","),
// Build dynamic column selection
ColumnsToKeep = RequiredList & List.Intersect({OptionalList, SourceColumns}),
// Select only available columns
SelectedColumns = Table.SelectColumns(Source, ColumnsToKeep),
// Dynamic column renaming
RenamedColumns = if RenameColumns then
Table.RenameColumns(SelectedColumns, {
{"CustomerID", "Customer_ID"},
{"OrderDate", "Order_Date"},
{"Revenue", "Total_Revenue"}
})
else
SelectedColumns,
// Add calculated columns dynamically
WithCalculations = if List.Contains(Table.ColumnNames(RenamedColumns), "Discount") then
Table.AddColumn(RenamedColumns, "NetRevenue",
each [Total_Revenue] * (1 - [Discount]))
else
Table.AddColumn(RenamedColumns, "NetRevenue", each [Total_Revenue])
in
WithCalculations
This pattern handles several dynamic scenarios:
Flexible column selection: The query selects required columns and any available optional columns, adapting to different source structures.
Conditional renaming: Column renaming applies only when enabled, allowing the same query to work with different naming standards.
Adaptive calculations: The NetRevenue calculation adjusts based on whether discount information is available.
Another powerful dynamic column technique involves creating columns based on parameter-driven rules:
// Parameter: MetricCalculations
// Value: "Profit=Revenue-Cost,Margin=Profit/Revenue,Growth=Current/Previous-1"
let
Source = Excel.CurrentWorkbook(){[Name="BusinessData"]}[Content],
// Parse calculation definitions
CalcDefinitions = Text.Split(MetricCalculations, ","),
// Function to parse and create a single calculation
CreateCalculation = (table as table, definition as text) =>
let
Parts = Text.Split(definition, "="),
ColumnName = Parts{0},
Formula = Parts{1},
// This is simplified - real implementation would need expression parsing
NewColumn = if Formula = "Revenue-Cost" then
Table.AddColumn(table, ColumnName, each [Revenue] - [Cost])
else if Formula = "Profit/Revenue" then
Table.AddColumn(table, ColumnName, each [Profit] / [Revenue])
else if Formula = "Current/Previous-1" then
Table.AddColumn(table, ColumnName, each [Current] / [Previous] - 1)
else
table
in
NewColumn,
// Apply all calculations dynamically
FinalData = List.Accumulate(CalcDefinitions, Source, CreateCalculation)
in
FinalData
Warning: Dynamic column operations can become complex quickly. Start with simple patterns and gradually add sophistication. Always test edge cases like missing columns or invalid parameter values.
The ultimate expression of dynamic queries is custom functions that encapsulate common transformation patterns. These functions accept parameters and return transformed tables, making them reusable across multiple queries and workbooks.
Let's build a comprehensive data cleaning function that handles common preparation tasks:
// Custom Function: CleanSalesData
(
sourceTable as table,
dateColumn as text,
amountColumn as text,
removeOutliers as logical,
standardizeNames as logical,
outlierThreshold as number
) as table =>
let
// Step 1: Validate inputs
ValidateInputs =
if not Table.HasColumns(sourceTable, {dateColumn, amountColumn})
then error "Required columns not found in source table"
else sourceTable,
// Step 2: Clean date column
CleanedDates = Table.TransformColumns(ValidateInputs, {
{dateColumn, each if _ is date then _ else Date.FromText(Text.From(_))}
}),
// Step 3: Clean amount column
CleanedAmounts = Table.TransformColumns(CleanedDates, {
{amountColumn, each if _ is number then _ else Number.FromText(Text.Replace(Text.From(_), ",", ""))}
}),
// Step 4: Remove outliers conditionally
OutliersRemoved = if removeOutliers then
let
AmountValues = Table.Column(CleanedAmounts, amountColumn),
Mean = List.Average(AmountValues),
StdDev = List.StandardDeviation(AmountValues),
LowerBound = Mean - (outlierThreshold * StdDev),
UpperBound = Mean + (outlierThreshold * StdDev)
in
Table.SelectRows(CleanedAmounts,
each Record.Field(_, amountColumn) >= LowerBound and
Record.Field(_, amountColumn) <= UpperBound)
else
CleanedAmounts,
// Step 5: Standardize names conditionally
StandardizedNames = if standardizeNames then
if Table.HasColumns(OutliersRemoved, {"CustomerName"}) then
Table.TransformColumns(OutliersRemoved, {
{"CustomerName", each Text.Proper(Text.Trim(_))}
})
else
OutliersRemoved
else
OutliersRemoved
in
StandardizedNames
This function demonstrates several key patterns for reusable dynamic queries:
Input validation: Always validate that required columns exist and parameters are reasonable.
Conditional processing: Each transformation step can be enabled or disabled based on parameters.
Flexible column handling: The function works with any table as long as required columns are present.
Error handling: Clear error messages help users understand what went wrong.
To use this function in your queries:
let
Source = Excel.CurrentWorkbook(){[Name="RawSalesData"]}[Content],
CleanedData = CleanSalesData(
Source, // sourceTable
"OrderDate", // dateColumn
"Revenue", // amountColumn
true, // removeOutliers
true, // standardizeNames
2.5 // outlierThreshold
)
in
CleanedData
For more complex scenarios, you can create functions that return other functions, enabling even greater flexibility:
// Function Factory: CreateAggregator
(groupByColumns as list, aggregateRules as record) as function =>
(sourceTable as table) as table =>
let
GroupedData = Table.Group(sourceTable, groupByColumns,
Record.ToList(
Record.TransformFields(aggregateRules,
each {_{0}, _{1}})
))
in
GroupedData
// Usage example
let
CreateRevenueAggregator = CreateAggregator(
{"Region", "Product"},
[
TotalRevenue = {"Revenue", List.Sum},
AvgOrderSize = {"OrderValue", List.Average},
OrderCount = {"OrderID", List.Count}
]
),
Source = Excel.CurrentWorkbook(){[Name="OrderData"]}[Content],
AggregatedData = CreateRevenueAggregator(Source)
in
AggregatedData
Let's build a comprehensive dynamic reporting query that demonstrates all the concepts we've covered. You'll create a flexible sales analysis system that adapts based on multiple parameters.
Scenario: Your organization needs a standardized sales report that works across different regions, time periods, and analysis focuses. The report should handle varying data structures and provide different analytical perspectives based on user requirements.
Step 1: Set up your parameters. In Power Query Editor, create these parameters:
ReportRegion (Text): "North America" (options: "North America", "Europe", "Asia", "All Regions")AnalysisPeriod (Text): "Last Quarter" (options: "Last Month", "Last Quarter", "Last Year") FocusMetric (Text): "Revenue" (options: "Revenue", "Units", "Profit")IncludeForecasting (True/False): falseOutlierRemoval (True/False): trueDetailLevel (Text): "Summary" (options: "Summary", "Detailed")Step 2: Create the main dynamic query:
let
// Data source - adjust this to match your data
Source = Excel.CurrentWorkbook(){[Name="SalesData"]}[Content],
// Dynamic date filtering based on period
DateFiltered =
if AnalysisPeriod = "Last Month" then
Table.SelectRows(Source, each [SalesDate] >= Date.AddMonths(Date.From(DateTime.LocalNow()), -1))
else if AnalysisPeriod = "Last Quarter" then
Table.SelectRows(Source, each [SalesDate] >= Date.AddMonths(Date.From(DateTime.LocalNow()), -3))
else if AnalysisPeriod = "Last Year" then
Table.SelectRows(Source, each [SalesDate] >= Date.AddYears(Date.From(DateTime.LocalNow()), -1))
else
Source,
// Dynamic region filtering
RegionFiltered =
if ReportRegion = "All Regions" then
DateFiltered
else
Table.SelectRows(DateFiltered, each [Region] = ReportRegion),
// Remove outliers conditionally
OutliersHandled = if OutlierRemoval then
let
RevenueValues = Table.Column(RegionFiltered, "Revenue"),
Q1 = List.Percentile(RevenueValues, 0.25),
Q3 = List.Percentile(RevenueValues, 0.75),
IQR = Q3 - Q1,
LowerBound = Q1 - (1.5 * IQR),
UpperBound = Q3 + (1.5 * IQR)
in
Table.SelectRows(RegionFiltered,
each [Revenue] >= LowerBound and [Revenue] <= UpperBound)
else
RegionFiltered,
// Dynamic focus metric calculation
WithFocusMetric =
if FocusMetric = "Revenue" then
Table.AddColumn(OutliersHandled, "FocusValue", each [Revenue])
else if FocusMetric = "Units" then
Table.AddColumn(OutliersHandled, "FocusValue", each [UnitsSold])
else if FocusMetric = "Profit" then
Table.AddColumn(OutliersHandled, "FocusValue", each [Revenue] - [Cost])
else
OutliersHandled,
// Conditional forecasting
WithForecasting = if IncludeForecasting then
Table.AddColumn(WithFocusMetric, "Forecasted",
each [FocusValue] * 1.15) // Simple 15% growth assumption
else
WithFocusMetric,
// Dynamic aggregation based on detail level
FinalData =
if DetailLevel = "Summary" then
Table.Group(WithForecasting, {"Region", "Product"}, {
{"TotalFocusValue", "FocusValue", List.Sum},
{"AverageFocusValue", "FocusValue", List.Average},
{"TransactionCount", "OrderID", List.Count}
})
else
WithForecasting
in
FinalData
Step 3: Test your dynamic query by changing parameters:
ReportRegion to "Europe" and refresh - you should see only European dataFocusMetric to "Profit" and refresh - calculations should shift to profit analysis DetailLevel to "Detailed" to see row-level data instead of aggregated summariesIncludeForecasting to add forecasted valuesStep 4: Add error handling and validation:
let
// Add validation at the start
ValidatedSource =
if not Table.HasColumns(Source, {"Region", "SalesDate", "Revenue"})
then error "Source table missing required columns: Region, SalesDate, Revenue"
else Source,
// Rest of your query here...
This exercise demonstrates how parameters can completely transform query behavior without changing the underlying logic structure. You've built a single query that can serve multiple business needs across different regions, time periods, and analytical focuses.
Dynamic queries introduce complexity that can lead to subtle errors. Here are the most common issues and how to resolve them:
Parameter Reference Errors: When parameters aren't available or have incorrect names, queries fail with cryptic error messages. Always ensure parameter names in your query exactly match parameter names in your workbook. Use the formula bar to verify parameter references - they should appear as simple text without quotes (e.g., StartDate, not "StartDate").
Type Mismatches: Parameters have specific data types, but M sometimes treats them as text. This commonly occurs with date parameters:
// Wrong - treats parameter as text
FilteredRows = Table.SelectRows(Source, each [OrderDate] >= "StartDate")
// Correct - uses parameter value
FilteredRows = Table.SelectRows(Source, each [OrderDate] >= StartDate)
// Safe with explicit conversion
FilteredRows = Table.SelectRows(Source, each [OrderDate] >= Date.From(StartDate))
Null Parameter Handling: Users might leave parameters empty, causing queries to fail. Build defensive code:
// Defensive parameter usage
RegionFilter = if RegionParameter = null or RegionParameter = ""
then "All Regions"
else RegionParameter,
FilteredData = if RegionFilter = "All Regions"
then Source
else Table.SelectRows(Source, each [Region] = RegionFilter)
Column Existence Issues: Dynamic column operations fail when referenced columns don't exist. Always validate column presence:
// Safe column operations
SafeColumnAdd = if Table.HasColumns(Source, {"Revenue", "Cost"})
then Table.AddColumn(Source, "Profit", each [Revenue] - [Cost])
else Source
Performance Problems: Dynamic queries can become slow, especially with complex conditional logic. Monitor refresh times and optimize by:
Circular References: When parameters reference other parameters or queries, you might create circular dependencies. Keep parameter relationships simple and avoid parameter chains more than one level deep.
Debugging Tip: When dynamic queries behave unexpectedly, add intermediate steps that expose parameter values and intermediate results. Create simple queries that display parameter values:
= ParameterNamehelps verify parameters are working correctly.
You've learned to transform static, single-purpose queries into flexible, reusable transformation engines. Dynamic queries with parameters enable you to build scalable data solutions that adapt to changing requirements without code duplication.
The key concepts you've mastered include:
These techniques scale beyond simple parameter substitution. You can build entire data processing frameworks where business users modify parameters to generate different reports, analyses, and transformations. The same core query logic serves multiple departments, regions, or analytical needs.
Next steps in your dynamic query journey:
Explore advanced function patterns: Learn to create function libraries that your organization can share across workbooks and projects.
Master expression parsing: Build queries that can interpret user-provided formulas and calculations dynamically.
Integrate with external systems: Use dynamic queries with APIs and databases where parameters control endpoint calls and query generation.
Performance optimization: Learn techniques for making complex dynamic queries performant at enterprise scale.
Dynamic queries represent a paradigm shift from writing code that works to writing code that adapts. This flexibility becomes invaluable as your data needs grow and evolve, providing a foundation for scalable, maintainable analytics solutions.
Learning Path: Advanced M Language