问题
I am trying to create a measure that would give me a value based on an expected percentage of of the whole, by date. This will let me create a visual showing expected and actual execution values. I have the actual value calculation down, but I am having issues with the expected calculation.
The table with the expected percentage by date is setup with columns
Date, Project, Expected Pct Complete
1/1/2019, ProjA, .20
1/2/2019, ProjA, .40
1/3/2019, ProjA, .60
1/4/2019, ProjA, .80
1/5/2019, ProjA, 1.00
1/1/2019, ProjA, .33
1/2/2019, ProjA, .66
1/3/2019, ProjA, 1.00
I then have a table with all the test executions, where the data has this general format,
Execution Date, Project, Script, Status
1/1/2019, ProjA, Script1, Passed
1/1/2019, ProjB, ScriptA, Failed
1/1/2019, ProjA, Script2, Failed
1/2/2019, ProjA, Script3, Passed
I want the measure to generate values of form (assume project A & B both have 100 scripts)
Date, Expected Amount, Project
1/1/2019, 20, ProjA
1/1/2019, 33, ProjB
1/2,2019, 40, ProjA
1/2/2019, 66, ProjB
1/3/2019, 60, ProjA
1/3/2019, 100, ProjB
1/4/2019, 80, ProjA
1/5/2019, 100, ProjA
How can I create this measure so that I can put it into a visualization that will show me that actual vs expected rates, when I select the particular project from a slicer?
回答1:
Try adding a calculated column to the first table in your post with the following code.
Expected Amount =
var _totalScriptsPerProject =
CALCULATE (
DISTINCTCOUNT ( tests[Script] ) ;
FILTER ( ALL ( tests ) ; tests[Project] = expectations[Project] )
)
RETURN
expectations[Expected pct Complete] * _totalScriptsPerProject
(note I assumed the first table to be called 'expectations' and the second 'tests', also note I use semicolons as argument seprator)
来源:https://stackoverflow.com/questions/57401841/create-measure-to-calculate-value-based-on-lookup-by-date