|
by Zebco Fuckface 01/31/2003, 5:15pm PST |
|
 |
|
 |
|
TAFKAM wrote:
I know V's a database person, anyone else? I've been busting my head against this seemingly simple problem and thought I might as well run it up a flagpole over here as well:
Item=Text PRIMARY KEY
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, Q1, Q2, Q3, Q4, FULLYEAR = real
1-12 = Months
I have aggregate fields Q1, Q2, Q3, Q4, FULLYEAR that I want to stick the average of the periods into for a table full of records.
What's the cleanest, easiest way to do it? 1-12 are nullable. That could potentially whack out the average values for a flat (1+2+3)/3
I thought of using variables and setting them to 1 or 0 depending on if NULL, then dividing by (var1+var2+var3), but that would be seriously wasteful in a loop for all records - Resetting the variables, testing them, writing * 5 fields x *100000000 records.
Am I missing something obvious?
Well, putting IfNull(columname, 0) in the query string would fix up that problem. Sounds like you're doing mini data-warehousing; a general solution depends entirely on the database you're using. You can use materialized views (or creating a unique clustered index on the view, or whatever the fuck they call it) on SQL Server, as an example.
Of course, tracking something monthly but letting null in your data set is kind of silly. |
|
 |
|
 |
|
|
|