New to the group, and been looking around for a solution to this issue, and have found serveral possibilites but all having their own down falls, so not sure which way would be be the best practice for accomplishing this.
I have a table which contains a persons stats for a day from a phone system, each record contains a person's specific id, the date of the record, the current position in the company of the person and what formula is to be used for this particular record based on the person's position. Now if it stopped there it is pretty easy and straight forward, the issue comes in where the person in mid stream could change position at any given time, policy is that any position changes begin on the first of the month, but is not always the case depending on the need of the company, that is why I tag each record with the formula id of which one is associated to the position for that person on the given day. Now the formula shouldn't change from fiscal year to fiscal year. Each position within the department corresponds to a formula for calculating a particular key performance indicator(KPI), so if there are 5 positions then there would possibly be 3-5 different formulas to correspond to these positions, to add to the complexity of the issue, I will need to take these KPI's and aggregate them through possibly a weekly, definitely a monthly and 3 month average. Then at the end of the fiscal year an annual.
The processes I have investigated into accomplishing this is dynamic SQL, and cursor, or it maybe a combination of the two, or maybe there is something else I am missing. My thoughts are to bring the data together into a temp table, pass each record through a cursor to get the formula associated with the record, retrieve the correct formula and plug it int a variable, that is used in an SQL statement then insert the final records into another temp table for out put to the client side which would be an ASP.Net page. So though the records count that it would be working with on a daily bases would be small to start out with it will grow quite large. Daily the records will be between 100-250 records, multiply that by 7 days when doing the weekly, by 30 days for a monthly, 90 days for the 3 month, and 365 days for the annual, and so on, as you can see the records quickly get into the thousands, for an annual it could be approximately 73000 records. Now, I will be group by position and conditioning the data as far down as I can before I start processing it through the cursor, so that 73000 could be cut down quite substantially before I run it through.
So the question is, am I on the right track or is there a better way to do this?