We are looking at creating a "dashboard" for our customers.Our customers use our service to submit data to us for various government auditing needs and then we aggregate their data, compare to others on a national level and give them feedback on how they are ranked within their industry segment, etc, etc.This new dashboard would offer this:-Current status of each data set if it is submitted, in progress, or not submitted at all yet (customers might submit 4 data sets or 40) -Errors in the data set (data was submitted and they said they would send data from 5 of their factory sites, yet we only received 3 so far or data from site 2 conflicts with audit data from site 4 so they need to investigate and resolve)[b]Here is the real question:[/b]IF we run these queries directly against OLAP it takes 3 to 10 seconds per client... that seems like a long time to wait for a page to load.If all our clients logged in at the same time 8 AM on Monday... they would all be requesting that same 10 second query that touches around 80% of our OLAP tables to scan them to see if data is there, mismatched, etc, etc. Whereas just scanning for all our clients at once doesn't take much more than 3 or 4 times longer than a query for a single client. Seems more efficient to just query for all clients and store the result rather than doing each one on demand.I was thinking of using SSAS to gather up those results and hold them and then refresh SSAS perhaps every 5/15 minutes or incrementally.Design this as a data mart:Customer, factory site, and a list of dimensions for each data set, type of error, and then facts for their status. "DataSet 15: Not Submitted/Submitted No Errors/Partially Submitted" etc...I haven't worked with SSAS before so I wanted to be sure I wasn't thinking of its purpose completely wrongly.Thanks.
↧