I use a tool at work that lets me do queries and get back HTML tables of info. I do not have any kind of back-end access to it.
A lot of this inf
Quick and dirty:
Copy out of browser into Excel, save as CSV.
Better solution (for long term use):
Write a bit of code in the language of your choice that will pull the html contents down, and scrape out the bits that you want. You could probably throw in all of the data operations (sorting, averaging, etc) on top of the data retrieval. That way, you just have to run your code and you get the actual report that you want.
It all depends on how often you will be performing this particular task.