I need to optimize the loading speed of several existing websites. One of the issues that I have is the amount of requests per page. The websites have 7 or more different ty
Depending on your development environment, you might consider automating the process. It is a fair bit more work up front, but I found it has been worth it in the long run. How you would go about doing that depends largely on your project and environment. There are several options, but I will explain (high level) what we did.
In our case, we have several ASP.NET based websites. I wrote an ASP.NET control that simply contains a list of static dependencies - CSS and JavaScript. Each page lists what it needs. We have some pages with 7 or 8 JS dependencies and 4 or 5 CSS dependencies, depending on what shared libraries/controls are being used. The first time the page loads, I create a new background worker thread that evaluates all the static resources, combines them into a single file (1 for CSS, 1 for JS), and then performs minification on them using the Yahoo Yui Compressor (can do both JS and CSS). I then output the file into a new "merged" or "optimized" directory.
The next time someone loads that page, the ASP.NET control sees the optimized version of the resource, and loads that instead of the list of 10-12 other resources.
Furthermore, it is designed to only load the optimized resources when the project is running in "RELEASE" mode (as opposed to DEBUG mode inside Visual Studio). This is fantastic because we can keep different classes, pages, controls, etc. separate for organization (and sharing across projects), but we still get the benefit of optimized loading. It is a completely transparent process that requires no additional attention (once it is working). We even went back and added a condition where the non-optimized resources were loaded if "debug=true" was specified in the query string of the URL for cases where you need to verify/replicate bugs in production.