I\'ve bumped into a problem while working at a project. I want to \"crawl\" certain websites of interest and save them as \"full web page\" including styles and images in or
You actually need to parse the html and all css files that are referenced, which is NOT easy. However a fast way to do it is to use an external tool like wget. After installing wget you could run from the command line
wget --no-parent --timestamping --convert-links --page-requisites --no-directories --no-host-directories -erobots=off http://example.com/mypage.html
This will download the mypage.html and all linked css files, images and those images linked inside css.
After installing wget on your system you could use php's system()
function to control programmatically wget.
NOTE: You need at least wget 1.12 to properly save images that are references through css files.