I cannot get wget to mirror a section of a website (a folder path below root) - it only seems to work from the website homepage.
I\'ve tried many options - here is o
Check out archivebox.io, it's an open-source, self-hosted tool that creates a local, static, browsable HTML clone of websites (it saves HTML, JS, media files, PDFs, screenshot, static assets and more).
By default, it only archives the URL you specify, but we're adding a --depth=n flag soon that will let you recursively archive links from the given URL.