I want to create a scraper that:
For a version of the above solution that actually works and doesn't rely on jsonfile (instead using the more standard fs) check this out:
Setup:
const fs = require('fs');
const cookiesPath = "cookies.txt";
Reading the cookies (put this code first):
// If the cookies file exists, read the cookies.
const previousSession = fs.existsSync(cookiesPath)
if (previousSession) {
const content = fs.readFileSync(cookiesPath);
const cookiesArr = JSON.parse(content);
if (cookiesArr.length !== 0) {
for (let cookie of cookiesArr) {
await page.setCookie(cookie)
}
console.log('Session has been loaded in the browser')
}
}
Writing the cookies:
// Write Cookies
const cookiesObject = await page.cookies()
fs.writeFileSync(cookiesPath, JSON.stringify(cookiesObject));
console.log('Session has been saved to ' + cookiesPath);