requestjs

http request function won't return result

﹥>﹥吖頭↗ 提交于 2021-02-10 20:45:37
问题 I am setting up a server with Express.js and I want a 'GET' request to '/' to return the results of a function. The function is making an get request from a news API. When I make the call to '/', the function is being triggered, and the results ('stories') is being logged in the console, but nothing is being sent in the response to the '/' 'GET' request. I have tried putting the 'return' statement in a few different places and it still doesn't work... any idea would be hugely appreciated!

NodeJs ECONNRESET

会有一股神秘感。 提交于 2020-03-18 11:19:09
问题 I get an error when I'm executing GET request. Error: read ECONNRESET at TLSWrap.onStreamRead (internal/stream_base_commons.js:111:27) errno: 'ECONNRESET', code: 'ECONNRESET', syscall: 'read' } It happens when I try to make GET request to https://www.adidas.co.uk/api/search/taxonomy?query=men but it works when I use https://jsonplaceholder.typicode.com/api/todos/1 . Request: var request = require('request'); [...] app.use('/api', cacheMiddleware(), function (req, out) { var headers = { //

How to POST from JS or node.js?

与世无争的帅哥 提交于 2019-12-25 01:37:25
问题 I want to do the following POST command from my JS or node.js file. terminal.zsh curl -L --data-binary @data/scrape.csv https://script.google.com/macros/s/#/exec I can successfully write my .csv file from my node.js file with the following code. node.js const ObjectsToCsv = require('objects-to-csv'); const itemsAsCsv = new ObjectsToCsv(formattedItems); itemsAsCsv.toDisk(filePathCsv, { allColumns: true, }); I have unsuccessfully tried the following. I expect to see the data hit my API but

stream response from nodejs request to s3

房东的猫 提交于 2019-12-22 04:06:41
问题 How do you use request to download contents of a file and directly stream it up to s3 using the aws-sdk for node? The code below gives me Object #<Request> has no method 'read' which makes it seem like request does not return a readable stream... var req = require('request'); var s3 = new AWS.S3({params: {Bucket: myBucket, Key: s3Key}}); var imageStream = req.get(url) .on('response', function (response) { if (200 == response.statusCode) { //imageStream should be read()able by now right? s3

What is the proper way to loop through an array in an EJS template after an AJAX Call (using ExpressJS)?

安稳与你 提交于 2019-12-22 03:46:08
问题 So I am trying to loop through an array of objects that I got from an http call using to my internal API using the request module/package. So far, I am able to get my data back from the API and DISPLAY the full object on my page. I would like to display it on my page and loop through it using the EJS templating system. I know I can use AngularJS for frontend stuff, but I would like to see how far I can go with only server-side. So below is my code: server.js // Prepend /api to my apiRoutes

Getting binary content in Node.js using request

夙愿已清 提交于 2019-12-17 02:21:04
问题 I was trying to GET a binary data using request, and had something like: var requestSettings = { method: 'GET', url: url, }; request(requestSettings, function(error, response, body) { // Use body as a binary Buffer } But body was always a few bytes different from expected. After further investigation I found out that request assumed body is string and replaced all non-unicode bytes. I tried to add encoding: 'binary' to requestSettings but it didn't help. How can I get the binary data? 回答1: OK

Getting binary content in Node.js using request

旧巷老猫 提交于 2019-12-17 02:20:52
问题 I was trying to GET a binary data using request, and had something like: var requestSettings = { method: 'GET', url: url, }; request(requestSettings, function(error, response, body) { // Use body as a binary Buffer } But body was always a few bytes different from expected. After further investigation I found out that request assumed body is string and replaced all non-unicode bytes. I tried to add encoding: 'binary' to requestSettings but it didn't help. How can I get the binary data? 回答1: OK

Piping from request.js to s3.upload results in a zero byte file

泪湿孤枕 提交于 2019-12-11 19:34:13
问题 I'm trying to pipe a file from request.js straight to s3. The code below is the closest I've managed to figure out. It runs, but the file that ends up on s3 is zero bytes. What am I getting wrong? var request = require('request'); var AWS = require('aws-sdk'); var s3 = new AWS.S3(); request('https://placekitten.com/g/2000/2000') .on('response', function(response) { response.on('data', function(data) { // this is getting called console.log(data.length); }); s3.upload({ Body: response, Bucket:

Why does NodeJS request() fail on localhost in flight mode, but not 127.0.0.1? (Windows 10)

北慕城南 提交于 2019-12-11 14:59:59
问题 Given I already have a server up and running on localhost (see further down for an example), at a node command line while online , I get the following: > var x = request('http://localhost:8080/test.html', ... function(err) { if (err) console.log(err) }) undefined > I expect to get the above result all the time. If I've switched to flight mode I get the following: > var x = request('http://localhost:8080/test.html', ... function(err) { if (err) console.log(err) }) undefined > { Error:

Why does requestjs reject a self-signed SSL certificate that works with Firefox?

断了今生、忘了曾经 提交于 2019-12-11 10:34:26
问题 Here's the situation. I created a self-signed CA certificate, and used it to sign a second certificate for use with https. The web server is nginx doing SSL termination and reverse proxying to an expressjs application. To verify that the chain of trust is correct, I installed the CA in Firefox and was able to access the website over https without warnings, as expected. Further, I can inspect the server's certificate with openssl x509 -in server.crt -text -noout , and I see both the expected