How to download and save a file from Internet using Java?

前端 未结 21 3534
情深已故
情深已故 2020-11-21 05:06

There is an online file (such as http://www.example.com/information.asp) I need to grab and save to a directory. I know there are several methods for grabbing a

21条回答
  •  清歌不尽
    2020-11-21 05:48

    To summarize (and somehow polish and update) previous answers. The three following methods are practically equivalent. (I added explicit timeouts because I think they are a must, nobody wants a download to freeze forever when the connection is lost.)

    public static void saveUrl1(final Path file, final URL url,
       int secsConnectTimeout, int secsReadTimeout)) 
        throws MalformedURLException, IOException {
        // Files.createDirectories(file.getParent()); // optional, make sure parent dir exists
        try (BufferedInputStream in = new BufferedInputStream(
           streamFromUrl(url, secsConnectTimeout,secsReadTimeout)  );
            OutputStream fout = Files.newOutputStream(file)) {
            final byte data[] = new byte[8192];
            int count;
            while((count = in.read(data)) > 0)
                fout.write(data, 0, count);
        }
    }
    
    public static void saveUrl2(final Path file, final URL url,
       int secsConnectTimeout, int secsReadTimeout))  
        throws MalformedURLException, IOException {
        // Files.createDirectories(file.getParent()); // optional, make sure parent dir exists
        try (ReadableByteChannel rbc = Channels.newChannel(
          streamFromUrl(url, secsConnectTimeout,secsReadTimeout) 
            );
            FileChannel channel = FileChannel.open(file,
                 StandardOpenOption.CREATE, 
                 StandardOpenOption.TRUNCATE_EXISTING,
                 StandardOpenOption.WRITE) 
            ) {
            channel.transferFrom(rbc, 0, Long.MAX_VALUE);
        }
    }
    
    public static void saveUrl3(final Path file, final URL url, 
       int secsConnectTimeout, int secsReadTimeout))  
        throws MalformedURLException, IOException {
        // Files.createDirectories(file.getParent()); // optional, make sure parent dir exists
        try (InputStream in = streamFromUrl(url, secsConnectTimeout,secsReadTimeout) ) {
            Files.copy(in, file, StandardCopyOption.REPLACE_EXISTING);
        }
    }
    
    public static InputStream streamFromUrl(URL url,int secsConnectTimeout,int secsReadTimeout) throws IOException {
        URLConnection conn = url.openConnection();
        if(secsConnectTimeout>0) conn.setConnectTimeout(secsConnectTimeout*1000);
        if(secsReadTimeout>0) conn.setReadTimeout(secsReadTimeout*1000);
        return conn.getInputStream();
    }
    

    I don't find significant differences, all seem right to me. They are safe and efficient. (Differences in speed seem hardly relevant - I write 180Mb from local server to a SSD disk in times that fluctuate around 1.2 to 1.5 segs). They don't require external libraries. All work with arbitrary sizes and (to my experience) HTTP redirections.

    Additionally, all throw FileNotFoundException if the resource is not found (error 404, typically), and java.net.UnknownHostException if the DNS resolution failed; other IOException correspond to errors during transmission.

    (Marked as community wiki, feel free to add info or corrections)

提交回复
热议问题