How do you Programmatically Download a Webpage in Java
问题 I would like to be able to fetch a web page\'s html and save it to a String , so I can do some processing on it. Also, how could I handle various types of compression. How would I go about doing that using Java? 回答1: Here's some tested code using Java's URL class. I'd recommend do a better job than I do here of handling the exceptions or passing them up the call stack, though. public static void main(String[] args) { URL url; InputStream is = null; BufferedReader br; String line; try { url =