java-io

reading a file from HDFS only after it is fully written and closed

无人久伴 提交于 2019-12-07 07:13:12
问题 I have two processes running. One is writing files to an HDFS and the other is loading those files. The first process (The one that writes the file) is using: private void writeFileToHdfs(byte[] sourceStream, Path outFilePath) { FSDataOutputStream out = null; try { // create the file out = getFileSystem().create(outFilePath); out.write(sourceStream); } catch (Exception e) { LOG.error("Error while trying to write a file to hdfs", e); } finally { try { if (null != out) out.close(); } catch

Making sure file gets deleted on JVM exit

China☆狼群 提交于 2019-12-07 04:01:01
问题 Does File.deleteOnExit() guarantee that the file is deleted even if the JVM is killed prematurely? 回答1: Deletion will be attempted only for normal termination of the virtual machine, as defined by the Java Language Specification No. check the file at next start-up if possible. 回答2: As Tim Bender notes, File.deleteOnExit() does not guarantee that the file actually gets deleted. On Unixish systems (such as Linux or OSX), however, it's possible to delete the temporary file before writing to it

How java.io.Buffer* stream differs from normal streams?

泄露秘密 提交于 2019-12-06 20:46:07
问题 1) How does buffered streams work in background, how do they differ from normal streams and what are the advantage(s) of using them? 2) DataInputStream is also Byte based. But it is having methods to readLine() . What's the point in here? 回答1: From the BufferedInputStream javadoc: A BufferedInputStream adds functionality to another input stream-namely, the ability to buffer the input and to support the mark and reset methods. When the BufferedInputStream is created, an internal buffer array

Java IO : Writing into a text file line by line [closed]

ε祈祈猫儿з 提交于 2019-12-06 16:23:07
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 5 years ago . I have a requirement where i need to write a text file line by line. The number of lines may be up to 80K . I am opening the file output stream and inside a for-loop , iterating a list and forming a line and writing the line into the file. This means 80K write operations are made on the file . Opening and

Resolving IOException, FileNotFoundException when using FileReader

て烟熏妆下的殇ゞ 提交于 2019-12-06 14:06:59
问题 I've not been able to resolve the following exception in the code below. What is the problem with the way I use BufferedReader? I'm using BufferedReader inside the main method OUTPUT :- ParseFileName.java:56: unreported exception java.io.FileNotFoundException; must be caught or declared to be thrown BufferedReader buffread = new BufferedReader (new FileReader("file.txt")); // ParseFileName is used to get the file name from a file path // For eg: get - crc.v from "$ROOT/rtl/..path/crc.v"

Spring Mvc java.io.FileNotFoundException - ApplicationContext.xml

穿精又带淫゛_ 提交于 2019-12-06 12:24:35
The applicationContext.xml is in the WEB-INF folder, why am i getting this error : org.springframework.beans.factory.BeanDefinitionStoreException: IOException parsing XML document from class path resource [applicationContext.xml]; nested exception is java.io.FileNotFoundException: class path resource [applicationContext.xml] cannot be opened because it does not exist Web.xml <?xml version="1.0" encoding="UTF-8"?> <web-app xmlns="http://java.sun.com/xml/ns/javaee" xmlns:web="http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi

Need help with pdf-renderer

强颜欢笑 提交于 2019-12-06 12:07:35
问题 I'm using PDF-Renderer to view PDF files within my java application. It's working perfectly for normal PDF files. However, i want the application to be able to display encrypted PDF files. The ecrypted file will be decrypted with CipherInputStream , but i do not want to save the decrypted data on disk. Am trying to figure a way i can pass the decryted data from CipherInputStream to the PDFFile constructor without having to write the decryted data to file. I will also appreciate if someone can

base64 decoded file is not equal to the original unencoded file

◇◆丶佛笑我妖孽 提交于 2019-12-06 11:19:19
I have a normal pdf file A.pdf , a third party encodes the file in base64 and sends it to me in a webservice as a long string (i have no control on the third party). My problem is that when i decode the string with java org.apache.commons.codec.binary.Base64 and right the output to a file called B.pdf I expect B.pdf to be identical to A.pdf, but B.pdf turns out a little different then A.pdf. As a result B.pdf is not recognized as a valid pdf by acrobat. Does base64 have different types of encoding\charset mechanisms? can i detect how the string I received is encoded so that B.pdf=A.pdf ? EDIT-

Reading integer values from binary file using Java

浪尽此生 提交于 2019-12-06 07:16:32
问题 I am trying to write values greater than 256 using DataOupPutStream.write() method. When i try reading the same value using DataInputStream.read() it will return 0. So, i used DataOutputStream.writeInt() and DataInputStream.readInt() methods to write and retrieve values greater than 256 and it is working fine. Refer the below code snippet i would like to know the behaviour of the compiler as what it does in the in.readInt() inside the while statement. FileOutputStream fout = new

Why is File.exists() behaving flakily in multithreaded environment?

本小妞迷上赌 提交于 2019-12-06 05:25:02
问题 I have a batch process running under java JDK 1.7. It is running on a system with RHEL, 2.6.18-308.el5 #1 SMP. This process gets a list of metadata objects from a database. From this metadata it extracts a path to a file. This file may or may not actually exist. The process uses the ExecutorService ( Executors.newFixedThreadPool() ) to launch multiple threads. Each thread runs a Callable that launches a process that reads that file and writes another file if that input file exists (and logs