ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
public static String getTextFromFile(String fileName, String encoding)
throws IOException
{
BufferedReader input = new BufferedReader(
new InputStreamReader(new FileInputStream(
fileName), encoding));
String line = input.readLine();
StringBuilder stringBuilder = new StringBuilder();
while (line != null)
{
stringBuilder.append(line);
stringBuilder.append("\n");
line = input.readLine();
}
input.close();
return stringBuilder.toString();
}
For a file with 3000 lines, this will take 0.25 seconds on my PC (when run as a Java application).
Being run as a Tomcat servlet, this takes minutes...
By debugging the application, I found the part that takes so long on Tomcat is the loop.
I've also tried to use the StringBuffer class.
Hmmm, it became fast again after a reboot... I'm using Eclipse to invoke and start Tomcat (both, in run and debug mode), and apparently, it's getting unreliable (and/or slower) after several starts (even though any Java processes have been ended, properly).
I think reading one line at a time would be rather slow. Furthermore you create and then discard a String object for every loop iteration (might even be 2 String's per loop, I seem to remember using a String literal, the "\n", creates a new String). You might try to read all the bytes at once:
Code:
...
//read in the file in raw byte form
byte[] buf = new byte[fileSize];
fin = new FileInputStream(fileName);
fin.read(buf);
...
return new String(buf, encoding); //translate into encoding
This requires knowing the file size though. There is a more sophisticated method that takes an estimate for file size here.
I mean, the code in my first post is what you can read in any basic Java book, or by searching Google on "java read from text file". And while your code might work - or might not -, that's a rather uncommon, experimental approach. I.e., it contains alot of FIXME comments.
Well, I found that experimental piece of code through Googling actually. Although today I see I was totally on the wrong track; I had read somewhere that read a text file into memory was called "slurping" so I was searching for that, but today when I searched java read file string I see things are much simpler. I somehow missed the available() method in FileInputStream.
So this should work fine:
Code:
...
//read in the file in raw byte form
fin = new FileInputStream(fileName);
byte[] buf = new byte[fin.available()];
fin.read(buf);
...
return new String(buf, encoding); //translate into encoding
Although I really feel that you ought to be able to just attach a FileInputStream to a StringWriter and have all the low-level stuff be taken care automagically, oh well...
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.