LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 02-12-2008, 11:20 PM   #1
rblampain
Senior Member
 
Registered: Aug 2004
Location: Western Australia
Distribution: Debian 11
Posts: 1,288

Rep: Reputation: 52
HTML/PHP pages +and unnecessary characters removed


I have a BASIC program that produces 2 HTML or PHP files from the same input text file, one is formatted without removing unnecessary spaces and newlines, the other file has these characters removed and produces a single string (about 40k in my problem case), they are then checked with the W3C validator and they are always found to be valid code.

This program gives a .PHP extension if there are "include" statements in the file, or a .HTM extension if there is not.

The reason for this sort of compression is that I have read somewhere some time ago that doing so will make the file download faster to a visitor's computer.

The problem is that, although they render correctly and without error on my machine they do not from the Internet server. When I display the file from the Internet server, it appears as one line overwritten on itself and when trying to render it, I get an error message saying an unexpected "/" has been found.

When I display the same file on my computer, it takes a long time to load but it shows as a single string and it renders correctly.

My questions are:
What could the problem be?
Should I do it differently?

Any hint welcome and thank you for your help.
 
Old 02-13-2008, 02:38 AM   #2
j-ray
Senior Member
 
Registered: Jan 2002
Location: germany
Distribution: ubuntu, mint, suse
Posts: 1,591

Rep: Reputation: 145Reputation: 145
1.
would be helpful if you post the source code of the html/php file your browser receives. In general sorting out the whitespace from html files does not make the page much faster.

2.
Are there include statements in the text so that we expect PHP? If so -as I assume- maybe there are some quotes missing or sth like that...
 
Old 02-13-2008, 04:15 AM   #3
graemef
Senior Member
 
Registered: Nov 2005
Location: Hanoi
Distribution: Fedora 13, Ubuntu 10.04
Posts: 2,379

Rep: Reputation: 148Reputation: 148
Without more information about your error it's difficult to give advice. However, some thoughts:

Do the operating systems differ? Windows and Linux handle new lines differently.
What character encoding are you using? ASCII, UNICODE etc

What do you mean by "it takes a long time to load" could this be an indication of your problem?
 
Old 02-13-2008, 07:32 AM   #4
rupertwh
Member
 
Registered: Sep 2006
Location: Munich, Germany
Distribution: Debian / Ubuntu
Posts: 297

Rep: Reputation: 49
A 40K long text line?
Please! What if you ever want to inspect the HTML code? Happy scrolling.

You will not achieve any worthwhile speed increase. But you will break any parser/editor/viewer that (wrongfully) doesn't expect a text file to have a forty thousand character long line.
In fact, anything above 80 chars per line will potentially become a nuisance to someone looking at the code.
 
Old 02-13-2008, 10:00 PM   #5
rblampain
Senior Member
 
Registered: Aug 2004
Location: Western Australia
Distribution: Debian 11
Posts: 1,288

Original Poster
Rep: Reputation: 52
Thank you for your answers.

The only PHP code is those "include" statement like this:
<?php include '../charts/p1.chart01.php'; ?>
The Internet site provider uses Linux and I use UTF-8. When I said it takes a long time to load, this is in "gedit", between 4 seconds to 35 seconds (for 40k) but this could be due to a bug in my system because I regularly have messages like "window not responding" for no apparent reason, we could not solve this problem. The rendering of the file through the browser is instantaneous with firefox but I have just discovered iceweasel displays the file in bluefish instead of rendering it and this is also very slow.

If there is little or no advantage in "compressing" the file into a string, as j-ray mentioned, then it is much safer to use the "uncompressed" file as it gives no problem and I will do that.
 
Old 02-13-2008, 10:05 PM   #6
rblampain
Senior Member
 
Registered: Aug 2004
Location: Western Australia
Distribution: Debian 11
Posts: 1,288

Original Poster
Rep: Reputation: 52
Thank you to rupertwh's comment which supports j-ray's answer.

Last edited by rblampain; 02-13-2008 at 10:10 PM.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Strange characters in man pages _maco_ Linux - General 2 06-01-2009 05:44 PM
Wrong characters in web pages apffal Fedora 1 07-04-2006 11:03 PM
My Apache2 fails to parse php-scripts in html pages 3-1415 Linux - Software 6 10-21-2004 05:59 AM
Cookie Sharing Between CGI generated HTML pages and standard HTML pages rkwhited Linux - Newbie 5 08-15-2004 07:39 AM
loading html pages with php BaudRacer Programming 3 12-03-2003 08:34 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 06:01 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration