ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hi all,
This may prove to be more of a venting than something anyone can do anything about, right off.
My experience with the computer as a pieces of hardware and software has made me on the wary side. Quite some time back I made a very basic DOS program that works 3 or so files in parallel. They are all identical (the long term goal was to get them dispersed on the storage devices and help stability that way also.) I have heard that there are networking databases that do something similar across network links. Anyway, even that simple little prog caught failures while working on a Windows environment. It is not very happy with dosemu (only 24 lines allowed) many Alt keys covered by the OS and some Ctl key not reacting properly.
What I think was going on was the memory buffering of the data on the harddrive was not allowing a write error to it to be discovered. As I follow the programs close with a thorough file compare process in a batch file. It was discovered on the next use of the files. So I have noted well directly that hardware integrity can create data degradation problems. I'm sure there are other very important information that needs high integrity out there also. The reason I stick so tight with it is that I use it to keep records on 2 cemeteries I maintain. There is records that date back to 1904 still in effect. So extrapolating that into the future, who can say how long one needs to keep such information validly intact. There isn't a 7 year itch as possible as many businesses might slip by with as it relates to funds and taxes.
Anyway, in the endeavor to serve well I can see the need for high integrity programs that will react wisely to hardware flubs, if nothing else. I look at the space shuttle as a nice example of a way to address such concerns. Radiation has a noted effect memory chips, and even people at sea level are getting some of it. (Another thing that address things in code like this is that it also catches errant code while your still programming it.) Some of the approaches I would take would be:
1) use more than one variable in the program to store data in the process of updating.
2) reversing operate a function to make sure that the process was executing properly, where possible.
a)floating point math is very tough at some points to execute well, especially in functions like sines and tangents. The best I could guess would be to change the variable one bit both ways and make sure the reverse operation results landed on both sides of the starting values.
3) use diverse hardware CPUs, memory, and storage devices where possible to both speed up the process and assure integrity.
4) store data files with built in keys for each record, and perhaps even a sum check of some type for each column and, perhaps even through sheets, on a large speadsheet file. That would line up on any errant data fields quite simply. If nothing else one could use older data files that potentially were not modified to see if a valid value can be found to restore information.
There may be a way to make about any program function with more integrity by having the OS or other functional environment shell run redundant copies of the same program (even on the same machine, though several might be even better).
If you want me to suggest a name for such processing, how about 'srock', for solid rock, or 'shiroc' for System Health and Integrity by Reflective Computations.
Should anyone have direction I might take to progress to that end I'll sure listen. I keep trying to get into more programming myself. I do enjoy trying, but time and finances are not much in my favor presently.
I definitely agree with your sentiment: software needs to be able to compensate for hardware deficiencies. As you rightly noted, this is not quite so easily done as said.
Regarding implementation of safeguards in programming, this would depend of course upon which platform the application is designed to operate. If in a *NIX (UNIX, Linux, etc) environment, one has access to a number of comparasion and integrity utilites (most importantly diff, grep, and md5sum) and a wealth of knowledge gathered by other users with similar problems. I would suggest researching database programming, especially SQL, as many features built into modern database templates and guides include a number of integrity safeguards.
Hi,
I have used md5sum quite a bit. I've yet to conquer the technique to work it over the network, particularly to windows machine. I use 'dvdsig' freeware quite a bit on it. Wish their outputs were compatible. 'diff' has come in very handy at times also. 'grep' seems to have evaded me. I'll have to look into it.
I guess what I'm really wanting [(parallel) redundancy and/or checking] will be a long way down the road for the most part the way its looks. The hardware should be able to do it these days without much sign of slowing things down, if properly configured, as I see it. So that is not a limiting factor. Something of a standardized framework to build upon may be a big concern for people that are having to work in concert to develope things. I'm all for the endeavor, in any case.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.