Why can't a universal progarm write drivers based on the actual hardware it sees?
If it runs a byte through every little *bit* of hardware, could it not backwards design a hardware driver? Or would that be too much factoring and permutation?
I'm thinking the computer could write a hardware driver since it knew the input and the output from the hardware.....it would just have to simulate what happened inbetween the in and out part. Say the computer knew where it was trying to get, say a network device or a sound card, and it knew the results it was getting from doing thus and such, and then it knew (because we preprogrammed it how to respond) what to put in the next line of code, wouldn't we eventually get a working driver?
Of course that would probably already have been done if we could do it. Just a thought tho....
If it is possible, why don't we write a really robust driver of all drivers that can back compile a driver for programs, and then get IBM to put one of their supercomputers on writing drivers for really old ISA cards. If there were a learning mechanism, where it found that thus and such a result most of the time meant *this* was happening in the hardware, then it could catch those recurrences and begin to expect them. Eventually we would get a computer that was very fast at guessing how the newest sound card or NIC should work.
The software could be tested using known hardware and drivers...and then we could compare and change things until we had a near perfect driver writer. As long as the program could learn, I don't see why this wouldn't work. One last option would be for the computer to analyze a 3D X-ray of the device, just to give it another frame of reference.
Of course I'm only a Ti-83+ programmer, I have no idea if this would be possible.
|