Although this is not really an issue anymore since the advent of things like HIMEM.sys XMS extenders for the PC XT 286, and DOS4GW and EMM386 for 386SX and beyond to where it was integrated with Windows (and BSD, linux and other versions of Unix always had memory management thereafter)...I still wonder:
Why wasn't data compression used to overcome the 640K limit?
Couldn't we have compressed the contents of RAM by at least 25% on a good day when most of the things loading to them were EXE and COM files and device driver data?
Do you think things would have been done differently or would it have made a big difference to you back then (or now) to have 800k or up to 1MB of memory without an extender board or external RAM? Especially for device drivers and program stack and data?
Would assembly language routines and roms for IBM PCs including hardware compression mnemonics have been a game changer for the industry?
I'm not sure how many will know what I'm saying or appreciate what it would have meant if their first systems were 386 or 486 computers (since they already had bank switching for RAM anywhere from 4MB to 64mb depending on what operating systems they were running years ago), but those of you who used a 286, an MSX, Commodore 64/128, Atari Computer, Tandy Color Computer / TRS80, or similar system from years ago will. Most of those had either 512K, 256K, 128K and some even less than that for conventional memory to where you really had to make sure everything fit under 64k most of the time.
Do you think compression for this back then would have made a difference on what we do now, how we do it, and on computer history today?