• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Are they mistaken?

WildRhino

Junior Member
There is a graphics demo on this site

http://chillblast.netfirms.com

under news for saturday (4th paragraph I think)and it says that it is only 64k but the program lasts about 10 mins.Can some1 explain what is going on,how they get so much into such a small program?It needs directx 8 and it is well worth the 10 second d/l >8}
 
Well, if you play till the end, they'll explain how much data is compressed into 64K. It's probably somekind of fractal/recursive compression. According to their claims, they achieve 30000:1 ratio.

 
Does this mean that all software in the future is going to be greatly reduced in size, therefore reducing the amount of storage needed?
 
in order to produce such compact executable, the code and data needs to be hand code/optimized. and take quite a while to produce such results. And currently the compression algorithm are still need some improvement, because AFAIK, one type of algorithm will only work with certain type of data, and there's a lot of constrains. MP3 vs WAV is a good example. it takes time to do it right. I still remember when the technology is out, a lot of people won't believe it's actually to achieve a 10:1 lossy compression on audio. but now everyone accept it as a standard.

but those demos demostrate that we are far behind on the software curve then the hardware. If you gather those scene people and have them write a complete OS package, it'll probably be 10x faster and 10x smaller.. (just guessing)

 
Back
Top