News:

MASM32 SDK Description, downloads and other helpful links
MASM32.com New Forum Link
masmforum WebSite

100 to 1 Compressor

Started by Sergiu FUNIERU, February 23, 2010, 11:15:06 PM

Previous topic - Next topic

Sergiu FUNIERU

I am working on an algorithm that will compress at least 100 to 1 every file, no matter what OTHER algorithm was used to compress it before.

I contacted the creator of a very known commercial archive manager, and he told me that he's not interested in better algorithms. That got me thinking : is this a dead end? There is no interest in compression algorithms? I know that I received a "No, thanks" answer from only one person, but I think high of that person and his work.

dedndave

100 to 1 isn't very likely - not without data loss
of course, it depends on what the file has in it
but i seriously doubt any routine can maintain that on average

long ago, i wrote a program that would usually compress to about 65% or so
it took forever - about 3 minutes for 64 kb on a 200 MHz Pentium MMX
it took so long because it scanned forward, looking for the most effiicient method to use (about a dozen different methods)
it decompressed very fast, though   :P

i could probably write it faster, now - i may dig it out when i get caught up (like, the end of summer)

Sergiu FUNIERU

Quote from: dedndave on February 23, 2010, 11:55:37 PM
100 to 1 isn't very likely - not without data loss
of course, it depends on what the file has in it
but i seriously doubt any routine can maintain that on average
Not on average - EVERY time. Without any data loss. The algorithm doesn't contradict the Shannon's entropy.

dedndave

well, i am sure we'd all like to see it in action   :bg

Sergiu FUNIERU

Quote from: dedndave on February 24, 2010, 12:02:16 AMwell, i am sure we'd all like to see it in action   :bg
So do I.

Right now, my biggest problem is that I don't know how to create a patent for it.

dedndave

patents are usually devices that tell the competitor how you did it
they can quite often be overcome by minor improvements
but, i know a little about math and i can tell you that, unless the file is filled with repeat bytes, it isn't possible

Sergiu FUNIERU

Quote from: dedndave on February 24, 2010, 12:10:18 AM
patents are usually devices that tell the competitor how you did it
they can quite often be overcome by minor improvements
A patent is the best protection I know of. Do you know a better method?



Quote from: dedndave on February 24, 2010, 12:10:18 AMbut, i know a little about math and i can tell you that, unless the file is filled with repeat bytes, it isn't possible
I can't prove my point without revealing my method.

BlackVortex

You should patent that idea of yours as soon as possible, this is groundbreaking !

Ficko

Quote
,,Not on average - EVERY time. Without any data loss...."

That's a very bold statement. :lol

That would mean I give you 100 bytes and you compress it to 1 without losing the information? :dazzled:

Quote
... and he told me that he's not interested in better algorithms

Don't take wrong but I would turn you down as well for offering me such a program like I would turn down anybody else trying to sell me a perpetuum mobile. :toothy

ecube

heh, this guys funny, be gentle with him. :bg

dedndave

it's cool - we all have our theories that go sour
i was raised on a farm
so, i learned early that if you try to put 10 pounds of shit in a 5 pound sack, you wind up with a bit of a mess   :P
i think it is pretty cool that you can squeeze the air out of it and sometimes get 10 pounds into a 6.5 pound sack
nowdays they call it "information theory"
as simple farmers, we just called it a "shitty mess"

oex

You could win the remaining Hutter Prize with that http://prize.hutter1.net/
We are all of us insane, just to varying degrees and intelligently balanced through networking

http://www.hereford.tv

Sergiu FUNIERU

Quote from: Ficko on February 24, 2010, 08:45:16 AMThat would mean I give you 100 bytes and you compress it to 1 without losing the information?
No, I can't do that. But I can compress 100 MB to 1MB, or 100 Kb to 1Kb. I might go under 100 Kb for the lowest limit, but I'm not sure it worth the effort.
My algorithm would be best in large data compression, like storing a 2 Blu Ray disks on one regular cd.

Many people try to perfect the existing algorithms to squeeze 1 more bit. My method is a totally different approach.

Quote from: Ficko on February 24, 2010, 08:45:16 AMDon't take wrong but I would turn you down as well for offering me such a program like I would turn down anybody else trying to sell me a perpetuum mobile. :toothy
It's not a perpetuum mobile. It's like trying to sell someone the nuclear energy technology. If that person strongly believes that classic energy is the best way to go, I don't have any chance to persuade that person.

At first. I was upset that person didn't even want to hear that idea. Maybe it's better this way. I would've have given him my idea for free at that time - to a person who don't value its potential.

I know some people who still don't believe that loseless compression is possible. Their question is "How to I get back the information I remove when I compress?" They simply don't believe that some information is redundant and the compressor simply uses that fact.

oex

Compression algorithms find real limitations when put into practice, you should write your algoritm in asm and test it on various random sample data before you make too many wild claims.... What might seem obviously right now will soon find boundaries after you put pen to paper.

If you do indeed get it to work only then is it worth worrying about patents.
We are all of us insane, just to varying degrees and intelligently balanced through networking

http://www.hereford.tv

Sergiu FUNIERU

Quote from: oex on February 24, 2010, 12:39:45 PMYou could win the remaining Hutter Prize with that http://prize.hutter1.net/
Wow!

Thank you so much for telling me! I didn't know of such contest but it looks very interesting. I will read the conditions, to see if I have to give them the algorithm in exchange for the prize.