COMPGROUPS.NET | Search | Post Question | Groups | Stream | About | Register

### New form of recursive compression tested and proven. Now how do I market it?

• Follow

I have been using this technology for the last month and compressed my
entire hard drive into a small file less than a megabyte.

So my question is.....what now? I want to patent it, but the cost is
daunting. Is patent pending really enough protection?

 0

ffkfsoxfdzgjez@mailinator.com writes:
> I have been using this technology for the last month and compressed my
> entire hard drive into a small file less than a megabyte.
>
> So my question is.....what now? I want to patent it, but the cost is
> daunting. Is patent pending really enough protection?

Don't bother unless you can decompress that small file and recover the
original contents of your hard drive.

--
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Nokia
"We must do something.  This is something.  Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

 0

On May 29, 5:52=A0pm, Keith Thompson <ks...@mib.org> wrote:
> ffkfsoxfdzg...@mailinator.com writes:
> > I have been using this technology for the last month and compressed my
> > entire hard drive into a small file less than a megabyte.
>
> > So my question is.....what now? I want to patent it, but the cost is
> > daunting. Is patent pending really enough protection?
>
> Don't bother unless you can decompress that small file and recover the
> original contents of your hard drive.
>
> --
> Keith Thompson (The_Other_Keith) ks...@mib.org =A0<http://www.ghoti.net/~=
kst>
> Nokia
> "We must do something. =A0This is something. =A0Therefore, we must do thi=
s."
> =A0 =A0 -- Antony Jay and Jonathan Lynn, "Yes Minister"


 0

ffkfsoxfdzgjez@mailinator.com schrieb:
> I have been using this technology for the last month and compressed my
> entire hard drive into a small file less than a megabyte.
>
> So my question is.....what now?

I suggest try to restore your data from a backup - since it is lost by now.

> I want to patent it, but the cost is
> daunting. Is patent pending really enough protection?

Patent offices don't patent scams.

So long,
Thomas

 0

On May 30, 2:06=A0am, Thomas Richter <t...@math.tu-berlin.de> wrote:
> ffkfsoxfdzg...@mailinator.com schrieb:
>
> > I have been using this technology for the last month and compressed my
> > entire hard drive into a small file less than a megabyte.
>
> > So my question is.....what now?
>
> I suggest try to restore your data from a backup - since it is lost by no=
w.
>
> =A0> I want to patent it, but the cost is
>
> > daunting. Is patent pending really enough protection?
>
> Patent offices don't patent scams.
>
> So long,
> =A0 =A0 =A0 =A0 Thomas

I have a WORKING protoype. I use it to backup and restore files on my
hard drive now.

I'm really just looking for patent advice becasue I'm not really sure
what the best way to proceed with this tech is.

 0

<ffkfsoxfdzgjez@mailinator.com> wrote in message
> On May 30, 2:06 am, Thomas Richter <t...@math.tu-berlin.de> wrote:
>> ffkfsoxfdzg...@mailinator.com schrieb:
>>
>> > I have been using this technology for the last month and compressed my
>> > entire hard drive into a small file less than a megabyte.
>>
>> > So my question is.....what now?
>>
>> I suggest try to restore your data from a backup - since it is lost by
>> now.
>>
>>  > I want to patent it, but the cost is
>>
>> > daunting. Is patent pending really enough protection?
>>
>> Patent offices don't patent scams.
>>
>> So long,
>>         Thomas
>
> I have a WORKING protoype. I use it to backup and restore files on my
> hard drive now.
>
> I'm really just looking for patent advice becasue I'm not really sure
> what the best way to proceed with this tech is.

If your in the US then get free advice regarding that from www.score.org.

Paul

 0

On May 29, 6:32=A0pm, ffkfsoxfdzg...@mailinator.com wrote:
> I have been using this technology for the last month and compressed my
> entire hard drive into a small file less than a megabyte.
>
> So my question is.....what now? I want to patent it, but the cost is
> daunting. Is patent pending really enough protection?

Before you patent it, you must prove the technology really works.
Which it doesn't.  Unless your hard drive is filled with a single
value, I can guarantee you that you haven't compressed your entire
hard drive down to less than a megabyte.

Here is a simple way to test your process without giving up any of

1. You (inventor) provide a decompressor.
2. We (testers) send you a file to compress.
3. You run the file through your compressor and send us the resulting
compressed output.
4. We run the compressed output through the decompressor and compare
to the original file we sent you.

- If the files match, your decompression system works.
- If the compressed file is smaller than the original file, then your
compression system works.
- If the size of the compressed file + the size of the decompression
system is smaller than the original file, then you have achieved
actual compression of the data.

like.

 0

On May 29, 7:32=A0pm, ffkfsoxfdzg...@mailinator.com wrote:
> I have been using this technology for the last month and compressed my
> entire hard drive into a small file less than a megabyte.

First, you need it tested.     Jim Leonard has posted the procedure
Haiku/BeOS/Windows XP/Windows Vista/Linux machines.  And I know local

Without independant testing no-one is going to believe your claims.

 0

On Jun 1, 8:47=A0am, Jim Leonard <MobyGa...@gmail.com> wrote:

> Before you patent it, you must prove the technology really works.
> Which it doesn't. =A0Unless your hard drive is filled with a single
> value, I can guarantee you that you haven't compressed your entire
> hard drive down to less than a megabyte.
>
> Here is a simple way to test your process without giving up any of
>
> 1. You (inventor) provide a decompressor.
> 2. We (testers) send you a file to compress.
> 3. You run the file through your compressor and send us the resulting
> compressed output.
> 4. We run the compressed output through the decompressor and compare
> to the original file we sent you.
>
> - If the files match, your decompression system works.
> - If the compressed file is smaller than the original file, then your
> compression system works.
> - If the size of the compressed file + the size of the decompression
> system is smaller than the original file, then you have achieved
> actual compression of the data.
>
> like.

I have thought of doing something just like this. The worry I have
though is with potential reverse engineering of the decompressor. I'm
having a rough time finding a way to show my product's ability, while
still keeping it protected.

 0


<ffkfsoxfdzgjez@mailinator.com> wrote in message
> On Jun 1, 8:47 am, Jim Leonard <MobyGa...@gmail.com> wrote:
>
>> Before you patent it, you must prove the technology really works.
>> Which it doesn't.  Unless your hard drive is filled with a single
>> value, I can guarantee you that you haven't compressed your entire
>> hard drive down to less than a megabyte.
>>
>> Here is a simple way to test your process without giving up any of
>>
>> 1. You (inventor) provide a decompressor.
>> 2. We (testers) send you a file to compress.
>> 3. You run the file through your compressor and send us the resulting
>> compressed output.
>> 4. We run the compressed output through the decompressor and compare
>> to the original file we sent you.
>>
>> - If the files match, your decompression system works.
>> - If the compressed file is smaller than the original file, then your
>> compression system works.
>> - If the size of the compressed file + the size of the decompression
>> system is smaller than the original file, then you have achieved
>> actual compression of the data.
>>
>> like.
>
> I have thought of doing something just like this. The worry I have
> though is with potential reverse engineering of the decompressor. I'm
> having a rough time finding a way to show my product's ability, while
> still keeping it protected.

Exactly!  That is why you should never prove nothing to these guys.  They
want proof - then let them wait for it after you get a patent.

Paul

 0

On Jun 2, 7:15=A0am, ffkfsoxfdzg...@mailinator.com wrote:
> On Jun 1, 8:47=A0am, Jim Leonard <MobyGa...@gmail.com> wrote:
>
>
>
>
>
> > Before you patent it, you must prove the technology really works.
> > Which it doesn't. =A0Unless your hard drive is filled with a single
> > value, I can guarantee you that you haven't compressed your entire
> > hard drive down to less than a megabyte.
>
> > Here is a simple way to test your process without giving up any of
>
> > 1. You (inventor) provide a decompressor.
> > 2. We (testers) send you a file to compress.
> > 3. You run the file through your compressor and send us the resulting
> > compressed output.
> > 4. We run the compressed output through the decompressor and compare
> > to the original file we sent you.
>
> > - If the files match, your decompression system works.
> > - If the compressed file is smaller than the original file, then your
> > compression system works.
> > - If the size of the compressed file + the size of the decompression
> > system is smaller than the original file, then you have achieved
> > actual compression of the data.
>
> > I am willing to help you test your system; you can email me if you
> > like.
>
> I have thought of doing something just like this. The worry I have
> though is with potential reverse engineering of the decompressor. I'm
> having a rough time finding a way to show my product's ability, while
> still keeping it protected.- Hide quoted text -
>
> - Show quoted text -

Yes, no Java, java has decompiler, no C, there are lots of assembly
hacker,
you should make your program with erlang or fortran or cobol,and
encrypt with custom
UPX packer + remove the Identidy header, then use a program warper
around it
together with a USB token. Possibly demand a DAT drive, and define one
(no random access) and record your tape activity. And refresh the bios
memory
and unplug the RAM after test.

Get what I mean ?

Regards,
Fibonacci


 0

On Jun 1, 7:15=A0pm, ffkfsoxfdzg...@mailinator.com wrote:
> On Jun 1, 8:47=A0am, Jim Leonard <MobyGa...@gmail.com> wrote:
>
>
>
> > Before you patent it, you must prove the technology really works.
> > Which it doesn't. =A0Unless your hard drive is filled with a single
> > value, I can guarantee you that you haven't compressed your entire
> > hard drive down to less than a megabyte.
>
> > Here is a simple way to test your process without giving up any of
>
> > 1. You (inventor) provide a decompressor.
> > 2. We (testers) send you a file to compress.
> > 3. You run the file through your compressor and send us the resulting
> > compressed output.
> > 4. We run the compressed output through the decompressor and compare
> > to the original file we sent you.
>
> > - If the files match, your decompression system works.
> > - If the compressed file is smaller than the original file, then your
> > compression system works.
> > - If the size of the compressed file + the size of the decompression
> > system is smaller than the original file, then you have achieved
> > actual compression of the data.
>
> > I am willing to help you test your system; you can email me if you
> > like.
>
> I'm
> having a rough time finding a way to show my product's ability, while
> still keeping it protected.

Is that because it doesn't actually work?

I don't get you trolls, like it's an obvious gag, so why do you keep
perpetuating this nonsense?  An n-bit string cannot have 2^{n+1}
possible meanings, just by a simple argument of counting.  Yet you
compression idea that you know that we know can't possibly exist.

People like you seriously just need to get a life.  There is an entire
world outside of usenet, and I suggest you go explore it before it's
too late.

Tom

 0

On 2 Jun., 12:15, t...@iahu.ca wrote:
> On Jun 1, 7:15=A0pm, ffkfsoxfdzg...@mailinator.com wrote:
>
> > I'm
> > having a rough time finding a way to show my product's ability, while
> > still keeping it protected.
>
> Is that because it doesn't actually work?
>
> I don't get you trolls, like it's an obvious gag, so why do you keep
> perpetuating this nonsense?

I think the most likely explanation is incompetence. Many of those
random-compression-posters probably believe in their algorithms and
this belief is sometimes so strong that they think they don't need any
testing (decompressor).
http://en.wikipedia.org/wiki/Dunning-Kruger_effect
http://en.wikipedia.org/wiki/Hanlon%27s_razor

Cheers!
SG

 0

On Jun 2, 3:15=A0am, t...@iahu.ca wrote:
> On Jun 1, 7:15=A0pm, ffkfsoxfdzg...@mailinator.com wrote:
>
>
>
>
>
> > On Jun 1, 8:47=A0am, Jim Leonard <MobyGa...@gmail.com> wrote:
>
> > > Before you patent it, you must prove the technology really works.
> > > Which it doesn't. =A0Unless your hard drive is filled with a single
> > > value, I can guarantee you that you haven't compressed your entire
> > > hard drive down to less than a megabyte.
>
> > > Here is a simple way to test your process without giving up any of
>
> > > 1. You (inventor) provide a decompressor.
> > > 2. We (testers) send you a file to compress.
> > > 3. You run the file through your compressor and send us the resulting
> > > compressed output.
> > > 4. We run the compressed output through the decompressor and compare
> > > to the original file we sent you.
>
> > > - If the files match, your decompression system works.
> > > - If the compressed file is smaller than the original file, then your
> > > compression system works.
> > > - If the size of the compressed file + the size of the decompression
> > > system is smaller than the original file, then you have achieved
> > > actual compression of the data.
>
> > > I am willing to help you test your system; you can email me if you
> > > like.
>
> > I'm
> > having a rough time finding a way to show my product's ability, while
> > still keeping it protected.
>
> Is that because it doesn't actually work?
>
> I don't get you trolls, like it's an obvious gag, so why do you keep
> perpetuating this nonsense? =A0An n-bit string cannot have 2^{n+1}
> possible meanings, just by a simple argument of counting. =A0Yet you
> insist on trolling usenet for laughs or whatever about your amazing
> compression idea that you know that we know can't possibly exist.
>
> People like you seriously just need to get a life. =A0There is an entire
> world outside of usenet, and I suggest you go explore it before it's
> too late.
>
> Tom- Hide quoted text -
>
> - Show quoted text -

I understand the history of this type of claim, and why you think it's
untrue, but it really has been accomplished. I looking for advice on
how to show that it's real, while keeping it protected.

 0

On Jun 1, 6:15=A0pm, ffkfsoxfdzg...@mailinator.com wrote:
>
> I have thought of doing something just like this. The worry I have
> though is with potential reverse engineering of the decompressor. I'm
> having a rough time finding a way to show my product's ability, while
> still keeping it protected.

If you are worried that reverse-engineering of the decompressor will

There is mathematical proof that your method does not work for all
inputs (it may work for *some* inputs, but not all).  The purpose of
the procedure I posted is not only to verify your claims, but also to
expose flaws in the method that may not be obvious to you because you
are too close to it.

I am willing to test your method for you.  You supply me the
decompressor first, I will then supply you a file to compress second,
you send me back the resulting output third, and I will decompress it
and compare it to the original.  I will sign any NDA you deem
necessary if you're worried (but believe me, you have nothing to be

Most of the time I have helped people do this, they go ahead and do
the decompression/compare themselves, find it doesn't work, and
finally realize the flaw in their concept.

 0

ffkfsoxfdzgjez@mailinator.com wrote:
) I understand the history of this type of claim, and why you think it's
) untrue, but it really has been accomplished. I looking for advice on
) how to show that it's real, while keeping it protected.

Step one: Point out where the flaw is in the Counting Theorem.

You know, the mathematical proof that any given compressor
can only compress one in every 2^n possible inputs by n bits.

SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

 0

On Jun 2, 8:34=A0am, Jim Leonard <MobyGa...@gmail.com> wrote:
> On Jun 1, 6:15=A0pm, ffkfsoxfdzg...@mailinator.com wrote:
>
>
>
> > I have thought of doing something just like this. The worry I have
> > though is with potential reverse engineering of the decompressor. I'm
> > having a rough time finding a way to show my product's ability, while
> > still keeping it protected.
>
> If you are worried that reverse-engineering of the decompressor will
>
> There is mathematical proof that your method does not work for all
> inputs (it may work for *some* inputs, but not all). =A0The purpose of
> the procedure I posted is not only to verify your claims, but also to
> expose flaws in the method that may not be obvious to you because you
> are too close to it.
>
> I am willing to test your method for you. =A0You supply me the
> decompressor first, I will then supply you a file to compress second,
> you send me back the resulting output third, and I will decompress it
> and compare it to the original. =A0I will sign any NDA you deem
> necessary if you're worried (but believe me, you have nothing to be
>
> Most of the time I have helped people do this, they go ahead and do
> the decompression/compare themselves, find it doesn't work, and
> finally realize the flaw in their concept.

I'm sure you're a trustworthy guy Jim. But you have to understand why
I can't send the decompressor to you, even with a NDA. Do you know of
any well-known organizations that do independent testing? My purpose
here is to verify my claims, but again, with protection for the
method.

As for the mathematical proof....let's just say I look forward to
turning it on it's ear. :)

 0

<ffkfsoxfdzgjez@mailinator.com> wrote in message

> I'm sure you're a trustworthy guy Jim [Leonard].
> But you have to understand why
> I can't send the decompressor to you, even with a NDA.
First, you need to explain why.
Then we might understand why.

> Do you know of any well-known organizations that do
> independent testing? My purpose
> here is to verify my claims, but again, with protection for the
> method.

And this well-kown organization will be better than Jim how?
More trustworthy?
More competent?
More likely to be snowed?

Enquiring minds want to know.

Pete


 0

On 2 Jun., 17:20, ffkfsoxfdzg...@mailinator.com wrote:
> I understand the history of this type of claim, and why you think it's
> untrue,

Frankly, it seems you don't understand why we think it's untrue.

But to be fair: What exactly are you claiming? A detailed explanation
of what the algorithm is able to do (not how, we don't need to know
how it works) might be a good step forward. You could reuse this text
as the introduction for your patent application. And we get the chance
of responding accordingly (=E0 la "It's not possible because ..." or
"Ok, that's possible").

You should mention the following things:
- the set of possible files your compressor accepts as input
- a statement about whether the compressor might produce the same
output file for two distinct input files.
- a statement about i'ts compression performance (i.e. average file
size reduction over a uniform distribution of all possible input
files, something like "output file is always smaller than input file",
or ...)

Cheers!
SG

 0

On Jun 2, 9:22=A0am, "Pete Fraser" <pfra...@covad.net> wrote:
> <ffkfsoxfdzg...@mailinator.com> wrote in message
>
>
> > I'm sure you're a trustworthy guy Jim [Leonard].
> > But you have to understand why
> > I can't send the decompressor to you, even with a NDA.
>
> First, you need to explain why.
> Then we might understand why.
>
> > Do you know of any well-known organizations that do
> > independent testing? My purpose
> > here is to verify my claims, but again, with protection for the
> > method.
>
> And this well-kown organization will be better than Jim how?
> More trustworthy?
> More competent?
> More likely to be snowed?
>
> Enquiring minds want to know.
>
> Pete

1) I'm sure he's a good guy, but there's no way I'm sending the
decompressor to someone I don't know. Wouldn't be smart, surely you
can see that.

2)What's wrong with independent analysis of the program? It's done in
ever field of science, I don't see the problem. You honestly don't see
the difference between testing through a trusted organization versus
an individual over usenet?

 0

On Jun 2, 9:23=A0am, SG <s.gesem...@gmail.com> wrote:
> On 2 Jun., 17:20, ffkfsoxfdzg...@mailinator.com wrote:
>
> > I understand the history of this type of claim, and why you think it's
> > untrue,
>
> Frankly, it seems you don't understand why we think it's untrue.
>
> But to be fair: What exactly are you claiming? A detailed explanation
> of what the algorithm is able to do (not how, we don't need to know
> how it works) might be a good step forward. You could reuse this text
> as the introduction for your patent application. And we get the chance
> of responding accordingly (=E0 la "It's not possible because ..." or
> "Ok, that's possible").
>
> You should mention the following things:
> - the set of possible files your compressor accepts as input
> - a statement about whether the compressor might produce the same
> output file for two distinct input files.
> - a statement about i'ts compression performance (i.e. average file
> size reduction over a uniform distribution of all possible input
> files, something like "output file is always smaller than input file",
> or ...)
>
> Cheers!
> SG

The method can compress any data(above minimal size) recursively at
rate from 60% to 90% per cycle depending on the data. Conversely, the
compressed file can then be decompressed back to the original input
with lossless percision.

Your tips are welcomed and I will look into analyzing those other
attributes of the programs. May take a few days to get some numbers.

 0

ffkfsoxfdzgjez@mailinator.com wrote:
) As for the mathematical proof....let's just say I look forward to
) turning it on it's ear. :)

How ?

Showing that your compressor works is not enough; as long as the proof
stands, any compressor that contradicts it is provably fraudulent.

SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

 0

ffkfsoxfdzgjez@mailinator.com wrote:
) The method can compress any data(above minimal size) recursively at
) rate from 60% to 90% per cycle depending on the data. Conversely, the
) compressed file can then be decompressed back to the original input
) with lossless percision.

Please define 'minimal size'.  It's OK if you overestimate it.

) Your tips are welcomed and I will look into analyzing those other
) attributes of the programs. May take a few days to get some numbers.

Here's a more precise statement of what you claim your method can do:
(I'm going for 50% compression for ease of discussion, and am specifically
stating N bits as input.  This is a subset of your claim.)

- Your compressor method takes as input a series of N bits.
- It outputs a series of N/2 bits.

- Your decompressor method takes as input a series of N/2 bits.
- No other input is used for the method.
- It outputs a series of N bits.

- If the output of the compressor method is used as the input for the
decompressor method, the output of the decompressor method will be
identical to the input of the compressor method.
- The above statement holds for all possible input series of N bits.

Does the above accurately represent your method, and if not,
which of the points in the above claim do not hold for your method ?

SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

 0

On 2 Jun., 18:54, ffkfsoxfdzg...@mailinator.com wrote:
> The method can compress any data(above minimal size) recursively at
> rate from 60% to 90% per cycle depending on the data. Conversely, the
> compressed file can then be decompressed back to the original input
> with lossless percision.

Thanks for answering my questions. In this case the proof contained in
the comp.compression FAQ [1] applies. Look for "counting argument".
It's a "reductio ad absurdum"-type proof [2] which proves that no such
algorithm with the mentioned properties can exist.

Have a nice day,
SG

[1] http://www.faqs.org/faqs/compression-faq/part1/ (section 9)

 0

On Jun 2, 10:19=A0am, Willem <wil...@snail.stack.nl> wrote:
> ffkfsoxfdzg...@mailinator.com wrote:
>
> ) The method can compress any data(above minimal size) recursively at
> ) rate from 60% to 90% per cycle depending on the data. Conversely, the
> ) compressed file can then be decompressed back to the original input
> ) with lossless percision.
>
> Please define 'minimal size'. =A0It's OK if you overestimate it.
>
> ) Your tips are welcomed and I will look into analyzing those other
> ) attributes of the programs. May take a few days to get some numbers.
>
> Here's a more precise statement of what you claim your method can do:
> (I'm going for 50% compression for ease of discussion, and am specificall=
y
> =A0stating N bits as input. =A0This is a subset of your claim.)
>
> - Your compressor method takes as input a series of N bits.
> - It outputs a series of N/2 bits.
>
> - Your decompressor method takes as input a series of N/2 bits.
> - No other input is used for the method.
> - It outputs a series of N bits.
>
> - If the output of the compressor method is used as the input for the
> =A0 decompressor method, the output of the decompressor method will be
> =A0 identical to the input of the compressor method.
> - The above statement holds for all possible input series of N bits.
>
> Does the above accurately represent your method, and if not,
> which of the points in the above claim do not hold for your method ?
>
> SaSW, Willem
> --
> Disclaimer: I am in no way responsible for any of the statements
> =A0 =A0 =A0 =A0 =A0 =A0 made in the above text. For all I know I might be
> =A0 =A0 =A0 =A0 =A0 =A0 drugged or something..
> =A0 =A0 =A0 =A0 =A0 =A0 No I'm not paranoid. You all think I'm paranoid, =
don't you !
> #EOT

Let me correct an error in how I spoke a few posts before. When I
wrote that the compresson rate was 60-90% per cycle, I mean't to say
that the compressed cycle is 60-90% of the original data, so it
actually compresses at a rate of 10-40% per cycle. Sorry for
misspeaking.

1) The minimal size is roughly anything over 100 bytes to be safe, but
I'm confident that testing will yield an even smaller size than that.
I haven't done much testing as to how small a file can be yet.

2) Yes, that's a fair description of the method. There is a wrinkle to
my compression/decompression method though, that makes this possible.

 0

<ffkfsoxfdzgjez@mailinator.com> wrote in message

> 1) I'm sure he's a good guy, but there's no way I'm sending the
> decompressor to someone I don't know. Wouldn't be smart, surely you
> can see that.

He's a published author of compression books, and
is that he'll give you $100 if you succeed. http://marknelson.us/2006/06/20/million-digit-challenge/ If Mark determines that you (or anybody for that matter) has satisfied his criteria, I'll contribute an additional$100
to the winner.

Pete


 0

On Jun 2, 10:25=A0am, SG <s.gesem...@gmail.com> wrote:
> On 2 Jun., 18:54, ffkfsoxfdzg...@mailinator.com wrote:
>
> > The method can compress any data(above minimal size) recursively at
> > rate from 60% to 90% per cycle depending on the data. Conversely, the
> > compressed file can then be decompressed back to the original input
> > with lossless percision.
>
> Thanks for answering my questions. In this case the proof contained in
> the comp.compression FAQ [1] applies. Look for "counting argument".
> It's a "reductio ad absurdum"-type proof [2] which proves that no such
> algorithm with the mentioned properties can exist.
>
> Have a nice day,
> SG
>
> [1]http://www.faqs.org/faqs/compression-faq/part1/(section 9)

Sorry, but this is simply not true.

All I'm asking for here is advice on a way to reveal this method while
keeping it protected. If you don't believe me, that's fine. But at
least give me some tips on how I can be proven wrong instead of just
dismissing this altogether.

This method REALLY works, it's real. That's why I'm here asking for

 0

On Jun 2, 10:55=A0am, "Pete Fraser" <pfra...@covad.net> wrote:
> <ffkfsoxfdzg...@mailinator.com> wrote in message
>
>
> > 1) I'm sure he's a good guy, but there's no way I'm sending the
> > decompressor to someone I don't know. Wouldn't be smart, surely you
> > can see that.
>
> How about Mark Nelson then?
> He's a published author of compression books, and
> is that he'll give you $100 if you succeed. > > http://marknelson.us/2006/06/20/million-digit-challenge/ > > If Mark determines that you (or anybody for that matter) > has satisfied his criteria, I'll contribute an additional$100
> to the winner.
>
> Pete

I know of Mark, but again, $100 isn't much incentive to risk a method such as this. The Ocarina Prize Matt Mahoney posted about last year was interesting, but I don't think that's still going on.   0 Reply ffkfsoxfdzgjez 6/2/2009 6:05:43 PM ffkfsoxfdzgjez@mailinator.com wrote: ) Sorry, but this is simply not true. ) ) All I'm asking for here is advice on a way to reveal this method while ) keeping it protected. If you don't believe me, that's fine. But at ) least give me some tips on how I can be proven wrong instead of just ) dismissing this altogether. Tip: The Counting Theorem proves you wrong. If you don't agree then please point out the flaw in the theorem. SaSW, Willem -- Disclaimer: I am in no way responsible for any of the statements made in the above text. For all I know I might be drugged or something.. No I'm not paranoid. You all think I'm paranoid, don't you ! #EOT   0 Reply Willem 6/2/2009 6:08:50 PM On Jun 2, 11:20=A0am, ffkfsoxfdzg...@mailinator.com wrote: > On Jun 2, 3:15=A0am, t...@iahu.ca wrote: > > > > > On Jun 1, 7:15=A0pm, ffkfsoxfdzg...@mailinator.com wrote: > > > > On Jun 1, 8:47=A0am, Jim Leonard <MobyGa...@gmail.com> wrote: > > > > > Before you patent it, you must prove the technology really works. > > > > Which it doesn't. =A0Unless your hard drive is filled with a single > > > > value, I can guarantee you that you haven't compressed your entire > > > > hard drive down to less than a megabyte. > > > > > Here is a simple way to test your process without giving up any of > > > > your technology: > > > > > 1. You (inventor) provide a decompressor. > > > > 2. We (testers) send you a file to compress. > > > > 3. You run the file through your compressor and send us the resulti= ng > > > > compressed output. > > > > 4. We run the compressed output through the decompressor and compar= e > > > > to the original file we sent you. > > > > > - If the files match, your decompression system works. > > > > - If the compressed file is smaller than the original file, then yo= ur > > > > compression system works. > > > > - If the size of the compressed file + the size of the decompressio= n > > > > system is smaller than the original file, then you have achieved > > > > actual compression of the data. > > > > > I am willing to help you test your system; you can email me if you > > > > like. > > > > I'm > > > having a rough time finding a way to show my product's ability, while > > > still keeping it protected. > > > Is that because it doesn't actually work? > > > I don't get you trolls, like it's an obvious gag, so why do you keep > > perpetuating this nonsense? =A0An n-bit string cannot have 2^{n+1} > > possible meanings, just by a simple argument of counting. =A0Yet you > > insist on trolling usenet for laughs or whatever about your amazing > > compression idea that you know that we know can't possibly exist. > > > People like you seriously just need to get a life. =A0There is an entir= e > > world outside of usenet, and I suggest you go explore it before it's > > too late. > > > Tom- Hide quoted text - > > > - Show quoted text - > > I understand the history of this type of claim, and why you think it's > untrue, but it really has been accomplished. I looking for advice on > how to show that it's real, while keeping it protected. You type really need to come up with a new more entertaining gimmick. The "it's true, I've done it, just can't tell you, how do I get more attention to myself?" line has been done to death. If you "actually" knew why we know the others didn't work, you'd not think you could do it. But I'll bet this isn't about you being mistaken and more about you just trying to get a rise out of people. Oh look at the regulars, squirming to explain why your new found amazing algorithm is bust. Setting aside reality, if you had an algorithm you'd just patent it. It's not like compression patents are a new idea. Look at MPEG, LZW, etc. And it's not like finding funding or a patent attorney would be such a problem. Grow a pair and admit you're trolling. Tom   0 Reply tstdenis 6/2/2009 6:09:28 PM <ffkfsoxfdzgjez@mailinator.com> wrote in message news:b110cd9b-a5a7-4ae1-b1f3-3e7b18ac1ec7@l28g2000vba.googlegroups.com... >$100 isn't much incentive to risk a method such as this.

$200 with the special matching challenge grant.   0 Reply Pete 6/2/2009 6:10:03 PM ffkfsoxfdzgjez@mailinator.com wrote: ) On Jun 2, 10:19?am, Willem <wil...@snail.stack.nl> wrote: )> - Your compressor method takes as input a series of N bits. )> - It outputs a series of N/2 bits. )> )> - Your decompressor method takes as input a series of N/2 bits. )> - No other input is used for the method. )> - It outputs a series of N bits. )> )> - If the output of the compressor method is used as the input for the )> ? decompressor method, the output of the decompressor method will be )> ? identical to the input of the compressor method. )> - The above statement holds for all possible input series of N bits. )> )> Does the above accurately represent your method, and if not, )> which of the points in the above claim do not hold for your method ? )> )> SaSW, Willem )> -- )> Disclaimer: I am in no way responsible for any of the statements )> ? ? ? ? ? ? made in the above text. For all I know I might be )> ? ? ? ? ? ? drugged or something.. )> ? ? ? ? ? ? No I'm not paranoid. You all think I'm paranoid, don't you ! )> #EOT ) ) Let me correct an error in how I spoke a few posts before. When I ) wrote that the compresson rate was 60-90% per cycle, I mean't to say ) that the compressed cycle is 60-90% of the original data, so it ) actually compresses at a rate of 10-40% per cycle. Sorry for ) misspeaking. 100 bytes, to be safe ? OK, that's 800 bits. Is 1000 okay as well ? So let's replace N by 1000 and N/2 by 950. Is that okay with you ? ) 2) Yes, that's a fair description of the method. There is a wrinkle to ) my compression/decompression method though, that makes this possible. Does this 'wrinkle' involve any kind of extra input to the decompression method ? In any case, let's simplify your claim: (To ensure the output is always 950 bits, you can simply add 10 bits for length, and then pad the end with 0-bits.) - Your compression method takes a 1000-bit string as input and produces a 950-bit string as output. - Your decompression method takes a 950-bit string as input and produces a 1000-bit string as output. - If a 1000-bit string is given to the compression method, and the output of the compression method is used as input to the decompression method, then the output of the decompression method will be identical to the original 1000-bit string. Can your method do this for all possible 1000-bit strings ? If so, here's a question for you: Will the output of the compression method be different for each and every input string ? Yes or no ? SaSW, Willem -- Disclaimer: I am in no way responsible for any of the statements made in the above text. For all I know I might be drugged or something.. No I'm not paranoid. You all think I'm paranoid, don't you ! #EOT   0 Reply Willem 6/2/2009 6:20:25 PM On Jun 2, 11:08=A0am, Willem <wil...@snail.stack.nl> wrote: > ffkfsoxfdzg...@mailinator.com wrote: > > ) Sorry, but this is simply not true. > ) > ) All I'm asking for here is advice on a way to reveal this method while > ) keeping it protected. If you don't believe me, that's fine. But at > ) least give me some tips on how I can be proven wrong instead of just > ) dismissing this altogether. > > Tip: The Counting Theorem proves you wrong. > > If you don't agree then please point out the flaw in the theorem. > > SaSW, Willem > -- > Disclaimer: I am in no way responsible for any of the statements > =A0 =A0 =A0 =A0 =A0 =A0 made in the above text. For all I know I might be > =A0 =A0 =A0 =A0 =A0 =A0 drugged or something.. > =A0 =A0 =A0 =A0 =A0 =A0 No I'm not paranoid. You all think I'm paranoid, = don't you ! > #EOT I really don't want to get into a debate over the counting theorem. I'd rather focus on ways to show my method, while protecting it. If you were in my shoes, how would you approach the situation?   0 Reply ffkfsoxfdzgjez 6/2/2009 6:25:37 PM On Jun 2, 11:20=A0am, Willem <wil...@snail.stack.nl> wrote: > > 100 bytes, to be safe ? =A0OK, that's 800 bits. Is 1000 okay as well ? > > So let's replace N by 1000 and N/2 by 950. =A0Is that okay with you ? > > ) 2) Yes, that's a fair description of the method. There is a wrinkle to > ) my compression/decompression method though, that makes this possible. > > Does this 'wrinkle' involve any kind of extra input to the decompression > method ? > > In any case, let's simplify your claim: > > (To ensure the output is always 950 bits, you can simply add 10 bits > =A0for length, and then pad the end with 0-bits.) > > - Your compression method takes a 1000-bit string as input and produces a > =A0 950-bit string as output. > - Your decompression method takes a 950-bit string as input and produces = a > =A0 1000-bit string as output. > - If a 1000-bit string is given to the compression method, and the output > =A0 of the compression method is used as input to the decompression metho= d, > =A0 then the output of the decompression method will be identical to the > =A0 original 1000-bit string. > > Can your method do this for all possible 1000-bit strings ? > > If so, here's a question for you: =A0Will the output of the compression > method be different for each and every input string ? =A0Yes or no ? > > SaSW, Willem > -- > Disclaimer: I am in no way responsible for any of the statements > =A0 =A0 =A0 =A0 =A0 =A0 made in the above text. For all I know I might be > =A0 =A0 =A0 =A0 =A0 =A0 drugged or something.. > =A0 =A0 =A0 =A0 =A0 =A0 No I'm not paranoid. You all think I'm paranoid, = don't you ! > #EOT 1) Absolutely. 800 bits, 1000 bits, is okay. I can go below 100 bytes, I was just overestimating to be safe because I haven't figured the exact numbers yet. 2) No, the wrinkle doesn't involve extra input. 3) Yes, it works for all possible 1000 bit strings 4) Yes, the compressed output is different for every input.   0 Reply ffkfsoxfdzgjez 6/2/2009 6:34:26 PM ffkfsoxfdzgjez@mailinator.com wrote: ) 1) Absolutely. 800 bits, 1000 bits, is okay. I can go below 100 bytes, ) I was just overestimating to be safe because I haven't figured the ) exact numbers yet. ) ) 2) No, the wrinkle doesn't involve extra input. ) ) 3) Yes, it works for all possible 1000 bit strings ) ) 4) Yes, the compressed output is different for every input. OK, next questions: - How many possible 1000-bit strings are there ? - Will the compressed output be different for each and every one of those ? - If so, then how many different possible compressed outputs are there ? - How many possible 950-bit strings are there ? SaSW, Willem -- Disclaimer: I am in no way responsible for any of the statements made in the above text. For all I know I might be drugged or something.. No I'm not paranoid. You all think I'm paranoid, don't you ! #EOT   0 Reply Willem 6/2/2009 6:45:58 PM On Jun 2, 10:20=A0am, ffkfsoxfdzg...@mailinator.com wrote: > I understand the history of this type of claim, and why you think it's > untrue, but it really has been accomplished. I looking for advice on > how to show that it's real, while keeping it protected. I gave a very simple explanation of how to do this last week: http://groups.google.com/group/comp.compression/msg/ea781d2661a81124 If you follow the steps I list here, you will produce a demo webcast that will show your ability to compress the million digit file. This will not constitute definitive proof but you will have made a best effort that keeps your efforts completely safe. And it changes the playing field. Nobody will think you are a crank any more. Instead, they will think either a) maybe there is something real here or b) the guy is a crook. Much more interesting. It will also establish priority for your source code without revealing it. - Mark   0 Reply Mark 6/2/2009 6:51:21 PM On Jun 2, 11:45=A0am, Willem <wil...@snail.stack.nl> wrote: > ffkfsoxfdzg...@mailinator.com wrote: > > ) 1) Absolutely. 800 bits, 1000 bits, is okay. I can go below 100 bytes, > ) I was just overestimating to be safe because I haven't figured the > ) exact numbers yet. > ) > ) 2) No, the wrinkle doesn't involve extra input. > ) > ) 3) Yes, it works for all possible 1000 bit strings > ) > ) 4) Yes, the compressed output is different for every input. > > OK, next questions: > > - How many possible 1000-bit strings are there ? > > - Will the compressed output be different for each and every one of those= ? > - If so, then how many different possible compressed outputs are there ? > > - How many possible 950-bit strings are there ? > > SaSW, Willem > -- > Disclaimer: I am in no way responsible for any of the statements > =A0 =A0 =A0 =A0 =A0 =A0 made in the above text. For all I know I might be > =A0 =A0 =A0 =A0 =A0 =A0 drugged or something.. > =A0 =A0 =A0 =A0 =A0 =A0 No I'm not paranoid. You all think I'm paranoid, = don't you ! > #EOT Willem, I understand the counting theorem, and know where you are going with this. You honestly need to see the wrinkle to understand my compression method. It's like me trying to explain the periodic table to 19th century scientists who didn't properly understand the atom.   0 Reply ffkfsoxfdzgjez 6/2/2009 7:06:52 PM ffkfsoxfdzgjez@mailinator.com wrote: ) Willem, I understand the counting theorem, and know where you are ) going with this. You honestly need to see the wrinkle to understand my ) compression method. It's like me trying to explain the periodic table ) to 19th century scientists who didn't properly understand the atom. Then point out the flaw in the argument. Otherwise I'm calling bullshit. SaSW, Willem -- Disclaimer: I am in no way responsible for any of the statements made in the above text. For all I know I might be drugged or something.. No I'm not paranoid. You all think I'm paranoid, don't you ! #EOT   0 Reply Willem 6/2/2009 7:14:44 PM On Jun 2, 11:51=A0am, Mark Nelson <snorkel...@gmail.com> wrote: > On Jun 2, 10:20=A0am, ffkfsoxfdzg...@mailinator.com wrote: > > > I understand the history of this type of claim, and why you think it's > > untrue, but it really has been accomplished. I looking for advice on > > how to show that it's real, while keeping it protected. > > I gave a very simple explanation of how to do this last week: > > http://groups.google.com/group/comp.compression/msg/ea781d2661a81124 > > If you follow the steps I list here, you will produce a demo webcast > that will show your ability to compress the million digit file. > > This will not constitute definitive proof but you will have made a > best effort that keeps your efforts completely safe. > > And it changes the playing field. Nobody will think you are a crank > any more. Instead, they will think either a) maybe there is something > real here or b) the guy is a crook. Much more interesting. > > It will also establish priority for your source code without revealing > it. > > - Mark I really like this idea. I'll try this in a few days, but I need to familiarize myself with itrace and strace.   0 Reply ffkfsoxfdzgjez 6/2/2009 7:17:34 PM On Jun 2, 12:14=A0pm, Willem <wil...@snail.stack.nl> wrote: > ffkfsoxfdzg...@mailinator.com wrote: > > ) Willem, I understand the counting theorem, and know where you are > ) going with this. You honestly need to see the wrinkle to understand my > ) compression method. It's like me trying to explain the periodic table > ) to 19th century scientists who didn't properly understand the atom. > > Then point out the flaw in the argument. > Otherwise I'm calling bullshit. > > SaSW, Willem > -- > Disclaimer: I am in no way responsible for any of the statements > =A0 =A0 =A0 =A0 =A0 =A0 made in the above text. For all I know I might be > =A0 =A0 =A0 =A0 =A0 =A0 drugged or something.. > =A0 =A0 =A0 =A0 =A0 =A0 No I'm not paranoid. You all think I'm paranoid, = don't you ! > #EOT Sorry, my lips are sealed. In due time, my friend. :)   0 Reply ffkfsoxfdzgjez 6/2/2009 7:27:16 PM On 2 Jun., 21:27, ffkfsoxfdzg...@mailinator.com wrote: > On Jun 2, 12:14=A0pm, Willem <wil...@snail.stack.nl> wrote: > > ffkfsoxfdzg...@mailinator.com wrote: > > > ) Willem, I understand the counting theorem, and know where you are > > ) going with this. You honestly need to see the wrinkle to understand m= y > > ) compression method. It's like me trying to explain the periodic table > > ) to 19th century scientists who didn't properly understand the atom. > > > Then point out the flaw in the argument. > > Otherwise I'm calling bullshit. > > Sorry, my lips are sealed. In due time, my friend. :) You seem to be suffering from a superiority bias. You have exactly two options here to defend yourself: - Explain why the counting argument doesn't apply to your situation - Point out a flaw of the counting argument. None of these two options requires you to reveal your algorithm but you simply chose to evade the argument. Excellent move, Sherlock. Your credibility is at a new low and I guess it can't get any worse. Cheers! SG   0 Reply SG 6/2/2009 8:03:23 PM On Jun 2, 10:58=A0am, ffkfsoxfdzg...@mailinator.com wrote: > You have to understand why > I can't send the decompressor to you, even with a NDA. Not really, since the compressor does all of the decision work. If you think that analyzing the decompressor will reveal your method, then you don't have a patentable method (or your method doesn't achieve any actual compression). > any well-known organizations that do independent testing? My purpose > here is to verify my claims, but again, with protection for the > method. There was one that did, but was fooled by a con-artist and subsequently had its reputation damaged nearly beyond repair. They recanted the publication of the test and no longer agree to software tests of claims of this nature. There is no better group to do your testing than us. If you need a single person to verify your method, choose Mark Nelson as he would no doubt be called upon by any testing organization as a subject matter expert. If you won't "trust" experts in the field of information theory, then try this simple test: Download the million digits file (here's a link: http://marknelson.us/attachments/million-digit-challenge/AMillionRand= omDigits.bin ), compress it using your method, then decompress it, then compare both files (not by size, but by content -- you can do this with "fc / b" in DOS/windows, or "diff" in unix). It has been theorized that the million digits file can be compressed by up to 50 bits due to faint redundancies in the way the original data was generated. If your method completely blows past that, then it is mathematically obvious your method is broken and you need to re-examine why you think it works. > As for the mathematical proof....let's just say I look forward to > turning it on it's ear. :) I don't think you understand the proof. Here are your claims: - Any input stream can be compressed to smaller output - That smaller output can be used to reconstruct the original It is your use of the word "any" that shows you are missing the obvious. If any data can be compressed by your method, then it is possible to compress any data stream down to a single bit simply by taking the output of the previous compression run and using it as the input in the next run. Let's say I give you two bits of data; by your claim, you could compress this data down to a single bit. If all you have is a single bit of representation, you have a grand total of two values you can hold in that bit, but the original data held four values. Without any additional bits in your compressed output, you have no indication of which original input is the correct one.   0 Reply Jim 6/2/2009 8:09:09 PM On Jun 2, 11:25=A0am, ffkfsoxfdzg...@mailinator.com wrote: > On Jun 2, 11:08=A0am, Willem <wil...@snail.stack.nl> wrote: > > > > > ffkfsoxfdzg...@mailinator.com wrote: > > > ) Sorry, but this is simply not true. > > ) > > ) All I'm asking for here is advice on a way to reveal this method whil= e > > ) keeping it protected. If you don't believe me, that's fine. But at > > ) least give me some tips on how I can be proven wrong instead of just > > ) dismissing this altogether. > > > Tip: The Counting Theorem proves you wrong. > > > If you don't agree then please point out the flaw in the theorem. > I really don't want to get into a debate over the counting theorem. > I'd rather focus on ways to show my method, while protecting it. If > you were in my shoes, how would you approach the situation? How would _I_ proceed? I would listen to the otherwise-apparently-bright people who can _prove_, as surely as 1+1=3D2, that my method _can_ not work, and wonder where I'm probably going wrong.   0 Reply crisgoogle 6/2/2009 9:02:29 PM In article <a68a723d-f7f1-421d-ab96-d9754b602216@z7g2000vbh.googlegroups.com>, <ffkfsoxfdzgjez@mailinator.com> wrote: >Willem, I understand the counting theorem, and know where you are >going with this. You honestly need to see the wrinkle to understand my >compression method. It's like me trying to explain the periodic table >to 19th century scientists who didn't properly understand the atom. No, it's like trying to explain that 1+1+1=2, because you've figured out a new way to add them up (hint: it involves breaking 1 up into 1/2 + 1/4 + 1/8 + ...). Alan -- Defendit numerus   0 Reply amorgan 6/2/2009 9:16:52 PM Once again, I'm not going to discuss the methodology of my compression process. There seems to be quite a few assumptions in this thread, many of which are wrong. My goal is to prove the method in a secure manner. If you think I'm wrong, help me prove that it's flawed by suggesting organizations that specialize in this field.   0 Reply ffkfsoxfdzgjez 6/2/2009 9:41:00 PM In article <13ff45b2-833c-4c01-ae7d-5d70245bb786@q14g2000vbn.googlegroups.com>, <ffkfsoxfdzgjez@mailinator.com> wrote: >On Jun 2, 11:08=A0am, Willem <wil...@snail.stack.nl> wrote: >> ffkfsoxfdzg...@mailinator.com wrote: >> >> ) Sorry, but this is simply not true. >> ) >> ) All I'm asking for here is advice on a way to reveal this method while >> ) keeping it protected. If you don't believe me, that's fine. But at >> ) least give me some tips on how I can be proven wrong instead of just >> ) dismissing this altogether. >> >> Tip: The Counting Theorem proves you wrong. >> >> If you don't agree then please point out the flaw in the theorem. >> >> SaSW, Willem >> -- >> Disclaimer: I am in no way responsible for any of the statements >> =A0 =A0 =A0 =A0 =A0 =A0 made in the above text. For all I know I might be >> =A0 =A0 =A0 =A0 =A0 =A0 drugged or something.. >> =A0 =A0 =A0 =A0 =A0 =A0 No I'm not paranoid. You all think I'm paranoid, = >don't you ! >> #EOT > >I really don't want to get into a debate over the counting theorem. >I'd rather focus on ways to show my method, while protecting it. If >you were in my shoes, how would you approach the situation? I would patent the algorithm or I would have some well respected member of the community sign an NDA and then I'd demo it for them. If I were really, really paranoid about demoing it I might modify the source code to include lots of irrelevant calculations and random nonsense to obscure what it was I was doing. Or I might write it in some high level language that compiles to C (e.g. sml via Mlton or lisp via Stalin), add lots of irrelevant crap, generate the C code, add lots of irrelevant crap to *that*, and compile the resulting noise. Anyone who can reverse engineer that is smart enough to write the thing in the first place. Alan -- Defendit numerus   0 Reply amorgan 6/2/2009 9:49:24 PM On Jun 2, 4:41=A0pm, ffkfsoxfdzg...@mailinator.com wrote: > My goal is to prove the method in a secure manner. If you think I'm > wrong, help me prove that it's flawed by suggesting organizations that > specialize in this field. The only way to prove the method is to follow the previously outlined procedure. No independent party, "specialized organizations" or otherwise, can verify your claim without the basic procedure of sending them your decompressor first, then them sending you input to compress, then you sending the compressed output, then them decompressing the output and comparing it to the original. That *is* the verification procedure. Whether it's an "organization" or a single person, an NDA is required in all cases so you're not going to get anything more "secure" than an NDA. The only difference is that an independent organization will charge you five-digit numbers to perform this service. We are trying to be nice in offering to perform your verification for you for free (not to "steal your idea" but rather to help show you why your method cannot possibly work). Any independent organization that would perform testing would bring in Mark Nelson as a subject matter expert to oversee the testing. I suggest you simply work with him, as he has offered to do so already in this group and, assuming he is still in a nice mood, for free. If you need proof of Mark's credentials, I suggest you look up gzip/zlib, his books, and his editorial work.   0 Reply Jim 6/2/2009 10:08:54 PM Jim Leonard <MobyGamer@gmail.com> writes: > On Jun 1, 6:15�pm, ffkfsoxfdzg...@mailinator.com wrote: >> I have thought of doing something just like this. The worry I have >> though is with potential reverse engineering of the decompressor. I'm >> having a rough time finding a way to show my product's ability, while >> still keeping it protected. > > If you are worried that reverse-engineering of the decompressor will > somehow compromise your method, then either 1. your method doesn't > work, or 2. your method has already been invented. [...] I'm not sure I follow the reasoning here. I agree that the original poster's claims are mathematically impossible. But suppose somebody claims to have invented a *really good* compressor/decompressor, not one that violates the laws of mathematics, but one that does extraordinarily good compression and reliable decompression on typical (non-random) data, using some genuinely novel technique. Isn't it plausible that providing a copy of either the compressor or the decompressor might risk revealing details about the technique? -- Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.net/~kst> Nokia "We must do something. This is something. Therefore, we must do this." -- Antony Jay and Jonathan Lynn, "Yes Minister"   0 Reply Keith 6/2/2009 10:41:28 PM ffkfsoxfdzg...@mailinator.com wrote: > 1) I'm sure he's a good guy, but there's no way I'm sending the > decompressor to someone I don't know. Wouldn't be smart, surely you > can see that. > > 2)What's wrong with independent analysis of the program? It's done in > ever field of science, I don't see the problem. You honestly don't see > the difference between testing through a trusted organization versus > an individual over usenet? Create a command line version of your compressor and/or decompressor Create a zip file from it Visit http://www.metacompressor.com/submit.aspx Code: LongNightsDebugging Email: only you like to give away Enable new archiver Archiver name: what you like Archiver version: what you like Archiver author: what you like Archiver file: browse to your zip file of compressor/decompressor Command line compress: fill right parameters (test local first) Command line decompress: fill right parameters (test local first) File to test: Choose file44 the million random digits challenge or file13 both random Store archiver: disable Secure delete: enable Press Submit Then directly without human intervention a program test your archiver and remove all tracks of the uploaded and unpacked archiver and temp files created accept the result who shall be visible here: http://www.metacompressor.com/uploads.aspx?testfile=file44 and here: http://www.metacompressor.com/top.aspx?testfile=file44 Good luck!   0 Reply Sportman 6/2/2009 11:52:36 PM  "Pete Fraser" <pfraser@covad.net> wrote in message news:n6ednYkVkNYY-rjXnZ2dnUVZ_hydnZ2d@supernews.com... > <ffkfsoxfdzgjez@mailinator.com> wrote in message > news:068d73e3-f9ff-4ed4-acb1-2380fef458fe@h23g2000vbc.googlegroups.com... > >> 1) I'm sure he's a good guy, but there's no way I'm sending the >> decompressor to someone I don't know. Wouldn't be smart, surely you >> can see that. > > How about Mark Nelson then? > He's a published author of compression books, and > an editor of Dr Dobb's Journal. The added advantage > is that he'll give you$100 if you succeed.
>
> http://marknelson.us/2006/06/20/million-digit-challenge/
>
> If Mark determines that you (or anybody for that matter)
> has satisfied his criteria, I'll contribute an additional $100 > to the winner. > > Pete > I just happened to notice this. Would you do that for anyone else as well? I'm also working on my own Random Data compression which you can read about here: http://www.tretbase.com/forum/viewtopic.php?f=4&t=84 Paul   0 Reply Paul 6/3/2009 1:51:42 AM  "Jim Leonard" <MobyGamer@gmail.com> wrote in message news:735db3e3-0d31-4026-95a7-51e66206a9dd@r16g2000vbn.googlegroups.com... > On Jun 2, 10:58 am, ffkfsoxfdzg...@mailinator.com wrote: >> You have to understand why >> I can't send the decompressor to you, even with a NDA. > > Not really, since the compressor does all of the decision work. If > you think that analyzing the decompressor will reveal your method, > then you don't have a patentable method (or your method doesn't > achieve any actual compression). > >> any well-known organizations that do independent testing? My purpose >> here is to verify my claims, but again, with protection for the >> method. > > There was one that did, but was fooled by a con-artist and > subsequently had its reputation damaged nearly beyond repair. They > recanted the publication of the test and no longer agree to software > tests of claims of this nature. > > There is no better group to do your testing than us. If you need a > single person to verify your method, choose Mark Nelson as he would no > doubt be called upon by any testing organization as a subject matter > expert. > > If you won't "trust" experts in the field of information theory, then > try this simple test: Download the million digits file (here's a > link: > http://marknelson.us/attachments/million-digit-challenge/AMillionRandomDigits.bin > ), compress it using your method, then decompress it, then compare > both files (not by size, but by content -- you can do this with "fc / > b" in DOS/windows, or "diff" in unix). It has been theorized that the > million digits file can be compressed by up to 50 bits due to faint > redundancies in the way the original data was generated. If your > method completely blows past that, then it is mathematically obvious > your method is broken and you need to re-examine why you think it > works. > >> As for the mathematical proof....let's just say I look forward to >> turning it on it's ear. :) > > I don't think you understand the proof. Here are your claims: > > - Any input stream can be compressed to smaller output > - That smaller output can be used to reconstruct the original > > It is your use of the word "any" that shows you are missing the > obvious. If any data can be compressed by your method, then it is > possible to compress any data stream down to a single bit simply by > taking the output of the previous compression run and using it as the > input in the next run. > > Let's say I give you two bits of data; by your claim, you could > compress this data down to a single bit. If all you have is a single > bit of representation, you have a grand total of two values you can > hold in that bit, but the original data held four values. Without any > additional bits in your compressed output, you have no indication of > which original input is the correct one. > Your first statement was false. An output file alone can be sufficient to reverse engineer a compression scheme. Given a decompressor and the output file a programmer has all he needs to reverse engineer the compression scheme. Paul   0 Reply Paul 6/3/2009 1:57:27 AM On Jun 2, 5:41=A0pm, Keith Thompson <ks...@mib.org> wrote: > Isn't it plausible that providing a copy > of either the compressor or the decompressor might risk revealing > details about the technique? The compressor, most definitely. The decompressor, usually not. I can see how CM methods like LZP might give away a few hints looking at the decompression process only (LZP builds a table and uses it to provide prior context, just like the compression phase), but not the entire method.   0 Reply Jim 6/3/2009 1:58:20 AM  "Keith Thompson" <kst-u@mib.org> wrote in message news:lnskii45t3.fsf@nuthaus.mib.org... > Jim Leonard <MobyGamer@gmail.com> writes: >> On Jun 1, 6:15 pm, ffkfsoxfdzg...@mailinator.com wrote: >>> I have thought of doing something just like this. The worry I have >>> though is with potential reverse engineering of the decompressor. I'm >>> having a rough time finding a way to show my product's ability, while >>> still keeping it protected. >> >> If you are worried that reverse-engineering of the decompressor will >> somehow compromise your method, then either 1. your method doesn't >> work, or 2. your method has already been invented. > [...] > > I'm not sure I follow the reasoning here. > > I agree that the original poster's claims are mathematically > impossible. But suppose somebody claims to have invented a *really > good* compressor/decompressor, not one that violates the laws of > mathematics, but one that does extraordinarily good compression and > reliable decompression on typical (non-random) data, using some > genuinely novel technique. Isn't it plausible that providing a copy > of either the compressor or the decompressor might risk revealing > details about the technique? > > -- > Keith Thompson (The_Other_Keith) kst-u@mib.org > <http://www.ghoti.net/~kst> > Nokia > "We must do something. This is something. Therefore, we must do this." > -- Antony Jay and Jonathan Lynn, "Yes Minister" Exactly! Some of these "experts" here don't have a clue about programming apparently to make such statements. Paul   0 Reply Paul 6/3/2009 2:01:11 AM  "Jim Leonard" <MobyGamer@gmail.com> wrote in message news:b6564790-5a56-44a8-b129-72f826be22b8@y9g2000yqg.googlegroups.com... > On Jun 2, 5:41 pm, Keith Thompson <ks...@mib.org> wrote: >> Isn't it plausible that providing a copy >> of either the compressor or the decompressor might risk revealing >> details about the technique? > > The compressor, most definitely. The decompressor, usually not. I > can see how CM methods like LZP might give away a few hints looking at > the decompression process only (LZP builds a table and uses it to > provide prior context, just like the compression phase), but not the > entire method. Of course the decompressor would. Doesn't matter what language or type of compressor or method it uses. If it is software, it can be reverse-engineered. Paul   0 Reply Paul 6/3/2009 3:25:28 AM "Paul" <paul@tretbase.com> wrote in message news:YGkVl.6858$En5.5079@newsfe09.iad...
>
> "Pete Fraser" <pfraser@covad.net> wrote in message
>> If Mark determines that you (or anybody for that matter)
>> has satisfied his criteria, I'll contribute an additional $100 >> to the winner. > I just happened to notice this. Would you do that for anyone else as > well? Sure. Pete   0 Reply Pete 6/3/2009 5:05:56 AM On 2 Jun., 23:41, ffkfsoxfdzg...@mailinator.com wrote: > Once again, I'm not going to discuss the methodology of my compression > process. We're not interested in the details of your compression methodology. At least I'm not. This is about whether such a program with the mentioned properties can exist. And it can not. We simply asked you to defend yourself against the counting argument. If you can't do that we have a contradiction (the counting argument proves you wrong but your compressor is really existing). A contradiction which is explained by you being wrong which renders the "how do I securly demonstrate my program?"-question irrelevant. There is no such program that would need any testing. > My goal is to prove the method in a secure manner. If you think I'm > wrong, help me prove that it's flawed by suggesting organizations that > specialize in this field. We /know/ you're wrong and we've been trying to explain that to you. People keep pointing this out because you're an easy target. They like to correct others on the internet and can't stand lies. :-) You can actually consider this as helping even though it doesn't answer your original question. Why? Because your question about secure demonstrations relies on the false premise of you having developed a magic compressor. Cheers! SG   0 Reply SG 6/3/2009 6:16:25 AM On Jun 2, 10:25=A0pm, "Paul" <p...@tretbase.com> wrote: > >> Isn't it plausible that providing a copy > >> of either the compressor or the decompressor might risk revealing > >> details about the technique? > > > The compressor, most definitely. =A0The decompressor, usually not. =A0I > > can see how CM methods like LZP might give away a few hints looking at > > the decompression process only (LZP builds a table and uses it to > > provide prior context, just like the compression phase), but not the > > entire method. > > Of course the decompressor would. =A0Doesn't matter what language or type= of > compressor or method it uses. =A0If it is software, it can be > reverse-engineered. Just because it can be reverse-engineered doesn't mean the *compression* method can be derived from the effort. For example, here is my work-in-progress prototype for the fastest possible decompressor possible on a 16-bit 8088/8086 CPU: =3D=3D=3Dbegin=3D=3D=3D (this assumes you have set SS:SP to the start of your compressed data, ES:DI to the start of the output buffer, and DS:SI equal to ES:DI) POP DX ;total size of output, so we know when to stop! @cbitloop: POP BX ;get 16 bits of literal/match data ;-------begin of code block that repeats a total of 16 times------- ;-------this is not only an unroll speedup, but also because we're- ;-------out of registers to use as count vars! ;-)----------------- SHL BX,1 ;get leftmost control bit into carry bit JC @process_literal ;if carry set, next token is a literal POP CX ;otherwise, it's a code: grab the length, SHR CX,1 ;SHR to both grab carry bit and adjust range JC @process_match ;carry bit set? It's a match POP AX ;not set? It's a run. Grab date to make run REP STOSW ;output the word AX, CX times JMP @token_finished ;Finished with this token @process_match: POP SI ;next token is the offset to copy FROM REP MOVSW ;Copy CX words JMP @token_finished ;Finished with this token @process_literal: POP AX ;grab the literal... STOSW ;...and store to output buffer @token_finished: ;---------end of code block that repeats a total of 16 times------- CMP DI,DX ;compare di (where we are now) to dx (total) JL @cbitloop ;if current pos less than total, go another 16 rounds =3D=3D=3D=3Dend=3D=3D=3D=3D (It's a bit jump-heavy, which I will try to optimize someday, but the basic core optimizations are there.) So, there's my decompressor, source code included. I'll even help you understand it by explaining that it takes compressed data in the form of literals, match copies, and match runs, and copies/stores them to the output buffer using registers for everything, with the only memory accesses being reads from the input data, copies to the destination buffer, and the physical opcode fetches themselves. (Yes, I'm somewhat proud of it :-) Now that you have my complete decompressor, how does my COMPRESSOR work? Is it an RDC/LZRW1 variant that uses lazy matching? Is it LZSS with optimal parsing? Is it a rewritten LZO? (If so, which LZO variant did I use?) Is it refactored zlib/pkzip output that has been de-tokenized? There's no way of knowing. Even if I supply you with the decompressor *and* sample compressed data to run through it, it still doesn't tell you exactly how that compressed data was created. It is impossible for you to know exactly how my compression method works based on the decompressor alone.   0 Reply Jim 6/3/2009 8:43:58 AM <ffkfsoxfdzgjez@mailinator.com> wrote in message news:6950fc30-c458-4e45-91a2-479fcbe96ab3@z7g2000vbh.googlegroups.com... > On Jun 1, 8:47 am, Jim Leonard <MobyGa...@gmail.com> wrote: > >> Before you patent it, you must prove the technology really works. >> Which it doesn't. Unless your hard drive is filled with a single >> value, I can guarantee you that you haven't compressed your entire >> hard drive down to less than a megabyte. >> >> Here is a simple way to test your process without giving up any of >> your technology: >> >> 1. You (inventor) provide a decompressor. >> 2. We (testers) send you a file to compress. >> 3. You run the file through your compressor and send us the resulting >> compressed output. >> 4. We run the compressed output through the decompressor and compare >> to the original file we sent you. >> >> - If the files match, your decompression system works. >> - If the compressed file is smaller than the original file, then your >> compression system works. >> - If the size of the compressed file + the size of the decompression >> system is smaller than the original file, then you have achieved >> actual compression of the data. >> >> I am willing to help you test your system; you can email me if you >> like. > > I have thought of doing something just like this. The worry I have > though is with potential reverse engineering of the decompressor. I'm > having a rough time finding a way to show my product's ability, while > still keeping it protected. You don't need to be paranoid, none of us thinks it really works so none of us would try to steal it. And btw, exposing a web interface wouldn't work. Maybe you have it same the original file and you'd have no way to prove that you don't.   0 Reply Harold 6/3/2009 10:31:09 AM  "Pete Fraser" <pfraser@covad.net> wrote in message news:humdnUr8QrMgmbvXnZ2dnUVZ_sOdnZ2d@supernews.com... > "Paul" <paul@tretbase.com> wrote in message > news:YGkVl.6858$En5.5079@newsfe09.iad...
>>
>> "Pete Fraser" <pfraser@covad.net> wrote in message
>>> If Mark determines that you (or anybody for that matter)
>>> has satisfied his criteria, I'll contribute an additional $100 >>> to the winner. > >> I just happened to notice this. Would you do that for anyone else as >> well? > > Sure. > > Pete > I think you and Mark should up the amount. I mean if it really CAN'T be done then why not$10,000?

Paul


 0


"Jim Leonard" <MobyGamer@gmail.com> wrote in message
> On Jun 2, 10:25 pm, "Paul" <p...@tretbase.com> wrote:
>> >> Isn't it plausible that providing a copy
>> >> of either the compressor or the decompressor might risk revealing
>> >> details about the technique?
>>
>> > The compressor, most definitely.  The decompressor, usually not.  I
>> > can see how CM methods like LZP might give away a few hints looking at
>> > the decompression process only (LZP builds a table and uses it to
>> > provide prior context, just like the compression phase), but not the
>> > entire method.
>>
>> Of course the decompressor would.  Doesn't matter what language or type
>> of
>> compressor or method it uses.  If it is software, it can be
>> reverse-engineered.
>
> Just because it can be reverse-engineered doesn't mean the
> *compression* method can be derived from the effort.
>
> For example, here is my work-in-progress prototype for the fastest
> possible decompressor possible on a 16-bit 8088/8086 CPU:
>
> ===begin===
>
> (this assumes you have set SS:SP to the start of your compressed data,
> ES:DI to the start of the output buffer, and DS:SI equal to ES:DI)
>
>  POP   DX ;total size of output, so we know when to stop!
> @cbitloop:
>  POP   BX       ;get 16 bits of literal/match data
> ;-------begin of code block that repeats a total of 16 times-------
> ;-------this is not only an unroll speedup, but also because we're-
> ;-------out of registers to use as count vars! ;-)-----------------
>  SHL   BX,1 ;get leftmost control bit into carry bit
>  JC    @process_literal ;if carry set, next token is a literal
>  POP   CX ;otherwise, it's a code: grab the length,
>  SHR   CX,1 ;SHR to both grab carry bit and adjust range
>  JC    @process_match ;carry bit set?  It's a match
>  POP   AX ;not set?  It's a run.  Grab date to make run
>  REP   STOSW ;output the word AX, CX times
>  JMP   @token_finished ;Finished with this token
> @process_match:
>  POP   SI                      ;next token is the offset to copy FROM
>  REP   MOVSW ;Copy CX words
>  JMP   @token_finished ;Finished with this token
> @process_literal:
>  POP   AX ;grab the literal...
>  STOSW ;...and store to output buffer
> @token_finished:
> ;---------end of code block that repeats a total of 16 times-------
>  CMP   DI,DX ;compare di (where we are now) to dx (total)
>  JL    @cbitloop ;if current pos less than total, go another 16
> rounds
>
> ====end====
>
> (It's a bit jump-heavy, which I will try to optimize someday, but the
> basic core optimizations are there.)  So, there's my decompressor,
> source code included.  I'll even help you understand it by explaining
> that it takes compressed data in the form of literals, match copies,
> and match runs, and copies/stores them to the output buffer using
> registers for everything, with the only memory accesses being reads
> from the input data, copies to the destination buffer, and the
> physical opcode fetches themselves.  (Yes, I'm somewhat proud of
> it :-)
>
> Now that you have my complete decompressor, how does my COMPRESSOR
> work?  Is it an RDC/LZRW1 variant that uses lazy matching?  Is it LZSS
> with optimal parsing?  Is it a rewritten LZO?  (If so, which LZO
> variant did I use?)  Is it refactored zlib/pkzip output that has been
> de-tokenized?  There's no way of knowing.  Even if I supply you with
> the decompressor *and* sample compressed data to run through it, it
> still doesn't tell you exactly how that compressed data was created.
> It is impossible for you to know exactly how my compression method
> works based on the decompressor alone.

Jim, I'm not going to take the time to figure it out.  But if you're an
assembly programmer then you should have known your statements were
ridiculous.

Paul


 0


"Harold Aptroot" <harold.aptroot@gmail.com> wrote in message
news:a0615$4a2650ec$53558f0b$28312@cache110.multikabel.net... > <ffkfsoxfdzgjez@mailinator.com> wrote in message > news:6950fc30-c458-4e45-91a2-479fcbe96ab3@z7g2000vbh.googlegroups.com... >> On Jun 1, 8:47 am, Jim Leonard <MobyGa...@gmail.com> wrote: >> >>> Before you patent it, you must prove the technology really works. >>> Which it doesn't. Unless your hard drive is filled with a single >>> value, I can guarantee you that you haven't compressed your entire >>> hard drive down to less than a megabyte. >>> >>> Here is a simple way to test your process without giving up any of >>> your technology: >>> >>> 1. You (inventor) provide a decompressor. >>> 2. We (testers) send you a file to compress. >>> 3. You run the file through your compressor and send us the resulting >>> compressed output. >>> 4. We run the compressed output through the decompressor and compare >>> to the original file we sent you. >>> >>> - If the files match, your decompression system works. >>> - If the compressed file is smaller than the original file, then your >>> compression system works. >>> - If the size of the compressed file + the size of the decompression >>> system is smaller than the original file, then you have achieved >>> actual compression of the data. >>> >>> I am willing to help you test your system; you can email me if you >>> like. >> >> I have thought of doing something just like this. The worry I have >> though is with potential reverse engineering of the decompressor. I'm >> having a rough time finding a way to show my product's ability, while >> still keeping it protected. > > You don't need to be paranoid, none of us thinks it really works so none > of us would try to steal it. > And btw, exposing a web interface wouldn't work. Maybe you have it same > the original file and you'd have no way to prove that you don't. Yeah, I also looked at that idea and came to the same conclusion. It doesn't provide any proof. I'm curious if using some form of trusted screen capture program that can't be manipulated would work. One that has a proprietary video format. Paul   0 Reply Paul 6/3/2009 2:22:04 PM On Jun 2, 3:06=A0pm, ffkfsoxfdzg...@mailinator.com wrote: > Willem, I understand the counting theorem, and know where you are > going with this. You honestly need to see the wrinkle to understand my > compression method. It's like me trying to explain the periodic table > to 19th century scientists who didn't properly understand the atom. It would seem you don't. You were ok with saying all 1000-bit strings had unique compressions. Then ignore the fact there is an inconsistency with the fact there are fewer shorter strings than longer ones. Congrats, you're a troll. Get a life.   0 Reply tom 6/3/2009 2:24:34 PM On Jun 3, 2:15 pm, "Paul" <p...@tretbase.com> wrote: > "Pete Fraser" <pfra...@covad.net> wrote in message > > news:humdnUr8QrMgmbvXnZ2dnUVZ_sOdnZ2d@supernews.com... > > > "Paul" <p...@tretbase.com> wrote in message > >news:YGkVl.6858$En5.5079@newsfe09.iad...
>
> >> "Pete Fraser" <pfra...@covad.net> wrote in message
> >>> If Mark determines that you (or anybody for that matter)
> >>> has satisfied his criteria, I'll contribute an additional $100 > >>> to the winner. > > >> I just happened to notice this. Would you do that for anyone else as > >> well? > > > Sure. > > > Pete > > I think you and Mark should up the amount. I mean if it really CAN'T be > done then why not$10,000?
>
> Paul

Mike Goldman used to offer $5,000 for anyone who could pass his test. 'Dunno if he still does. Negotiate and ask him to up the prize to$10,000 due to inflation and all that.

 0

ffkfsoxfdzgjez@mailinator.com wrote:
> On Jun 2, 11:45 am, Willem <wil...@snail.stack.nl> wrote:
>> ffkfsoxfdzg...@mailinator.com wrote:
>>
>> ) 1) Absolutely. 800 bits, 1000 bits, is okay. I can go below 100 bytes,
>> ) I was just overestimating to be safe because I haven't figured the
>> ) exact numbers yet.
>> )
>> ) 2) No, the wrinkle doesn't involve extra input.
>> )
>> ) 3) Yes, it works for all possible 1000 bit strings
>> )
>> ) 4) Yes, the compressed output is different for every input.
>>
>> OK, next questions:
>>
>> - How many possible 1000-bit strings are there ?
>>
>> - Will the compressed output be different for each and every one of those ?
>> - If so, then how many different possible compressed outputs are there ?
>>
>> - How many possible 950-bit strings are there ?
>>
>> SaSW, Willem
>> --
>> Disclaimer: I am in no way responsible for any of the statements
>>             made in the above text. For all I know I might be
>>             drugged or something..
>>             No I'm not paranoid. You all think I'm paranoid, don't you !
>> #EOT
>
> Willem, I understand the counting theorem, and know where you are
> going with this. You honestly need to see the wrinkle to understand my
> compression method. It's like me trying to explain the periodic table
> to 19th century scientists who didn't properly understand the atom.

You need a better metaphor ...
Dmitri Mendeleev created the periodic table in 1869.

You are claiming new mathematics invalidating the counting argument;
publish in a math journal.  Primacy of publication will give you all the
legal protection you need.

 0

"Paul" <paul@tretbase.com> wrote in message
news:nGvVl.7563$En5.5559@newsfe09.iad... > I'm curious if using some form of trusted screen capture program that > can't be manipulated would work. If you don't want to send one of us the decompressor first, at least do the test yourself. How big is your decompressor? How big is the compressed result of the million digit file? Have you verified that the output of the decompressor is identical to the original file? Pete   0 Reply Pete 6/3/2009 2:56:18 PM  "Pete Fraser" <pfraser@covad.net> wrote in message news:2Yqdnb3jwZiFErvXnZ2dnUVZ_uKdnZ2d@supernews.com... > "Paul" <paul@tretbase.com> wrote in message > news:nGvVl.7563$En5.5559@newsfe09.iad...
>
>> I'm curious if using some form of trusted screen capture program that
>> can't be manipulated would work.
>
> If you don't want to send one of us the decompressor first,
> at least do the test yourself.
>
> How big is your decompressor?
> How big is the compressed result of the million digit file?
> Have you verified that the output of the decompressor
> is identical to the original file?
>
> Pete
>

Pete, I don't have those answers for you just yet.  Decompressor size
continues to change.  I have verified output is same on data that I have
been working with.

I'll get those for you though here after some more changes to the
decompressor (which will result in smaller size).  By the way, I'm currently
using a prototype language so it is interpreted which allows me to quickly
make changes at this point.  I plan to move unto something performance
oriented and concise later on.  But currently, to provide a decompressor
size would require providing the interpreter size and the script size.

Paul


 0

"Paul" <paul@tretbase.com> wrote in message
news:tmxVl.581$Qg6.53@newsfe08.iad... > Pete, I don't have those answers for you just yet. Decompressor size > continues to change. I have verified output is same on data that I have > been working with. > > I'll get those for you though here after some more changes to the > decompressor (which will result in smaller size). Forget the decompresser size. What size is the compressed file?   0 Reply Pete 6/3/2009 4:26:43 PM  "Pete Fraser" <pfraser@covad.net> wrote in message news:8vKdnUcmyMrQObvXnZ2dnUVZ_gadnZ2d@supernews.com... > > "Paul" <paul@tretbase.com> wrote in message > news:tmxVl.581$Qg6.53@newsfe08.iad...
>
>> Pete, I don't have those answers for you just yet.  Decompressor size
>> continues to change.  I have verified output is same on data that I have
>> been working with.
>>
>> I'll get those for you though here after some more changes to the
>> decompressor (which will result in smaller size).
>
> Forget the decompresser size.
> What size is the compressed file?
>

I'll tell you after I compress the file.  I'm going to reconfigure for 4 bit
blocks to 3 bits and let you know.  I'll also post in my forum at
www.tretbase.com when I get it completed.

Paul


 0

On Jun 3, 9:19=A0am, "Paul" <p...@tretbase.com> wrote:
> Jim, I'm not going to take the time to figure it out. =A0But if you're an
> assembly programmer then you should have known your statements were
> ridiculous.

I guess you didn't read what I wrote.  My point is that, given my
decompressor, it is impossible to figure out exactly the method I am
using to compress the data.  I gave sufficient information to prove
this.  Therefore, your claims of "anyone can figure out the entire
method given the decompressor" is false.

If you're not going to read what people are posting, then stop making
idiotic statements without basis or proof.

 0

On Jun 3, 12:08=A0pm, "Paul" <p...@tretbase.com> wrote:
>
> I'll tell you after I compress the file. =A0I'm going to reconfigure for =
4 bit
> blocks to 3 bits and let you know. =A0

Why stop there?  If your method can take any 4-bit block and make it
into 3 bits and can still somehow reverse the transform in all cases,
why not just go all the way configured for 2-bit down to 1-bit?  You
should be able to compress any file down to a single bit, right?

> I'll also post in my forum atwww.tretbase.comwhen I get it completed.

have been doing.


 0


"Jim Leonard" <MobyGamer@gmail.com> wrote in message
> On Jun 3, 9:19 am, "Paul" <p...@tretbase.com> wrote:
>> Jim, I'm not going to take the time to figure it out.  But if you're an
>> assembly programmer then you should have known your statements were
>> ridiculous.
>
> I guess you didn't read what I wrote.  My point is that, given my
> decompressor, it is impossible to figure out exactly the method I am
> using to compress the data.  I gave sufficient information to prove
> this.  Therefore, your claims of "anyone can figure out the entire
> method given the decompressor" is false.
>
> If you're not going to read what people are posting, then stop making
> idiotic statements without basis or proof.

more talented programmers than myself that directly involved in the
production of a programming language and they said that anyone with a
decompressor and the input/output files would be able to reverse engineer
and determine the method.  So again, your statements don't hold weight.

Paul


 0


"Jim Leonard" <MobyGamer@gmail.com> wrote in message
> On Jun 3, 12:08 pm, "Paul" <p...@tretbase.com> wrote:
>>
>> I'll tell you after I compress the file.  I'm going to reconfigure for 4
>> bit
>> blocks to 3 bits and let you know.
>
> Why stop there?  If your method can take any 4-bit block and make it
> into 3 bits and can still somehow reverse the transform in all cases,
> why not just go all the way configured for 2-bit down to 1-bit?  You
> should be able to compress any file down to a single bit, right?
>
>> I'll also post in my forum atwww.tretbase.comwhen I get it completed.
>
> We do not follow your registration-required forum.  Post here, as you
> have been doing.
>

No, I don't make such a claim that I can take any file down to 1 bit.  That
is not a claim I have ever made here.  I make the claim that I can
recursively reduce random data.  It is others that want to produce the
strawman argument into this.

Paul


 0

On Jun 3, 2:53=A0pm, "Paul" <p...@tretbase.com> wrote:
> No, I don't make such a claim that I can take any file down to 1 bit. =A0=
That
> is not a claim I have ever made here. =A0I make the claim that I can
> recursively reduce random data. =A0It is others that want to produce the
> strawman argument into this.

You've said you can compress all files of N-bits of length
(specifically N =3D 1000) and uniquely decode them all.

I wonder what prevents you from compressing all files from N-bits to
<=3D N-1 bits until N=3D2 that doesn't prevent you from compressing all
N=3D1000 bit files?

Oh, could it be NOTHING?

Also, if you're claim is just that you can compress compressed data
(some of the times) that's not original nor unique.  Have you actually
compressed all 1000-bit files, or have a valid proof your method would
work?  I sincerely doubt so.

At best you have an algorithm you applied to a few handcrafted files
and saw some progress and are [incorrectly] extrapolating it to all
files.

At worse, and sadly most likely, you're just trolling because you're
seeking attention by negative means because you don't know [or care]
to act better.

Tom

 0


<tom@iahu.ca> wrote in message
> On Jun 3, 2:53 pm, "Paul" <p...@tretbase.com> wrote:
>> No, I don't make such a claim that I can take any file down to 1 bit.
>> That
>> is not a claim I have ever made here.  I make the claim that I can
>> recursively reduce random data.  It is others that want to produce the
>> strawman argument into this.
>
> You've said you can compress all files of N-bits of length
> (specifically N = 1000) and uniquely decode them all.
>
> I wonder what prevents you from compressing all files from N-bits to
> <= N-1 bits until N=2 that doesn't prevent you from compressing all
> N=1000 bit files?
>
> Oh, could it be NOTHING?
>
> Also, if you're claim is just that you can compress compressed data
> (some of the times) that's not original nor unique.  Have you actually
> compressed all 1000-bit files, or have a valid proof your method would
> work?  I sincerely doubt so.
>
> At best you have an algorithm you applied to a few handcrafted files
> and saw some progress and are [incorrectly] extrapolating it to all
> files.
>
> At worse, and sadly most likely, you're just trolling because you're
> seeking attention by negative means because you don't know [or care]
> to act better.
>
> Tom

You got the wrong person.  You must be referring to the person that started

Paul


 0

Jim Leonard <MobyGamer@gmail.com> writes:
> On Jun 2, 5:41�pm, Keith Thompson <ks...@mib.org> wrote:
>> Isn't it plausible that providing a copy
>> of either the compressor or the decompressor might risk revealing
>
> The compressor, most definitely.  The decompressor, usually not.  I
> can see how CM methods like LZP might give away a few hints looking at
> the decompression process only (LZP builds a table and uses it to
> provide prior context, just like the compression phase), but not the
> entire method.

*Usually* not.

I understand that the impossibility of unlimited recursive compression
of random data is based on fundamental mathematics (the counting
theorem).  Is the impossibility (or great difficulty) of determining
the methods used by a compressor given a copy of the decompressor
similarly based on fundamental mathematics, or is it just a common
attribute of existing compressors and decompressors?  (That's not a
rhetorical question; I don't know the answer.)

These people are claiming novel techniques.  If we're going to take
them seriously enough to suggest how they can demonstrate their
claims, using testing methods that would give actual results *if*
their claims were valid, then I think we have to acknowledge the
possibility that revealing the decompressor would reveal the
techniques, even if it wouldn't do so for an ordinary
compressor/decompressor.

And if there is a fundamental mathematical reason why revealing the
decompressor wouldn't pose any risk, I suspect that it's something
more subtle than the counting theorem -- and if they don't accept
that, they're hardly going to accept the other idea.

It seems to me that insisting on a copy of the decompressor just gives
them an excuse to refuse testing.

--
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Nokia
"We must do something.  This is something.  Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

 0

On Jun 3, 3:14=A0pm, Keith Thompson <ks...@mib.org> wrote:
> And if there is a fundamental mathematical reason why revealing the
> decompressor wouldn't pose any risk, I suspect that it's something
> more subtle than the counting theorem -- and if they don't accept
> that, they're hardly going to accept the other idea.
>
> It seems to me that insisting on a copy of the decompressor just gives
> them an excuse to refuse testing.

I think more so they refuse because it means their ruse would be up.
They think they can keep up the "debate" or thread or whatever, so
long as they appear to have a legitimate complaint against the
standards of science and refuse to listen to reason.

Amongst us sane people we're all aware of the counting argument and it
"just makes sense."  We're not trying to convince ourselves.  I think
a few people here genuinely want to "educate" the troll, a few just
like kicking around replies, and people like me like calling spades

The OP and Paul [and people like them] are just trolls.  They know, or
should by now, that their nonsense doesn't pass mustard.  They just
keep it up for the attention it gets them.  So we should stop acting
like there is some need to "correct" them because engaging them, even
on a "challenging your result" level implies you acknowledge them as
someone worthy to discuss interesting matters with.  They're not.  If
you met one of these people in real life the conversation would last
all of a minute at most.  you'd say "counting argument" and they'd
claim "oh but no I have that solved" and it'd just end there.

who cares?  Provided they get a patent it's not like you'd be using
the method anyways.  Look at the resistance to LZW prior to the
expiration of that patent.  People actually invented a whole new
[better] format just to side-step that annoying patent.

That's also an argument against patenting your algorithm.  Unless you
find a new novel application space [like they did with MDCT for audio]
nobody cares.  Bulk data compression is handled just fine by things
like PPM, BWT, LZ77, etc...  Bandwidth is cheap.  So push come to
shove I'll use PPM or BWT (like bzip2) over a patented method any day.

Tom

 0

On Jun 3, 3:14=A0pm, "Paul" <p...@tretbase.com> wrote:
> You got the wrong person. =A0You must be referring to the person that sta=
rted
>
> Paul

Ah, perhaps, you trolls are so interchangeable these days...

Well since you're on a infinite compressor bender too, you can answer
the claim too.

Tom

 0

"Keith Thompson" <kst-u@mib.org> wrote in message
news:ln8wk92kpo.fsf@nuthaus.mib.org...

> Is the impossibility (or great difficulty) of determining
> the methods used by a compressor given a copy of the decompressor
> similarly based on fundamental mathematics,

No.


 0

On Jun 2, 7:41=A0am, SG <s.gesem...@gmail.com> wrote:
> On 2 Jun., 12:15, t...@iahu.ca wrote:
>
> > On Jun 1, 7:15=A0pm, ffkfsoxfdzg...@mailinator.com wrote:
>
> > > I'm
> > > having a rough time finding a way to show my product's ability, while
> > > still keeping it protected.
>
> > Is that because it doesn't actually work?
>
> > I don't get you trolls, like it's an obvious gag, so why do you keep
> > perpetuating this nonsense?
>
> I think the most likely explanation is incompetence. Many of those
> random-compression-posters probably believe in their algorithms and
> this belief is sometimes so strong that they think they don't need any
> testing (decompressor).http://en.wikipedia.org/wiki/Dunning-Kruger_effect=
http://en.wikipedia.org/wiki/Hanlon%27s_razor
>
> Cheers!
> SG

Interesting articles.  I don't disagree that that's possible.  But
since it's so trivial to disprove their claims (which many of done
here over the years) you'd think they'd open their eyes to it.

Which is why I attribute it to malice.  EVEN IF they still think
they're right, they shouldn't in face of the abundance of evidence/
proof to the contrary.  It's a case of "they ought to know better."
So I wouldn't let them off the hook even if they just happen to not
know better.

Point is, if I walked into my lab here, claimed something for which I
couldn't prove, and was then easily disproven, I would not only be
expected to readjust my thinking, but if I was being an ass about my
claims I'd have to make a beer store run to return normalcy to the
lab.  :-)

Tom

 0

On Jun 3, 10:15=A0am, "Paul" <p...@tretbase.com> wrote:
> I think you and Mark should up the amount. =A0I mean if it really CAN'T b=
e
> done then why not $10,000? > > Paul How about this, I'll offer$10,000 for a compressor which fairly
violates the counting argument (e.g. can decompress correctly,
decompressor+compressed data smaller than the source), if you offer to
pay me $5,000 up front, refundable upon completion of the challenge. If you're so confident it CAN be done, you'd be stupid not to accept my terms. Tom   0 Reply tom 6/3/2009 7:56:26 PM tom@iahu.ca wrote: ) On Jun 3, 3:14?pm, Keith Thompson <ks...@mib.org> wrote: )> <...> )> It seems to me that insisting on a copy of the decompressor just gives )> them an excuse to refuse testing. ) ) I think more so they refuse because it means their ruse would be up. Yes, but that's not the *excuse* they would use. SaSW, Willem -- Disclaimer: I am in no way responsible for any of the statements made in the above text. For all I know I might be drugged or something.. No I'm not paranoid. You all think I'm paranoid, don't you ! #EOT   0 Reply Willem 6/3/2009 8:02:26 PM On Jun 3, 9:15=A0am, "Paul" <p...@tretbase.com> wrote: > > I think you and Mark should up the amount. =A0I mean if it really CAN'T b= e > done then why not$10,000?
>

Offering $1,000 would mean that people would be tempted to game the rules, not the compression. I really don't want to be in court trying to explain to a judge how why it doesn't count for a win when you move information from the contents of a file into the name of the file. - Mark   0 Reply Mark 6/3/2009 8:05:49 PM On Jun 3, 1:51=A0pm, "Paul" <p...@tretbase.com> wrote: > I just got done talking about this subject in another group with some muc= h > more talented programmers than myself that directly involved in the > production of a programming language and they said that anyone with a > decompressor and the input/output files would be able to reverse engineer > and determine the method. Then I invite you to direct them to my post with my decompressor and ask them to determine the compression method.   0 Reply Jim 6/3/2009 8:20:36 PM  <tom@iahu.ca> wrote in message news:97d5f2fa-6a19-49f2-b0c2-eb8ae6c3111d@z7g2000vbh.googlegroups.com... > On Jun 3, 10:15 am, "Paul" <p...@tretbase.com> wrote: >> I think you and Mark should up the amount. I mean if it really CAN'T be >> done then why not$10,000?
>>
>> Paul
>
> How about this, I'll offer $10,000 for a compressor which fairly > violates the counting argument (e.g. can decompress correctly, > decompressor+compressed data smaller than the source), if you offer to > pay me$5,000 up front, refundable upon completion of the challenge.
>
> If you're so confident it CAN be done, you'd be stupid not to accept
> my terms.
>
> Tom

Publish your challenge.  Start a webpage with all the details.  What is
acceptable and what is not acceptable.

Paul


 0


"Mark Nelson" <snorkelman@gmail.com> wrote in message
> On Jun 3, 9:15 am, "Paul" <p...@tretbase.com> wrote:
>>
>> I think you and Mark should up the amount.  I mean if it really CAN'T be
>> done then why not $10,000? >> > > Offering$1,000 would mean that people would be tempted to game the
> rules, not the compression.
>
> I really don't want to be in court trying to explain to a judge how
> why it doesn't count for a win when you move information from the
> contents of a file into the name of the file.
>
> -  Mark

Come on Mark, that is an excuse.  You can make all the clarifications you
believe is necessary.  Right now a $100 challenge to me says you have concerns and not confident in your claims. Paul   0 Reply Paul 6/3/2009 9:02:36 PM  "Jim Leonard" <MobyGamer@gmail.com> wrote in message news:39e75c6b-a39f-48e7-a9e6-799c1b537aa1@q2g2000vbr.googlegroups.com... > On Jun 3, 1:51 pm, "Paul" <p...@tretbase.com> wrote: >> I just got done talking about this subject in another group with some >> much >> more talented programmers than myself that directly involved in the >> production of a programming language and they said that anyone with a >> decompressor and the input/output files would be able to reverse engineer >> and determine the method. > > Then I invite you to direct them to my post with my decompressor and > ask them to determine the compression method. I really don't think anyone is going to waste their time with that. Paul   0 Reply Paul 6/3/2009 9:04:11 PM  <tom@iahu.ca> wrote in message news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroups.com... > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: >> You got the wrong person. You must be referring to the person that >> started >> this thread. I never made such a claim. >> >> Paul > > Ah, perhaps, you trolls are so interchangeable these days... > > Well since you're on a infinite compressor bender too, you can answer > the claim too. > > Tom Tom, my claim is that I can recursively reduce random data. Does that sound impossible to you? Paul   0 Reply Paul 6/3/2009 9:10:07 PM "Paul" <paul@tretbase.com> writes: [...] > Tom, my claim is that I can recursively reduce random data. Does that > sound impossible to you? Have you tried it on AMillionRandomDigits.bin? -- Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.net/~kst> Nokia "We must do something. This is something. Therefore, we must do this." -- Antony Jay and Jonathan Lynn, "Yes Minister"   0 Reply Keith 6/4/2009 12:20:59 AM  "Keith Thompson" <kst-u@mib.org> wrote in message news:lniqjc26j8.fsf@nuthaus.mib.org... > "Paul" <paul@tretbase.com> writes: > [...] >> Tom, my claim is that I can recursively reduce random data. Does that >> sound impossible to you? > > Have you tried it on AMillionRandomDigits.bin? > > -- > Keith Thompson (The_Other_Keith) kst-u@mib.org > <http://www.ghoti.net/~kst> > Nokia > "We must do something. This is something. Therefore, we must do this." > -- Antony Jay and Jonathan Lynn, "Yes Minister" I will, I am currently re-programming my compressor. Paul   0 Reply Paul 6/4/2009 12:37:33 AM This thread really grew, there's a lot to disgest. But let me say a few things: 1) I'm not here to "troll" or deceive. I'm not trying to sell you guys anything or get praise for my method. My ONLY goal here is to get advice from people who I think are well-versed in compression on a way to bring this to market. 2) If you don't believe me, that's fine. Skepticism is always healthy and always welcome. Many of the assumptions made in this thread are false, and I look forward to showing why when my product comes to market. I'm not here to debate theorems or methodology, this is not the time nor place for that. Would I love to reveal my method right this very second, but then years of work would be wasted and I wouldn't see a penny for my efforts. 3) Thanks to all that have given me advice, instead of insults. Especially Paul, and Mark Nelson. 4) Let's please keep this discussion civil. I know many think my claims are "impossible", but were all adults here, let's treat each other as such.   0 Reply ffkfsoxfdzgjez 6/4/2009 1:15:49 AM <ffkfsoxfdzgjez@mailinator.com> wrote in message news:beeedbbe-e10e-4e5e-995b-dd21616fc536@c36g2000yqn.googlegroups.com... > Many of the assumptions made in this thread are > false, and I look forward to showing why when my product comes to > market. I'm not here to debate theorems or methodology, this is not > the time nor place for that. Would I love to reveal my method right > this very second, but then years of work would be wasted and I > wouldn't see a penny for my efforts. Would you be interested in providing us the same information that Paul has provided (or will soon)? 1) Can you compress the million random digits file, then confirm that the decompresser returns the original file? Paul has confirmed this already. 2) Can you give us the size of the compressed file? Paul is making a few changes to his compressor, then will do that. I'm not sure why he didn't give us the size with his previous compressor. Perhaps it didn't compress as well as he had hoped - I assume he'll repeat step 1) with the new compressor. 3) Can you tell us the size of the decompressor? Paul was not willing to tell us the size of his current decompressor, because he felt it could be smaller. Looking forward to the answers. Pete   0 Reply Pete 6/4/2009 1:42:27 AM "Paul" <paul@tretbase.com> writes: > "Keith Thompson" <kst-u@mib.org> wrote in message > news:lniqjc26j8.fsf@nuthaus.mib.org... >> "Paul" <paul@tretbase.com> writes: >> [...] >>> Tom, my claim is that I can recursively reduce random data. Does that >>> sound impossible to you? >> >> Have you tried it on AMillionRandomDigits.bin? > > I will, I am currently re-programming my compressor. Ok, I'll just sit here and hold my breath until you post some concrete results. Or you could just try it with any existing working version of your compressor rather than waiting until you've finished re-programming it. You *do* have a working version, right? -- Keith Thompson (The_Other_Keith) kst-u@mib.org <http://www.ghoti.net/~kst> Nokia "We must do something. This is something. Therefore, we must do this." -- Antony Jay and Jonathan Lynn, "Yes Minister"   0 Reply Keith 6/4/2009 1:48:01 AM On Jun 3, 6:42=A0pm, "Pete Fraser" <pfra...@covad.net> wrote: > <ffkfsoxfdzg...@mailinator.com> wrote in message > > news:beeedbbe-e10e-4e5e-995b-dd21616fc536@c36g2000yqn.googlegroups.com... > > > Many of the assumptions made in this thread are > > false, and I look forward to showing why when my product comes to > > market. I'm not here to debate theorems or methodology, this is not > > the time nor place for that. Would I love to reveal my method right > > this very second, but then years of work would be wasted and I > > wouldn't see a penny for my efforts. > > Would you be interested in providing us the same information > that Paul has provided (or will soon)? > > 1) Can you compress the million random digits file, then confirm that > the decompresser returns the original file? > Paul has confirmed this already. > > 2) Can you give us the size of the compressed file? Paul is making > a few changes to his compressor, then will do that. I'm not sure > why he didn't give us the size with his previous compressor. > Perhaps it didn't compress as well as he had hoped - I assume he'll > repeat step 1) with the new compressor. > > 3) Can you tell us the size of the decompressor? Paul was not > willing to tell us the size of his current decompressor, because > he felt it could be smaller. > > Looking forward to the answers. > > Pete 1) Yes. I did this early on to test my method. 2) I didn't comrpess it to the limit, so I'm not sure what the minimum size is yet. I only did it for a few cycles and the compressed file was around 50% of the original, so next time I will see how low I can take it. I'll get some numbers for you soon, with work I haven't had time to do a lot of testing the past couple weeks. That said, the compression was consistent for the cycles I ran it through, and it decompressed to match the original input. 3)200 kb   0 Reply ffkfsoxfdzgjez 6/4/2009 2:03:08 AM Paul wrote: > Tom, my claim is that I can recursively reduce random data. =A0Does that = sound > impossible to you? If you're talking about *lossless* (reversible) compression and compression implies reduction in size of at least one bit and "random data" includes all possible files a typical file system can hold, then yes, it does not only sound impossible but it *is* impossible with 100% certaincy -- *provably* impossible. This means your efforts are futile and you're better off by spending time trying to understand the proof. Cheers! SG   0 Reply SG 6/4/2009 6:05:23 AM On Jun 3, 5:00=A0pm, "Paul" <p...@tretbase.com> wrote: > <t...@iahu.ca> wrote in message > > news:97d5f2fa-6a19-49f2-b0c2-eb8ae6c3111d@z7g2000vbh.googlegroups.com... > > > On Jun 3, 10:15 am, "Paul" <p...@tretbase.com> wrote: > >> I think you and Mark should up the amount. =A0I mean if it really CAN'= T be > >> done then why not$10,000?
>
> >> Paul
>
> > How about this, I'll offer $10,000 for a compressor which fairly > > violates the counting argument (e.g. can decompress correctly, > > decompressor+compressed data smaller than the source), if you offer to > > pay me$5,000 up front, refundable upon completion of the challenge.
>
> > If you're so confident it CAN be done, you'd be stupid not to accept
> > my terms.
>
> > Tom
>
> Publish your challenge. =A0Start a webpage with all the details. =A0What =
is
> acceptable and what is not acceptable.

You wouldn't accept the challenge anyways.  I mean are you really
telling me you have $5,000 ready to part with to prove your claims? That being said my offer was more of hyperbole to expose your flaw in your argument. It really should be the other way around. You pay us for the chance to compete for a prize. That way there is more motivation to be correct and not cheat. This no-money-down contest idea just gives you incentive to find loopholes in the rules. Tom   0 Reply tom 6/4/2009 10:36:40 AM On Jun 3, 5:10=A0pm, "Paul" <p...@tretbase.com> wrote: > <t...@iahu.ca> wrote in message > > news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroups.com... > > > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: > >> You got the wrong person. =A0You must be referring to the person that > >> started > >> this thread. =A0I never made such a claim. > > >> Paul > > > Ah, perhaps, you trolls are so interchangeable these days... > > > Well since you're on a infinite compressor bender too, you can answer > > the claim too. > > > Tom > > Tom, my claim is that I can recursively reduce random data. =A0Does that = sound > impossible to you? You need stronger definitions of what you claim to do. Recursively could include the set of operations performed 0 times just as easily as N>0 times. You haven't defined what "random" data is either. If you're claiming that you can compress all N-bit strings (N>1) to a size of something less than N-bits long. You're a liar. It's trivial to disprove. You can't compress all 2-bit strings to 1 bit because 1 bit can only represent half as many values as a 2-bit string. You can't compress all 3-bit strings to 2 or 1 bit because even combined (ignoring a need to be uniquely decodable) where are only 6 2- and 1- bit strings whereas there are 8 3-bit string. You can continue this argument on as N-bit and the sum of all possible 1,2,3,...,N-1 bit strings. \sum_{i=3D1}^{N-1}2^i will always be less than 2^N (hint: the sum is equal to 2^N - 2). Even if you somehow allowed in the zero- length string as a possible encoding, you're still only up to a sum of 2^N - 1. There are 2^N values you need to encode. So therefore you failed. And this of course is ignoring the fact you need some escape bits to tell one length of string from another. So now that I've disproven that ANY FORM of random data compressor is impossible, you're continuing to maintain that you've done it is just you being a troll. And there is no two ways about it (excuse the pun). You're now aware that it can't be done, so maintaining that it can is just you being contrary. What you need to do is stop trolling usenet because you're wasting your life. If you want to go out and have a good time hit the local pub, pound a few back and talk big with the folks at the counter. At least then you'll be interacting with real people in real life as opposed to getting a rise out of people remotely and anonymously. Honestly, I'm not saying this to be mean, you really ARE wasting everyones [including your own] time. Tom   0 Reply tom 6/4/2009 11:07:25 AM On Jun 4, 7:07=A0am, t...@iahu.ca wrote: > So now that I've disproven that ANY FORM of random data compressor is > impossible, you're continuing to maintain that you've done it is just > you being a troll. =A0And there is no two ways about it (excuse the > pun). =A0You're now aware that it can't be done, so maintaining that it > can is just you being contrary. That should be of course "proven" ... teach me to post before coffee. Anyways, I'm going back to the sidelines in these threads. Tom   0 Reply tom 6/4/2009 11:35:12 AM  <tom@iahu.ca> wrote in message news:9604e056-877d-4724-b18c-54b047274780@o14g2000vbo.googlegroups.com... > On Jun 3, 5:00 pm, "Paul" <p...@tretbase.com> wrote: >> <t...@iahu.ca> wrote in message >> >> news:97d5f2fa-6a19-49f2-b0c2-eb8ae6c3111d@z7g2000vbh.googlegroups.com... >> >> > On Jun 3, 10:15 am, "Paul" <p...@tretbase.com> wrote: >> >> I think you and Mark should up the amount. I mean if it really CAN'T >> >> be >> >> done then why not$10,000?
>>
>> >> Paul
>>
>> > How about this, I'll offer $10,000 for a compressor which fairly >> > violates the counting argument (e.g. can decompress correctly, >> > decompressor+compressed data smaller than the source), if you offer to >> > pay me$5,000 up front, refundable upon completion of the challenge.
>>
>> > If you're so confident it CAN be done, you'd be stupid not to accept
>> > my terms.
>>
>> > Tom
>>
>> Publish your challenge.  Start a webpage with all the details.  What is
>> acceptable and what is not acceptable.
>
> You wouldn't accept the challenge anyways.  I mean are you really
> telling me you have $5,000 ready to part with to prove your claims? > > That being said my offer was more of hyperbole to expose your flaw in > your argument. It really should be the other way around. You pay us > for the chance to compete for a prize. That way there is more > motivation to be correct and not cheat. This no-money-down contest > idea just gives you incentive to find loopholes in the rules. > > Tom Tom, I would be willing to give$5,000 to make $10,000. Wouldn't anyone? Paul   0 Reply Paul 6/4/2009 11:56:45 AM  <tom@iahu.ca> wrote in message news:fd764921-31ce-4853-8c41-2a9c9a4396a1@r3g2000vbp.googlegroups.com... > On Jun 3, 5:10 pm, "Paul" <p...@tretbase.com> wrote: >> <t...@iahu.ca> wrote in message >> >> news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroups.com... >> >> > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: >> >> You got the wrong person. You must be referring to the person that >> >> started >> >> this thread. I never made such a claim. >> >> >> Paul >> >> > Ah, perhaps, you trolls are so interchangeable these days... >> >> > Well since you're on a infinite compressor bender too, you can answer >> > the claim too. >> >> > Tom >> >> Tom, my claim is that I can recursively reduce random data. Does that >> sound >> impossible to you? > > You need stronger definitions of what you claim to do. Recursively > could include the set of operations performed 0 times just as easily > as N>0 times. You haven't defined what "random" data is either. > > If you're claiming that you can compress all N-bit strings (N>1) to a > size of something less than N-bits long. You're a liar. It's trivial > to disprove. You can't compress all 2-bit strings to 1 bit because 1 > bit can only represent half as many values as a 2-bit string. You > can't compress all 3-bit strings to 2 or 1 bit because even combined > (ignoring a need to be uniquely decodable) where are only 6 2- and 1- > bit strings whereas there are 8 3-bit string. You can continue this > argument on as N-bit and the sum of all possible 1,2,3,...,N-1 bit > strings. \sum_{i=1}^{N-1}2^i will always be less than 2^N (hint: the > sum is equal to 2^N - 2). Even if you somehow allowed in the zero- > length string as a possible encoding, you're still only up to a sum of > 2^N - 1. There are 2^N values you need to encode. So therefore you > failed. And this of course is ignoring the fact you need some escape > bits to tell one length of string from another. > > So now that I've disproven that ANY FORM of random data compressor is > impossible, you're continuing to maintain that you've done it is just > you being a troll. And there is no two ways about it (excuse the > pun). You're now aware that it can't be done, so maintaining that it > can is just you being contrary. > > What you need to do is stop trolling usenet because you're wasting > your life. If you want to go out and have a good time hit the local > pub, pound a few back and talk big with the folks at the counter. At > least then you'll be interacting with real people in real life as > opposed to getting a rise out of people remotely and anonymously. > Honestly, I'm not saying this to be mean, you really ARE wasting > everyones [including your own] time. > > Tom Tom, I'm beginning to believe that there is no such thing as Random Data. But I know the depth at which people here consider data Random so I go by that. But for the sake of argument, let's call Random data whatever you believe exhibits the greatest entropy. There is no trolling going on here, Tom. You disbelieve so you consider anyone that does believe to be trolling. Paul   0 Reply Paul 6/4/2009 12:02:12 PM  <tom@iahu.ca> wrote in message news:c31e4767-e887-4006-9d6a-36b7ea49063f@s21g2000vbb.googlegroups.com... > On Jun 4, 7:07 am, t...@iahu.ca wrote: >> So now that I've disproven that ANY FORM of random data compressor is >> impossible, you're continuing to maintain that you've done it is just >> you being a troll. And there is no two ways about it (excuse the >> pun). You're now aware that it can't be done, so maintaining that it >> can is just you being contrary. > > That should be of course "proven" ... teach me to post before coffee. > > Anyways, I'm going back to the sidelines in these threads. > > Tom Tom, get your coffee. Paul   0 Reply Paul 6/4/2009 12:05:16 PM  "SG" <s.gesemann@gmail.com> wrote in message news:03108d47-006b-4430-b121-e9a3d16ccc01@l12g2000yqo.googlegroups.com... > Paul wrote: >> Tom, my claim is that I can recursively reduce random data. Does that >> sound >> impossible to you? > > If you're talking about *lossless* (reversible) compression and > compression implies reduction in size of at least one bit and "random > data" includes all possible files a typical file system can hold, then > yes, it does not only sound impossible but it *is* impossible with > 100% certaincy -- *provably* impossible. This means your efforts are > futile and you're better off by spending time trying to understand the > proof. > > Cheers! > SG > Good, then I know I'm on to something. Would have hated reinventing the wheel. Paul   0 Reply Paul 6/4/2009 12:26:54 PM On Jun 4, 8:02=A0am, "Paul" <p...@tretbase.com> wrote: > <t...@iahu.ca> wrote in message > > news:fd764921-31ce-4853-8c41-2a9c9a4396a1@r3g2000vbp.googlegroups.com... > > > > > On Jun 3, 5:10 pm, "Paul" <p...@tretbase.com> wrote: > >> <t...@iahu.ca> wrote in message > > >>news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroups.com.= ... > > >> > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: > >> >> You got the wrong person. =A0You must be referring to the person th= at > >> >> started > >> >> this thread. =A0I never made such a claim. > > >> >> Paul > > >> > Ah, perhaps, you trolls are so interchangeable these days... > > >> > Well since you're on a infinite compressor bender too, you can answe= r > >> > the claim too. > > >> > Tom > > >> Tom, my claim is that I can recursively reduce random data. =A0Does th= at > >> sound > >> impossible to you? > > > You need stronger definitions of what you claim to do. =A0Recursively > > could include the set of operations performed 0 times just as easily > > as N>0 times. =A0You haven't defined what "random" data is either. > > > If you're claiming that you can compress all N-bit strings (N>1) to a > > size of something less than N-bits long. =A0You're a liar. =A0It's triv= ial > > to disprove. =A0You can't compress all 2-bit strings to 1 bit because 1 > > bit can only represent half as many values as a 2-bit string. =A0You > > can't compress all 3-bit strings to 2 or 1 bit because even combined > > (ignoring a need to be uniquely decodable) where are only 6 2- and 1- > > bit strings whereas there are 8 3-bit string. =A0You can continue this > > argument on as N-bit and the sum of all possible 1,2,3,...,N-1 bit > > strings. =A0\sum_{i=3D1}^{N-1}2^i will always be less than 2^N (hint: t= he > > sum is equal to 2^N - 2). =A0Even if you somehow allowed in the zero- > > length string as a possible encoding, you're still only up to a sum of > > 2^N - 1. =A0There are 2^N values you need to encode. =A0So therefore yo= u > > failed. =A0And this of course is ignoring the fact you need some escape > > bits to tell one length of string from another. > > > So now that I've disproven that ANY FORM of random data compressor is > > impossible, you're continuing to maintain that you've done it is just > > you being a troll. =A0And there is no two ways about it (excuse the > > pun). =A0You're now aware that it can't be done, so maintaining that it > > can is just you being contrary. > > > What you need to do is stop trolling usenet because you're wasting > > your life. =A0If you want to go out and have a good time hit the local > > pub, pound a few back and talk big with the folks at the counter. =A0At > > least then you'll be interacting with real people in real life as > > opposed to getting a rise out of people remotely and anonymously. > > Honestly, I'm not saying this to be mean, you really ARE wasting > > everyones [including your own] time. > > > Tom > > Tom, I'm beginning to believe that there is no such thing as Random Data. > But I know the depth at which people here consider data Random so I go by > that. =A0But for the sake of argument, let's call Random data whatever yo= u > believe exhibits the greatest entropy. > > There is no trolling going on here, Tom. =A0You disbelieve so you conside= r > anyone that does believe to be trolling. The point though is that it doesn't matter if the data is random or not. messages of length M where M<N cannot represent all possible N- bit messages. Whether they're random or enumerated. Like a program that simply emits all 1024-bit integers will need to have at least 1024-bits of storage to represent the counter. Even though the integers are sequential and totally non-random. So to have a program that emits all N-bit messages, you'll need either N-bits [or more] of input, or combine M<N bits of input with bits from the program. And if you want to emit specific messages than the input must uniquely determine the output which means you either can't compress all N-bit messages, or you violate M>N on average. Tom   0 Reply tom 6/4/2009 1:16:44 PM On Jun 4, 8:05=A0am, "Paul" <p...@tretbase.com> wrote: > <t...@iahu.ca> wrote in message > > news:c31e4767-e887-4006-9d6a-36b7ea49063f@s21g2000vbb.googlegroups.com... > > > On Jun 4, 7:07 am, t...@iahu.ca wrote: > >> So now that I've disproven that ANY FORM of random data compressor is > >> impossible, you're continuing to maintain that you've done it is just > >> you being a troll. =A0And there is no two ways about it (excuse the > >> pun). =A0You're now aware that it can't be done, so maintaining that i= t > >> can is just you being contrary. > > > That should be of course "proven" ... teach me to post before coffee. > > > Anyways, I'm going back to the sidelines in these threads. > > > Tom > > Tom, get your coffee. > > Paul It's all cute to be playful off my typo, but you've totally [as usual] sidestepped the entire discussion. Do you not accept my reasoning that the sum total of all M<N bit messages is less than the total of all messages of N bits long? If you do, then you cannot then state "I can compress all N bit messages" without knowingly stating something you disagree with. If you do not, then can you please point out the flaw in my one sentence proof that you can't compress all equal length strings? Tom   0 Reply tom 6/4/2009 1:19:10 PM  <tom@iahu.ca> wrote in message news:98056804-0474-4a6d-9757-608d54d329e6@s16g2000vbp.googlegroups.com... > On Jun 4, 8:05 am, "Paul" <p...@tretbase.com> wrote: >> <t...@iahu.ca> wrote in message >> >> news:c31e4767-e887-4006-9d6a-36b7ea49063f@s21g2000vbb.googlegroups.com... >> >> > On Jun 4, 7:07 am, t...@iahu.ca wrote: >> >> So now that I've disproven that ANY FORM of random data compressor is >> >> impossible, you're continuing to maintain that you've done it is just >> >> you being a troll. And there is no two ways about it (excuse the >> >> pun). You're now aware that it can't be done, so maintaining that it >> >> can is just you being contrary. >> >> > That should be of course "proven" ... teach me to post before coffee. >> >> > Anyways, I'm going back to the sidelines in these threads. >> >> > Tom >> >> Tom, get your coffee. >> >> Paul > > It's all cute to be playful off my typo, but you've totally [as usual] > sidestepped the entire discussion. Do you not accept my reasoning > that the sum total of all M<N bit messages is less than the total of > all messages of N bits long? > > If you do, then you cannot then state "I can compress all N bit > messages" without knowingly stating something you disagree with. > > If you do not, then can you please point out the flaw in my one > sentence proof that you can't compress all equal length strings? > > Tom > Tom, I answered this a bit in the response to you in the million bit SUCCESS thread. Here is what I know. There is a sequence of data that may not compress on first pass. However, it may recursively compress. The reason for this is that the representation of the original data is now represented differently on the first pass. Therefore, when the recursion runs through the processed data it may indeed then compress the data. The interesting thing is that I don't think I have optimized the method yet. I'm getting awesome results but I think it can be improved further. Paul   0 Reply Paul 6/4/2009 2:28:54 PM  <tom@iahu.ca> wrote in message news:074e4133-fb24-42b2-82f0-6001fdb8ff33@s21g2000vbb.googlegroups.com... > On Jun 4, 8:02 am, "Paul" <p...@tretbase.com> wrote: >> <t...@iahu.ca> wrote in message >> >> news:fd764921-31ce-4853-8c41-2a9c9a4396a1@r3g2000vbp.googlegroups.com... >> >> >> >> > On Jun 3, 5:10 pm, "Paul" <p...@tretbase.com> wrote: >> >> <t...@iahu.ca> wrote in message >> >> >>news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroups.com... >> >> >> > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: >> >> >> You got the wrong person. You must be referring to the person that >> >> >> started >> >> >> this thread. I never made such a claim. >> >> >> >> Paul >> >> >> > Ah, perhaps, you trolls are so interchangeable these days... >> >> >> > Well since you're on a infinite compressor bender too, you can >> >> > answer >> >> > the claim too. >> >> >> > Tom >> >> >> Tom, my claim is that I can recursively reduce random data. Does that >> >> sound >> >> impossible to you? >> >> > You need stronger definitions of what you claim to do. Recursively >> > could include the set of operations performed 0 times just as easily >> > as N>0 times. You haven't defined what "random" data is either. >> >> > If you're claiming that you can compress all N-bit strings (N>1) to a >> > size of something less than N-bits long. You're a liar. It's trivial >> > to disprove. You can't compress all 2-bit strings to 1 bit because 1 >> > bit can only represent half as many values as a 2-bit string. You >> > can't compress all 3-bit strings to 2 or 1 bit because even combined >> > (ignoring a need to be uniquely decodable) where are only 6 2- and 1- >> > bit strings whereas there are 8 3-bit string. You can continue this >> > argument on as N-bit and the sum of all possible 1,2,3,...,N-1 bit >> > strings. \sum_{i=1}^{N-1}2^i will always be less than 2^N (hint: the >> > sum is equal to 2^N - 2). Even if you somehow allowed in the zero- >> > length string as a possible encoding, you're still only up to a sum of >> > 2^N - 1. There are 2^N values you need to encode. So therefore you >> > failed. And this of course is ignoring the fact you need some escape >> > bits to tell one length of string from another. >> >> > So now that I've disproven that ANY FORM of random data compressor is >> > impossible, you're continuing to maintain that you've done it is just >> > you being a troll. And there is no two ways about it (excuse the >> > pun). You're now aware that it can't be done, so maintaining that it >> > can is just you being contrary. >> >> > What you need to do is stop trolling usenet because you're wasting >> > your life. If you want to go out and have a good time hit the local >> > pub, pound a few back and talk big with the folks at the counter. At >> > least then you'll be interacting with real people in real life as >> > opposed to getting a rise out of people remotely and anonymously. >> > Honestly, I'm not saying this to be mean, you really ARE wasting >> > everyones [including your own] time. >> >> > Tom >> >> Tom, I'm beginning to believe that there is no such thing as Random Data. >> But I know the depth at which people here consider data Random so I go by >> that. But for the sake of argument, let's call Random data whatever you >> believe exhibits the greatest entropy. >> >> There is no trolling going on here, Tom. You disbelieve so you consider >> anyone that does believe to be trolling. > > The point though is that it doesn't matter if the data is random or > not. messages of length M where M<N cannot represent all possible N- > bit messages. Whether they're random or enumerated. Like a program > that simply emits all 1024-bit integers will need to have at least > 1024-bits of storage to represent the counter. Even though the > integers are sequential and totally non-random. So to have a program > that emits all N-bit messages, you'll need either N-bits [or more] of > input, or combine M<N bits of input with bits from the program. > > And if you want to emit specific messages than the input must uniquely > determine the output which means you either can't compress all N-bit > messages, or you violate M>N on average. > > Tom Then by your understanding it should be impossible to save 12941 bytes of data from the million bit bin file, correct? Paul   0 Reply Paul 6/4/2009 2:30:48 PM On Jun 4, 10:28=A0am, "Paul" <p...@tretbase.com> wrote: > <t...@iahu.ca> wrote in message > > news:98056804-0474-4a6d-9757-608d54d329e6@s16g2000vbp.googlegroups.com... > > > > > On Jun 4, 8:05 am, "Paul" <p...@tretbase.com> wrote: > >> <t...@iahu.ca> wrote in message > > >>news:c31e4767-e887-4006-9d6a-36b7ea49063f@s21g2000vbb.googlegroups.com.= ... > > >> > On Jun 4, 7:07 am, t...@iahu.ca wrote: > >> >> So now that I've disproven that ANY FORM of random data compressor = is > >> >> impossible, you're continuing to maintain that you've done it is ju= st > >> >> you being a troll. =A0And there is no two ways about it (excuse the > >> >> pun). =A0You're now aware that it can't be done, so maintaining tha= t it > >> >> can is just you being contrary. > > >> > That should be of course "proven" ... teach me to post before coffee= .. > > >> > Anyways, I'm going back to the sidelines in these threads. > > >> > Tom > > >> Tom, get your coffee. > > >> Paul > > > It's all cute to be playful off my typo, but you've totally [as usual] > > sidestepped the entire discussion. =A0Do you not accept my reasoning > > that the sum total of all M<N bit messages is less than the total of > > all messages of N bits long? > > > If you do, then you cannot then state "I can compress all N bit > > messages" without knowingly stating something you disagree with. > > > If you do not, then can you please point out the flaw in my one > > sentence proof that you can't compress all equal length strings? > > > Tom > > Tom, I answered this a bit in the response to you in the million bit SUCC= ESS > thread. =A0Here is what I know. =A0There is a sequence of data that may n= ot > compress on first pass. =A0However, it may recursively compress. =A0The r= eason > for this is that the representation of the original data is now represent= ed > differently on the first pass. =A0Therefore, when the recursion runs thro= ugh > the processed data it may indeed then compress the data. =A0 The interest= ing > thing is that I don't think I have optimized the method yet. =A0I'm getti= ng > awesome results but I think it can be improved further. And I'm saying that's not possible. Reduce your "multi passes" to just a black box. I hand you every N-bit message, and with the same box you compress them all. Is that not the claim? If that is the claim, then it's clearly false for the reasons we've stated. If it's not then draw a line in the sand and say where exactly your claims lie. Speak to generalities stop saying "I compressed a file ... " I can do that to. The thread here is about compressing all inputs. is that what you're claiming? Tom   0 Reply tom 6/4/2009 2:40:32 PM On Jun 4, 10:30=A0am, "Paul" <p...@tretbase.com> wrote: > <t...@iahu.ca> wrote in message > > news:074e4133-fb24-42b2-82f0-6001fdb8ff33@s21g2000vbb.googlegroups.com... > > > > > On Jun 4, 8:02 am, "Paul" <p...@tretbase.com> wrote: > >> <t...@iahu.ca> wrote in message > > >>news:fd764921-31ce-4853-8c41-2a9c9a4396a1@r3g2000vbp.googlegroups.com..= .. > > >> > On Jun 3, 5:10 pm, "Paul" <p...@tretbase.com> wrote: > >> >> <t...@iahu.ca> wrote in message > > >> >>news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroups.c= om... > > >> >> > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: > >> >> >> You got the wrong person. =A0You must be referring to the person= that > >> >> >> started > >> >> >> this thread. =A0I never made such a claim. > > >> >> >> Paul > > >> >> > Ah, perhaps, you trolls are so interchangeable these days... > > >> >> > Well since you're on a infinite compressor bender too, you can > >> >> > answer > >> >> > the claim too. > > >> >> > Tom > > >> >> Tom, my claim is that I can recursively reduce random data. =A0Does= that > >> >> sound > >> >> impossible to you? > > >> > You need stronger definitions of what you claim to do. =A0Recursivel= y > >> > could include the set of operations performed 0 times just as easily > >> > as N>0 times. =A0You haven't defined what "random" data is either. > > >> > If you're claiming that you can compress all N-bit strings (N>1) to = a > >> > size of something less than N-bits long. =A0You're a liar. =A0It's t= rivial > >> > to disprove. =A0You can't compress all 2-bit strings to 1 bit becaus= e 1 > >> > bit can only represent half as many values as a 2-bit string. =A0You > >> > can't compress all 3-bit strings to 2 or 1 bit because even combined > >> > (ignoring a need to be uniquely decodable) where are only 6 2- and 1= - > >> > bit strings whereas there are 8 3-bit string. =A0You can continue th= is > >> > argument on as N-bit and the sum of all possible 1,2,3,...,N-1 bit > >> > strings. =A0\sum_{i=3D1}^{N-1}2^i will always be less than 2^N (hint= : the > >> > sum is equal to 2^N - 2). =A0Even if you somehow allowed in the zero= - > >> > length string as a possible encoding, you're still only up to a sum = of > >> > 2^N - 1. =A0There are 2^N values you need to encode. =A0So therefore= you > >> > failed. =A0And this of course is ignoring the fact you need some esc= ape > >> > bits to tell one length of string from another. > > >> > So now that I've disproven that ANY FORM of random data compressor i= s > >> > impossible, you're continuing to maintain that you've done it is jus= t > >> > you being a troll. =A0And there is no two ways about it (excuse the > >> > pun). =A0You're now aware that it can't be done, so maintaining that= it > >> > can is just you being contrary. > > >> > What you need to do is stop trolling usenet because you're wasting > >> > your life. =A0If you want to go out and have a good time hit the loc= al > >> > pub, pound a few back and talk big with the folks at the counter. = =A0At > >> > least then you'll be interacting with real people in real life as > >> > opposed to getting a rise out of people remotely and anonymously. > >> > Honestly, I'm not saying this to be mean, you really ARE wasting > >> > everyones [including your own] time. > > >> > Tom > > >> Tom, I'm beginning to believe that there is no such thing as Random Da= ta. > >> But I know the depth at which people here consider data Random so I go= by > >> that. =A0But for the sake of argument, let's call Random data whatever= you > >> believe exhibits the greatest entropy. > > >> There is no trolling going on here, Tom. =A0You disbelieve so you cons= ider > >> anyone that does believe to be trolling. > > > The point though is that it doesn't matter if the data is random or > > not. =A0messages of length M where M<N cannot represent all possible N- > > bit messages. =A0Whether they're random or enumerated. =A0Like a progra= m > > that simply emits all 1024-bit integers will need to have at least > > 1024-bits of storage to represent the counter. =A0Even though the > > integers are sequential and totally non-random. =A0 So to have a progra= m > > that emits all N-bit messages, you'll need either N-bits [or more] of > > input, or combine M<N bits of input with bits from the program. > > > And if you want to emit specific messages than the input must uniquely > > determine the output which means you either can't compress all N-bit > > messages, or you violate M>N on average. > > > Tom > > Then by your understanding it should be impossible to save 12941 bytes of > data from the million bit bin file, correct? You haven't done it until you can decompress it. And specifically my comments refer to the claim of compressing ALL inputs. For all I know the million bit file [which I didn't make] is the output of a presumed cryptographically strong PRNG with a short seed. Maybe you broke the algorithm and figured out what the seed is. I have no way of knowing as I didn't make the file. Can you compress ALL inputs, or just some inputs?   0 Reply tom 6/4/2009 2:42:42 PM  <tom@iahu.ca> wrote in message news:bc5c5a75-8dbe-44fb-94d3-4f93490919ec@v4g2000vba.googlegroups.com... > On Jun 4, 10:28 am, "Paul" <p...@tretbase.com> wrote: >> <t...@iahu.ca> wrote in message >> >> news:98056804-0474-4a6d-9757-608d54d329e6@s16g2000vbp.googlegroups.com... >> >> >> >> > On Jun 4, 8:05 am, "Paul" <p...@tretbase.com> wrote: >> >> <t...@iahu.ca> wrote in message >> >> >>news:c31e4767-e887-4006-9d6a-36b7ea49063f@s21g2000vbb.googlegroups.com... >> >> >> > On Jun 4, 7:07 am, t...@iahu.ca wrote: >> >> >> So now that I've disproven that ANY FORM of random data compressor >> >> >> is >> >> >> impossible, you're continuing to maintain that you've done it is >> >> >> just >> >> >> you being a troll. And there is no two ways about it (excuse the >> >> >> pun). You're now aware that it can't be done, so maintaining that >> >> >> it >> >> >> can is just you being contrary. >> >> >> > That should be of course "proven" ... teach me to post before >> >> > coffee. >> >> >> > Anyways, I'm going back to the sidelines in these threads. >> >> >> > Tom >> >> >> Tom, get your coffee. >> >> >> Paul >> >> > It's all cute to be playful off my typo, but you've totally [as usual] >> > sidestepped the entire discussion. Do you not accept my reasoning >> > that the sum total of all M<N bit messages is less than the total of >> > all messages of N bits long? >> >> > If you do, then you cannot then state "I can compress all N bit >> > messages" without knowingly stating something you disagree with. >> >> > If you do not, then can you please point out the flaw in my one >> > sentence proof that you can't compress all equal length strings? >> >> > Tom >> >> Tom, I answered this a bit in the response to you in the million bit >> SUCCESS >> thread. Here is what I know. There is a sequence of data that may not >> compress on first pass. However, it may recursively compress. The >> reason >> for this is that the representation of the original data is now >> represented >> differently on the first pass. Therefore, when the recursion runs >> through >> the processed data it may indeed then compress the data. The >> interesting >> thing is that I don't think I have optimized the method yet. I'm getting >> awesome results but I think it can be improved further. > > And I'm saying that's not possible. Reduce your "multi passes" to > just a black box. I hand you every N-bit message, and with the same > box you compress them all. > > Is that not the claim? > > If that is the claim, then it's clearly false for the reasons we've > stated. If it's not then draw a line in the sand and say where > exactly your claims lie. Speak to generalities stop saying "I > compressed a file ... " I can do that to. The thread here is about > compressing all inputs. is that what you're claiming? > > Tom Tom, nobody can do what your saying. It is a STUPID argument. Nobody is going to tell you that they can compress 1 bit to less than 1 bit. Now can you stop with the ridiculous assertions? But to say that we can't compress 1024 bytes of data that exhibit the highest entropy is also a wrong assertion. Paul   0 Reply Paul 6/4/2009 2:57:00 PM  <tom@iahu.ca> wrote in message news:11f26d3c-d0af-4a92-ba29-85eec7682e97@n4g2000vba.googlegroups.com... > On Jun 4, 10:30 am, "Paul" <p...@tretbase.com> wrote: >> <t...@iahu.ca> wrote in message >> >> news:074e4133-fb24-42b2-82f0-6001fdb8ff33@s21g2000vbb.googlegroups.com... >> >> >> >> > On Jun 4, 8:02 am, "Paul" <p...@tretbase.com> wrote: >> >> <t...@iahu.ca> wrote in message >> >> >>news:fd764921-31ce-4853-8c41-2a9c9a4396a1@r3g2000vbp.googlegroups.com... >> >> >> > On Jun 3, 5:10 pm, "Paul" <p...@tretbase.com> wrote: >> >> >> <t...@iahu.ca> wrote in message >> >> >> >>news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroups.com... >> >> >> >> > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: >> >> >> >> You got the wrong person. You must be referring to the person >> >> >> >> that >> >> >> >> started >> >> >> >> this thread. I never made such a claim. >> >> >> >> >> Paul >> >> >> >> > Ah, perhaps, you trolls are so interchangeable these days... >> >> >> >> > Well since you're on a infinite compressor bender too, you can >> >> >> > answer >> >> >> > the claim too. >> >> >> >> > Tom >> >> >> >> Tom, my claim is that I can recursively reduce random data. Does >> >> >> that >> >> >> sound >> >> >> impossible to you? >> >> >> > You need stronger definitions of what you claim to do. Recursively >> >> > could include the set of operations performed 0 times just as easily >> >> > as N>0 times. You haven't defined what "random" data is either. >> >> >> > If you're claiming that you can compress all N-bit strings (N>1) to >> >> > a >> >> > size of something less than N-bits long. You're a liar. It's >> >> > trivial >> >> > to disprove. You can't compress all 2-bit strings to 1 bit because >> >> > 1 >> >> > bit can only represent half as many values as a 2-bit string. You >> >> > can't compress all 3-bit strings to 2 or 1 bit because even combined >> >> > (ignoring a need to be uniquely decodable) where are only 6 2- and >> >> > 1- >> >> > bit strings whereas there are 8 3-bit string. You can continue this >> >> > argument on as N-bit and the sum of all possible 1,2,3,...,N-1 bit >> >> > strings. \sum_{i=1}^{N-1}2^i will always be less than 2^N (hint: >> >> > the >> >> > sum is equal to 2^N - 2). Even if you somehow allowed in the zero- >> >> > length string as a possible encoding, you're still only up to a sum >> >> > of >> >> > 2^N - 1. There are 2^N values you need to encode. So therefore you >> >> > failed. And this of course is ignoring the fact you need some >> >> > escape >> >> > bits to tell one length of string from another. >> >> >> > So now that I've disproven that ANY FORM of random data compressor >> >> > is >> >> > impossible, you're continuing to maintain that you've done it is >> >> > just >> >> > you being a troll. And there is no two ways about it (excuse the >> >> > pun). You're now aware that it can't be done, so maintaining that >> >> > it >> >> > can is just you being contrary. >> >> >> > What you need to do is stop trolling usenet because you're wasting >> >> > your life. If you want to go out and have a good time hit the local >> >> > pub, pound a few back and talk big with the folks at the counter. >> >> > At >> >> > least then you'll be interacting with real people in real life as >> >> > opposed to getting a rise out of people remotely and anonymously. >> >> > Honestly, I'm not saying this to be mean, you really ARE wasting >> >> > everyones [including your own] time. >> >> >> > Tom >> >> >> Tom, I'm beginning to believe that there is no such thing as Random >> >> Data. >> >> But I know the depth at which people here consider data Random so I go >> >> by >> >> that. But for the sake of argument, let's call Random data whatever >> >> you >> >> believe exhibits the greatest entropy. >> >> >> There is no trolling going on here, Tom. You disbelieve so you >> >> consider >> >> anyone that does believe to be trolling. >> >> > The point though is that it doesn't matter if the data is random or >> > not. messages of length M where M<N cannot represent all possible N- >> > bit messages. Whether they're random or enumerated. Like a program >> > that simply emits all 1024-bit integers will need to have at least >> > 1024-bits of storage to represent the counter. Even though the >> > integers are sequential and totally non-random. So to have a program >> > that emits all N-bit messages, you'll need either N-bits [or more] of >> > input, or combine M<N bits of input with bits from the program. >> >> > And if you want to emit specific messages than the input must uniquely >> > determine the output which means you either can't compress all N-bit >> > messages, or you violate M>N on average. >> >> > Tom >> >> Then by your understanding it should be impossible to save 12941 bytes of >> data from the million bit bin file, correct? > > You haven't done it until you can decompress it. And specifically my > comments refer to the claim of compressing ALL inputs. > > For all I know the million bit file [which I didn't make] is the > output of a presumed cryptographically strong PRNG with a short seed. > Maybe you broke the algorithm and figured out what the seed is. I > have no way of knowing as I didn't make the file. > > Can you compress ALL inputs, or just some inputs? Tom, nobody will EVER be able to compress all inputs. I give you 1 bit and say compress - you will fail. However, speaking to practicality, I am confident that many shall be impressed. Paul   0 Reply Paul 6/4/2009 2:59:33 PM Paul schrieb: > > > Tom, nobody can do what your saying. It is a STUPID argument. Nobody > is going to tell you that they can compress 1 bit to less than 1 bit. > Now can you stop with the ridiculous assertions? But to say that we > can't compress 1024 bytes of data that exhibit the highest entropy is > also a wrong assertion. No, it's a right assumption. If you can compress it, then the model you picked for defining the entropy does not fit the model that is used in your compressor. IOW, by using words inappropriately you come to wrong conclusions. Greetings, Thomas   0 Reply Thomas 6/4/2009 3:45:42 PM On Jun 4, 10:59=A0am, "Paul" <p...@tretbase.com> wrote: > <t...@iahu.ca> wrote in message > > news:11f26d3c-d0af-4a92-ba29-85eec7682e97@n4g2000vba.googlegroups.com... > > > > > On Jun 4, 10:30 am, "Paul" <p...@tretbase.com> wrote: > >> <t...@iahu.ca> wrote in message > > >>news:074e4133-fb24-42b2-82f0-6001fdb8ff33@s21g2000vbb.googlegroups.com.= ... > > >> > On Jun 4, 8:02 am, "Paul" <p...@tretbase.com> wrote: > >> >> <t...@iahu.ca> wrote in message > > >> >>news:fd764921-31ce-4853-8c41-2a9c9a4396a1@r3g2000vbp.googlegroups.co= m... > > >> >> > On Jun 3, 5:10 pm, "Paul" <p...@tretbase.com> wrote: > >> >> >> <t...@iahu.ca> wrote in message > > >> >> >>news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroup= s.com... > > >> >> >> > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: > >> >> >> >> You got the wrong person. =A0You must be referring to the per= son > >> >> >> >> that > >> >> >> >> started > >> >> >> >> this thread. =A0I never made such a claim. > > >> >> >> >> Paul > > >> >> >> > Ah, perhaps, you trolls are so interchangeable these days... > > >> >> >> > Well since you're on a infinite compressor bender too, you can > >> >> >> > answer > >> >> >> > the claim too. > > >> >> >> > Tom > > >> >> >> Tom, my claim is that I can recursively reduce random data. =A0D= oes > >> >> >> that > >> >> >> sound > >> >> >> impossible to you? > > >> >> > You need stronger definitions of what you claim to do. =A0Recursi= vely > >> >> > could include the set of operations performed 0 times just as eas= ily > >> >> > as N>0 times. =A0You haven't defined what "random" data is either= .. > > >> >> > If you're claiming that you can compress all N-bit strings (N>1) = to > >> >> > a > >> >> > size of something less than N-bits long. =A0You're a liar. =A0It'= s > >> >> > trivial > >> >> > to disprove. =A0You can't compress all 2-bit strings to 1 bit bec= ause > >> >> > 1 > >> >> > bit can only represent half as many values as a 2-bit string. =A0= You > >> >> > can't compress all 3-bit strings to 2 or 1 bit because even combi= ned > >> >> > (ignoring a need to be uniquely decodable) where are only 6 2- an= d > >> >> > 1- > >> >> > bit strings whereas there are 8 3-bit string. =A0You can continue= this > >> >> > argument on as N-bit and the sum of all possible 1,2,3,...,N-1 bi= t > >> >> > strings. =A0\sum_{i=3D1}^{N-1}2^i will always be less than 2^N (h= int: > >> >> > the > >> >> > sum is equal to 2^N - 2). =A0Even if you somehow allowed in the z= ero- > >> >> > length string as a possible encoding, you're still only up to a s= um > >> >> > of > >> >> > 2^N - 1. =A0There are 2^N values you need to encode. =A0So theref= ore you > >> >> > failed. =A0And this of course is ignoring the fact you need some > >> >> > escape > >> >> > bits to tell one length of string from another. > > >> >> > So now that I've disproven that ANY FORM of random data compresso= r > >> >> > is > >> >> > impossible, you're continuing to maintain that you've done it is > >> >> > just > >> >> > you being a troll. =A0And there is no two ways about it (excuse t= he > >> >> > pun). =A0You're now aware that it can't be done, so maintaining t= hat > >> >> > it > >> >> > can is just you being contrary. > > >> >> > What you need to do is stop trolling usenet because you're wastin= g > >> >> > your life. =A0If you want to go out and have a good time hit the = local > >> >> > pub, pound a few back and talk big with the folks at the counter. > >> >> > At > >> >> > least then you'll be interacting with real people in real life as > >> >> > opposed to getting a rise out of people remotely and anonymously. > >> >> > Honestly, I'm not saying this to be mean, you really ARE wasting > >> >> > everyones [including your own] time. > > >> >> > Tom > > >> >> Tom, I'm beginning to believe that there is no such thing as Random > >> >> Data. > >> >> But I know the depth at which people here consider data Random so I= go > >> >> by > >> >> that. =A0But for the sake of argument, let's call Random data whate= ver > >> >> you > >> >> believe exhibits the greatest entropy. > > >> >> There is no trolling going on here, Tom. =A0You disbelieve so you > >> >> consider > >> >> anyone that does believe to be trolling. > > >> > The point though is that it doesn't matter if the data is random or > >> > not. =A0messages of length M where M<N cannot represent all possible= N- > >> > bit messages. =A0Whether they're random or enumerated. =A0Like a pro= gram > >> > that simply emits all 1024-bit integers will need to have at least > >> > 1024-bits of storage to represent the counter. =A0Even though the > >> > integers are sequential and totally non-random. =A0 So to have a pro= gram > >> > that emits all N-bit messages, you'll need either N-bits [or more] o= f > >> > input, or combine M<N bits of input with bits from the program. > > >> > And if you want to emit specific messages than the input must unique= ly > >> > determine the output which means you either can't compress all N-bit > >> > messages, or you violate M>N on average. > > >> > Tom > > >> Then by your understanding it should be impossible to save 12941 bytes= of > >> data from the million bit bin file, correct? > > > You haven't done it until you can decompress it. =A0And specifically my > > comments refer to the claim of compressing ALL inputs. > > > For all I know the million bit file [which I didn't make] is the > > output of a presumed cryptographically strong PRNG with a short seed. > > Maybe you broke the algorithm and figured out what the seed is. =A0 I > > have no way of knowing as I didn't make the file. > > > Can you compress ALL inputs, or just some inputs? > > Tom, nobody will EVER be able to compress all inputs. =A0I give you 1 bit= and > say compress - you will fail. =A0However, speaking to practicality, I am > confident that many shall be impressed. Well there is a difference between compressing some files really well and compressing ALL files even a little bit. And using some vague notion of "recursive" compression is just a quick way to get branded a troll. frankly I don't see hold out, many people have posted their algorithms in the past. Some have done amazingly well at compressing known data sets (like the Calgary Corpus). I don't see why you're making a production out of your work. Just post it already and be done with. Either you have something real, tangible, working to show off, or you don't. Tom   0 Reply tom 6/4/2009 4:12:14 PM On Jun 3, 12:17=A0pm, "Paul" <p...@tretbase.com> wrote: > "Pete Fraser" <pfra...@covad.net> wrote in message > I'll get those for you though here after some more changes to the > decompressor (which will result in smaller size). =A0By the way, I'm curr= ently > using a prototype language so it is interpreted which allows me to quickl= y > make changes at this point. =A0I plan to move unto something performance > oriented and concise later on. =A0But currently, to provide a decompresso= r > size would require providing the interpreter size and the script size. To me that sounds like you are a scam artist. Every magic compressor company that came along goes thru a phase when they start making excuses about changing thier compression tech as a reason why they have not released anything yet. And the interesting thing is no matter how many tech changes they make (Assembly, Interpreters, HLDLs, Hardware, etc) they never deliver something that works. To me it looks like you are going down the same road/excuses. If you have something that works already, why do you need to redo it before you can prove your claims?   0 Reply earlcolby 6/4/2009 4:17:58 PM  <tom@iahu.ca> wrote in message news:9b260c33-11a6-422e-a312-46704c9dc642@n4g2000vba.googlegroups.com... > On Jun 4, 10:59 am, "Paul" <p...@tretbase.com> wrote: >> <t...@iahu.ca> wrote in message >> >> news:11f26d3c-d0af-4a92-ba29-85eec7682e97@n4g2000vba.googlegroups.com... >> >> >> >> > On Jun 4, 10:30 am, "Paul" <p...@tretbase.com> wrote: >> >> <t...@iahu.ca> wrote in message >> >> >>news:074e4133-fb24-42b2-82f0-6001fdb8ff33@s21g2000vbb.googlegroups.com... >> >> >> > On Jun 4, 8:02 am, "Paul" <p...@tretbase.com> wrote: >> >> >> <t...@iahu.ca> wrote in message >> >> >> >>news:fd764921-31ce-4853-8c41-2a9c9a4396a1@r3g2000vbp.googlegroups.com... >> >> >> >> > On Jun 3, 5:10 pm, "Paul" <p...@tretbase.com> wrote: >> >> >> >> <t...@iahu.ca> wrote in message >> >> >> >> >>news:a579407a-8d8f-43f3-a265-4715f7cb0b1d@s28g2000vbp.googlegroups.com... >> >> >> >> >> > On Jun 3, 3:14 pm, "Paul" <p...@tretbase.com> wrote: >> >> >> >> >> You got the wrong person. You must be referring to the >> >> >> >> >> person >> >> >> >> >> that >> >> >> >> >> started >> >> >> >> >> this thread. I never made such a claim. >> >> >> >> >> >> Paul >> >> >> >> >> > Ah, perhaps, you trolls are so interchangeable these days... >> >> >> >> >> > Well since you're on a infinite compressor bender too, you can >> >> >> >> > answer >> >> >> >> > the claim too. >> >> >> >> >> > Tom >> >> >> >> >> Tom, my claim is that I can recursively reduce random data. >> >> >> >> Does >> >> >> >> that >> >> >> >> sound >> >> >> >> impossible to you? >> >> >> >> > You need stronger definitions of what you claim to do. >> >> >> > Recursively >> >> >> > could include the set of operations performed 0 times just as >> >> >> > easily >> >> >> > as N>0 times. You haven't defined what "random" data is either. >> >> >> >> > If you're claiming that you can compress all N-bit strings (N>1) >> >> >> > to >> >> >> > a >> >> >> > size of something less than N-bits long. You're a liar. It's >> >> >> > trivial >> >> >> > to disprove. You can't compress all 2-bit strings to 1 bit >> >> >> > because >> >> >> > 1 >> >> >> > bit can only represent half as many values as a 2-bit string. >> >> >> > You >> >> >> > can't compress all 3-bit strings to 2 or 1 bit because even >> >> >> > combined >> >> >> > (ignoring a need to be uniquely decodable) where are only 6 2- >> >> >> > and >> >> >> > 1- >> >> >> > bit strings whereas there are 8 3-bit string. You can continue >> >> >> > this >> >> >> > argument on as N-bit and the sum of all possible 1,2,3,...,N-1 >> >> >> > bit >> >> >> > strings. \sum_{i=1}^{N-1}2^i will always be less than 2^N (hint: >> >> >> > the >> >> >> > sum is equal to 2^N - 2). Even if you somehow allowed in the >> >> >> > zero- >> >> >> > length string as a possible encoding, you're still only up to a >> >> >> > sum >> >> >> > of >> >> >> > 2^N - 1. There are 2^N values you need to encode. So therefore >> >> >> > you >> >> >> > failed. And this of course is ignoring the fact you need some >> >> >> > escape >> >> >> > bits to tell one length of string from another. >> >> >> >> > So now that I've disproven that ANY FORM of random data >> >> >> > compressor >> >> >> > is >> >> >> > impossible, you're continuing to maintain that you've done it is >> >> >> > just >> >> >> > you being a troll. And there is no two ways about it (excuse the >> >> >> > pun). You're now aware that it can't be done, so maintaining >> >> >> > that >> >> >> > it >> >> >> > can is just you being contrary. >> >> >> >> > What you need to do is stop trolling usenet because you're >> >> >> > wasting >> >> >> > your life. If you want to go out and have a good time hit the >> >> >> > local >> >> >> > pub, pound a few back and talk big with the folks at the counter. >> >> >> > At >> >> >> > least then you'll be interacting with real people in real life as >> >> >> > opposed to getting a rise out of people remotely and anonymously. >> >> >> > Honestly, I'm not saying this to be mean, you really ARE wasting >> >> >> > everyones [including your own] time. >> >> >> >> > Tom >> >> >> >> Tom, I'm beginning to believe that there is no such thing as Random >> >> >> Data. >> >> >> But I know the depth at which people here consider data Random so I >> >> >> go >> >> >> by >> >> >> that. But for the sake of argument, let's call Random data >> >> >> whatever >> >> >> you >> >> >> believe exhibits the greatest entropy. >> >> >> >> There is no trolling going on here, Tom. You disbelieve so you >> >> >> consider >> >> >> anyone that does believe to be trolling. >> >> >> > The point though is that it doesn't matter if the data is random or >> >> > not. messages of length M where M<N cannot represent all possible >> >> > N- >> >> > bit messages. Whether they're random or enumerated. Like a program >> >> > that simply emits all 1024-bit integers will need to have at least >> >> > 1024-bits of storage to represent the counter. Even though the >> >> > integers are sequential and totally non-random. So to have a >> >> > program >> >> > that emits all N-bit messages, you'll need either N-bits [or more] >> >> > of >> >> > input, or combine M<N bits of input with bits from the program. >> >> >> > And if you want to emit specific messages than the input must >> >> > uniquely >> >> > determine the output which means you either can't compress all N-bit >> >> > messages, or you violate M>N on average. >> >> >> > Tom >> >> >> Then by your understanding it should be impossible to save 12941 bytes >> >> of >> >> data from the million bit bin file, correct? >> >> > You haven't done it until you can decompress it. And specifically my >> > comments refer to the claim of compressing ALL inputs. >> >> > For all I know the million bit file [which I didn't make] is the >> > output of a presumed cryptographically strong PRNG with a short seed. >> > Maybe you broke the algorithm and figured out what the seed is. I >> > have no way of knowing as I didn't make the file. >> >> > Can you compress ALL inputs, or just some inputs? >> >> Tom, nobody will EVER be able to compress all inputs. I give you 1 bit >> and >> say compress - you will fail. However, speaking to practicality, I am >> confident that many shall be impressed. > > Well there is a difference between compressing some files really well > and compressing ALL files even a little bit. > > And using some vague notion of "recursive" compression is just a quick > way to get branded a troll. > > frankly I don't see hold out, many people have posted their algorithms > in the past. Some have done amazingly well at compressing known data > sets (like the Calgary Corpus). I don't see why you're making a > production out of your work. Just post it already and be done with. > > Either you have something real, tangible, working to show off, or you > don't. > > Tom Tom, you wont ever be able to verify my technology until I get a patent on the process. But Tom, please tell me. Do you think it is impossible that I compressed the million bit Random .bin file by 12,941 bytes? Paul   0 Reply Paul 6/4/2009 4:28:46 PM  <earlcolby.pottinger@sympatico.ca> wrote in message news:e347fd3c-60bf-453d-8abc-bbf18cd8b376@f10g2000vbf.googlegroups.com... > On Jun 3, 12:17 pm, "Paul" <p...@tretbase.com> wrote: >> "Pete Fraser" <pfra...@covad.net> wrote in message > >> I'll get those for you though here after some more changes to the >> decompressor (which will result in smaller size). By the way, I'm >> currently >> using a prototype language so it is interpreted which allows me to >> quickly >> make changes at this point. I plan to move unto something performance >> oriented and concise later on. But currently, to provide a decompressor >> size would require providing the interpreter size and the script size. > > To me that sounds like you are a scam artist. > > Every magic compressor company that came along goes thru a phase when > they start making excuses about changing thier compression tech as a > reason why they have not released anything yet. And the interesting > thing is no matter how many tech changes they make (Assembly, > Interpreters, HLDLs, Hardware, etc) they never deliver something that > works. > > To me it looks like you are going down the same road/excuses. > > If you have something that works already, why do you need to redo it > before you can prove your claims? Earl I have explained this already - go read the posts. Paul   0 Reply Paul 6/4/2009 4:30:14 PM On Jun 4, 7:02=A0am, "Paul" <p...@tretbase.com> wrote: > Tom, I'm beginning to believe that there is no such thing as Random Data. > But I know the depth at which people here consider data Random so I go by > that. =A0But for the sake of argument, let's call Random data whatever yo= u > believe exhibits the greatest entropy. This is a bad definition. Entropy is relative to a model, not absolute, and your definition does not take that into account. A computer that can generate the million random digit file with a single instruction would consider that file to have extremely low entropy. Personally I think a really good working definition of randomness is "the length of the smallest C++ program able to generate the sequence using only the standard library." It is reasonable to add "plus a data file" to that. Any sequence that can be generated by a program smaller than the sequence might be called "not very random". A sequence that requires a program as long or greater than itself is "somewhat random". By this definition that million random digit file is most definitely random, and this situation is not likely to ever change. - Mark   0 Reply Mark 6/4/2009 5:25:12 PM On Jun 4, 11:28=A0am, "Paul" <p...@tretbase.com> wrote: > you wont ever be able to verify my technology until I get a patent on > the process. =A0 I guess our efforts to spare you the$4000 patent filing fees have
fallen on deaf ears.

 0


"Jim Leonard" <MobyGamer@gmail.com> wrote in message
> On Jun 4, 11:28 am, "Paul" <p...@tretbase.com> wrote:
>> you wont ever be able to verify my technology until I get a patent on
>> the process.
>
> I guess our efforts to spare you the $4000 patent filing fees have > fallen on deaf ears. Yes, couldn't I just win the$5000 challenge and use that money for the
patent ;-)


 0

On Jun 4, 9:42=A0am, t...@iahu.ca wrote:

> For all I know the million bit file [which I didn't make] is the
> output of a presumed cryptographically strong PRNG with a short seed.
>
> Can you compress ALL inputs, or just some inputs?

No, the file was constructed quite carefully, and stands up well. No
PRNG or other magic function is going to generate its sequence.

There is a pointer to the monograph on my site:

http://marknelson.us/2006/06/20/million-digit-challenge/

You an find most of the information on how this constructed with
Google searches. It was done in 1955, when the tools were a bit less
powerful than they are now, but was nonetheless a great piece of work.

- Mark

 0

In article <wjRVl.5579$Qg6.5246@newsfe08.iad>, Paul <paul@tretbase.com> wrote: > ><tom@iahu.ca> wrote in message >news:11f26d3c-d0af-4a92-ba29-85eec7682e97@n4g2000vba.googlegroups.com... >> >> Can you compress ALL inputs, or just some inputs? > >Tom, nobody will EVER be able to compress all inputs. I give you 1 bit and >say compress - you will fail. Okay. Can you compress all inputs over a certain size (say, 16K?). Alan -- Defendit numerus   0 Reply amorgan 6/4/2009 5:33:00 PM  "Mark Nelson" <snorkelman@gmail.com> wrote in message news:ac47c7dd-7da1-429e-bfed-a0475811c084@o18g2000yqi.googlegroups.com... > On Jun 4, 7:02 am, "Paul" <p...@tretbase.com> wrote: >> Tom, I'm beginning to believe that there is no such thing as Random Data. >> But I know the depth at which people here consider data Random so I go by >> that. But for the sake of argument, let's call Random data whatever you >> believe exhibits the greatest entropy. > > This is a bad definition. Entropy is relative to a model, not > absolute, and your definition does not take that into account. > > A computer that can generate the million random digit file with a > single instruction would consider that file to have extremely low > entropy. > > Personally I think a really good working definition of randomness is > "the length of the smallest C++ program able to generate the sequence > using only the standard library." > > It is reasonable to add "plus a data file" to that. > > Any sequence that can be generated by a program smaller than the > sequence might be called "not very random". > > A sequence that requires a program as long or greater than itself is > "somewhat random". > > By this definition that million random digit file is most definitely > random, and this situation is not likely to ever change. > > - Mark Well, I'm hoping that for the sake of testing, that you experts here can present data that exhibits the best sample for the experiment. So far I have the million bit bin file. If there is another file someone would rather me test against, I would appreciate a link to download the data. Paul   0 Reply Paul 6/4/2009 5:33:12 PM Paul wrote: > Tom wrote: > > > Either you have something real, tangible, working to show off, or you > > don't. > > Tom, you wont ever be able to verify my technology until I get a patent on > the process. Translation: Never. > But Tom, please tell me. Do you think it is impossible that I > compressed the million bit Random.bin file by 12,941 bytes? No, of course it's possible. But it's completely irrelevant. For every file (bit string) q I can build a lossless compressor c_q which compresses q to one bit. c_q(x) = 1 for x==q c_q(x) = 0|x for x!=q where "|" denotes the concatenation of bit strings. What you should try instead is to compress this Random.bin file to a size S so that S + size_of(decompressor) < size_of(Random.bin) where "decompressor" can be the binary file or a ZIP of your source code including required tables (if you have any tables) of your decompressor. Try that. The chance of succeeding on an uniformly randomly chosen file is almost zero. But hey ... it's not exactly zero ... you might get lucky. :-) Oh, and don't forget to write this decompressor and test the whole thing before you claim anything. Cheers! SG   0 Reply SG 6/4/2009 5:38:35 PM  "Alan Morgan" <amorgan@xenon.Stanford.EDU> wrote in message news:h090gc$hqa$1@xenon.Stanford.EDU... > In article <wjRVl.5579$Qg6.5246@newsfe08.iad>, Paul <paul@tretbase.com>
> wrote:
>>
>><tom@iahu.ca> wrote in message
>>>
>>> Can you compress ALL inputs, or just some inputs?
>>
>>Tom, nobody will EVER be able to compress all inputs.  I give you 1 bit
>>and
>>say compress - you will fail.
>
> Okay.  Can you compress all inputs over a certain size (say, 16K?).
>
> Alan
> --
> Defendit numerus

Ah, a more rational question.  I can only presume yes.  Obviously, I cannot
test every input greater than 16k.  But I do not see why it wouldn't.

Paul


 0

Paul wrote:
> Alan Morgan wrote:
>
> > Okay. =A0Can you compress all inputs over a certain size (say, 16K?).
>
> Ah, a more rational question.=A0I can only presume yes. Obviously,
> I cannot test every input greater than 16k. =A0But I do not see why
> it wouldn't.

X =3D the number of all possible files of length 16 KB
Y =3D the number of all possible files below 16 KB.

For someone like you this should fairly easy.

Please compare X and Y to see which one of the following three cases
applies:
1. X < Y
2. X =3D Y
3. X > Y

What is the implication of case 3 with respect to compressing all 16
KB files?

Cheers!
SG

 0

Paul wrote:
) Tom, I answered this a bit in the response to you in the million bit SUCCESS
) thread.  Here is what I know.  There is a sequence of data that may not
) compress on first pass.  However, it may recursively compress.  The reason
) for this is that the representation of the original data is now represented
) differently on the first pass.  Therefore, when the recursion runs through
) the processed data it may indeed then compress the data.   The interesting
) thing is that I don't think I have optimized the method yet.  I'm getting
) awesome results but I think it can be improved further.

So you need to remember how many times the compressor has run, and
you need to add that piece of information to the compressed file,
otherwise how would the decompressor know ?  There's your loss.

By the way, here's a stronger (but still provable) version
of the Counting Theorem:

The probability that a given compressor will compress a
randomly generated string by N bits is equal to 1 in 2^N

Which is easily provable:

There are half as many possible bit strings of length X-1
as there are of length X, again half as many of length X-2,
and so on.  So there are 1/2^N possible bit strings of length
X-N.  Each string of X bits must be compressed to a different
string of X-N bits, so only 1 in 2^N can be compressed suchly.

SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

 0

Paul wrote:
)>> Then by your understanding it should be impossible to save 12941 bytes of
)>> data from the million bit bin file, correct?

Unless, of course, the decompressor, implicitly or explicitly, contains
part of the million bit file.  How many bytes is the decompressor ?

) Tom, nobody will EVER be able to compress all inputs.  I give you 1 bit and
) say compress - you will fail.  However, speaking to practicality, I am
) confident that many shall be impressed.

Crossthread I've stated the stronger version of the Counting Theorem:

It's impossible to compress more than 1 in 2^N files by N bits.

So, the probability that a given compressor shrinks a random file
by 12500 bytes is one in 2^10000, which is positively microscopic.

SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

 0


"Willem" <willem@snail.stack.nl> wrote in message
news:slrnh2g9cg.vvj.willem@snail.stack.nl...
> Paul wrote:
> )>> Then by your understanding it should be impossible to save 12941 bytes
> of
> )>> data from the million bit bin file, correct?
>
> Unless, of course, the decompressor, implicitly or explicitly, contains
> part of the million bit file.  How many bytes is the decompressor ?
>
> ) Tom, nobody will EVER be able to compress all inputs.  I give you 1 bit
> and
> ) say compress - you will fail.  However, speaking to practicality, I am
> ) confident that many shall be impressed.
>
> Crossthread I've stated the stronger version of the Counting Theorem:
>
> It's impossible to compress more than 1 in 2^N files by N bits.
>
> So, the probability that a given compressor shrinks a random file
> by 12500 bytes is one in 2^10000, which is positively microscopic.
>
>
> SaSW, Willem
> --
> Disclaimer: I am in no way responsible for any of the statements
>            made in the above text. For all I know I might be
>            drugged or something..
>            No I'm not paranoid. You all think I'm paranoid, don't you !
> #EOT

Then I have a feeling there is something definitely wrong with what
compression theory experts are using for calculation.

Paul


 0

"Paul" <paul@tretbase.com> wrote in message

> Then I have a feeling there is something definitely wrong with what
> compression theory experts are using for calculation.

Certainly there are some compression theory experts in this group.
I'm not one of them - I'm just a lowly hardware hacker with
some common sense. However, a little bit of common sense is
all that's needed to understand the counting argument.
IT'S NOT ROCKET SCIENCE.


 0

On Jun 4, 4:06=A0pm, "Paul" <p...@tretbase.com> wrote:

> Then I have a feeling there is something definitely wrong with what
> compression theory experts are using for calculation.

Feelings mean nothing to computers.  Code is KING!  Until you have a
working decompressor you have got NOTHING!

Stop post, Start coding.  But we know you will not stop posting,
because you can't write code that does what you claim, and no-one here
who has written real working compressors/decompressors will believe
such a claim without working code.

The ball is all in your court.

PS, we know in a month or two like all the rest before you, that you

PSS. To the best of my knowledge only two people have entered this
UseNet Group with a lot of postings about their 'magic' compression
systems and then later admitted they were wrong.  All the rest
vanished rather than admit thier mistakes.

 0

<earlcolby.pottinger@sympatico.ca> wrote in message

> PSS. To the best of my knowledge only two people have entered this
> UseNet Group with a lot of postings about their 'magic' compression
> systems and then later admitted they were wrong.  All the rest
> vanished rather than admit thier mistakes.

With the notable exception of Jules Gilbert (a.k.a. Energizer Bunny).
He's been not admitting his mistakes for fifteen years at least.


 0

On Jun 4, 12:33=A0pm, "Paul" <p...@tretbase.com> wrote:
> Well, I'm hoping that for the sake of testing, that you experts here can
> present data that exhibits the best sample for the experiment. =A0So far =
I
> have the million bit bin file. =A0If there is another file someone would

No, the million digits file is more than enough to show you the flaw
in your system.  Work with that one.

 0

On 4 Jun., 11:30, Mark Nelson <snorkel...@gmail.com> wrote:

> There is a pointer to the monograph on my site:

That's so hilarious that it was a roulette-machine. It's a gamble,
like that of the random-compressor creators (to set on the right
number to win all). I like that tiny spot of irony! ;-)

Ciao
Niels

 0

Paul wrote:
) "Willem" <willem@snail.stack.nl> wrote in message
) news:slrnh2g9cg.vvj.willem@snail.stack.nl...
)> Crossthread I've stated the stronger version of the Counting Theorem:
)>
)> It's impossible to compress more than 1 in 2^N files by N bits.
)>
)> So, the probability that a given compressor shrinks a random file
)> by 12500 bytes is one in 2^10000, which is positively microscopic.
)
) Then I have a feeling there is something definitely wrong with what
) compression theory experts are using for calculation.

I don't understand what you mean by that statement.

What do you believe it is that compression theory experts
are using for calculation ?  And what would be wrong with it ?

SaSW, Willem
--
Disclaimer: I am in no way responsible for any of the statements
made in the above text. For all I know I might be
drugged or something..
No I'm not paranoid. You all think I'm paranoid, don't you !
#EOT

 0

On 2 Jun., 19:58, ffkfsoxfdzg...@mailinator.com wrote:
> On Jun 2, 10:25=A0am, SG wrote:
> > On 2 Jun., 18:54, ffkfsoxfdzg...@mailinator.com wrote:
>
> > > The method can compress any data(above minimal size) recursively at
> > > rate from 60% to 90% per cycle depending on the data. Conversely, the
> > > compressed file can then be decompressed back to the original input
> > > with lossless percision.
>
> > Thanks for answering my questions. In this case the proof contained in
> > the comp.compression FAQ [1] applies. Look for "counting argument".
> > It's a "reductio ad absurdum"-type proof [2] which proves that no such
> > algorithm with the mentioned properties can exist.
>
> > [1]http://www.faqs.org/faqs/compression-faq/part1/(section9)
>
> Sorry, but this is simply not true.

Care to point out exactly what is "not true" about it?
Does the proof not apply? If so, why not?
Is the proof flawed? If so, what's the flaw?

Cheers!
SG

 0

132 Replies
198 Views

Similiar Articles:

7/12/2012 11:10:31 PM