f



C++ vs C

I've just been working with a bit of C++ code for the first time (with a 
view to spending a couple of hours converting it to C. Or to *any* language 
that mere humans could understand, for that matter).

But after that experience I don't think I'll ever complain about anything in 
C again!

The thing was bristling with templates, specialisation templates, classes, 
namespaces, operators, friends, the full works, used, as far as I could see, 
just for the sake of it. (I managed to get rid of the templates, and it 
still worked..)

This was a jpeg encoder/decoder, a complex enough task to follow without the 
language making it impossible.

(I gave up on the conversion task, but fortunately, by some miracle, managed 
to build it as it was into the 64-bit DLL file I needed. Except I have to 
import snappy function names such as 
"_ZN4jpge27compress_image_to_jpeg_fileEPKciiiPKhRKNS_6paramsE" into the 
non-C++ code that uses it!)

Anyone reading this and is at the point of being undecided about whether to 
learn C++ or not, please do everyone else a favour and don't bother!

(The only thing I can say in its favour, is that at least I managed to get 
something working. The only C equivalent I could find involved 72 C source 
files, compared with the two used here. One way or another, everyone seems 
intent on making things more complicated than they need be.)

-- 
Bartc 

0
BartC
10/12/2014 8:05:59 PM
comp.lang.c 30656 articles. 5 followers. spinoza1111 (3246) is leader. Post Follow

443 Replies
3457 Views

Similar Articles

[PageSpeed] 59

On 10/12/14, 4:05 PM, BartC wrote:
> I've just been working with a bit of C++ code for the first time (with a
> view to spending a couple of hours converting it to C. Or to *any*
> language that mere humans could understand, for that matter).
>
> But after that experience I don't think I'll ever complain about
> anything in C again!
>
> The thing was bristling with templates, specialisation templates,
> classes, namespaces, operators, friends, the full works, used, as far as
> I could see, just for the sake of it. (I managed to get rid of the
> templates, and it still worked..)
>
> This was a jpeg encoder/decoder, a complex enough task to follow without
> the language making it impossible.
>
> (I gave up on the conversion task, but fortunately, by some miracle,
> managed to build it as it was into the 64-bit DLL file I needed. Except
> I have to import snappy function names such as
> "_ZN4jpge27compress_image_to_jpeg_fileEPKciiiPKhRKNS_6paramsE" into the
> non-C++ code that uses it!)
>
> Anyone reading this and is at the point of being undecided about whether
> to learn C++ or not, please do everyone else a favour and don't bother!
>
> (The only thing I can say in its favour, is that at least I managed to
> get something working. The only C equivalent I could find involved 72 C
> source files, compared with the two used here. One way or another,
> everyone seems intent on making things more complicated than they need be.)
>

YES, I see many C++ programs which use feature for the sense of using 
features, which makes them hard to read.

One comment on the lousy function names for export, if you declare your 
API interface functions (where possible) like:

extern "C" void fun(char* name);

in the interface header (which is included in the file where the 
function is defined), then the compiler will NOT mangle the name (or 
just use the limited C mangling which might add a _ before the name).
0
Richard
10/12/2014 8:51:00 PM
In article <afB_v.422927$_u1.247933@fx30.am4>, "BartC" <bc@freeuk.com> wrote:

> I've just been working with a bit of C++ code for the first time (with a 

Unclean! Unclean! Drive a stake through the heart of the beast!

> Anyone reading this and is at the point of being undecided about whether to 
> learn C++ or not, please do everyone else a favour and don't bother!

The problem is they tried to make an object oriented language without garbage 
collection, and they tried to make it as statickally typed as possible. 90% of 
C++ are kludges to implement these objectives.

The lack of garbage collection is also why Cocoa on MacOSX is such a pain in the 
ass: temporary results have to be named and you have to maintain the bloody 
reference counts. However Objective-C is willing to do more work at runtime so 
it doesn't have all the crap C++ has.

Objective-C with garbage collection isn't too bad and with a few modifications 
it's available as Javascript. Some people want a typed Javascript, but in 18 
months the new CPU will handle the current code fast enough.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/12/2014 9:15:06 PM
I enjoy the C++ class, exceptions, and very little
else. I would tell anyone, "you get very far beyond
those two add-ons to C, and you're into the uncertain
lands called obfuscation and misery."

Hard to beat:
    Cxyz* myvar = new Cxyz(whatever);
    myvar->function(a, b, c);
    delete myvar;

Rather than:
    Sxyz* myvar = malloc(sizeof(Sxyz));
    xyz_init(myvar, whatever);
    xyz_function(myvar, a, b, c);
    xyz_delete(myvar);
    free(myvar);

Best regards,
Rick C. Hodgin
0
Rick
10/12/2014 9:22:09 PM
On Sunday, 12 October 2014 21:10:56 UTC+1, Bart  wrote:
> I've just been working with a bit of C++ code for the first time (with a 
> 
> view to spending a couple of hours converting it to C. Or to *any* language 
> 
> that mere humans could understand, for that matter).
> 
> 
> 
> But after that experience I don't think I'll ever complain about anything in 
> 
> C again!
> 
> 
> 
> The thing was bristling with templates, specialisation templates, classes, 
> 
> namespaces, operators, friends, the full works, used, as far as I could see, 
> 
> just for the sake of it. (I managed to get rid of the templates, and it 
> 
> still worked..)
> 
> 
> 
> This was a jpeg encoder/decoder, a complex enough task to follow without the 
> 
> language making it impossible.
> 
> 
> 
> (I gave up on the conversion task, but fortunately, by some miracle, managed 
> 
> to build it as it was into the 64-bit DLL file I needed. Except I have to 
> 
> import snappy function names such as 
> 
> "_ZN4jpge27compress_image_to_jpeg_fileEPKciiiPKhRKNS_6paramsE" into the 
> 
> non-C++ code that uses it!)
> 
> 
> 
> Anyone reading this and is at the point of being undecided about whether to 
> 
> learn C++ or not, please do everyone else a favour and don't bother!
> 
> 
> 
> (The only thing I can say in its favour, is that at least I managed to get 
> 
> something working. The only C equivalent I could find involved 72 C source 
> 
> files, compared with the two used here. One way or another, everyone seems 
> 
> intent on making things more complicated than they need be.)

Hi Bart!

C++ is supposed to make things easier, not harder. The idea is to write functions etc that abstract away the details, so that the deatails only occur in one place (in what is admittedly probably pretty horrible code) and then can be glossed over.

For instance, in C++ you can write:

x = a + (b * c);

where in C you would have to write:

mult(temp, b, c);
add(x, a temp);

Or, here is the test code for a little parser I wrote - it detects inputs of two different types and (remember this is just to test it) reflects them back:

int main(void) {
int a=0, n=0, x=0, y=0; 
Parse par; 
int st = FALSE;
char string[100];

while (!st) {
  gets(string);
  if (string[0] == 'q') st = TRUE; else {
    par << string; 
    if (par >> "From" >> ' ' >> x >> ' ' >> "to" >> ' ' >> y) a = 1; 
    else if (par >> "Number" >> ' ' >> n) a = 2; 
    else a = 3; 

    switch (a) {
      case 1: printf("From %d to %d\n", x, y); break;
      case 2: printf("Number %d\n", n); break;
      default: printf("Other\n"); } } }
}
0
Paul
10/12/2014 9:32:28 PM
In article <b0706107-6999-4f4b-8f35-57939905c723@googlegroups.com>,
 "Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote:

> I enjoy the C++ class, exceptions, and very little
> else. I would tell anyone, "you get very far beyond
> those two add-ons to C, and you're into the uncertain
> lands called obfuscation and misery."
> 
> Hard to beat:
>     Cxyz* myvar = new Cxyz(whatever);
>     myvar->function(a, b, c);
>     delete myvar;
> 
> Rather than:
>     Sxyz* myvar = malloc(sizeof(Sxyz));
>     xyz_init(myvar, whatever);
>     xyz_function(myvar, a, b, c);
>     xyz_delete(myvar);
>     free(myvar);

The code I've been writing is
    xyz_function(newSxyz(parameters), a, b, c);

newSxyz handles whatever allocation and initialisation is necessary.

The anonymous value is garbage collected sometime later.

newSxyz can attach a finaliser to anything allocated on the heap to do any 
additional work before freeing.

If I want to use a variable
    Sxyz var = newSxyz(parameters);
    f(var, a, b, c);
    g(h(var), a, b, c);

Do note I don't declare var to be a pointer. If it needs to be a pointer, that 
is hidden behind the interface in the declaration of Sxyz and implementation of 
newSxyz etc: I can change Sxyz to be a pointer, a struct, or anything else 
without rewriting the code.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/12/2014 9:35:28 PM
Rick C. Hodgin wrote:

> I enjoy the C++ class, exceptions, and very little
> else. I would tell anyone, "you get very far beyond
> those two add-ons to C, and you're into the uncertain
> lands called obfuscation and misery."
> 
> Hard to beat:
>     Cxyz* myvar = new Cxyz(whatever);

Hard to re-size.

>     myvar->function(a, b, c);
>     delete myvar;
> 
> Rather than:
>     Sxyz* myvar = malloc(sizeof(Sxyz));
>     xyz_init(myvar, whatever);

Why not:

Sxyz *myvar = xyz_create(whatever);

and let xyz_create do the mallocing?

>     xyz_function(myvar, a, b, c);

Yes, the absence of an effective 'this' mechanism is, I will grant you, a 
pain.

>     xyz_delete(myvar);
>     free(myvar);

Why not: xyz_delete(&myvar);

and let xyz_delete do not only the deleting but also the freeing? And while 
you're at it, you could even point myvar to NULL.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/12/2014 9:35:51 PM
Paul N wrote:

<snip>

> C++ is supposed to make things easier, not harder. The idea is to write
> functions etc that abstract away the details, so that the deatails only
> occur in one place (in what is admittedly probably pretty horrible code)
> and then can be glossed over.
> 
> For instance, in C++ you can write:
> 
> x = a + (b * c);
> 
> where in C you would have to write:
> 
> mult(temp, b, c);
> add(x, a temp);

Operator overloading is something I'd love to see in C. Unfortunately, it 
isn't going to happen, and I realise why, and the reasons are good reasons, 
and I can accept that, but... well, a chap can dream, right?

> 
> Or, here is the test code for a little parser I wrote - it detects inputs
> of two different types and (remember this is just to test it) reflects
> them back:
> 
> int main(void) {
> int a=0, n=0, x=0, y=0;
> Parse par;
> int st = FALSE;
> char string[100];
> 
> while (!st) {
>   gets(string);

Oh, no no no no no no no no no. It is never, ever right to use gets(). Look 
up fgets().

It is possible to use fgets incorrectly, just as it is possible to use any 
function incorrectly. Nevertheless, when used correctly, it works fine. The 
gets function, on the other hand, whether used correctly or incorrectly, 
opens a gaping security hole in your program. It is not possible to use gets 
safely. Don't try. Just don't use it at all, ever ever ever.

<snip>

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/12/2014 9:40:49 PM
Hard to beat: 
    Cxyz* myvar = new Cxyz(whatever);
    myvar->function(a, b, c);
    delete myvar;

Rather than:
    Sxyz* myvar = xyz_new(whatever);
    xyz_function(myvar, a, b, c);
    xyz_delete(&myvar);

....both in use, and in definition. :-)

Best regards,
Rick C. Hodgin
0
Rick
10/12/2014 9:44:48 PM
Rick C. Hodgin wrote:

> Hard to beat:
>     Cxyz* myvar = new Cxyz(whatever);

This is still just as hard to re-size as it was the last time you posted it.

>     myvar->function(a, b, c);

This is still hard to beat, yes.

>     delete myvar;

This is still just as easy to re-use accidentally after deletion as the last 
time you posted it.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/12/2014 9:47:52 PM
I do like this syntax though:
    Cxyz.new(whatever).=myvar;
    myvar.function(a, b, c);
    myvar.delete();

The type on myvar is given by its usage, and
there are no ->, just dot references.

The ".=" is a post-op assignment, same as:
    myvar = Cxyz.new(whatever);

....except it follows the thing. Also works in
nested ops.

Best regards,
Rick C. Hodgin
0
Rick
10/12/2014 9:57:14 PM
> > x = a + (b * c);
> > 
> > where in C you would have to write:
> > 
> > mult(temp, b, c);
> > add(x, a temp);
> 
> Operator overloading is something I'd love to see in C. Unfortunately, it 
> isn't going to happen, and I realise why, and the reasons are good reasons, 
> and I can accept that, but... well, a chap can dream, right?

The problem is how to free anonymous temporary results.
        x = a + (b*c)

        T1 = b*c;
        x = a + T1;
        release(T1);

You either have to make T1 explicit and its release explicitly (C), forbid T1 
from needing a release (this is what Ada does), or have some mechanism to 
implicitly save the value and release it later
        x = a + ({T1 =} b*c); {release(T1);}
where {...} marks hidden code added by the compiler.

The originally way to hide this was with reference counts; the latest MacOSX 
compiler apparently can do this for you. Garbage collection was the next 
solution.

C++ was going to be better than everyone afore it so it introduced its zeusawful 
destructors, and it's been limping along ever since dealing with the 
ramifications of that decision.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/12/2014 10:03:03 PM
Consider:
    Cxyz.new(whatever).function(a, b, c).delete();

Best regards,
Rick C. Hodgin
0
Rick
10/12/2014 10:03:46 PM
Siri Crews wrote:
>
> C++ was going to be better than everyone afore it so it introduced its zeusawful
> destructors, and it's been limping along ever since dealing with the
> ramifications of that decision.

Nearly all of those ramifications are good.

The ability to automatically allocate (through a constructor) and 
release (through a destructor) is probably the biggest advantage C++ has 
over C.  It is the one C++ feature neither C nor languages with garbage 
collection can emulate.

-- 
Ian Collins
0
Ian
10/12/2014 10:23:03 PM
In article <ca0da8Fk5h7U3@mid.individual.net>,
 Ian Collins <ian-news@hotmail.com> wrote:

> Siri Crews wrote:
> >
> > C++ was going to be better than everyone afore it so it introduced its 
> > zeusawful
> > destructors, and it's been limping along ever since dealing with the
> > ramifications of that decision.
> 
> Nearly all of those ramifications are good.
> 
> The ability to automatically allocate (through a constructor) and 
> release (through a destructor) is probably the biggest advantage C++ has 
> over C.  It is the one C++ feature neither C nor languages with garbage 
> collection can emulate.

'That's not a bug--it's a feature.'

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/12/2014 10:29:08 PM
Ian Collins wrote:
> The ability to automatically allocate (through
> a constructor) and release (through a
> destructor) is probably the biggest advantage
> C++ has over C. It is the one C++ feature
> neither C nor languages with garbage
> collection can emulate.

Sxyz* myvar = xyz_new(whatever);  //ctor
..
..
..
xyz_delete(&myvar);  //dtor

Best regards,
Rick C. Hodgin
0
Rick
10/12/2014 10:33:23 PM
Siri Crews wrote:
> In article <ca0da8Fk5h7U3@mid.individual.net>,
>   Ian Collins <ian-news@hotmail.com> wrote:
>
>> Siri Crews wrote:
>>>
>>> C++ was going to be better than everyone afore it so it introduced its
>>> zeusawful
>>> destructors, and it's been limping along ever since dealing with the
>>> ramifications of that decision.
>>
>> Nearly all of those ramifications are good.
>>
>> The ability to automatically allocate (through a constructor) and
>> release (through a destructor) is probably the biggest advantage C++ has
>> over C.  It is the one C++ feature neither C nor languages with garbage
>> collection can emulate.
>
> 'That's not a bug--it's a feature.'

In the case of constructors and destructors, it certainly is a feature. 
  Try emulating automatic resource management in C and see how far you get.

-- 
Ian Collins
0
Ian
10/12/2014 10:44:54 PM
Le 12/10/2014 23:40, Richard Heathfield a �crit :
> Operator overloading is something I'd love to see in C. Unfortunately, it
> isn't going to happen, and I realise why, and the reasons are good reasons,
> and I can accept that, but... well, a chap can dream, right?

I have presented in this group a compiler that implements operator 
overloading in C. Around 10 years ago.

The reactions were almost without exception negative by a group leaded 
by Mr Heathfield and Mr Thompson.

IT IS NOT C!

HORROR!

IT IS NOT STANDARD!

I am glad you have changed your mind. It would have been better to look 
at my implementation, see what could be improved, and try to make a 
stand in the C Committee.

That still can be done Mr Heathfield. I would appreciate your comments.

jacob
0
jacob
10/12/2014 10:53:21 PM
"Paul N" <gw7rib@aol.com> wrote in message
news:9bb94b05-6247-4334-8fd8-674134d5aa56@googlegroups.com...
> On Sunday, 12 October 2014 21:10:56 UTC+1, Bart  wrote:

>> The thing was bristling with templates, specialisation templates,
>> classes,
>> namespaces, operators, friends, the full works, used, as far as I could
>> see,
>> just for the sake of it. (I managed to get rid of the templates, and it
>> still worked..)

>
> C++ is supposed to make things easier, not harder. The idea is to write
> functions etc that abstract away the details, so that the deatails only
> occur in one place (in what is admittedly probably pretty horrible code)
> and then can be glossed over.
>
> For instance, in C++ you can write:
>
> x = a + (b * c);
>
> where in C you would have to write:
>
> mult(temp, b, c);
> add(x, a temp);

Yes, that's what the sales brochure would have you believe. The reality is
somewhat different.

(That add or multiply, or even assignment, might not mean what you think it
means. Those x, a, b and c names might not be defined anywhere nearby. The
function you're in might somehow be associated with some implied class
instance, and that class definition (buried in a separate file) might
finally yield the types of x, a, b and c if you're lucky. Code is just no
longer obvious.)

(The bit of code I was trying to convert is here:
https://code.google.com/p/jpeg-compressor/source/browse/trunk/jpgd.cpp
Clicking 'trunk' in the source path will lead you to the jpgd.h header that
is part of it.)

(BTW, when I starting looking at C++ many years ago wondering what it could
do, I was always amused by the examples because I'd been doing exactly the
same in the dynamic/scripting languages I created for my 3D/graphics
applications.

That took care of the temporaries someone mentioned elsewhere in the thread,
and had built-in types for things such as points, vectors and matrices, with
the ability to do arithmetic such as x = a+b*c (here, a, b might be vectors,
and c a scalar, or c is also a vector and b*c forms a vector product,
another vector iirc).

The difference is these are built-in features, so no surprises, and you
didn't need to hunt down the definitions of the names to find out what they
were. It's possible to make use of high-level features like this without the
code looking like a dog's dinner..

My point of view is a little different however, because I see a language
like C being used to implement something more high-level; I don't see the
need for it to have fancy features of its own. When C is used to implement a
Python interpreter, then to implement the latter's flexible arrays and
classes, it is not necessary for C itself to have flexible arrays and 
classes. Actually it would likely impede it!)

-- 
Bartc

 

0
BartC
10/13/2014 12:47:43 AM
On Sunday, October 12, 2014 5:22:16 PM UTC-4, Rick C. Hodgin wrote:
>
> I enjoy the C++ class, exceptions, and very little
> else. I would tell anyone, "you get very far beyond
> those two add-ons to C, and you're into the uncertain
> lands called obfuscation and misery."

That seems to be a weird property specific to C++:  the more you do it righ=
t, the worse it gets.

In most languages, if you think in the language and write in a style approp=
riate to the language, using the language's features, everything is better;=
 and if you try to write a program the same way you would in another langua=
ge, the result is often longer, awkward and unreadable.

In C++, however, the opposite is true:  you can get some nice advantages by=
 writing C code in C++ and using a couple features that remove the cruft of=
 memory and string management.  Or converting Java to C++ and mapping over =
the object features you use in Java.  But once you bust out templates and n=
ontrivial OO features to write C++ like a C++ programmer, suddenly you just=
 have an abstract wall of bytes.

We just had some summer projects using OpenCV, which is written in full-blo=
wn proper C++.  Several times we had to take some OpenCV code and figure ou=
t the underlying algorithm so we could implement it in another language.  T=
his always led us on a wild goose chase through dozens of source files, bec=
ause of all the functions that just set up a call to some other function in=
 some other file.  And that's not because OpenCV is badly written, it's bec=
ause OpenCV is written correctly.

In defense of C++, however, one should bear in mind that it exists to handl=
e very large complicated systems that need extensive scaffolding, like user=
 interface APIs.  There's probably no better language for writing OpenCV be=
cause OpenCV is an astoundingly big project.  So naturally C++ code will lo=
ok monstrous given the kind of things that have to be written in it---howev=
er, I would also argue that even things you don't need to write in C++ will=
 look like a glacier of code if written in C++.

--S
0
Xcott
10/13/2014 1:01:01 AM
jacob navia wrote:

> Le 12/10/2014 23:40, Richard Heathfield a écrit :
>> Operator overloading is something I'd love to see in C. Unfortunately, it
>> isn't going to happen, and I realise why, and the reasons are good
>> reasons, and I can accept that, but... well, a chap can dream, right?
> 
> I have presented in this group a compiler that implements operator
> overloading in C. Around 10 years ago.
> 
> The reactions were almost without exception negative by a group leaded
> by Mr Heathfield and Mr Thompson.
> 
> IT IS NOT C!

That's right - it isn't. Unless ISO adds operator overloading to C (which it 
isn't going to), C doesn't have operator overloading. It would be nice, but 
it isn't going to happen. (I can dream, but it won't help.)

But there is always C++, so if I want a C-like language with operator 
overloading, I know where to find it. (And, on those occasions where I find 
operator overloading to be a significant advantage, I will use C++.)

> 
> HORROR!
> 
> IT IS NOT STANDARD!
> 
> I am glad you have changed your mind.

I don't know what makes you think I've changed my mind. Operator overloading 
would have been nice ten years ago, and it would be nice now, but that 
doesn't change the fact that C doesn't have it and isn't likely to get it.

I am, however, prepared to change my mind if the facts change. If ISO 
standardises operator overloading in C, I'll be all in favour of that 
change, provided it doesn't break anything.

> It would have been better to look
> at my implementation, see what could be improved, and try to make a
> stand in the C Committee.

If you want to take up operator overloading with ISO, nobody is stopping 
you.

> 
> That still can be done Mr Heathfield. I would appreciate your comments.

I realise that what you are trying to do is to improve the C language 
according to your opinion of what constitutes an improvement. In this case, 
I happen to agree that adding operator overloading *would* be an improvement 
*if* it could be done without breaking other stuff (e.g. the simplicity of 
the language) - BUT at present the language does not incorporate this 
feature, and therefore it would be unwise for anyone to rely on its presence 
if they require their code to be portable to implementations other than your 
own. In my own case, for example, I don't use your implementation and 
therefore the fact that you've implemented operator overloading doesn't help 
me in the slightest. The same reasoning applies to all your extensions to 
C's grammar and syntax. It is less true of library functions, I suppose, 
since these can sometimes be ported if the source code is available.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/13/2014 1:39:39 AM
On 2014-10-12 23:15, Siri Crews wrote:
> Some people want a typed Javascript,

TypeScript comes to mind.

> but in 18 months the new CPU will handle the current code fast
> enough.

The main benefit of static typing is that more programming errors can be 
found at compile-time. Explicit static typing also serves as documentation.


-- August
0
August
10/13/2014 6:51:04 AM
On 2014-10-13 02:47, BartC wrote:
> "Paul N" <gw7rib@aol.com> wrote in message
> news:9bb94b05-6247-4334-8fd8-674134d5aa56@googlegroups.com...
>> For instance, in C++ you can write:
>>
>> x = a + (b * c);
>>
>> where in C you would have to write:
>>
>> mult(temp, b, c);
>> add(x, a temp);
>
> Yes, that's what the sales brochure would have you believe. The reality is
> somewhat different.
>
> (That add or multiply, or even assignment, might not mean what you think it
> means. Those x, a, b and c names might not be defined anywhere nearby. The
> function you're in might somehow be associated with some implied class
> instance, and that class definition (buried in a separate file) might
> finally yield the types of x, a, b and c if you're lucky. Code is just no
> longer obvious.)

Very well put. For the same reasons I eschew the wild card import 
feature that some languages provide.


-- August
0
August
10/13/2014 7:16:26 AM
On Sunday, October 12, 2014 9:10:56 PM UTC+1, Bart wrote:
> This was a jpeg encoder/decoder, a complex enough task to follow without the 
> language making it impossible.
> 
> (The only thing I can say in its favour, is that at least I managed to get 
> something working. The only C equivalent I could find involved 72 C source 
> files, compared with the two used here. One way or another, everyone seems 
> intent on making things more complicated than they need be.)
> 
> 
I've got a two file JPEG codec, one for the encoder one for the decoder.
It's in pure ANSI C.
Drop me a line if you want it.



0
Malcolm
10/13/2014 9:18:25 AM

"Malcolm McLean" <malcolm.mclean5@btinternet.com> wrote in message 
news:f73e23fa-b34f-4208-a9b4-3d74f3fdf364@googlegroups.com...
> On Sunday, October 12, 2014 9:10:56 PM UTC+1, Bart wrote:
>> This was a jpeg encoder/decoder, a complex enough task to follow without 
>> the
>> language making it impossible.
>>
>> (The only thing I can say in its favour, is that at least I managed to 
>> get
>> something working. The only C equivalent I could find involved 72 C 
>> source
>> files, compared with the two used here. One way or another, everyone 
>> seems
>> intent on making things more complicated than they need be.)
>>
>>
> I've got a two file JPEG codec, one for the encoder one for the decoder.
> It's in pure ANSI C.
> Drop me a line if you want it.

Thanks, I've sent you an email.

-- 
Bartc 

0
BartC
10/13/2014 10:08:09 AM

"Siri Crews" <chine.bleu@yahoo.com> wrote in message 
news:chine.bleu-E549FA.15025412102014@news.eternal-september.org...
>> > x = a + (b * c);
>> >
>> > where in C you would have to write:
>> >
>> > mult(temp, b, c);
>> > add(x, a temp);
>>
>> Operator overloading is something I'd love to see in C. Unfortunately, it
>> isn't going to happen, and I realise why, and the reasons are good 
>> reasons,
>> and I can accept that, but... well, a chap can dream, right?
>
> The problem is how to free anonymous temporary results.
>        x = a + (b*c)
>
>        T1 = b*c;
>        x = a + T1;
>        release(T1);
>
> You either have to make T1 explicit and its release explicitly (C), forbid 
> T1
> from needing a release (this is what Ada does), or have some mechanism to
> implicitly save the value and release it later
>        x = a + ({T1 =} b*c); {release(T1);}
> where {...} marks hidden code added by the compiler.
>
> The originally way to hide this was with reference counts; the latest 
> MacOSX
> compiler apparently can do this for you. Garbage collection was the next
> solution.

I would stay clear of both reference counts and garbage collection unless 
you want the language to use shallow assignments and therefore shared data. 
So if the types of the above are arbitrarily long integers for example, then 
after:

 a = b;

both a and b refer to exactly the same number. You would have to prohibit 
any in-place operations on a, because they would change b too, or have 
complicated mechanisms to do copy-on-write.

Your example I might implement in a dynamic language like this (using a 
stack model because it's simple), which manipulates references:

 push a, marking a as a copy
 push b, as a copy
 push c, as a copy
 multiply [b by c], freeing b and c as needed
 add [a and multiply result], freeing operands as needed
 pop to x, freeing x beforehand and duplicating a as needed

No garbage collection or reference counts needed. For a static language, the 
compiler will know what needs freeing or duplicating (and can make it 
register-based).

The real problems are how to impart to the compiler all the information that 
a new type needs so that it knows how to create it, duplicate it, free it, 
and, for numeric types (which I believe are the main ones that should be 
allowed overloading for arithmetic ops) how to denote them, how they are 
promoted and converted when mixed with the standard types and with other 
user-defined numeric types. It quickly becomes a can of worms, that can just 
end up as a bad imitation of C++.

-- 
Bartc 

0
BartC
10/13/2014 10:40:27 AM
In article <Q1O_v.660967$AB3.269619@fx07.am4>, "BartC" <bc@freeuk.com> wrote:

> I would stay clear of both reference counts and garbage collection unless 
> you want the language to use shallow assignments and therefore shared data. 
> So if the types of the above are arbitrarily long integers for example, then 
> after:
> 
>  a = b;
> 
> both a and b refer to exactly the same number. You would have to prohibit 
> any in-place operations on a, because they would change b too, or have 
> complicated mechanisms to do copy-on-write.

See also CLU http://publications.csail.mit.edu/lcs/pubs/pdf/MIT-LCS-TR-225.pdf 
mutable and immutable objects.


> Your example I might implement in a dynamic language like this (using a 
> stack model because it's simple), which manipulates references:
> 
>  push a, marking a as a copy
>  push b, as a copy
>  push c, as a copy
>  multiply [b by c], freeing b and c as needed
>  add [a and multiply result], freeing operands as needed
>  pop to x, freeing x beforehand and duplicating a as needed

Are you willing to duplicate possibly very large values?

I implement an immutable stack with a heap and garbage collection. A push 
allocates a new cell, link to the stack, and return as the new stack value
    s = push(s, v);
Nothing below the new top is modified so the stack is immutable and can be 
safely shared on multiple threads without locking.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/13/2014 11:08:02 AM
"Malcolm McLean" <malcolm.mclean5@btinternet.com> wrote in message 
news:f73e23fa-b34f-4208-a9b4-3d74f3fdf364@googlegroups.com...
> On Sunday, October 12, 2014 9:10:56 PM UTC+1, Bart wrote:
>> This was a jpeg encoder/decoder, a complex enough task to follow without 
>> the
>> language making it impossible.
>>
>> (The only thing I can say in its favour, is that at least I managed to 
>> get
>> something working. The only C equivalent I could find involved 72 C 
>> source
>> files, compared with the two used here. One way or another, everyone 
>> seems
>> intent on making things more complicated than they need be.)
>>
>>
> I've got a two file JPEG codec, one for the encoder one for the decoder.
> It's in pure ANSI C.
> Drop me a line if you want it.

(I've just tried it, it compiles, links and works perfectly first time. It 
even exports exactly the same names ('loadjpeg', 'savejpeg') that I had to 
add manually to the other library because 
'jpgd::decompress_jpeg_image_from_file' was a trifle long-winded even before 
name-mangling. The binary is also 1/6th the size of the C++ one!

Why can't more open-source programs be like this? The C libjpeg project 
comprises 150 files.)

-- 
Bartc 

0
BartC
10/13/2014 11:30:29 AM
"Siri Crews" <chine.bleu@yahoo.com> wrote in message 
news:chine.bleu-5A4B44.04075313102014@news.eternal-september.org...
> In article <Q1O_v.660967$AB3.269619@fx07.am4>, "BartC" <bc@freeuk.com> 
> wrote:

>>  a = b;
>>
>> both a and b refer to exactly the same number. You would have to prohibit
>> any in-place operations on a, because they would change b too, or have
>> complicated mechanisms to do copy-on-write.
>
> See also CLU 
> http://publications.csail.mit.edu/lcs/pubs/pdf/MIT-LCS-TR-225.pdf
> mutable and immutable objects.

(I love that 1979 manual style! I was at college around the same time)

>> Your example I might implement in a dynamic language like this (using a
>> stack model because it's simple), which manipulates references:
.....
>>  pop to x, freeing x beforehand and duplicating a as needed
>
> Are you willing to duplicate possibly very large values?

That's one drawback. But most values aren't large. When they are, you are 
generally aware of that and will avoid assignments of such values (the only 
time the duplication occurs).

(And as I implement this stuff, I use a special, explicit type when data is 
expected to be shared, called a 'handle' (because the concept is similar to 
manipulating file handles: you don't expect a file's contents to be copied 
when you assign a handle from one variable to another).)

The advantage is being able to type:

  a = b;

and knowing that modifying a will never affect b, a behaviour that is 
persistent /regardless of type/.

-- 
Bartc
 

0
BartC
10/13/2014 11:46:11 AM
BartC wrote:

> 
> 
> "Malcolm McLean" <malcolm.mclean5@btinternet.com> wrote in message
> news:f73e23fa-b34f-4208-a9b4-3d74f3fdf364@googlegroups.com...
<snip>
>>>
>> I've got a two file JPEG codec, one for the encoder one for the decoder.
>> It's in pure ANSI C.
>> Drop me a line if you want it.
> 
> Thanks, I've sent you an email.

So have I. Curiosity got the better of me.
 
-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/13/2014 12:28:02 PM
BartC wrote:

> "Malcolm McLean" <malcolm.mclean5@btinternet.com> wrote in message
> news:f73e23fa-b34f-4208-a9b4-3d74f3fdf364@googlegroups.com...
<snip>
>>>
>> I've got a two file JPEG codec, one for the encoder one for the decoder.
>> It's in pure ANSI C.
>> Drop me a line if you want it.
> 
> (I've just tried it, it compiles, links and works perfectly first time. It
> even exports exactly the same names ('loadjpeg', 'savejpeg') that I had to
> add manually to the other library because
> 'jpgd::decompress_jpeg_image_from_file' was a trifle long-winded even
> before name-mangling. The binary is also 1/6th the size of the C++ one!
> 
> Why can't more open-source programs be like this? The C libjpeg project
> comprises 150 files.)

Malcolm very kindly dropped a copy to me as well. I haven't had time yet to 
have a decent look, but I did peek at the header file, and was somewhat 
astonished to find that it looks like this:

#ifndef jpeg_h
#define jpeg_h

unsigned char *loadjpeg(const char *path, int *width, int *height);
int savejpeg(char *path, unsigned char *rgb, int width, int height);

#endif

Now *that's* a header! Malcolm, if you have really truly made it as simple 
as this to load and save jpegs, you just earned yourself something like 
three thousand brownie points. Before you know it, you'll be a Guide! ;-)

I won't be able to take a close look for a day or two, but I'm very much 
looking forward to it.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/13/2014 1:14:07 PM
On Monday, October 13, 2014 5:18:33 AM UTC-4, Malcolm McLean wrote:
> I've got a two file JPEG codec, one for the encoder one for the decoder.
> It's in pure ANSI C.
> Drop me a line if you want it.

Malcom, is it code you can post for all?  I sure would like to see/use it.

Best regards,
Rick C. Hodgin
0
Rick
10/13/2014 2:00:12 PM
BartC wrote:
>
>
> "Siri Crews" <chine.bleu@yahoo.com> wrote in message
> news:chine.bleu-E549FA.15025412102014@news.eternal-september.org...
>>>> x = a + (b * c);
>>>>
>>>> where in C you would have to write:
>>>>
>>>> mult(temp, b, c);
>>>> add(x, a temp);
>>>
>>> Operator overloading is something I'd love to see in C. Unfortunately, it
>>> isn't going to happen, and I realise why, and the reasons are good
>>> reasons,
>>> and I can accept that, but... well, a chap can dream, right?
>>
>> The problem is how to free anonymous temporary results.
>>         x = a + (b*c)
>>
>>         T1 = b*c;
>>         x = a + T1;
>>         release(T1);
>>
>> You either have to make T1 explicit and its release explicitly (C), forbid
>> T1
>> from needing a release (this is what Ada does), or have some mechanism to
>> implicitly save the value and release it later
>>         x = a + ({T1 =} b*c); {release(T1);}
>> where {...} marks hidden code added by the compiler.
>>
>> The originally way to hide this was with reference counts; the latest
>> MacOSX
>> compiler apparently can do this for you. Garbage collection was the next
>> solution.
>
> I would stay clear of both reference counts and garbage collection unless
> you want the language to use shallow assignments and therefore shared data.

Or you just do what the C++ standard does and include rules for the 
lifetime of temporary objects...

-- 
Ian Collins
0
Ian
10/13/2014 7:08:40 PM
On Sunday, October 12, 2014 4:36:00 PM UTC-5, Richard Heathfield wrote:
> Rick C. Hodgin wrote:
> 
> > I enjoy the C++ class, exceptions, and very little
> > else. I would tell anyone, "you get very far beyond
> > those two add-ons to C, and you're into the uncertain
> > lands called obfuscation and misery."
> > 
> 

The standard containers are pretty awesome, as are the standard algorithms.
With the std::copy function template, I can:

 - Copy the contents of an input stream to a container;
 - Write the contents of a container to an output stream;
 - Copy the contents of one container to another;

with almost no changes from one call to the other.  

    std::vector<int> myvec;
    /**
     * load vector from standard input
     */
    std::copy( std::istream_iterator<int>( std::cin ),
               std::istream_iterator<int>( ),
               std::back_inserter( myvec ));
     /**
      * write vector contents to standard output
      */
    std::copy( myvec.begin(), myvec.end(), 
               std::ostream_iterator<int>( std::cout, " " ));

That'll work with any container, stream, or otherwise that implements 
iterators and begin() and end().    

> > Hard to beat:
> >     Cxyz* myvar = new Cxyz(whatever);
> 
> Hard to re-size.
> 

When would you re-size a class instance?
0
John
10/13/2014 7:50:13 PM
John Bode wrote:

> On Sunday, October 12, 2014 4:36:00 PM UTC-5, Richard Heathfield wrote:
>> Rick C. Hodgin wrote:
>> 
<snip>
 
>> > Hard to beat:
>> >     Cxyz* myvar = new Cxyz(whatever);
>> 
>> Hard to re-size.
>> 
> 
> When would you re-size a class instance?

When I want one to start with, and then some more of them later.

Sxyz *myvar = malloc(sizeof *myvar);

/* ...later... */

t = realloc(myvar, n * sizeof *myvar);
if(t != NULL)
{
  myvar = t;

In C, you could use new [] instead of new, but you'd struggle to resize it. 
You'd probably end up using std::vector instead. That's normally what I do, 
but occasionally I get weird problems with std::vector on one particular 
implementation I use, so I often end up reverting to malloc/realloc anyway.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/13/2014 8:20:49 PM
On Sun, 2014-10-12, BartC wrote:
> I've just been working with a bit of C++ code for the first time (with a 
> view to spending a couple of hours converting it to C. Or to *any* language 
> that mere humans could understand, for that matter).
>
> But after that experience I don't think I'll ever complain about anything in 
> C again!
....
> Anyone reading this and is at the point of being undecided about whether to 
> learn C++ or not, please do everyone else a favour and don't bother!

Let me see that code, and I'll have an opinion about it.

/Jorgen

-- 
  // Jorgen Grahn <grahn@  Oo  o.   .     .
\X/     snipabacken.se>   O  o   .
0
Jorgen
10/13/2014 8:21:01 PM
On Monday, October 13, 2014 3:21:01 PM UTC-5, Richard Heathfield wrote:
[snip]
> You'd probably end up using std::vector instead. That's normally what I 
> do, but occasionally I get weird problems with std::vector on one 
> particular implementation I use, so I often end up reverting to 
> malloc/realloc anyway.
> 

Interesting - what implementation?

I pretty much live on vectors and maps these days; the only times I rely
on manual memory management are where I've coded myself into a corner
and don't have another option.  
0
John
10/13/2014 8:38:54 PM

"Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message 
news:slrnm3od1b.1ks.grahn+nntp@frailea.sa.invalid...
> On Sun, 2014-10-12, BartC wrote:
>> I've just been working with a bit of C++ code for the first time (with a
>> view to spending a couple of hours converting it to C. Or to *any* 
>> language
>> that mere humans could understand, for that matter).
>>
>> But after that experience I don't think I'll ever complain about anything 
>> in
>> C again!
> ...
>> Anyone reading this and is at the point of being undecided about whether 
>> to
>> learn C++ or not, please do everyone else a favour and don't bother!
>
> Let me see that code, and I'll have an opinion about it.

I did post a link to it in a reply, but here it is again:

(The bit of code I was trying to convert is here:
https://code.google.com/p/jpeg-compressor/source/browse/trunk/jpgd.cpp
Clicking 'trunk' in the source path will lead you to the jpgd.h header that
is part of it.)

There might be a lot worse, but this is the first I've had to try and 
understand.

-- 
Bartc 

0
BartC
10/13/2014 9:06:49 PM
On Mon, 2014-10-13, BartC wrote:
>
>
> "Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message 
> news:slrnm3od1b.1ks.grahn+nntp@frailea.sa.invalid...
>> On Sun, 2014-10-12, BartC wrote:
>>> I've just been working with a bit of C++ code for the first time (with a
>>> view to spending a couple of hours converting it to C. Or to *any* 
>>> language
>>> that mere humans could understand, for that matter).
>>>
>>> But after that experience I don't think I'll ever complain about anything 
>>> in
>>> C again!
>> ...
>>> Anyone reading this and is at the point of being undecided about whether 
>>> to
>>> learn C++ or not, please do everyone else a favour and don't bother!
>>
>> Let me see that code, and I'll have an opinion about it.
>
> I did post a link to it in a reply, but here it is again:
>
> (The bit of code I was trying to convert is here:
> https://code.google.com/p/jpeg-compressor/source/browse/trunk/jpgd.cpp
> Clicking 'trunk' in the source path will lead you to the jpgd.h header that
> is part of it.)
>
> There might be a lot worse, but this is the first I've had to try and 
> understand.

Ok, thanks.

I see nothing remarkable about it, except
- it's JPEG decompression; it's bound to be almost unreadable
  to normal people who just want to uncompress images. E.g. what
  is a Matrix44 or X113 or IDCT?
- the author seems to have a thing for few and really big source
  files with really long lines; that also makes it much harder

Other than that, I mostly see plain old C code, lightly and tastefully
wrapped in C++. He /does/ use templates, but it makes sense, and it's
really basic usage of templates -- the kind you can learn by reading a
book for an hour or so.  I can explain it if you want.

But of course it /is/ C++ code: you cannot expect to understand it
without some understanding of the language it's written in.  Since you
say it's the first time you've worked with C++, I'm not surprised you
have difficulties.

I'm a bit surprised, though, that you reject the language
based on this.

/Jorgen

-- 
  // Jorgen Grahn <grahn@  Oo  o.   .     .
\X/     snipabacken.se>   O  o   .
0
Jorgen
10/13/2014 10:34:33 PM
"Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message
news:slrnm3okro.1ks.grahn+nntp@frailea.sa.invalid...
> On Mon, 2014-10-13, BartC wrote:

>> I did post a link to it in a reply, but here it is again:
>>
>> (The bit of code I was trying to convert is here:
>> https://code.google.com/p/jpeg-compressor/source/browse/trunk/jpgd.cpp
>> Clicking 'trunk' in the source path will lead you to the jpgd.h header
>> that
>> is part of it.)
>>
>> There might be a lot worse, but this is the first I've had to try and
>> understand.
>
> Ok, thanks.
>
> I see nothing remarkable about it, except
> - it's JPEG decompression; it's bound to be almost unreadable
>  to normal people who just want to uncompress images. E.g. what
>  is a Matrix44 or X113 or IDCT?
> - the author seems to have a thing for few and really big source
>  files with really long lines; that also makes it much harder
>
> Other than that, I mostly see plain old C code

That's what I thought too at first! The templates (although odd-looking such
as template<>struct col<1> {....}) at least stood out with their <,>
brackets, while "::" I was familiar with.

Until I tried to trace through the code and found it was impossible.  If you
had to construct some code that was the embodiment of the 'twisty little
passages, all alike' description, then this comes close!

Look, for example, at function (or method) jpeg_decoder::gray_convert() at
line 2054. In the body you will see names such as m_max_mcu_y_size; these
are only defined inside the class jpeg_decode. Presumably, then, they refer
to fields in an instance of that class. But where is that instance? Because
::gray_convert() takes apparently no parameters.

It's called from line 2196, but there's nothing in front of it such as
X.gray_convert() (or is it X:: ?) so that X is passed as the
implied parameter 'self' or 'this' (which gray_convert doesn't use anyway).
So where /is/ this seemingly intangible class instance? Apparently nowhere!

This seems odd: on the one hand, you have jpeg_decoder:: everywhere; on the
other, there all these naked, unqualified class members dotted about,
unattached to any particular class instance. (I put a dummy 'int
m_max_mcu_y_size' definition at the top of the file; it still compiled, but
which m_max_mcu_y_size is now being used inside ::gray_convert*()? It could
be anyone's guess!)

> But of course it /is/ C++ code: you cannot expect to understand it
> without some understanding of the language it's written in.

You're right, I don't know the language, and don't want to. But I could know
it inside out, it would still be a huge amount of work to disentangle
everything and turn it into regular C. With other languages I don't know,
however, usually I can figure what the code is trying to do, provided it
doesn't use OOP which is like a death-knell for any readability it might
have had. (And provided it isn't Lisp or anything weird-looking...)

> I'm a bit surprised, though, that you reject the language
> based on this.

I reject it in this case because it's encouraged what I think is the wrong
approach for this task, bundling data structures and function declarations
untidily into one class, while the function bodies themselves are everywhere
else, and in a separate file (so much for encapsulation!).

-- 
Bartc 

0
BartC
10/14/2014 12:13:06 AM
On 10/13/2014 7:13 PM, BartC wrote:
> "Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message
> news:slrnm3okro.1ks.grahn+nntp@frailea.sa.invalid...
>> On Mon, 2014-10-13, BartC wrote:
>>> I did post a link to it in a reply, but here it is again:

>>> (The bit of code I was trying to convert is here:
>>> https://code.google.com/p/jpeg-compressor/source/browse/trunk/jpgd.cpp
>>> Clicking 'trunk' in the source path will lead you to the jpgd.h header
>>> that
>>> is part of it.)

>>> There might be a lot worse, but this is the first I've had to try and
>>> understand.

>> Ok, thanks.

>> I see nothing remarkable about it, except
>> - it's JPEG decompression; it's bound to be almost unreadable
>>  to normal people who just want to uncompress images. E.g. what
>>  is a Matrix44 or X113 or IDCT?
>> - the author seems to have a thing for few and really big source
>>  files with really long lines; that also makes it much harder

>> Other than that, I mostly see plain old C code

> That's what I thought too at first! The templates (although odd-looking
> such
> as template<>struct col<1> {....}) at least stood out with their <,>
> brackets, while "::" I was familiar with.

> Until I tried to trace through the code and found it was impossible.  If
> you
> had to construct some code that was the embodiment of the 'twisty little
> passages, all alike' description, then this comes close!

> Look, for example, at function (or method) jpeg_decoder::gray_convert() at
> line 2054. In the body you will see names such as m_max_mcu_y_size; these
> are only defined inside the class jpeg_decode. Presumably, then, they refer
> to fields in an instance of that class. But where is that instance? Because
> ::gray_convert() takes apparently no parameters.

It's a member function. There's an implicit parameter that is, 
effectively, a constant pointer to a jpeg_decoder.

> It's called from line 2196, but there's nothing in front of it such as
> X.gray_convert() (or is it X:: ?) so that X is passed as the
> implied parameter 'self' or 'this' (which gray_convert doesn't use anyway).
> So where /is/ this seemingly intangible class instance? Apparently nowhere!

Since it's called from a member function, it comes from the implicit 
this parameter that jpeg_decoder::decode was called with.

> This seems odd: on the one hand, you have jpeg_decoder:: everywhere; on the
> other, there all these naked, unqualified class members dotted about,
> unattached to any particular class instance. (I put a dummy 'int
> m_max_mcu_y_size' definition at the top of the file; it still compiled, but
> which m_max_mcu_y_size is now being used inside ::gray_convert*()? It could
> be anyone's guess!)

>> But of course it /is/ C++ code: you cannot expect to understand it
>> without some understanding of the language it's written in.

> You're right, I don't know the language, and don't want to. But I could
> know
> it inside out, it would still be a huge amount of work to disentangle
> everything and turn it into regular C. With other languages I don't know,
> however, usually I can figure what the code is trying to do, provided it
> doesn't use OOP which is like a death-knell for any readability it might
> have had. (And provided it isn't Lisp or anything weird-looking...)

It wasn't any particular effort for me. Then again, I do have a fair 
amount of experience with C++. Still, it's at least a data point that 
runs counter to your assertion that it would be a huge amount of work 
even if you knew C++ inside out.

>> I'm a bit surprised, though, that you reject the language
>> based on this.

> I reject it in this case because it's encouraged what I think is the wrong
> approach for this task, bundling data structures and function declarations
> untidily into one class, while the function bodies themselves are
> everywhere
> else, and in a separate file (so much for encapsulation!).

It doesn't do anything of the sort. The four class definitions related 
to decoding are in a single header file. The implementation of the 
functions defined in those classes are in a single implementation file. 
This method of splitting up the definition and implementation is taken 
pretty much straight from C. It seems odd to reject C++ (in favor of C) 
for this reason.

Martin Shobe

0
Martin
10/14/2014 2:15:56 AM
Richard Heathfield <invalid@see.sig.invalid> wrote:
> BartC wrote:
> 
>> "Malcolm McLean" <malcolm.mclean5@btinternet.com> wrote in message
>> news:f73e23fa-b34f-4208-a9b4-3d74f3fdf364@googlegroups.com...
> <snip>
>>>>
>>> I've got a two file JPEG codec, one for the encoder one for the decoder.
>>> It's in pure ANSI C.
>>> Drop me a line if you want it.
>> 
>> (I've just tried it, it compiles, links and works perfectly first time.
>> Why can't more open-source programs be like this?
> 
> Malcolm very kindly dropped a copy to me as well. I haven't had time yet to 
> have a decent look,
> I won't be able to take a close look for a day or two, but I'm very much 
> looking forward to it.

If you've got $2.96 USD to spare, look at it alongside page 459
(and related pages) of
 lulu.com/shop/malcolm-mclean/basic-algorithms/ebook/product-17550309.html
Minor quibble: the toc has chapter page#'s wrong, e.g., "A JPEG Codec"
(you may want to read that whole chapter) starts on pg 419, not pg 440
as listed. Major remark: it's overall remarkably useful; kind of like
an expanded version of the "Selected Topics" chapter in Cormen, et al.
Actually, that's not quite an adequate comparison, e.g., MM has many
more topics and much more "fully functional" code, but I can't think of
a better comparison offhand.
-- 
John Forkosh  ( mailto:  j@f.com  where j=john and f=forkosh )
0
JohnF
10/14/2014 4:57:03 AM
On Sun, 12 Oct 2014 14:44:48 -0700 (PDT), "Rick C. Hodgin"
<rick.c.hodgin@gmail.com> wrote:

>Hard to beat: 
>    Cxyz* myvar = new Cxyz(whatever);
>    myvar->function(a, b, c);
>    delete myvar;

CppClassxyz  myvar;
myvar.function(a, b, c);

>Rather than:
>    Sxyz* myvar = xyz_new(whatever);
>    xyz_function(myvar, a, b, c);
>    xyz_delete(&myvar);
>
>...both in use, and in definition. :-)
>
>Best regards,
>Rick C. Hodgin

0
Rosario193
10/14/2014 7:05:42 AM
John Bode wrote:

> On Monday, October 13, 2014 3:21:01 PM UTC-5, Richard Heathfield wrote:
> [snip]
>> You'd probably end up using std::vector instead. That's normally what I
>> do, but occasionally I get weird problems with std::vector on one
>> particular implementation I use, so I often end up reverting to
>> malloc/realloc anyway.
>> 
> 
> Interesting - what implementation?

Borland.

> I pretty much live on vectors and maps these days; the only times I rely
> on manual memory management are where I've coded myself into a corner
> and don't have another option.

There are two schools of thought on this: (a) memory management is far too 
important to be left to the programmer; (b) memory management is far too 
important to be left to the language. Clearly we are in opposing camps. But 
I have some good news, John - in a couple of months, there's going to be a 
24-hour ceasefire. Fancy a game of football about half-way between?

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/14/2014 7:06:18 AM
On Sun, 12 Oct 2014 15:03:03 -0700, Siri Crews wrote:
>> > x = a + (b * c);
>> > 
>> > where in C you would have to write:
>> > 
>> > mult(temp, b, c);
>> > add(x, a temp);
>> 
>> Operator overloading is something I'd love to see in C. Unfortunately, it 
>> isn't going to happen, and I realise why, and the reasons are good reasons, 
>> and I can accept that, but... well, a chap can dream, right?
>
>The problem is how to free anonymous temporary results.
>        x = a + (b*c)
>
>        T1 = b*c;
>        x = a + T1;
>        release(T1);

one can assure result ok and use mem efficently if expression is not
much deep if has a number limit of deep

for example g=(x+(b+y+(z+p)))  etc would deep 3 or somethig as this

for example one can assume there are not 20 () deep for expressions
etc....

the trick i find is use a circular set of global type of for example
20 variables with a global index...

something next variable is circular[(index++)%20]

the problem is that operator + above in "g=(x+(b+y+(z+p)))" can not
use that circular buffer in its code

so could be ok only for the function and not for functions that
function call...

>You either have to make T1 explicit and its release explicitly (C), forbid T1 
>from needing a release (this is what Ada does), or have some mechanism to 
>implicitly save the value and release it later
>        x = a + ({T1 =} b*c); {release(T1);}
>where {...} marks hidden code added by the compiler.
>
>The originally way to hide this was with reference counts; the latest MacOSX 
>compiler apparently can do this for you. Garbage collection was the next 
>solution.

i think it is not a solution

>C++ was going to be better than everyone afore it so it introduced its zeusawful 
>destructors, and it's been limping along ever since dealing with the 
>ramifications of that decision.

0
Rosario193
10/14/2014 7:06:49 AM
Rosario193 wrote:
> CppClassxyz myvar;
> myvar.function(a, b, c);

It may sound silly to some developers,
but I am not a fan of non-pointers. I
am a fan of documenting the steps
involved in completing a task, and not of
letting "invisible" things happen when
there are non-invisible ways to do the
same. That's what procedural function
steps/calls are for, they help document,
and make debugging easier.

For example, aving documented the extra
1.5 steps makes it clear even to a non-
developer what is happening.

I believe there should be a lnew command
to allocate the local stack for this purpose.
It would accomplish the same thing, but
do so explicitly, and would also not
require the delete.

    Cxyz* myvar = lnew Cxyz(whatever);
    myvar->function(a,b,c);

However, I still like:

    Cxyz.new(whatever).function(a,b,c).delete();

But with the explicit new/reference/delete
cycle, a new "dew" keyword could be
introduced, meaning "new with (implicit)
delete after use," as in:

    Cxyz.dew(whatever).function(a,b,c);

It is also a play on word sounds in
English as "dew" sounds like "do." :-)

Its purpose differs from what could be:

    Cxyz::function(a,b,c)

....because of the ctor and dtor, and all
that brings.

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 7:47:10 AM
Rick C. Hodgin wrote:
> It would accomplish the same thing,
> but do so explicitly, and would also not
> require the delete.
>
>    Cxyz* myvar = lnew Cxyz(whatever);
>    myvar->function(a,b,c);

I shouldn't say, "would not require the delete,"
because it is still required, but all lnew allocations
would self-clean any non-explicitly-delete'd
objects at function exit.

I realize this may also seem to go against my
"no invisible things" statement, but it would not
be completely invisible, as the lnew allocation
was explicitly made, and the explicit delete can
still be used and documented.

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 8:10:46 AM
"Martin Shobe" <martin.shobe@yahoo.com> wrote in message 
news:m1i11h$cih$1@dont-email.me...
> On 10/13/2014 7:13 PM, BartC wrote:

>>But I could know it inside out, it would still be a huge amount of work to 
>>disentangle
>> everything and turn it into regular C.

> It wasn't any particular effort for me. Then again, I do have a fair 
> amount of experience with C++. Still, it's at least a data point that runs 
> counter to your assertion that it would be a huge amount of work even if 
> you knew C++ inside out.

If that was the case, why is an equivalent C version so hard to find? As 
you'd think someone would have done it already. Or perhaps if you've got a 
couple of hours spare ...

(I suspect the only people who are capable of translating such code will 
also be the people who think it's fine as it is!)

>> I reject it in this case because it's encouraged what I think is the 
>> wrong
>> approach for this task, bundling data structures and function 
>> declarations
>> untidily into one class, while the function bodies themselves are
>> everywhere
>> else, and in a separate file (so much for encapsulation!).
>
> It doesn't do anything of the sort. The four class definitions related to 
> decoding are in a single header file. The implementation of the functions 
> defined in those classes are in a single implementation file. This method 
> of splitting up the definition and implementation is taken pretty much 
> straight from C. It seems odd to reject C++ (in favor of C) for this 
> reason.

The difference is that, given actual C code, you can pretty much translate 
it line-by-line to another language, without sight of the declarations or 
definitions of the identifiers on each line. (Well, provided people haven't 
gone mad with macros.)

You can't do that with the C++ code even if you know the language well.

Another thing, is that given any standalone identifier (not following "." or 
"->"), the process of tracing that back to its definition is straightforward 
in C.

In C++, it's more laborious and might even be ambiguous when you have 
overloaded function and operator names (not forgetting multiple 
inheritance).

(The language-building features of C++ are interesting, and useful when used 
in a constrained manner, but not when used everywhere in everyday code 
because they lead to obfuscation).

-- 
Bartc 

0
BartC
10/14/2014 10:07:58 AM
On Tuesday, October 14, 2014 6:10:07 AM UTC-4, Bart wrote:
> If that was the case, why is an equivalent C version so hard to find?
> As you'd think someone would have done it already. Or perhaps if you've
> got a  couple of hours spare ...

I agree with you, Bart.  It was confusing.  I would rather write one
from scratch myself using a typewritten explanation of the algorithm
than sift through the source code at that link.

"Real use" of C++ makes everything so hard.  It's not worth it because
it's not THAT much more difficult to do it in C, or in a lesser subset
of C++, while simultaneously making it 10x easier to maintain.

I think C++ has gone way too far in its "simplification".  The
complexity and steep learning curves, to me, simply do not justify
the benefits.

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 10:43:34 AM
Richard Heathfield <invalid@see.sig.invalid> writes:
<snip>
> There are two schools of thought on this: (a) memory management is far too 
> important to be left to the programmer; (b) memory management is far too 
> important to be left to the language. Clearly we are in opposing
> camps.

I find this view very strange!  Why is that programmers are in these two
camps rather than programs or circumstances?  It seems to me there are
situations where memory management is too important to be left to the
programmer, and others where it too important to be left to the language
and, no doubt, programmers will disagree about the boundary cases, but do
you think that disagreement is likely to be consistent enough to
attribute it to the programmer rather than the program?

<snip>
-- 
Ben.
0
Ben
10/14/2014 10:49:15 AM
"Rick C. Hodgin" <rick.c.hodgin@gmail.com> writes:
<snip>
> It may sound silly to some developers,
> but I am not a fan of non-pointers. I
> am a fan of documenting the steps
> involved in completing a task, and not of
> letting "invisible" things happen when
> there are non-invisible ways to do the
> same.
<snip>
> I believe there should be a lnew command
> to allocate the local stack for this purpose.
> It would accomplish the same thing, but
> do so explicitly, and would also not
> require the delete.
>
>     Cxyz* myvar = lnew Cxyz(whatever);
>     myvar->function(a,b,c);

myvar is allocated on the stack and that's being done "invisibly".  Algol
68 did what you want, but it finished the job by doing it in all cases.

Anyway, what does lnew get you over and above the normal stack allocation:

  Cxyz myvar[] = {whatever};

you get (more or less) a pointer, and everyone know it's on the stack?

Note, I'm not saying it's a good thing to do -- I have no idea what your
objection to "non-pointers" is -- but I don't see the need to
proliferate language features to satisfy it.  (To keep is slightly
topical, I put the "=" in so it's valid C.  I think modern C++ style is
to omit it, given the new {} initialisers.)

<snip>
-- 
Ben.
0
Ben
10/14/2014 11:09:06 AM
On 10/14/2014 5:07 AM, BartC wrote:
> "Martin Shobe" <martin.shobe@yahoo.com> wrote in message
> news:m1i11h$cih$1@dont-email.me...
>> On 10/13/2014 7:13 PM, BartC wrote:
>>> But I could know it inside out, it would still be a huge amount of
>>> work to disentangle
>>> everything and turn it into regular C.

>> It wasn't any particular effort for me. Then again, I do have a fair
>> amount of experience with C++. Still, it's at least a data point that
>> runs counter to your assertion that it would be a huge amount of work
>> even if you knew C++ inside out.

> If that was the case, why is an equivalent C version so hard to find? As
> you'd think someone would have done it already. Or perhaps if you've got
> a couple of hours spare ...

What does this have to do with what I've said? Do you think that in 
order to understand what some C++ code means I have to create an 
equivalent C version?

> (I suspect the only people who are capable of translating such code will
> also be the people who think it's fine as it is!)

I suspect that anyone who wanted a C jpeg library would simply write one 
rather than trying to duplicate a C++ one.

>>> I reject it in this case because it's encouraged what I think is the
>>> wrong
>>> approach for this task, bundling data structures and function
>>> declarations
>>> untidily into one class, while the function bodies themselves are
>>> everywhere
>>> else, and in a separate file (so much for encapsulation!).

>> It doesn't do anything of the sort. The four class definitions related
>> to decoding are in a single header file. The implementation of the
>> functions defined in those classes are in a single implementation
>> file. This method of splitting up the definition and implementation is
>> taken pretty much straight from C. It seems odd to reject C++ (in
>> favor of C) for this reason.

> The difference is that, given actual C code, you can pretty much
> translate it line-by-line to another language, without sight of the
> declarations or definitions of the identifiers on each line. (Well,
> provided people haven't gone mad with macros.)

If that was your point, why didn't you say that instead of talking about 
how bad something was that it borrowed from C? (I also seriously doubt 
your assertion in any case. Try converting C line-by-line to PATH.)

> You can't do that with the C++ code even if you know the language well.

But why would you want to? Why would anyone care if the translation is 
line-by-line?

> Another thing, is that given any standalone identifier (not following
> "." or "->"), the process of tracing that back to its definition is
> straightforward in C.

> In C++, it's more laborious and might even be ambiguous when you have
> overloaded function and operator names (not forgetting multiple
> inheritance).

It's not ambiguous, the compiler wouldn't be able to do it if it were. 
Otherwise, this is a legitimate complaint. If you find overloading
confusing, then C++ will confuse you.

[snip]

Martin Shobe

0
Martin
10/14/2014 12:10:12 PM
"Martin Shobe" <martin.shobe@yahoo.com> wrote in message
news:m1j3rt$fal$1@dont-email.me...
> On 10/14/2014 5:07 AM, BartC wrote:

>> (I suspect the only people who are capable of translating such code will
>> also be the people who think it's fine as it is!)
>
> I suspect that anyone who wanted a C jpeg library would simply write one
> rather than trying to duplicate a C++ one.

That was the next option. Not an appealing one because jpeg is complex and
also because I've done plenty of messing with image file formats in the
past.

>> The difference is that, given actual C code, you can pretty much
>> translate it line-by-line to another language, without sight of the
>> declarations or definitions of the identifiers on each line. (Well,
>> provided people haven't gone mad with macros.)

> If that was your point, why didn't you say that instead of talking about
> how bad something was that it borrowed from C? (I also seriously doubt
> your assertion in any case. Try converting C line-by-line to PATH.)

I can convert a lot of C code line-by-line to my own two languages that I
use. I can think of other /conventional/ languages where it would be also
straightforward, especially if the algorithm is clear and that is translated
instead.

>> You can't do that with the C++ code even if you know the language well.
>
> But why would you want to? Why would anyone care if the translation is
> line-by-line?

Someone who would have to do exactly that?

>> Another thing, is that given any standalone identifier (not following
>> "." or "->"), the process of tracing that back to its definition is
>> straightforward in C.
>
>> In C++, it's more laborious and might even be ambiguous when you have
>> overloaded function and operator names (not forgetting multiple
>> inheritance).
>
> It's not ambiguous, the compiler wouldn't be able to do it if it were.
> Otherwise, this is a legitimate complaint. If you find overloading
> confusing, then C++ will confuse you.

I design my own languages and write compilers. I've looked at overloading
(and a lot of other OO stuff) and decided not to bother with it (as
user-defined parts of  the language) because, while a compiler might be able
to follow some torturous algorithm to figure out what's what, for the source
to be understandable, the human reader would have to do the same!

(Besides, one of my languages being dynamically typed, you get a lot of
these features for free anyway, eg. generics.)

However I don't particularly want to start a language war. This is just my 
personal opinion. And in the last couple of days, my opinion of C has gone 
up, and that of C++ has gone down.

-- 
Bartc 

0
BartC
10/14/2014 12:55:42 PM
On Sun, 12 Oct 2014 15:33:23 -0700, Rick C. Hodgin wrote:

> Ian Collins wrote:
>> The ability to automatically allocate (through a constructor) and
>> release (through a destructor) is probably the biggest advantage C++ has
>> over C. It is the one C++ feature neither C nor languages with garbage
>> collection can emulate.
> 
> Sxyz* myvar = xyz_new(whatever);  //ctor .
> .
> .
> xyz_delete(&myvar);  //dtor

Except that now you have to ensure that xyz_delete is called regardless of
how control escapes the scope: reaching the end of the scope, return,
break, continue, goto, exception, signal, longjmp(), ...

Handling the earlier examples is feasible in C, albeit tedious, ugly and
error-prone. Once you add in exceptions, destructors become non-optional
as a language feature. FWIW, C++ won't invoke exceptions for signals or
longjmp(), but the existence of exceptions make those far less common.

Destructors are sufficiently useful that they've spawned their own idiom,
creating class wrappers which exist solely so that you can put any
clean-up (e.g. closing files) into the destructor and rely upon the
implementation to ensure that it gets called at the appropriate point.

0
Nobody
10/14/2014 1:37:53 PM
On Tuesday, October 14, 2014 9:38:05 AM UTC-4, Nobody wrote:
> On Sun, 12 Oct 2014 15:33:23 -0700, Rick C. Hodgin wrote:
> > Sxyz* myvar = xyz_new(whatever);  //ctor .
> > .
> > .
> > xyz_delete(&myvar);  //dtor
> 
> Except that now you have to ensure that xyz_delete is called regardless
> of how control escapes the scope: reaching the end of the scope, return,
> break, continue, goto, exception, signal, longjmp(), ...

You have to do that in C++ with any explicitly created objects using new.
If you use a local variable definition it handles it for you, but it is
also being done for you automatically.

You could accomplish the same thing by always using this:

    goto clean_quit_and_exit;

;-)

> Handling the earlier examples is feasible in C, albeit tedious, ugly and
> error-prone. Once you add in exceptions, destructors become non-optional
> as a language feature. FWIW, C++ won't invoke exceptions for signals or
> longjmp(), but the existence of exceptions make those far less common.
> 
> Destructors are sufficiently useful that they've spawned their own idiom,
> creating class wrappers which exist solely so that you can put any
> clean-up (e.g. closing files) into the destructor and rely upon the
> implementation to ensure that it gets called at the appropriate point.

I have no issues with constructors or destructors.  My issues relate to
using local variable class definitions which are not created explicitly
with new, or in my suggestion lnew.  I do not like the fact that these
things are taking place "invisibly".

I have devised some workarounds for these in RDC that I like.  Along
with return values, a message or something I call an "inquiry" can be
returned, which is like an error condition.  The compiler automatically
adds codes to branch to those locations and they are added with something
I call a cask.  They also use a new flow control mechanism called a flow.

    flow {
        // Do some code, if it has an inquiry branch to the code below
        something() (|meia|inquiry||);

    } subflow inquiry optionalNameHere(meia* parameter) {
        // Code for the inquiry will go here

    } void something(void) {
        return meia(whatever);

    } always after {
        // Clean house
    }

Sample of what casks look like, along with the almost current version
of my IDE, can be seen here:  http://www.visual-freepro.org/images/vjr_052.png

In this design, you enter a flow.  There are ways to flowto other places,
you can also flowout at any time, and because of the presence of "always
after" code, it will always exit that code.  You can also have "always
before".  The "void something(void)" is called an adhoc.  It allows
functions related explicitly to that flow to be created which are only
accessible to the flow, and self-document their associativity to that
entity.  They do not have to be functions though either.  They can be
mere blocks of code which remove the logic from some lengthy nested
place inside of a normal bit of code, and categorize it by name for
both documentation, and ease of maintenance.  When it reaches the },
it will automatically return in the case of a definition which has a
return parameter, or if it was simply given a subflow name, then it
will exit out of the flow.

Lots of other such things will be introduced with my RDC (should the
Lord ever allow me to complete it James 4:15 :-) :-) :-)).

I just at present don't have the time to complete these things because
it is just me working on it, and I have the necessities of life and
living as through a normal job.  In short:  I have the will, I have the
ability, but I do not have the time.

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 2:15:33 PM
You're essentially arguing that your inability to comprehend (relatively
mundane) C++ is a consequence of C++ itself rather than your lack of
familiarity with it. This isn't true.

It's quite possible to use C++'s features to write code which is
objectively hard to understand. But this isn't an example of that. Anyone
familiar with C++ will have no problem understanding it.

None of the issues which you've described are any more unintuitive
than e.g. C collapsing arrays to pointers, or the difference between a
const pointer and pointer-to-const, or any of the hundreds of other quirks
which trip up new users year after year.

>> I'm a bit surprised, though, that you reject the language based on this.
> 
> I reject it in this case because it's encouraged what I think is the wrong
> approach for this task, bundling data structures and function declarations
> untidily into one class, while the function bodies themselves are
> everywhere else, and in a separate file (so much for encapsulation!).

This is no different to C code which provides a header file containing a
struct declaration plus declarations of the functions which use it, and a
source file containing the definitions of those functions.

The only difference is on which side of the closing brace the function
declarations live.

0
Nobody
10/14/2014 2:24:22 PM
On Tue, 14 Oct 2014 00:47:10 -0700 (PDT), "Rick C. Hodgin" wrote:
>Rosario193 wrote:
>> CppClassxyz myvar;
>> myvar.function(a, b, c);
>
>It may sound silly to some developers,
>but I am not a fan of non-pointers. I
>am a fan of documenting the steps
>involved in completing a task, and not of
>letting "invisible" things happen 

when i see the above code i image the call of constructor for 
myvar in "CppClassxyz myvar;"
and at end of function, the call to destructor for the same variable..

the point or the clue is i have to write that constructor and
destructor so i would know what happen there....

so it should be in my memory when i see some variable 
variable definition

>when
>there are non-invisible ways to do the
>same. That's what procedural function
>steps/calls are for, they help document,
>and make debugging easier.

i had not problem in that, but constructor destructor has to be by me

>For example, aving documented the extra
>1.5 steps makes it clear even to a non-
>developer what is happening.
>
>I believe there should be a lnew command
>to allocate the local stack for this purpose.
>It would accomplish the same thing, but
>do so explicitly, and would also not
>require the delete.
>
>    Cxyz* myvar = lnew Cxyz(whatever);
>    myvar->function(a,b,c);

the above code result difficult to understand to me
because in c++ i use only the C malloc, or my home made malloc...

>However, I still like:
>
>    Cxyz.new(whatever).function(a,b,c).delete();

i not understand the c++ word/operator "new" for me it is all
malloc...

>But with the explicit new/reference/delete

i used no too much new/delete [in code i wrote exsplicity]
i used many reference for struct or class
in arg operator or function

>cycle, a new "dew" keyword could be
>introduced, meaning "new with (implicit)
>delete after use," as in:
>
>    Cxyz.dew(whatever).function(a,b,c);
>
>It is also a play on word sounds in
>English as "dew" sounds like "do." :-)
>
>Its purpose differs from what could be:
>
>    Cxyz::function(a,b,c)
>
>...because of the ctor and dtor, and all
>that brings.
>
>Best regards,
>Rick C. Hodgin
0
Rosario193
10/14/2014 2:34:06 PM
On Tuesday, October 14, 2014 10:24:34 AM UTC-4, Nobody wrote:
> You're essentially arguing that your inability to comprehend (relatively
> mundane) C++ is a consequence of C++ itself rather than your lack of
> familiarity with it. This isn't true.

That's not my argument.  I comprehend it well enough (most of it anyway).
I just think C++ on the whole is generally hideous and awful, and it makes
what should be very straight-forward tasks very very difficult to understand
at a glance the deeper you get into it. :-)

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 2:45:12 PM
On Tuesday, October 14, 2014 3:24:34 PM UTC+1, Nobody wrote:
>
> This is no different to C code which provides a header file containing a 
> struct declaration plus declarations of the functions which use it, and a
> source file containing the definitions of those functions.
> 
> The only difference is on which side of the closing brace the function
> declarations live.
>
That makes all the difference.

In C, the functions have a link to the structures they operate on, but the 
structures don't know anything about the functions that operate on them.
So we can easily delete a function, without affecting anything else.
We can add a function in static file scope, without affecting anything else.
And we can write a function that operates on two structures of different
types, without creating any dependencies between the two structures.

C++ is more encapsulated at run time, because only designated functions
have access to the members. But that makes it less encapsulated at
compile time, because member functions have to be designated.
0
Malcolm
10/14/2014 2:51:48 PM
On 10/14/2014 7:55 AM, BartC wrote:
> "Martin Shobe" <martin.shobe@yahoo.com> wrote in message
> news:m1j3rt$fal$1@dont-email.me...
>> On 10/14/2014 5:07 AM, BartC wrote:
[snip]

>>> The difference is that, given actual C code, you can pretty much
>>> translate it line-by-line to another language, without sight of the
>>> declarations or definitions of the identifiers on each line. (Well,
>>> provided people haven't gone mad with macros.)

>> If that was your point, why didn't you say that instead of talking about
>> how bad something was that it borrowed from C? (I also seriously doubt
>> your assertion in any case. Try converting C line-by-line to PATH.)

> I can convert a lot of C code line-by-line to my own two languages that I
> use. I can think of other /conventional/ languages where it would be also
> straightforward, especially if the algorithm is clear and that is
> translated
> instead.

You can translate this C++ code line-by-line to a number of conventional 
languages. C# and Java come to mind.

>>> You can't do that with the C++ code even if you know the language well.

>> But why would you want to? Why would anyone care if the translation is
>> line-by-line?

> Someone who would have to do exactly that?

Why would anyone be required to do that?

[snip]

Martin Shobe

0
Martin
10/14/2014 2:51:59 PM
On Tuesday, October 14, 2014 10:34:16 AM UTC-4, Rosario193 wrote:
> On Tue, 14 Oct 2014 00:47:10 -0700 (PDT), "Rick C. Hodgin" wrote:
> >I believe there should be a lnew command
> >to allocate the local stack for this purpose.
> >It would accomplish the same thing, but
> >do so explicitly, and would also not
> >require the delete.
> 
> >    Cxyz* myvar = lnew Cxyz(whatever);
> >    myvar->function(a,b,c);
> 
> the above code result difficult to understand to me
> because in c++ i use only the C malloc, or my home made malloc...

lnew is like new, except instead of allocating globally, it allocates
on the local stack.  It will automatically be deleted upon exit, just
as if you had declared:

    Cxyz myvar;

The only differences are that you are creating one that has a pointer,
one that can be explicitly deleted at any time, and one which will be
automatically cleaned up (with the destructor called) upon exit if it
wasn't explicitly deleted.

I like to see the new and the delete.  It documents procedurally
what's happening, rather than leaving it to the "invisible" aspects of
things the compiler does for you.  If I were to write a scanner parser
for my source code to determine logic flow, I would go step-by-step.
But if I used the Cxyz myvar ability, It would have parse out those
aspects as "invisible code being run."

> >However, I still like:
> >    Cxyz.new(whatever).function(a,b,c).delete();
> 
> i not understand the c++ word/operator "new" for me it is all
> malloc...

:-)  For me too (mostly).  It would be the same as:

    Cxyz* temp = new Cxyz(whatever);
	temp->function(a,b,c);
	delete temp;

But, it's all in one line.

> >But with the explicit new/reference/delete
> i used no too much new/delete [in code i wrote exsplicity]
> i used many reference for struct or class
> in arg operator or function

Me too.  My C/C++ work can be seen here:

VFrP:  https://github.com/RickCHodgin/libsf/tree/master/vvm/core
VJr:   https://github.com/RickCHodgin/libsf/tree/master/source/vjr

Nearly all of what I've done is C, but all of it is compiled using
a C++ compiler which changes the way C code has to be represented
in various ways.  I like almost all of them.

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 2:54:36 PM
On Tuesday, October 14, 2014 6:49:24 AM UTC-4, Ben Bacarisse wrote:
> Richard Heathfield writes:
>=20
> <snip>
> > There are two schools of thought on this: (a) memory management is far =
too=20
> > important to be left to the programmer; (b) memory management is far to=
o=20
> > important to be left to the language. Clearly we are in opposing
> > camps.
>=20
> I find this view very strange!  Why is that programmers are in these two
> camps rather than programs or circumstances? =20

The same reason people get into religious wars over computer programming la=
nguages:  because a lot of people seem to believe that "there can be only o=
ne."

In fact, while there are languages that commit (a) or (b) exclusively, a lo=
t of environments, allow a blend of both, including C++, Objective C, and T=
cl with C extensions. =20

I also don't see the point of fighting over which is better:  in my mind, i=
t's a job for the programming language when I'm sick of doing it, which is =
clearly a personal preference.  It's a job for the programmer when I feel t=
he need to control something completely at the byte level, and when I'd hat=
e to manipulate data hidden under the pillow fortress of a high-level langu=
age.  As long as I can switch between the two approaches whenever I want, w=
hat's to fight about?

--S
0
Xcott
10/14/2014 3:32:19 PM
On Tuesday, October 14, 2014 2:06:26 AM UTC-5, Richard Heathfield wrote:

[snip]

>=20
> There are two schools of thought on this: (a) memory management is far to=
o=20
> important to be left to the programmer; (b) memory management is far too=
=20
> important to be left to the language. Clearly we are in opposing camps.

Not necessarily; for me it's situational.  I'm living on vectors and maps
because for the code I'm working on, they're the right tool.  It's pretty
braindead management of a bunch of items, so the standard containers plus
iterators are pretty ideal. =20

I have another tool that I'm working on where the memory management isn't
so straightforward, so I have to do some things the old-fashioned way.  Eve=
n
so, I'm storing the resulting pointers in a set container to make them
easier to manage. =20

I've never warmed to auto_ptr (which is apparently a good thing, as it's
been deprecated in favor of unique_ptr), but I have used reference count
types in the past, with varying degrees of satisfaction.  They worked
well enough to deal with the problem at the time, but there are use cases=
=20
where they're a pain in the ass to deal with.   =20

> But I have some good news, John - in a couple of months, there's going to
> be a 24-hour ceasefire. Fancy a game of football about half-way between?
>

Depends; are we talking Howard Cosell football ("HE, COULD, GO, ALL,=20
THE, WAY!")  or Andr=E9s Cantor football ("GOOOOOOOOAAAAAAALLLLLLL!!!!")?

Oh, hell, it doesn't matter, I suck equally at both...
0
John
10/14/2014 3:46:03 PM
"BartC" <bc@freeuk.com> writes:
[...]
> However I don't particularly want to start a language war.
[...]

Then perhaps starting a thread with the subject "C++ vs C" was not the
best approach.

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/14/2014 4:26:37 PM
Le 13/10/2014 22:20, Richard Heathfield a �crit :
> In C, you could use new [] instead of new, but you'd struggle to resize it.
Not if youy use the c containers library (CCL)

You just use the resizable vectors  IN C without any trouble.

As I have said here a zillion times.
0
jacob
10/14/2014 5:23:28 PM
Le 12/10/2014 23:32, Paul N a �crit :
> For instance, in C++ you can write:
>
> x = a + (b * c);
>
> where in C you would have to write:
>
> mult(temp, b, c);
> add(x, a temp);

Not if you use lcc-win, a C compiler with operator overloading.

http://www.cs.virginia.edu/~lcc-win32

0
jacob
10/14/2014 5:25:20 PM
Ben Bacarisse wrote:
> Richard Heathfield <invalid@see.sig.invalid> writes:
> <snip>
>> There are two schools of thought on this: (a) memory management is far too
>> important to be left to the programmer; (b) memory management is far too
>> important to be left to the language. Clearly we are in opposing
>> camps.
>
> I find this view very strange!  Why is that programmers are in these two
> camps rather than programs or circumstances?


Beats me. I find either heuristic to be useful in different problem
domains.

An example: I've written multiple things that use both FFTW and 
libsndfile ( audio tweaking apps stuff - command line tools ).
I built some C++ classes to make these easier to deal with. No more
malloc() or fftw_malloc().

(ObDisclosure; I've dealt a lot with OO and C++ at work; this is not 
that ).

A lot happened when the GUI became the interface standard. GUIs are
inherently difficult and complex. There were a lot of miscues in 
paradigms for GUI development.

(aside: Tk is the very best standard, IMO. It's *actual* object-oriented 
and very flexible ).

This got worse with the Web.  And games? Those guys are crazy. I
knew a guy who knew the guys at ID. I did a happy hour with 'em
and they were pretty... intense.

It's funny; I read John Carmack's blog now and he's learning things
that I learned thirty years ago. Just goes to show; this is a lifetime 
discipline.

> It seems to me there are
> situations where memory management is too important to be left to the
> programmer, and others where it too important to be left to the language

Right.

> and, no doubt, programmers will disagree about the boundary cases, but do
> you think that disagreement is likely to be consistent enough to
> attribute it to the programmer rather than the program?
>

No, I think there's personal identity tied up in it. I kinda wish there
was a better social structure to help people ameliorate this; Usenet
used to be pretty good for that but people left .

Your favorite paradigm is your favorite paradigm because
that's the path you crawled up the mountain.

I still feel like the "best" approach is - use 'C' but write code
generators and other superstructure to support using 'C' in
your preferred scripting language. With scripting languages, you can
get closer to proving certain invariants about the code before it's
ever compiled, and "templated" 'C' means you can be quite repetitive
but consistent.

This may sound like nonsense; you'd have to see it to understand, I'd
think. Most problems have an internal "kernel" structure; if you can
use  something that can do permutations easily, you can frequently
unpack that into nice-looking tables that 'C' code can then exploit. Or
just generate a type-specific method each for an operation and have a
dispatcher. Or whatever.

Now throw in that other combinators and "Lisp" like recursion can be
invoked to generate code.

An awful lot of human problems are people who use social interaction
purely to reinforce their sense of identity. I can't tell you what
I've learned over the years on Usenet from people who disagree with me.

> <snip>
>

-- 
Les Cargill

0
Les
10/14/2014 6:13:19 PM
On 2014-10-14, jacob navia <jacob@spamsink.net> wrote:
> Le 12/10/2014 23:32, Paul N a écrit :
>> For instance, in C++ you can write:
>>
>> x = a + (b * c);
>>
>> where in C you would have to write:
>>
>> mult(temp, b, c);
>> add(x, a temp);
>
> Not if you use lcc-win, a C compiler with operator overloading.
>
> http://www.cs.virginia.edu/~lcc-win32

Right: gee, why use C++, when you can do the same thing using a
Windows-on-Intel-specific dialect of the C language?

If all you want is a C dialect with overloading, C++ will give it to you:
just use the C compatible subset of C++, plus its operator overloading.
This restricted dialect of C++ will be widely portable.

And you wonder why you get pissed on this group.

Why don't you just stick a "Kiki me!" note on your back?
0
Kaz
10/14/2014 7:05:08 PM
Les Cargill wrote:

> Your favorite paradigm is your favorite paradigm because
> that's the path you crawled up the mountain

That pretty much explains it, I think. If people thought logically and 
objectively in all ways at all times, this would be a very different world!

Programmers are, I think, better than average at being objective, but we all 
have our favourite ways of thinking about programming.

The trick is to recognise when it might be advantageous to switch paths, and 
when not. (I recall, for example, failing to get even remotely excited about 
J++. Just as well, really...)

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/14/2014 7:09:38 PM
jacob navia wrote:

> Le 13/10/2014 22:20, Richard Heathfield a écrit :
>> In C, you could use new [] instead of new, but you'd struggle to resize
>> it.

Typo - that should have read "in C++".

> Not if youy use the c containers library (CCL)

Very kind offer, Jacob, but I've got my own library, thanks.

> 
> You just use the resizable vectors  IN C without any trouble.
> 
> As I have said here a zillion times.

I already have resizeable containers, thanks. Yes, in C.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/14/2014 7:11:22 PM
Le 14/10/2014 21:05, Kaz Kylheku a écrit :

> And you wonder why you get pissed on this group.
>
> Why don't you just stick a "Kiki me!" note on your back?
>


There are two kinds of people in this group

The ones that produce things, do things, and try to improve the C language.

And the assholes that go around pissing other people, writing stupid 
replies like you.

I do not care about your opinion. I write for the others.


:-)


0
jacob
10/14/2014 7:23:21 PM
In article <8521c6c8-1a0e-4f9b-bdab-61d09b6d2323@googlegroups.com>,
 "Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote:

> I like to see the new and the delete.  It documents procedurally
> what's happening, rather than leaving it to the "invisible" aspects of

I don't need to see register spills and loads.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/14/2014 7:29:59 PM
On Tuesday, October 14, 2014 3:30:23 PM UTC-4, Siri Crews wrote:
> In article <8521c6c8-1a0e-4f9b-bdab-61d09b6d2323@googlegroups.com>,
>  "Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote:
> > I like to see the new and the delete.  It documents procedurally
> > what's happening, rather than leaving it to the "invisible" aspects of
> 
> I don't need to see register spills and loads.

:-)

I like to see those too!  That's probably actually where this
desire came from (my assembly days).

I also like to have all pointer variables on single lines.  It
makes it more clear to the eye (at least for me, as I have a
very difficult time reading at times). The more I can have
arranged into groups, the less reading it is for my overtaxed,
underpaid brain.

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 7:54:44 PM
BartC wrote:
>
> I design my own languages and write compilers. I've looked at overloading
> (and a lot of other OO stuff) and decided not to bother with it (as
> user-defined parts of  the language) because, while a compiler might be able
> to follow some torturous algorithm to figure out what's what, for the source
> to be understandable, the human reader would have to do the same!

So you find the expression "c = a+b" where a, b and c are objects 
torturous?

> (Besides, one of my languages being dynamically typed, you get a lot of
> these features for free anyway, eg. generics.)
>
> However I don't particularly want to start a language war. This is just my
> personal opinion. And in the last couple of days, my opinion of C has gone
> up, and that of C++ has gone down.

I fail to see how your opinion of what is effectively a super-set of C 
can go down without your opinion of C going down...

-- 
Ian Collins
0
Ian
10/14/2014 8:04:50 PM
On 2014-10-14, Richard Heathfield <invalid@see.sig.invalid> wrote:
> Les Cargill wrote:
>
>> Your favorite paradigm is your favorite paradigm because
>> that's the path you crawled up the mountain
>
> That pretty much explains it, I think. If people thought logically and 
> objectively in all ways at all times, this would be a very different world!

Yes; specifically, it would be a world without art, literature, music,
electricity and running water.

The arts are not based on logic, and progress depends on creativity which
has something in common with art.

Logic alone gets you perhaps as far as using primitive tools for hunting,
gathering and building shelter. Progress requires forming increasingly
sophisticated hypotheses (which do not proceed from logical deduction).

Progress requires imaginative, creative leaps.
0
Kaz
10/14/2014 8:20:31 PM
Les Cargill wrote:
>
> Your favorite paradigm is your favorite paradigm because
> that's the path you crawled up the mountain.

I guess that depends how much you enjoyed the crawl.

My deep loathing for all caps anything probably comes form doing my 
initial crawl on a teletype.

My preference for abstraction probably comes form doing the next part of 
my crawl in assembler.

-- 
Ian Collins
0
Ian
10/14/2014 8:29:11 PM
In article <4b8e05fa-bb21-41b1-9cca-5aec8ffc667c@googlegroups.com>,
 "Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote:

> On Tuesday, October 14, 2014 3:30:23 PM UTC-4, Siri Crews wrote:
> > In article <8521c6c8-1a0e-4f9b-bdab-61d09b6d2323@googlegroups.com>,
> >  "Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote:
> > > I like to see the new and the delete.  It documents procedurally
> > > what's happening, rather than leaving it to the "invisible" aspects of
> > 
> > I don't need to see register spills and loads.
> 
> :-)
> 
> I like to see those too!  That's probably actually where this
> desire came from (my assembly days).

Some of us have learned to exploit the human power of abstraction.

I did assemmbly on Cyber 170 and 205. Now I'm free to use a Javascript level of 
abstraction in C or Javascript.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/14/2014 8:34:37 PM
"Ian Collins" <ian-news@hotmail.com> wrote in message
news:ca5dv3FerbqU1@mid.individual.net...
> BartC wrote:
>>
>> I design my own languages and write compilers. I've looked at overloading
>> (and a lot of other OO stuff) and decided not to bother with it (as
>> user-defined parts of  the language) because, while a compiler might be
>> able
>> to follow some torturous algorithm to figure out what's what, for the
>> source
>> to be understandable, the human reader would have to do the same!
>
> So you find the expression "c = a+b" where a, b and c are objects
> torturous?

When it is necessary to understand exactly what that does (for a variety of
reasons including porting the code to a different language) then yes it can
be torturous. That's if it involves templates and overloads and inheritance
and class members and namespaces, all the usual stuff that every C++ program
seems to be obliged to use even though, as you say, it is a superset of C
and therefore programmers could exercise some restraint if they wished.

>> (Besides, one of my languages being dynamically typed, you get a lot of
>> these features for free anyway, eg. generics.)
>>
>> However I don't particularly want to start a language war. This is just
>> my
>> personal opinion. And in the last couple of days, my opinion of C has
>> gone
>> up, and that of C++ has gone down.
>
> I fail to see how your opinion of what is effectively a super-set of C can
> go down without your opinion of C going down...

It's /because/ C doesn't have that super-set that my opinion has gone up, 
and therefore you /know/ that programs are likely to be reasonably 
transparent. (And because I often operate outside of all these languages and 
need to know what goes on, the transparency is important.)

There are still issues I have with C, there's a load of stuff I don't like
about it, but it's mostly small things such as syntax preferences. And
people can still create large, convoluted projects with it, but they have to
make more effort than with C++.

-- 
Bartc
 

0
BartC
10/14/2014 8:43:47 PM
In article <gXf%v.550369$Zu6.436642@fx06.am4>, "BartC" <bc@freeuk.com> wrote:

> It's /because/ C doesn't have that super-set that my opinion has gone up, 
> and therefore you /know/ that programs are likely to be reasonably 
> transparent. (And because I often operate outside of all these languages and 

See also Obfuscated C Contest.

Arguably programming in anything other than long strings of hex numbers can be 
called not transparent. Programming languages exploit a marvelous human ability 
call abstraction which allows repetitive, redundant, and unnecessary details to 
go unmentionned. It then up to the programmer to come up with abstractions that 
aid understanding or hinder it. You can write obfuscated code in any programming 
language. You can also write clean and comprehensible code in any language 
except possibly Intercal.

How about this?

    program ::= statement
            A program consists of a single statement.

    pack ::= empty | ( series )
    series ::= statement | series; statement
            Zero or more statements evaluated consecutively.

    statement ::= variable := value - value pack
            Assign the difference of two values. If pack is
            not empty, evaluate the packed statements repeat the
            difference until difference is nonpositive.

    value ::= variable | integer-constant

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/14/2014 9:06:35 PM
On Tuesday, October 14, 2014 9:05:02 PM UTC+1, Ian Collins wrote:
>
> So you find the expression "c = a+b" where a, b and c are objects 
> torturous?
>
It's a nightmare.
One possibility is that the objects are scalars. In that case, it's
hard to tell from looking at client code whether they are aliases
for another type, or if application-specific code is being called.
Can the bug be that c isn't actually a + b?

Another possibility is that the objects are vectors or matrices
or other mathematical objects for which addition is defined and
called addition. This is a good use of operator overloading. 
Addition is usually unproblematic, but c = a * b and d = b * a
won't yield the same for matrix multiplication. It's got its 
pitfalls if writing for the unwary.

Another possibility is that a and b are strings, of a is a file
and b a record, or some other defensible but confusing misuse of 
the addition operator. Now it gets very difficult. It's hard to
find the function. Worse, someone might have defined the +
operation as concatenation in one place, and insertion in 
another.   
>  
> I fail to see how your opinion of what is effectively a super-set
> of C can go down without your opinion of C going down...
> 
C gives you enough to write code without worrying unduly about
book-keeping. You don't have to reset the stack, perform 
double register precision arithmetic, set up parameter blocks
for subroutine calls. On the other hand, it makes it difficult to
set up tangled high-level structures. Unless you go madly
overboard with function pointers, you know what is calling what
and with which arguments.

The main thing that is lacking is the realloc() dance, and the 
related problem of destroying half-created structures held 
together with pointers.

0
Malcolm
10/14/2014 9:25:18 PM
Le 14/10/2014 22:29, Ian Collins a �crit :
> Les Cargill wrote:
>>
>> Your favorite paradigm is your favorite paradigm because
>> that's the path you crawled up the mountain.
>
> I guess that depends how much you enjoyed the crawl.
>
> My deep loathing for all caps anything probably comes form doing my
> initial crawl on a teletype.
>
> My preference for abstraction probably comes form doing the next part of
> my crawl in assembler.
>


10 Read input
20 Process input
30 Output results
40 IF NOT (done?) goto 10

ALL possible programs are the same at this abstraction level

:-)

The abstraction level is dependent on what you want to do with the 
machine at your disposal.

Want to do web programming for your aunt's lilly personal web page?

Use java script.

Want to do a mathematical package for calculating in 128 bit precision?

Use assemby language

Want to do a C compiler?

Use C.

Want to loose your mind?

Use C++.

jacob
0
jacob
10/14/2014 9:32:01 PM
Le 14/10/2014 23:32, jacob navia a �crit :
> Want to loose your mind?

I think that should have been "lose". Sorry about that.
0
jacob
10/14/2014 9:35:44 PM
Malcolm McLean wrote:
> On Tuesday, October 14, 2014 9:05:02 PM UTC+1, Ian Collins wrote:
>>
>> So you find the expression "c = a+b" where a, b and c are objects
>> torturous?
>>
> It's a nightmare.
> One possibility is that the objects are scalars. In that case, it's
> hard to tell from looking at client code whether they are aliases
> for another type, or if application-specific code is being called.
> Can the bug be that c isn't actually a + b?

I don't know, what does your test tell you?

> Another possibility is that the objects are vectors or matrices
> or other mathematical objects for which addition is defined and
> called addition. This is a good use of operator overloading.
> Addition is usually unproblematic, but c = a * b and d = b * a
> won't yield the same for matrix multiplication. It's got its
> pitfalls if writing for the unwary.

If you don't understand how the objects behave, whether or not you use 
operator overloading won't save you.

> Another possibility is that a and b are strings, of a is a file
> and b a record, or some other defensible but confusing misuse of
> the addition operator. Now it gets very difficult. It's hard to
> find the function. Worse, someone might have defined the +
> operation as concatenation in one place, and insertion in
> another.

Then they are a fool who would be equally likely to write messed up C.

>> I fail to see how your opinion of what is effectively a super-set
>> of C can go down without your opinion of C going down...
>>
> C gives you enough to write code without worrying unduly about
> book-keeping. You don't have to reset the stack, perform
> double register precision arithmetic, set up parameter blocks
> for subroutine calls. On the other hand, it makes it difficult to
> set up tangled high-level structures. Unless you go madly
> overboard with function pointers, you know what is calling what
> and with which arguments.
>
> The main thing that is lacking is the realloc() dance, and the
> related problem of destroying half-created structures held
> together with pointers.

There you are then, use the C + resource management objects subset of 
C++ and you are sorted.  More than one team I have worked with has been 
happy to use this with function overloading as their working language.

Crap code comes form crap process, no matter what the language.

-- 
Ian Collins
0
Ian
10/14/2014 9:38:36 PM
BartC wrote:
> "Ian Collins" <ian-news@hotmail.com> wrote:
>>
>> So you find the expression "c = a+b" where a, b and c are objects
>> torturous?
>
> When it is necessary to understand exactly what that does (for a variety of
> reasons including porting the code to a different language) then yes it can
> be torturous.

I guess one could classify being a human compiler as a corner case :)

> That's if it involves templates and overloads and inheritance
> and class members and namespaces, all the usual stuff that every C++ program
> seems to be obliged to use even though, as you say, it is a superset of C
> and therefore programmers could exercise some restraint if they wished.

If they are properly managed, they should be.  In the open source realm, 
the child in a toyshop mentality often takes over...

Pretty much the only time I overload the maths operators is for objects 
that represent a specialised numeric types, such as a bounded integers 
or bignum.  Here there isn't any inheritance and the benefits 
(especially if you are mixing special and built-in types or using 
generic algorithms) far outweigh any costs (except for human compilers).

-- 
Ian Collins
0
Ian
10/14/2014 9:54:15 PM
"Malcolm McLean" <malcolm.mclean5@btinternet.com> wrote in message 
news:9c1ccfd3-a3fe-47a5-a050-94433fbd690f@googlegroups.com...
> On Tuesday, October 14, 2014 9:05:02 PM UTC+1, Ian Collins wrote:
>>
>> So you find the expression "c = a+b" where a, b and c are objects
>> torturous?
>>
> It's a nightmare.
> One possibility is that the objects are scalars. In that case, it's

> Another possibility is that the objects are vectors or matrices
> or other mathematical objects for which addition is defined and
> called addition.

> Another possibility is that a and b are strings,

Actually, those possibilities aren't much of a problem. (I don't think I'm 
saying simply because I used to have language with operators that worked on 
exactly those types...)

It's when a and b are classes and "+" is defined between instances of that 
class. There might be inheritance at work and a and/or b are subclasses (or 
whatever they are called), and it's not clear exactly which "+" handler is 
being used. When you look at those handlers, then maybe specialisation 
templates have been employed so the search for the exact handler forks yet 
again.

Leaving that aside, look at the actual a and b operands; where are they 
defined? If they are not declared, then maybe you're in some method 
belonging to some class, and a and b might be somewhere in that class 
definition (and might themselves be instances of some other class). I don't 
know what happens when a and/or are also locally declared in this method, or 
in the file, and could be the same or conflicting types. Probably smoke 
starts issuing from the machine and/or the programmer.

This stuff *can* be done properly so that it enhances the code and makes 
life easier for everyone. I don't believe that jpeg example was enhanced by 
these methods.

-- 
Bartc 

0
BartC
10/14/2014 10:01:22 PM
On Tuesday, October 14, 2014 4:35:06 PM UTC-4, Siri Crews wrote:
> Rick C. Hodgin <rick.c.hodgin@gmail.com> wrote:
> > I like to see those too!  That's probably actually where this
> > desire came from (my assembly days).
> 
> Some of us have learned to exploit the human power of abstraction.
> 
> I did assemmbly on Cyber 170 and 205. Now I'm free to use a Javascript
> level of abstraction in C or Javascript.

It's not for abstraction, it's for ease of long term maintenance.  Coming
back to code over years ... I've never found anything which makes the
maintenance faster than to be a little obvious and inefficient in various
areas.  It doesn't affect performance that much, as the compiler will
almost always optimize it away, but the more important component of that
equation (my time) is greatly benefited from having code be in a rather
obvious, sometimes overly simplistic form.

It's just my experience over time.  The more simplistically elegant you
can make your code, coupled with good commenting, the easier it will be
to maintain in the long term.

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 10:24:14 PM
Rick C. Hodgin wrote:
> On Tuesday, October 14, 2014 4:35:06 PM UTC-4, Siri Crews wrote:
>> Rick C. Hodgin <rick.c.hodgin@gmail.com> wrote:
>>> I like to see those too!  That's probably actually where this
>>> desire came from (my assembly days).
>>
>> Some of us have learned to exploit the human power of abstraction.
>>
>> I did assemmbly on Cyber 170 and 205. Now I'm free to use a Javascript
>> level of abstraction in C or Javascript.
>
> It's not for abstraction, it's for ease of long term maintenance.  Coming
> back to code over years ... I've never found anything which makes the
> maintenance faster than to be a little obvious and inefficient in various
> areas.  It doesn't affect performance that much, as the compiler will
> almost always optimize it away, but the more important component of that
> equation (my time) is greatly benefited from having code be in a rather
> obvious, sometimes overly simplistic form.

The best help you can give your future self for maintaining code is a 
good set of tests.  These will show if if you break the code and provide 
you with a comprehensive set of examples.

> It's just my experience over time.  The more simplistically elegant you
> can make your code, coupled with good commenting, the easier it will be
> to maintain in the long term.

Which is all fine and dandy until the problem the code is solving 
becomes more complex.  You then have to ask the question "do I want to 
expose all of the complexity in the code, or should I allow the language 
to take care of the more mundane parts?".  Being able to see the bigger 
picture without getting bogged down in the detail has significant 
benefits for maintainers.

-- 
Ian Collins
0
Ian
10/14/2014 11:00:52 PM
Ian Collins wrote:
> [snip]

I thought I was in your killfile.

Best regards,
Rick C. Hodgin
0
Rick
10/14/2014 11:09:32 PM
Rick C. Hodgin wrote:
> Ian Collins wrote:
>> [snip]
>
> I thought I was in your killfile.

Only in c.l.c++, you haven't been evangelising here.

-- 
Ian Collins
0
Ian
10/15/2014 12:20:37 AM
On Tuesday, October 14, 2014 8:20:49 PM UTC-4, Ian Collins wrote:
> Rick C. Hodgin wrote:
> > Ian Collins wrote:
> >> [snip]
> > I thought I was in your killfile.
> Only in c.l.c++, you haven't been evangelising here.

It's not "evangelizing," but an incorporation into my life the teaching
about the way things are.  I did not know about that way before it was
given to me ... but now I do know, and I then proceed.

How does it come up?  From time to time throughout the course of each
of my days things happen which move me to speak about the necessity of
knowing the Lord, so that what's coming after we die is the desirable
course.

Best regards,
Rick C. Hodgin
0
Rick
10/15/2014 1:20:26 AM
Kaz Kylheku wrote:
> On 2014-10-14, Richard Heathfield <invalid@see.sig.invalid> wrote:
>> Les Cargill wrote:
>>
>>> Your favorite paradigm is your favorite paradigm because
>>> that's the path you crawled up the mountain
>>
>> That pretty much explains it, I think. If people thought logically and
>> objectively in all ways at all times, this would be a very different world!
>
> Yes; specifically, it would be a world without art, literature, music,
> electricity and running water.
>
> The arts are not based on logic,

Some are. A great deal of modern art is more or less a way of 
experimenting  with the visual cortex. Some modern art is explicitly
mathematical. There's cubism.

Even representative art drew on perspective geometry, emerging
understanding of color, many other technical details. I'm pretty sure
that Galileo advanced art a great deal.

Music as we refer to it comes from Pythagoras.

Literature is of the art of rhetoric, which is *the* Liberal Art. At its
purest, rhetoric is logic.

It's all one big thing, really. And I'll find very few things as 
beautiful as some proofs I've read. Cantor's  Diagonalization is a 
masterpiece. In a way, it's in a class by itself.

> and progress depends on creativity which
> has something in common with art.
>

It does and it doesn't. Programmers have an awful lot in common
with composers, I think.

> Logic alone gets you perhaps as far as using primitive tools for hunting,
> gathering and building shelter. Progress requires forming increasingly
> sophisticated hypotheses (which do not proceed from logical deduction).
>

Logic is generally post-hoc. And when it's not, it's usually part of a
creative process. Don't get me wrong - I think the *cult* of creativity
is misplaced - not only are there famous people who are creative, but
in a very meaningful way, we all are.

Logic is important because we're all hypocrites by necessity - knowing
some logic is simply a way to try to improve consistency some.


> Progress requires imaginative, creative leaps.
>

-- 
Les Cargill
0
Les
10/15/2014 2:19:54 AM
Ian Collins wrote:
> Rick C. Hodgin wrote:
>> On Tuesday, October 14, 2014 4:35:06 PM UTC-4, Siri Crews wrote:
>>> Rick C. Hodgin <rick.c.hodgin@gmail.com> wrote:
>>>> I like to see those too!  That's probably actually where this
>>>> desire came from (my assembly days).
>>>
>>> Some of us have learned to exploit the human power of abstraction.
>>>
>>> I did assemmbly on Cyber 170 and 205. Now I'm free to use a Javascript
>>> level of abstraction in C or Javascript.
>>
>> It's not for abstraction, it's for ease of long term maintenance.  Coming
>> back to code over years ... I've never found anything which makes the
>> maintenance faster than to be a little obvious and inefficient in various
>> areas.  It doesn't affect performance that much, as the compiler will
>> almost always optimize it away, but the more important component of that
>> equation (my time) is greatly benefited from having code be in a rather
>> obvious, sometimes overly simplistic form.
>
> The best help you can give your future self for maintaining code is a
> good set of tests.  These will show if if you break the code and provide
> you with a comprehensive set of examples.
>
>> It's just my experience over time.  The more simplistically elegant you
>> can make your code, coupled with good commenting, the easier it will be
>> to maintain in the long term.
>
> Which is all fine and dandy until the problem the code is solving
> becomes more complex.  You then have to ask the question "do I want to
> expose all of the complexity in the code, or should I allow the language
> to take care of the more mundane parts?".  Being able to see the bigger
> picture without getting bogged down in the detail has significant
> benefits for maintainers.
>

Maintainers are *keenly interested* in the details. Keenly.

I once found a bug that caused *all* use of STL to be ended on a 
project. A map() failed utterly and we could not replace the STL
version... Replacement took two hours; diagnosis had gone on on
and off for two *years*.

As a maintainer, I get to trudge through your insufferable* abstractions 
and try to find were you went wrong :) After all,
if you left a bug,  it must be hard to find.If I have to trace through a 
dozen levels of indirection, including late-bound virtual** methods...

**we took to calling "virtual"  "the vorpal keyword".

*they all are when you're doing maintenance. They conspire to become
insufferable the instant you let go of them for some inexplicable reason :)

I once worked on a project where 500 classes were in play. Myself and 
another old 'C' hand estimated it would have been roughly 1/10th  the 
complexity done in raw 'C'. That is not the tools' fault. It
was <name withheld>'s fault.

The principle virtue of software is transparency. How do you know it
works?

-- 
Les Cargill




0
Les
10/15/2014 2:34:15 AM
Les Cargill wrote:
>
> The principle virtue of software is transparency. How do you know it
> works?

You test it?

-- 
Ian Collins
0
Ian
10/15/2014 2:41:11 AM
Ian Collins <ian-news@hotmail.com> writes:
> Rick C. Hodgin wrote:
>> Ian Collins wrote:
>>> [snip]
>>
>> I thought I was in your killfile.
>
> Only in c.l.c++, you haven't been evangelising here.

He has.  I'm surprised you missed it.

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/15/2014 5:29:18 AM
Le 15/10/2014 04:41, Ian Collins a �crit :
> Les Cargill wrote:
>>
>> The principle virtue of software is transparency. How do you know it
>> works?
>
> You test it?
>

Yes, but in most project and most cases the tests will test only 0.1% of 
the possible combinations of input values a software should accept.

To test if a floating point package is correct you would need more time 
than the age of he Universe...

Yes, tests are very good but you should not believe they will ensure bug 
free software!

Other problems aren't covered by testing: Porting the software to a new 
environment for instance.

Or, in the case of C++, new incompatible changes in the language made 
your code obsolete and it must be rewritten.
0
jacob
10/15/2014 5:30:32 AM
jacob navia wrote:
> Le 15/10/2014 04:41, Ian Collins a �crit :
>> Les Cargill wrote:
>>>
>>> The principle virtue of software is transparency. How do you know it
>>> works?
>>
>> You test it?
>>
>
> Yes, but in most project and most cases the tests will test only 0.1% of
> the possible combinations of input values a software should accept.

Humbug.

> To test if a floating point package is correct you would need more time
> than the age of he Universe...

Well obviously you don't aim to testes every number, you apply a little 
of the intelligence you were hired for.

> Yes, tests are very good but you should not believe they will ensure bug
> free software!

Close enough, if done well.

> Other problems aren't covered by testing: Porting the software to a new
> environment for instance.

Run the tests again?

> Or, in the case of C++, new incompatible changes in the language made
> your code obsolete and it must be rewritten.

C++ is no better or worse than C in this regard.

-- 
Ian Collins
0
Ian
10/15/2014 6:35:57 AM
On Tue, 14 Oct 2014 21:34:15 -0500, Les Cargill wrote:

>I once worked on a project where 500 classes were in play. Myself and 
>another old 'C' hand estimated it would have been roughly 1/10th  the 
>complexity done in raw 'C'. 

from my beginner point of vew:
i'm not agree i not find difficult to follow the subset of c++ :
classes constructor distructors
operator and function definitions [overloading]
c subset

all with some malloc function that allow to see if there are *leaks*
at last

>That is not the tools' fault. It
>was <name withheld>'s fault.
>
>The principle virtue of software is transparency. How do you know it
>works?

scalability
use the language right for operation

i think asm [math operations and build operator called from c++] and
c/c++[call math operator function and not too heavy operations] are
good
0
Rosario193
10/15/2014 6:36:26 AM
On Tue, 14 Oct 2014 14:25:18 -0700 (PDT), Malcolm McLean wrote:

>On Tuesday, October 14, 2014 9:05:02 PM UTC+1, Ian Collins wrote:
>>
>> So you find the expression "c = a+b" where a, b and c are objects 
>> torturous?
>>
>It's a nightmare.
>One possibility is that the objects are scalars. In that case, it's
>hard to tell from looking at client code whether they are aliases
>for another type, or if application-specific code is being called.
>Can the bug be that c isn't actually a + b?

i not find above expression too much difficult
get a   a has its own address  &a
get b   b has its own address  &b
get c   c has its own address  &c
make +(&c, &a, &b)
the function that do a+b and put result in c

the problem could be in the expression

x=g+(a+b)

get a   a has its own address  &a
get b   b has its own address  &b

request global data from a circular buffer m=&buffer[index++%20]
doing +(m, &a, &b)
doing +(&x, &g, m)

the problem...

for example

classType&  function(classType&   b)
{
.....
....
 x=c+d+n;
 r=b+k+x;
 .....
 (#)
 return r+b;
}

and one call function in the follow way

k=function(m+(n+c))

because m+(n+c) use global buffer and

would be == &buffer[4] for example

and if that global circular buffer has eg few element

in (#) above, b point in function above to
&buffer[4]
ok

but could be it [&buffer[4]] can be already used from function()
operations
and contain so the wrong value...

i think one could do it safe in this way:
"x=m+(n+c); k=f(x);"

if there would be not problems as above
this is the way because one use always the same memory
so for me it could be circular buffer is always all in the cache
mem...

immagination of efficency, vs right result

i choose the first
and to be carefull for see if buffer not overflow for function call
that use buffer or too many () level or some unknow bug of that type
0
Rosario193
10/15/2014 7:45:11 AM
"Rosario193" wrote:

> On Tue, 14 Oct 2014 21:34:15 -0500, Les Cargill wrote:
>
>>I once worked on a project where 500 classes were in play. Myself and
>>another old 'C' hand estimated it would have been roughly 1/10th  the
>>complexity done in raw 'C'.
>
> from my beginner point of vew:
> i'm not agree i not find difficult to follow the subset of c++ :
> classes constructor distructors
> operator and function definitions [overloading]
> c subset

He did say 500 *classes*, not objects. I have difficulty even imagining such 
a beast.
That says, to me,there were 500 different *types* of variables.  Plus what 
you started with in C.  ISTM there was no adult supervision at all during 
that design.

The world is not full of simple problems where classes make sense, so the 
books really strain credulity.  Look at the books full of dogs that bark, 
monkeys that chatter.  Images where a rectangle is a variant of a square (or 
is it the other way around?). These were the books I encountered while 
learning in the 90's, have the books changed?  The student adopts the 
attitude of the author or instructor.  Cobol has long variable names, that's 
what my instructor thought so now *I* think that. Does it make sense or is 
it true? Of course not!  But the thought is still there in the back of my 
mind.

On the other hand the world is full of math problems (and thus examples) 
that want to be solved in simple minded C.

Properly used with constraint, OO is a great idea.  But OO is an idea that 
has been, and is being, terribly oversold.  This notion of forecasting the 
future so your 20 lines of super clever code can work it's way into all 
kinds of wonderful programs, not yet written,  makes me want to puke. If I 
were supervising programmers The number times you would see "protected" in 
code written by my guys used would be very close to zero. Write the damn 
program, let the future take care of itself. 


0
Osmium
10/15/2014 12:21:05 PM
On Wednesday, October 15, 2014 8:21:15 AM UTC-4, Osmium wrote:
> He did say 500 *classes*, not objects. I have difficulty even
> imagining such a beast.

I had someone respond the other day with "when you're trying to manage
over a billion lines of C++ code, there are certain rules and procedures
which must be followed," (or something like that).

My first thought was ... really?  A billion lines of C++ code?  What
kind of system would that be?  All of Linux with all its drivers are
less than 20 million lines.  Windows Server 2003 was estimated to be
50 million lines.  The full Debian OS release is less than 500 million,
and that includes every available app that's available on Debian.

Here's a website that says over the past seven years from over 300 open
source projects their company has analyzed 850 million lines of code
(http://www.net-security.org/secworld.php?id=14871), and these include
projects like Linux, PHP, Apache, and all they entail.

Where would someone come up with a billion lines of C++ code in a single
project?

Best regards,
Rick C. Hodgin
0
Rick
10/15/2014 12:42:29 PM
On 10/13/14, 2:35 PM, Xcott Craver wrote:
> On Sunday, October 12, 2014 5:22:16 PM UTC-4, Rick C. Hodgin wrote:
>>
>> I enjoy the C++ class, exceptions, and very little else. I would
>> tell anyone, "you get very far beyond those two add-ons to C, and
>> you're into the uncertain lands called obfuscation and misery."
>
> That seems to be a weird property specific to C++:  the more you do
> it right, the worse it gets.
>
> In most languages, if you think in the language and write in a style
> appropriate to the language, using the language's features,
> everything is better; and if you try to write a program the same way
> you would in another language, the result is often longer, awkward
> and unreadable.
>
> In C++, however, the opposite is true:  you can get some nice
> advantages by writing C code in C++ and using a couple features that
> remove the cruft of memory and string management.  Or converting Java
> to C++ and mapping over the object features you use in Java.  But
> once you bust out templates and nontrivial OO features to write C++
> like a C++ programmer, suddenly you just have an abstract wall of
> bytes.
....

> --S
>

The thing I have found is it is often worthwhile to think of C++ as a 
FAMILY of languages (often quite different from each other). They have a 
common standard that specifies the low level framework you use to write 
in them, but you can write in a number of different styles/idioms. This 
is perhaps one strength (and weakness) of the language. Many languages 
have primarily one way to write a program, and if the the task at hand 
doesn't fit that method well, it is hard to write. In C++ you can choose 
which method to use. The problem is that if you are used to only a few 
of the methods, and pick up a program written in one you aren't familiar 
with, it will seem like it is written in an unknown language (because it 
sort of is).
0
Richard
10/15/2014 12:55:16 PM
On 10/15/2014 02:35 AM, Ian Collins wrote:
> jacob navia wrote:
....
>> Yes, but in most project and most cases the tests will test only 0.1% of
>> the possible combinations of input values a software should accept.
> 
> Humbug.

How could it possibly be other wise? The range of possible combinations
of inputs is huge even for relatively trivial functions such as
add(a,b), where a and b are long integers. For any realistic function,
it can be so large that even achieving 0.1% coverage with your tests is
a fantasy, not a reality.

>> To test if a floating point package is correct you would need more time
>> than the age of he Universe...
> 
> Well obviously you don't aim to testes every number, you apply a little 
> of the intelligence you were hired for.

Yes - with the associated problem that your intelligence might fail to
identify the particular combination of inputs that will trigger a
particular problem. I've seen some pretty obscure bugs triggered by very
odd situations I'd never have thought to test for.

For instance, a bug in third party software that caused it to delete a
file, rather than update it, if the disk volume it was stored in ran out
of space. It did so without even reporting a failure - because the
authors hadn't bothered to check whether fwrite(), fclose(), or rename()
actually succeeded. It didn't help that the third party library didn't
even document the fact that the file was subject to updates. I never
even thought to check whether there had been any changes to that file
until this bug made it mysteriously disappear.

>> Yes, tests are very good but you should not believe they will ensure bug
>> free software!
> 
> Close enough, if done well.

Nothing can ensure bug free software; it's just as much a mistake to
rely entirely upon testing as it is to rely entirely upon any other
single quality assurance technique.

>> Other problems aren't covered by testing: Porting the software to a new
>> environment for instance.
> 
> Run the tests again?

That's not necessarily sufficient, if the problem is due to something
that you didn't think to test because you weren't aware that it posed a
portability problem.
-- 
James Kuyper
0
James
10/15/2014 1:50:53 PM
On 2014-10-14, jacob navia <jacob@spamsink.net> wrote:
> Le 14/10/2014 21:05, Kaz Kylheku a écrit :
>
>> And you wonder why you get pissed on this group.
>>
>> Why don't you just stick a "Kiki me!" note on your back?
>>
>
>
> There are two kinds of people in this group
>
> The ones that produce things, do things, and try to improve the C language.

The C++ people already improved the C language with operator overloading and
containers several decades ago. You're also scoffing at people who have
worked to improve the language.

Improving the C language in nonproductive ways that mimic C++ is something
that I'm definitely not interested in.

> And the assholes that go around pissing other people, writing stupid 
> replies like you.
>
> I do not care about your opinion. I write for the others.

Both lcc-win32 and C++ are dialects of C which have operator overloading
and containers.

C++ is standardized, and widely implemented, so that many platforms
can be targeted by C++.

lcc-win32 targets Intel Windows.

When someone wants a C-like language with operator overloading and containers,
he or she faces trade-offs in choosing between lcc-win and C++.

So far, everything above isn't an opinion.

My opinion is that those trade-offs do not generally look favorable for
lcc-win32 from this perspective. I cannot think why someone who needs
polymorphic containers or operator overloading would rationally choose that
over C++, even if the features have some advantages over their C++
counterparts, like nicer syntax.

Name-calling isn't going to convince anyone otherwise, nor remove the
barriers to adoption.

Even programmers who don't plan to make a portable program still have
portability in the backs of their minds. Even if they don't think about the
portability of the whole thing, they think about the portability of pieces of
that program. They also think about the portability of their *skills* to future
projects. If you can "take the tool with you", you take the associated skills
wity you also.

When Stroustrup started working on a better C in 1980, ("C with Classes"), he
did a smart thing: he implemented a compiler that produced C source.  This
meant that his dialect rode on the coattails of C portability.  People who had
C compiler could use either use his implementation directly, or else, failing
that, they could at least compile the output of Stroustrup's implementation run
on machine A, using a C compiler running on machine B.

That might be something to think about: developing such a front end which
lets people use the lcc-win32 dialect anywhere they have a regular C compiler
in addition to having the "native" lcc-win32 on Windows.

Note that unlike some other people in this group, I recognize that C has
dialects, and I'm not opposed to their use. lcc-win32 with its containers
and overloading and whatnot is a kind of C.  I do think that someone using a
GCC extension, or C++, is in better shape than someone relying on a lcc-win32
extension. This is purely because of the portability and wide-spread use of
the GNU toolchain. You can take your code with GCC extensions from Windows
and get it running on ARM or MIPS. GCC also has enough clout that some of its
extensions appear elsewhere, like Clang.

I'm sorry you spent all that effort to make things that are a tough sell; that
is not anyone else's problem, but your own.  In life, you have to know whether
what you're making is something someone wants. If you choose to make something
that few are going to want, you have to do so consciously and refrain from
spiteful behavior when that is confirmed.

I do electronics as a hobby, here and here. I have built myself hardware of
which there is only one instance. I designed circuits, etched and drilled
PCB's, and assembled these things, and they are for my use only. This is fine.

I also developed and maintain a programming language that very few people use
at all. (It is portable, and to make it even easier for some people people, I
maintain binaries for Windows, a couple of Linuxes, Mac OS X, and Solaris.) I
do not call people names when they raise concerns about the language or
voice their opinion that it doesn't seem to offer them any advantage over some
other language that they already know well.  I encourage people to use
alternatives, if they solve their problems more easily, and have looked the
other way when someone else has done so in my very mailing list. That's what it
takes to be professional: cool detachment.
0
Kaz
10/15/2014 3:07:36 PM
"Kaz Kylheku" <kaz@kylheku.com> wrote in message 
news:20141015071013.889@kylheku.com...
> On 2014-10-14, jacob navia <jacob@spamsink.net> wrote:

>> The ones that produce things, do things, and try to improve the C 
>> language.

> When someone wants a C-like language with operator overloading and 
> containers,
> he or she faces trade-offs in choosing between lcc-win and C++.

> That might be something to think about: developing such a front end which
> lets people use the lcc-win32 dialect anywhere they have a regular C 
> compiler
> in addition to having the "native" lcc-win32 on Windows.

AFAIK lcc-win's containers library is written in standard C and ought to be 
portable.

-- 
Bartc 

0
BartC
10/15/2014 3:27:14 PM
Le 15/10/2014 17:07, Kaz Kylheku a écrit :
> The C++ people already improved the C language with operator overloading and
> containers several decades ago. You're also scoffing at people who have
> worked to improve the language.

I do not accept the complexity of C++.


Your viewpoint is very popular in this group however. Anyone that 
proposes to improve the C language is shown the c++ pile of horrors as 
it would be the best thing since sliced bread, including (of course) the 
C committee where most of its member work to improve the C++ language.

Since C++ started with their (OHHH so new) OO approach that was supposed 
to be the new silver bullet, C programmers have been denigrated by those 
people like somehow stupid folks that can't understand the new techniques.

We had one from JPL that said in this forum that he was waiting for C 
programmers to die out so that C would disappear.

You continue that tradition. If I propose to improve the C language with 
something that is current in Fortran, C# etc, almost ANY language 
nowadays, the C++ people in this group will block me with their eternal

"C++ has done it already"

Yes, it has "done it already" but that is NO REASON to block the 
development of a simple language without making it the monstruosity that 
C++ has become.

Just one data point:

The creator of the language, Mr Stroustroup, after YEARS of trying to 
introduce a new feature was forced to acknowledge that he wasn't able to 
do so. C++ has become too complex to fit in a single human brain even 
the brain of the creator of the language.

My compiler system targets today Power PC CPUs and intel ones. I do not 
have the fonancial backing of gcc/microsoft whatever. Because C is 
despises by those people as a mater of course.

Even though the usage figures of C continue to GROW.

Why is that?

Why after ALL THOSE YEARS OF C++ C continues to grow?

Because is a better, simpler language. Operator overloading in lcc-win 
is exclusively used for NUMBERS, as it should be. A small but very 
important application.

Since there are no classes nor templates, nor lambdas, nor... well ETC! 
everything remaons easy to ubderstand and read.

C++ is UNREADABLE!

Just the specifications of the overloading of an operator in C++ implies 
a topological sort of the classes to figure out which function will be 
called! They run for PAGES AND PAGES.

Do all C++ programmers KNOW all that and did they do the topological 
sort in their heads before writing the code?

Of course not, they just think they know and in many cases they are right.

Until someone else adds a class somewhere else and suddenly a lot of 
things break without anyone figuring out what is going on.

Look, I think C is the right measure. C++ will die from its own weight. 
It is just part of the obesity epidemic.
0
jacob
10/15/2014 4:26:21 PM
"jacob navia" <jacob@spamsink.net> wrote in message 
news:m1m77a$sai$1@speranza.aioe.org...
> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>> The C++ people already improved the C language with operator overloading 
>> and
>> containers several decades ago. You're also scoffing at people who have
>> worked to improve the language.
>
> I do not accept the complexity of C++.

> Your viewpoint is very popular in this group however. Anyone that proposes 
> to improve the C language is shown the c++ pile of horrors as it would be 
> the best thing since sliced bread, including (of course) the C committee 
> where most of its member work to improve the C++ language.
>
> Since C++ started with their (OHHH so new) OO approach that was supposed 
> to be the new silver bullet,

I think you mean 'magic bullet'. A silver bullet would have rather the 
opposite effect.

(Although thinking about it, perhaps you were right with 'silver bullet' 
after all.)

-- 
Bartc 

0
BartC
10/15/2014 4:48:50 PM
On 2014-10-15, jacob navia <jacob@spamsink.net> wrote:
> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>> The C++ people already improved the C language with operator overloading and
>> containers several decades ago. You're also scoffing at people who have
>> worked to improve the language.
>
> I do not accept the complexity of C++.
>
> Your viewpoint is very popular in this group however.

Not the part about C++ being a dialect of C, or that it's a better C.

Yesterday, I found a gaping bug in a C program with the help of C++.
I maintain the TXR language in "Clean C": it can be compiled with C or C++.
But I have been neglecting to maintain the C++ port.

I introduced a change in August revolving around the use of using Flex
to generate a reentrant scanner. Unfortuantely, the GNU Flex pinheads
chose to use a really stupid approach, namely "typedef void *yyscan_t"
as the public type representing a scanner.

I screwed up in some places where a function takes
(void *scanner, parser_t *parser) parameters, but the arguments
were swapped! Of course the parser passes to a void * in C without
complaints, and the scanner is void * and passes to a parser_t *.

I compiled the code as C++ yesterday and instantly found the glaring
problem.

When I build the program with C++, it has almost the same size and performance.

Built with g++:

  $ ldd ./txr
          libm.so.6 => /lib/i386-linux-gnu/libm.so.6 (0xb7779000)
          libgcc_s.so.1 => /lib/i386-linux-gnu/libgcc_s.so.1 (0xb775b000)
          libc.so.6 => /lib/i386-linux-gnu/libc.so.6 (0xb75b0000)
          /lib/ld-linux.so.2 (0xb77b9000)
  $ size ./txr
     text          data     bss     dec     hex filename
   625339          4460 1072816 1702615  19fad7 ./txr
  $ time make tests
  [...]
  ** tests passed!

  real  0m11.269s
  user  0m9.777s
  sys   0m0.296s

With gcc:

  $ ldd ./txr
          libm.so.6 => /lib/i386-linux-gnu/libm.so.6 (0xb778b000)
          libc.so.6 => /lib/i386-linux-gnu/libc.so.6 (0xb75e1000)
          /lib/ld-linux.so.2 (0xb77cb000)
  $ size ./txr
     text          data     bss     dec     hex filename
   622472          4444 1072768 1699684  19ef64 ./txr
  $ time make tests
  [...]
  ** tests passed!

  real  0m10.851s
  user  0m9.533s
  sys   0m0.308s

These differences can be noted:

- The text segment size of the C++ version is only 0.4% bigger, and it
  has a few more bytes of initialized and uninitialized data.

- Based on the user + sys times, and these single samples only, the C runs
  2.3% faster. (This is suspicious and probably the gap would be shown
  to be smaller with a few repetitions.)

- The C++ version links an extra shared library.

The monstrous complexity of C++ is not hurting here at all. And I haven't even
taken any special steps. I haven't told the C++ compiler, for instance, to
disable exception handling or not to link C++ specific libraries.

In spite of the complexity in the language, the C++ designers have managed to
do a good job of adhering to the principle "don't pay for what you don't use"
or at least don't pay much.

> proposes to improve the C language is shown the c++ pile of horrors as 
> it would be the best thing since sliced bread, including (of course) the 
> C committee where most of its member work to improve the C++ language.
>
> Since C++ started with their (OHHH so new) OO approach that was supposed 
> to be the new silver bullet, C programmers have been denigrated by those 
> people like somehow stupid folks that can't understand the new techniques.

Someone who wants operator overloading and generic containers obviously
understands some of the techniques. Your one-implementation, one
platform-specific dialect is a difficult "sell" to woo that person away
from C++.

The stuff you *don't* have compared to C++ isn't really a selling point.  The
average PC has tons of drive space to store the tools, and the extra cruft in
C++ which the person isn't interested in (above and beyond operator overloading
and containers) can be shown not to significantly impact the executable sizes
or performance.

> You continue that tradition. If I propose to improve the C language with 
> something that is current in Fortran, C# etc, almost ANY language 
> nowadays, the C++ people in this group will block me with their eternal
>
> "C++ has done it already"

This is a perfectly valid, rational point. C++ has done it already.
Demonstrably: there it is.  The counterargument "no, it hasn't" cannot be used.

C++ has not only done it, but you can keep writing C in C++ all you want, using
all the same libraries.

You haven't made any statements that lead me to believe that if you continue
unchecked in your own efforts, you will also not whip up something
complex.

Like so many people in computer science, you despise *other* people's
complexity but revel in that of your own creation.

"Haskell type system with monadic I/O? Bah, cmoplicated junk; I will stick to my
Lisp macros with quadruply nested backquotes."

> Yes, it has "done it already" but that is NO REASON to block the 
> development of a simple language without making it the monstruosity that 
> C++ has become.

True; but it's not honest to promote that language as an alternative to C++
without qualifications, to people who might not even be able to use it.

> Just one data point:
>
> The creator of the language, Mr Stroustroup, after YEARS of trying to 
> introduce a new feature was forced to acknowledge that he wasn't able to 
> do so. C++ has become too complex to fit in a single human brain even 
> the brain of the creator of the language.

That's a sign that the process of piling features into C++ is self-limiting;
it ought to make you glad.

> My compiler system targets today Power PC CPUs and intel ones. I do not 
> have the fonancial backing of gcc/microsoft whatever. Because C is 
> despises by those people as a mater of course.

C++ is obviously not despised by the GCC people. What?

> Why after ALL THOSE YEARS OF C++ C continues to grow?

I suspect, because the Google search index doesn't do a good job with symbols
like "++".

> Because is a better, simpler language. Operator overloading in lcc-win 
> is exclusively used for NUMBERS, as it should be. A small but very 
> important application.

That's a limitation; right there, you're losing potential customers.  "Oh, I
was looking for overloading that is a little more general; so it will
have to be C++."

C++ has function overloading, actually, and operator overloading is just
a syntactic sugar on top of that. The + operator is actually the function
operator +(). This function is overloaded using function overloading,
just like any other function such as foo(). The syntactic sugar
of the binary + operator is just an interface to those overloads.

> C++ is UNREADABLE!

Not what people want to read from a compiler writer and vendor.

> Just the specifications of the overloading of an operator in C++ implies 
> a topological sort of the classes to figure out which function will be 
> called! They run for PAGES AND PAGES.

Sounds like something I could bang up in half a page of Lisp, based
on about an equal amount of pseudocode from a CS algorithms textbook.

(Code for that might run for pages and pages if it's written *in* C++,
and even longer if it's written in C.)

> Do all C++ programmers KNOW all that and did they do the topological 
> sort in their heads before writing the code?

Do you do a topological sort of the clothes when dressing in the morning?

Topological sorts across type lattices just do the intuitive thing,
generally: they help discover the parameters which have the most specific
combination of types for the call.

What is most specific is obvious to the programmer at a glance without
laboring through the topological sort.

There are some obvious certainties, like that if you have an overload
for types (A, B, C) and three arguments are given which are *exactly*
of those types, then they go to that exact overload. This shortcut
completely bypasses the need for a topological sort.

C++ also simplifies things by only allowing a single user-defined conversion in
the deduction of a parameter. Unlike some other languages, it also fails
to resolve an overload sometimes, producing a diagnostic. Those situations
are precisely those in which the type lattice is confusing. The overload
resolution rules were carefully designed so that the candidate functions
are "obvious" in some sense, as is the uniquely best one among them.
For insatnce if you have:

  func(double, int);
  func(int, double);

The call func(1, 1) is ambiguous in C++ and flagged that way at compile time.
Both functions are candidates, but they are equally suited, and so there
is no single best candidate.  If you add a func(int, int) function, then that
overload will take the call. Always. No type signature can be more specific
for the func(1, 1) call, and any other signature which is different in any
way is necessarily less specific.

A naive topological sort without the added complexity of eliminating ambiguous
cases would be harder to work with. Better intuition requires more complexity!

Anyway, a C compiler has to do a bunch of graph theory to generate decent code:
allocate registers, analyze data flows and variable liveness through the
program, and so on.

You cannot reveal, as a C compiler developer and vendor, that you are scared
of topological sorts.

> Of course not, they just think they know and in many cases they are right.
>
> Until someone else adds a class somewhere else and suddenly a lot of 
> things break without anyone figuring out what is going on.

Whereas nobody ever adds anything to a C program that breaks, and makes it hard
to figure out what is going on?

Nobody ever makes an ad-hoc OO system in C to solve a problem, whose
control pathways are indecipherable (and which has that darned topological
sort in it somewhere).

> Look, I think C is the right measure. C++ will die from its own weight.
> It is just part of the obesity epidemic.

Fact remains that if you unleash a good C++ compiler on C code (which has been
carefully written to also compile as C++), you get decent performance and
executable size.

The bloat is undeniably there, but it's in the toolchain.
0
Kaz
10/15/2014 6:22:57 PM
jacob navia wrote:
> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>> The C++ people already improved the C language with operator overloading and
>> containers several decades ago. You're also scoffing at people who have
>> worked to improve the language.
>
> I do not accept the complexity of C++

One of these days you will grasp the simple concept of NOT USING THE 
BITS YOU DON'T LIKE!

If someone just wants C + operator overloading they can simply use that 
subset of C++.  This works well and it is portable.

-- 
Ian Collins
0
Ian
10/15/2014 6:48:39 PM
James Kuyper wrote:
> On 10/15/2014 02:35 AM, Ian Collins wrote:
>> jacob navia wrote:
> ....
>>> Yes, but in most project and most cases the tests will test only 0.1% of
>>> the possible combinations of input values a software should accept.
>>
>> Humbug.
>
> How could it possibly be other wise? The range of possible combinations
> of inputs is huge even for relatively trivial functions such as
> add(a,b), where a and b are long integers. For any realistic function,
> it can be so large that even achieving 0.1% coverage with your tests is
> a fantasy, not a reality.
>
>>> To test if a floating point package is correct you would need more time
>>> than the age of he Universe...
>>
>> Well obviously you don't aim to testes every number, you apply a little
>> of the intelligence you were hired for.
>
> Yes - with the associated problem that your intelligence might fail to
> identify the particular combination of inputs that will trigger a
> particular problem. I've seen some pretty obscure bugs triggered by very
> odd situations I'd never have thought to test for.

Which is why programmers pair and sensible projects use more than one 
level of testing and testers.

>>> Yes, tests are very good but you should not believe they will ensure bug
>>> free software!
>>
>> Close enough, if done well.
>
> Nothing can ensure bug free software; it's just as much a mistake to
> rely entirely upon testing as it is to rely entirely upon any other
> single quality assurance technique.

No one claims it does.  My projects had a zero defect target, with one 
or two a year eventually popping up.  If your QA isn't good enough to 
make a bug report rare enough to justify a team meeting and an analysis 
of how the bug wasn't spotted, you are doing it wrong.

>>> Other problems aren't covered by testing: Porting the software to a new
>>> environment for instance.
>>
>> Run the tests again?
>
> That's not necessarily sufficient, if the problem is due to something
> that you didn't think to test because you weren't aware that it posed a
> portability problem.

Performing the analysis of where such problems mat occur is an important 
part of any porting process.

-- 
Ian Collins
0
Ian
10/15/2014 6:57:07 PM
On 2014-10-15, Ian Collins <ian-news@hotmail.com> wrote:
> jacob navia wrote:
>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>>> The C++ people already improved the C language with operator overloading and
>>> containers several decades ago. You're also scoffing at people who have
>>> worked to improve the language.
>>
>> I do not accept the complexity of C++
>
> One of these days you will grasp the simple concept of NOT USING THE 
> BITS YOU DON'T LIKE!
>
> If someone just wants C + operator overloading they can simply use that 
> subset of C++.  This works well and it is portable.

Also, it runs in about the same time and space as a C program which
defines a bunch of separately named functions for all the possible
overloads, where the programmer does the overload resolution.

That is to say, this C++ code:

  int foo(int, int);
  double foo(double, int);
  double foo(int, double);
  double foo(double, double);

  foo(foo(3.0, 1), foo(4.0, foo(7, 6.1)));

isn't any worse in time and space than this C code:

  int foo0(int, int);
  double foo1(double, int);
  double foo2(int, double);
  double foo3(double, double);

  foo3(foo1(3.0, 1), foo3(4.0, foo2(7, 6.1)));

I just had to manually work out which foo is called.

This isn't quite operator overloading; it is function overloading, but
it lets a level comparison be made.

There is no run-time overhead from C++ figuring out which overloads
are called. Effectively, it writes the same code that is manually
written in C. The C++ compiler is larger and more complicated on account
of this, but the resulting code isn't.
0
Kaz
10/15/2014 7:05:40 PM

"Ian Collins" <ian-news@hotmail.com> wrote in message
news:ca7ts8FerbqU12@mid.individual.net...
> jacob navia wrote:
>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>>> The C++ people already improved the C language with operator overloading
>>> and
>>> containers several decades ago. You're also scoffing at people who have
>>> worked to improve the language.
>>
>> I do not accept the complexity of C++
>
> One of these days you will grasp the simple concept of NOT USING THE BITS
> YOU DON'T LIKE!
>
> If someone just wants C + operator overloading they can simply use that
> subset of C++.  This works well and it is portable.

So why does gcc distribute both 'gcc' and 'g++' executables?

In fact, why does the C language still exist at all, if it is pretty much a 
subset of C++?

-- 
Bartc 

0
BartC
10/15/2014 7:06:20 PM
Ian Collins <ian-news@hotmail.com> writes:
> jacob navia wrote:
>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>>> The C++ people already improved the C language with operator
>>> overloading and containers several decades ago. You're also scoffing
>>> at people who have worked to improve the language.
>>
>> I do not accept the complexity of C++
>
> One of these days you will grasp the simple concept of NOT USING THE 
> BITS YOU DON'T LIKE!
>
> If someone just wants C + operator overloading they can simply use that 
> subset of C++.  This works well and it is portable.

But it's difficult to enforce the use of such a subset.

Let's say you want to implement a project using code written in
the intersection of C and C++, except that operator overloading is
permitted.  So all the code is valid and portable C++, but avoids
most C++ features that are not also in C.  (And you have to cast
the result of malloc and avoid using C++ keywords as identifiers.)

You can't compile your code with a C compiler (unless that compiler
happens to support operator overloading as an extension *and* does so
in a manner that's compatible with C++).  You can of course compile
it with a C++ compiler, but if somebody on your team (even if your
team is just you) accidentally refers to a structure by its tag name:

    struct foo { /* ... */ };
    foo obj;

then your C++ compiler will quietly accept it and not diagnose the
fact that you're using a C++ feature that's not on the permitted
list.

In short, the dialect you're using has to be enforced by programmer
discipline and perhaps by external tools, not by the compiler.
It's going to be hard to avoid the temptation to use inheritance
or exceptions *just this once*.

I'm not trying to argue that using such a subset is therefore a bad
idea.  Most projects have coding standards that are enforced only
by programmer discipline.  Compilers aren't going to complain about
bad brace placement or violation of a tabs-vs-spaces indentation
convention.  Restricting yourself to a subset of a language isn't
much different.  But it's something to keep in mind.

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/15/2014 7:11:08 PM
On 10/15/2014 02:48 PM, Ian Collins wrote:
> jacob navia wrote:
....
>> I do not accept the complexity of C++
> 
> One of these days you will grasp the simple concept of NOT USING THE 
> BITS YOU DON'T LIKE!

At a minimum, you still need to understand the bits you don't like well
enough to avoid accidentally bringing them into play. Furthermore, in
any context where there's more than one programmer involved, every
member of the teem needs at least a basic understanding all of the parts
of C++ that are used by any member of the team.
0
James
10/15/2014 7:12:32 PM
On 2014-10-15, BartC <bc@freeuk.com> wrote:
>
>
> "Ian Collins" <ian-news@hotmail.com> wrote in message
> news:ca7ts8FerbqU12@mid.individual.net...
>> jacob navia wrote:
>>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>>>> The C++ people already improved the C language with operator overloading
>>>> and
>>>> containers several decades ago. You're also scoffing at people who have
>>>> worked to improve the language.
>>>
>>> I do not accept the complexity of C++
>>
>> One of these days you will grasp the simple concept of NOT USING THE BITS
>> YOU DON'T LIKE!
>>
>> If someone just wants C + operator overloading they can simply use that
>> subset of C++.  This works well and it is portable.
>
> So why does gcc distribute both 'gcc' and 'g++' executables?
>
> In fact, why does the C language still exist at all, if it is pretty much a 
> subset of C++?

It's not clear. You can chalk it up to ideological fanaticism.

Also, look up a song called "I'm On a Committee", by Phong Ngo.

In particular the lines:

  We resolve and absolve, but never dissolve
  Since it's out of the question for us.

Committees like ISO C continue have a tendency to get together long after
having served their purpose.

This is why I really respect the people who standardized ANSI Common Lisp. They
got the job done in 1994. Then they had the good taste and sense to disappear
and not to be heard from again! When a real issue comes up, maybe a working
group can be assembled again.
0
Kaz
10/15/2014 7:46:22 PM
In article <20141015071013.889@kylheku.com>, Kaz Kylheku <kaz@kylheku.com> 
wrote:

> Both lcc-win32 and C++ are dialects of C which have operator overloading
> and containers.

Smalltalk, Lisp, Scheme, etc had containers a long time ago. And because those 
languages are willing to push type operations to runtime, they do so with much 
less complication syntacticly and semanticly. CLU and Algol 68 also have 
operator overloading. Algol 68 can recover anonymous temporary results safely 
without destructors. Unfortunately Algol 68 closures are based on declarations 
rather than names, so a powerful technique was just missed.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/15/2014 7:48:42 PM
Keith Thompson wrote:
> Ian Collins <ian-news@hotmail.com> writes:
>> jacob navia wrote:
>>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>>>> The C++ people already improved the C language with operator
>>>> overloading and containers several decades ago. You're also scoffing
>>>> at people who have worked to improve the language.
>>>
>>> I do not accept the complexity of C++
>>
>> One of these days you will grasp the simple concept of NOT USING THE
>> BITS YOU DON'T LIKE!
>>
>> If someone just wants C + operator overloading they can simply use that
>> subset of C++.  This works well and it is portable.
>
> But it's difficult to enforce the use of such a subset.

No, it isn't.  Teams I have coached chose an agreed subset, there wasn't 
any enforcement involved.  I don't generally believe in enforced 
standards, someone will inevitably find a way to circumvent them.

<snip>

> I'm not trying to argue that using such a subset is therefore a bad
> idea.  Most projects have coding standards that are enforced only
> by programmer discipline.  Compilers aren't going to complain about
> bad brace placement or violation of a tabs-vs-spaces indentation
> convention.  Restricting yourself to a subset of a language isn't
> much different.  But it's something to keep in mind.

Sure, but once a consensus is reached, there are non-proscriptive ways 
of ensuring the rules are followed.

-- 
Ian Collins
0
Ian
10/15/2014 7:51:28 PM
James Kuyper wrote:
> On 10/15/2014 02:48 PM, Ian Collins wrote:
>> jacob navia wrote:
> ....
>>> I do not accept the complexity of C++
>>
>> One of these days you will grasp the simple concept of NOT USING THE
>> BITS YOU DON'T LIKE!
>
> At a minimum, you still need to understand the bits you don't like well
> enough to avoid accidentally bringing them into play. Furthermore, in
> any context where there's more than one programmer involved, every
> member of the teem needs at least a basic understanding all of the parts
> of C++ that are used by any member of the team.

A team (if it can truly be called a team) will have agreed in advance 
what language features they will use.  I've helped teams through this 
process.  One wanted little more tan C with overloading, another the 
kitchen sink and others something in between.  The subset doesn't have 
to be fixed.  As the collective understanding of the features on offer 
and the problems to be solved grows, so can the subset of the language 
in use.

-- 
Ian Collins
0
Ian
10/15/2014 7:57:49 PM
In article <ca7ts8FerbqU12@mid.individual.net>,
 Ian Collins <ian-news@hotmail.com> wrote:

> jacob navia wrote:
> > Le 15/10/2014 17:07, Kaz Kylheku a écrit :
> >> The C++ people already improved the C language with operator overloading 
> >> and
> >> containers several decades ago. You're also scoffing at people who have
> >> worked to improve the language.
> >
> > I do not accept the complexity of C++
> 
> One of these days you will grasp the simple concept of NOT USING THE 
> BITS YOU DON'T LIKE!
> 
> If someone just wants C + operator overloading they can simply use that 
> subset of C++.  This works well and it is portable.

Suppose I want to write operators for mxn Matrices where the matrix size is 
variable and so the matrix data is allocate on a heap. How would I recover the 
anonymous scalar product in A + r*B without reference counts, garbage 
collection, or destructors?

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/15/2014 7:58:06 PM
Siri Crews wrote:
> In article <ca7ts8FerbqU12@mid.individual.net>,
>   Ian Collins <ian-news@hotmail.com> wrote:
>
>> jacob navia wrote:
>>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>>>> The C++ people already improved the C language with operator overloading
>>>> and
>>>> containers several decades ago. You're also scoffing at people who have
>>>> worked to improve the language.
>>>
>>> I do not accept the complexity of C++
>>
>> One of these days you will grasp the simple concept of NOT USING THE
>> BITS YOU DON'T LIKE!
>>
>> If someone just wants C + operator overloading they can simply use that
>> subset of C++.  This works well and it is portable.
>
> Suppose I want to write operators for mxn Matrices where the matrix size is
> variable and so the matrix data is allocate on a heap. How would I recover the
> anonymous scalar product in A + r*B without reference counts, garbage
> collection, or destructors?

I don't see your point.  Once you start introducing objects that manage 
a resource, you need a means allocate and free that resource.  Yes you 
could set your self up with a world of pain and manage this by hand, or 
you could simply add a destructor and leave the mundane tasks to the 
compiler.

I'm still waiting to see your attempt at emulating automatic resource 
management in C.

-- 
Ian Collins
0
Ian
10/15/2014 8:19:45 PM
BartC wrote:
 > In fact, why does the C language still exist at all, if it is pretty 
much a subset of C++?

Kaz Kylheku wrote :

> It's not clear. You can chalk it up to ideological fanaticism.

Thousands of programmers build thousands of great applications using the 
language they like. They are all fanatics.

Mr Kylheku is NOT a fanatic OF COURSE!

:-)

He comes here with the purpose of destroying the group, and getting all 
those fanatics of C to C++, the great new religion.

The C Committee should dissolve (it has done so to a great extent),
we should embrace the new world of unreadable software and load those 
gigabytes of computer trivia into our overloaded minds :-)

Sure anyone can buy more RAM to his/her computer to accomodate a new 
monstruosity.

There are no stores to buy more brain space however. People have just a 
limited amount of patience to swallow more and more C++ stuff. I is a 
language that takes years and years to get accustomed to it, and it will 
never end. Nobody masters ALL of the language anymore, not even 
Stroustroup.

You can learn C very quickly. And yes, it has some pitfalls but they are 
very limited compared to C++.

My goal is to continue to develop C as a language and to avoid C++ as 
far as I can.
0
jacob
10/15/2014 8:20:25 PM
Le 15/10/2014 21:58, Siri Crews a �crit :
> Suppose I want to write operators for mxn Matrices where the matrix size is
> variable and so the matrix data is allocate on a heap. How would I recover the
> anonymous scalar product in A + r*B without reference counts, garbage
> collection, or destructors?

Very easy

All your overloaded operators work with a TOKEN, that records the 
operations, and pass an annotated token.

The ASSIGNMENT operator receives the token sequence, and then can handle 
all the intermediate data as it likes.

typedef struct tagMatrixToken { List *OperationsList} Matrix;

Matrix operator+(Matrix A,Matrix B)
{
	// Add an addition token annotating the operation with the
	// actual arguments.
	// Return the annotated token
}
Matrix operator+(Matrix A,Matrix B)
{
	// etc, the same
}

Matrix & operator=(Matrix &A, Matrix B)
{
	// In Matrix B we find the list of operations parsed for this
	// expression.
	// We can compile now the best sequence of operations
	// to emit FOR THIS expression
}

This implies that the assignment operator will catch anonymous 
assignments, for instance to function arguments. This can be done if we 
overload the cast operation from our Matrix Token to a real Matrix.

Roughly it would work like this. Please do not drown in a glass of water.

:-)


0
jacob
10/15/2014 8:32:35 PM
In article <m1mlkv$4hu$1@speranza.aioe.org>, jacob navia <jacob@spamsink.net> 
wrote:

> Le 15/10/2014 21:58, Siri Crews a �crit :
> > Suppose I want to write operators for mxn Matrices where the matrix size is
> > variable and so the matrix data is allocate on a heap. How would I recover 
> > the
> > anonymous scalar product in A + r*B without reference counts, garbage
> > collection, or destructors?
> 
> Very easy
> 
> All your overloaded operators work with a TOKEN, that records the 
> operations, and pass an annotated token.
> 
> The ASSIGNMENT operator receives the token sequence, and then can handle 
> all the intermediate data as it likes.

There was no assignment in the above expression, which might be in F(A+r*B). Nor 
is there an assignment in if (issingular(A+r*B)) {...}


> Roughly it would work like this. Please do not drown in a glass of water.

All you're really doing is reinventing destructors, but even worse than C++ did

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/15/2014 8:52:52 PM
On 15/10/14 21:06, BartC wrote:
>
>
> "Ian Collins" <ian-news@hotmail.com> wrote in message
> news:ca7ts8FerbqU12@mid.individual.net...
>> jacob navia wrote:
>>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>>>> The C++ people already improved the C language with operator
>>>> overloading
>>>> and
>>>> containers several decades ago. You're also scoffing at people who have
>>>> worked to improve the language.
>>>
>>> I do not accept the complexity of C++
>>
>> One of these days you will grasp the simple concept of NOT USING THE BITS
>> YOU DON'T LIKE!
>>
>> If someone just wants C + operator overloading they can simply use that
>> subset of C++.  This works well and it is portable.
>
> So why does gcc distribute both 'gcc' and 'g++' executables?

gcc and g++ are both "drivers" - they are programs that call the other 
bits of the compiler, assembler, linker, etc.  They do almost exactly 
the same thing, except that g++ automatically adds the C++ static 
library to the linker setup.  As long as you take that library into 
account when linking, you can compile C and C++ files with gcc and g++.

>
> In fact, why does the C language still exist at all, if it is pretty
> much a subset of C++?
>

As you can see, there are a lot of people who think C is the best 
language around and can't quite fathom that you can use a C++ compiler 
to compile most good quality C.  There are also those who insist on 
writing C that can't be compiled as C++ - either for good reasons (such 
as making use of designated initialisers), avoidable reasons (such as 
using "friend" and "class" as variable names), or bad reasons (such as 
writing poor quality C code that is rejected by fussier C++ compilers).

A major reason is that it is /much/ harder to write a C++ compiler than 
to write a C compiler, and the difference grows with each major version 
of C++.  There are only a few good C++ compilers around, with far fewer 
C++11 compilers than C++03 compilers, while there are loads of C 
compilers.  The "big computer" world is dominated by a few players, 
which are big enough to be able to make C++ compilers (like gcc, clang, 
Intel, MSVC, Borland, Pathscale, and a couple of others).  But in the 
world of embedded development, C++ toolchains are bowing out of the 
market - for ARM cpus, ARM's own toolchain, Keil, and CodeWarrior have 
all dropped out or are dropping out (ARM now uses clang in their 
development kit and also actively support gcc, while Freescale and most 
other manufacturers have moved to gcc).  C toolchains abound, however, 
for all sorts of devices (though few have C11 support).


0
David
10/15/2014 8:59:36 PM
In article <ca8371FerbqU15@mid.individual.net>,
 Ian Collins <ian-news@hotmail.com> wrote:

> Siri Crews wrote:
> > In article <ca7ts8FerbqU12@mid.individual.net>,
> >   Ian Collins <ian-news@hotmail.com> wrote:
> >
> >> jacob navia wrote:
> >>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
> >>>> The C++ people already improved the C language with operator overloading
> >>>> and
> >>>> containers several decades ago. You're also scoffing at people who have
> >>>> worked to improve the language.
> >>>
> >>> I do not accept the complexity of C++
> >>
> >> One of these days you will grasp the simple concept of NOT USING THE
> >> BITS YOU DON'T LIKE!
> >>
> >> If someone just wants C + operator overloading they can simply use that
> >> subset of C++.  This works well and it is portable.
> >
> > Suppose I want to write operators for mxn Matrices where the matrix size is
> > variable and so the matrix data is allocate on a heap. How would I recover 
> > the
> > anonymous scalar product in A + r*B without reference counts, garbage
> > collection, or destructors?
> 
> I don't see your point.  Once you start introducing objects that manage 

Perhaps not in your world, but matrices are rather important in software that 
deals with linear functions, quantum mechanics, open-gl, etc. You told me I can 
use operator overloading without anything from C++, and I want to use it for a 
linear algebra library. Are you now placing restrictions on me after the fact? 
Where can I send my bill for wasted project time?

> I'm still waiting to see your attempt at emulating automatic resource 
> management in C.

I use the Boehm-Deremer allocator and collector. I can add finalisers to 
objects, but in practice I only do that for things like decrementing Tcl or 
Cocoa reference counts.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/15/2014 9:03:02 PM
Siri Crews wrote:
> In article <ca8371FerbqU15@mid.individual.net>,
>   Ian Collins <ian-news@hotmail.com> wrote:
>
>> Siri Crews wrote:
>>> In article <ca7ts8FerbqU12@mid.individual.net>,
>>>    Ian Collins <ian-news@hotmail.com> wrote:
>>>
>>>> jacob navia wrote:
>>>>> Le 15/10/2014 17:07, Kaz Kylheku a écrit :
>>>>>> The C++ people already improved the C language with operator overloading
>>>>>> and
>>>>>> containers several decades ago. You're also scoffing at people who have
>>>>>> worked to improve the language.
>>>>>
>>>>> I do not accept the complexity of C++
>>>>
>>>> One of these days you will grasp the simple concept of NOT USING THE
>>>> BITS YOU DON'T LIKE!
>>>>
>>>> If someone just wants C + operator overloading they can simply use that
>>>> subset of C++.  This works well and it is portable.
>>>
>>> Suppose I want to write operators for mxn Matrices where the matrix size is
>>> variable and so the matrix data is allocate on a heap. How would I recover
>>> the
>>> anonymous scalar product in A + r*B without reference counts, garbage
>>> collection, or destructors?
>>
>> I don't see your point.  Once you start introducing objects that manage
>
> Perhaps not in your world, but matrices are rather important in software that
> deals with linear functions, quantum mechanics, open-gl, etc. You told me I can
> use operator overloading without anything from C++, and I want to use it for a
> linear algebra library. Are you now placing restrictions on me after the fact?
> Where can I send my bill for wasted project time?

If you want to use C++ for a linear algebra library, you would want to 
take advantage of other language features.  Operator overloading only 
starts to add value when used with other language features, especially 
references.

>> I'm still waiting to see your attempt at emulating automatic resource
>> management in C.
>
> I use the Boehm-Deremer allocator and collector. I can add finalisers to
> objects, but in practice I only do that for things like decrementing Tcl or
> Cocoa reference counts.

The are plenty of other resources other than memory, how would you 
automatically manage those?

-- 
Ian Collins
0
Ian
10/15/2014 9:27:21 PM
In article <ca875pFerbqU16@mid.individual.net>,
 Ian Collins <ian-news@hotmail.com> wrote:

> If you want to use C++ for a linear algebra library, you would want to 
> take advantage of other language features.  Operator overloading only 
> starts to add value when used with other language features, especially 
> references.

So you're saying I should ignore this guy:

    In article <ca7ts8FerbqU12@mid.individual.net>,
     Ian Collins <ian-news@hotmail.com> wrote:

    > One of these days you will grasp the simple concept of NOT USING THE 
    > BITS YOU DON'T LIKE!
    > 
    > If someone just wants C + operator overloading they can simply use that 
    > subset of C++.  This works well and it is portable.

> The are plenty of other resources other than memory, how would you 
> automatically manage those?

[Boehm-Demers.]

http://www.hboehm.info/gc/finalization.html

Many garbage collectors provide a facility for executing user code just before 
an object is collected. This can be used to reclaim any system resources or 
non-garbage-collected memory associated with the object. Experience has shown 
that this can be a useful facility. It is indispensible in cases in which system 
resources are embedded in complex data structures (e.g. file descriptors in the 
cord package).

Our collector provides the necessary functionality through GC_register_finalizer 
in gc.h, or by inheriting from gc_cleanup in gc_cpp.h.
..
..
..

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/15/2014 9:46:06 PM
Le 15/10/2014 22:52, Siri Crews a �crit :
> In article <m1mlkv$4hu$1@speranza.aioe.org>, jacob navia <jacob@spamsink.net>
> wrote:
>
>> Le 15/10/2014 21:58, Siri Crews a �crit :
>>> Suppose I want to write operators for mxn Matrices where the matrix size is
>>> variable and so the matrix data is allocate on a heap. How would I recover
>>> the
>>> anonymous scalar product in A + r*B without reference counts, garbage
>>> collection, or destructors?
>>
>> Very easy
>>
>> All your overloaded operators work with a TOKEN, that records the
>> operations, and pass an annotated token.
>>
>> The ASSIGNMENT operator receives the token sequence, and then can handle
>> all the intermediate data as it likes.
>
> There was no assignment in the above expression, which might be in F(A+r*B). Nor
> is there an assignment in if (issingular(A+r*B)) {...}
>

Apparently you have difficulties reading. In my answer I handled 
explicitely this case with
(sorry to cite myself)

<quote>
This implies that the assignment operator will catch anonymous 
assignments, for instance to function arguments. This can be done if we 
overload the cast operation from our Matrix Token to a real Matrix.
<end quote>

If your function issingular receives a real matrix, not a token, when 
you pass it a token list then there will be a cast from token list to 
real matrix. The cast operator does the same thing as the assignment 
operator.

Is that clearer now?

>
>> Roughly it would work like this. Please do not drown in a glass of water.
>
> All you're really doing is reinventing destructors, but even worse than C++ did
>

And you think that throwing keywords around will help you?

You asked a specific question:

 >>> Suppose I want to write operators for mxn Matrices where the matrix 
size is
 >>> variable and so the matrix data is allocate on a heap. How would I 
recover
 >>> the
 >>> anonymous scalar product in A + r*B without reference counts, garbage
 >>> collection, or destructors?

I answered that question.

Any more questions?


0
jacob
10/15/2014 10:01:11 PM
Le 15/10/2014 22:59, David Brown a écrit :
> As you can see, there are a lot of people who think C is the best
> language around and can't quite fathom that you can use a C++ compiler
> to compile most good quality C.

You are unable to accept the fact that are loads of programmers that do 
not want to fill their minds with C++ trivia and try to solve the task 
at hand without getting bogged down with an incomprehensible language.


0
jacob
10/15/2014 10:05:07 PM
David Brown <david.brown@hesbynett.no> writes:
> On 15/10/14 21:06, BartC wrote:
[...]
>> In fact, why does the C language still exist at all, if it is pretty
>> much a subset of C++?
>
> As you can see, there are a lot of people who think C is the best 
> language around and can't quite fathom that you can use a C++ compiler 
> to compile most good quality C.  There are also those who insist on 
> writing C that can't be compiled as C++ - either for good reasons (such 
> as making use of designated initialisers), avoidable reasons (such as 
> using "friend" and "class" as variable names), or bad reasons (such as 
> writing poor quality C code that is rejected by fussier C++ compilers).

And there are a lot of people (myself included) who happen to like C
(while acknowledging its flaws), and who enjoy using it and/or are
required to use it at work.

If I want to use a language that includes most of C as a subset,
it's not obvious that I should choose C++ over any of the other
C-derived languages like Objective-C, or lcc-win32 C, or GNU C
(yes, the latter two are more dialects than distinct languages).

If I'm programming in C, I generally don't even think about whether
my code will also compile as C++, because I'm not going to try to
compile it as C++.  There's no need to avoid calling a variable
"class" or "friend" if that happens to be the best name for it.
If I need to use C code in a C++ project, there's always extern "C".
Sure, there are cases where it makes sense to write code that will
compile either as C or as C++.  In my experience, those cases are
rarer than some people think they are, and I don't think I've ever
had such a requirement myself.

I do avoid non-prototype function declarations and definitions --
not because C++ doesn't permit them, but because it makes for better
C code.

[snip]

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/15/2014 10:54:55 PM
Siri Crews wrote:
> In article <ca875pFerbqU16@mid.individual.net>,
>   Ian Collins <ian-news@hotmail.com> wrote:
>
>> If you want to use C++ for a linear algebra library, you would want to
>> take advantage of other language features.  Operator overloading only
>> starts to add value when used with other language features, especially
>> references.
>
> So you're saying I should ignore this guy:
>
>      In article <ca7ts8FerbqU12@mid.individual.net>,
>       Ian Collins <ian-news@hotmail.com> wrote:
>
>      > One of these days you will grasp the simple concept of NOT USING THE
>      > BITS YOU DON'T LIKE!
>      >
>      > If someone just wants C + operator overloading they can simply use that
>      > subset of C++.  This works well and it is portable.

No, I'm saying exercise some intelligence.

Sure you can just use operator overloading alone:

struct Vector
{
   int *data;

   int operator[]( unsigned n ) { return data[n]; }
};

However overloaded arithmetic operators are usually associated with 
objects (such as a complex number, a bignum, a matrix...).  Once the 
object representation becomes non-trivial, it makes sense to use more 
language features.  For example you could have a simple complex number 
type thus:

struct Complex
{
   double i,j;
};

Complex operator+( Complex lhs, Complex rhs )
{
   Complex tmp;
   tmp.i = lhs.i + rhs.i;
   tmp.i = lhs.j + rhs.j;
   return tmp;
}

Which is OK, but what would you use for the return type operator +=? 
Simple, use are reference:

struct Complex
{
   double i,j;

   Complex& operator+=( Complex rhs )
   {
     i += rhs.i;
     j *= rhs.i;
     return *this;
   }
};

What if your type was a bignum rather than a double?  Then it would make 
sense to pass the parameters by reference.

What if you want to use float, double and a bignum for your complex 
type?  Make Complex a class template.

Each time the requirements get more complex, the language as a tool to 
help you implement them.

-- 
Ian Collins
0
Ian
10/15/2014 11:06:59 PM
Siri Crews wrote:
> In article <ca875pFerbqU16@mid.individual.net>,
>   Ian Collins <ian-news@hotmail.com> wrote:
>
>> The are plenty of other resources other than memory, how would you
>> automatically manage those?
>
> [Boehm-Demers.]
>
> http://www.hboehm.info/gc/finalization.html
>
> Many garbage collectors provide a facility for executing user code just before
> an object is collected. This can be used to reclaim any system resources or
> non-garbage-collected memory associated with the object. Experience has shown
> that this can be a useful facility. It is indispensible in cases in which system
> resources are embedded in complex data structures (e.g. file descriptors in the
> cord package).
>
> Our collector provides the necessary functionality through GC_register_finalizer
> in gc.h, or by inheriting from gc_cleanup in gc_cpp.h.

"Actions that must be executed promptly do not belong in finalizers. 
They should be handled by explicit calls in the code (or C++ destructors 
if you prefer)."

I still don't see how GC would help with a stand alone file descriptor 
or any resource not associated with allocated memory.

-- 
Ian Collins
0
Ian
10/15/2014 11:11:38 PM
jacob navia wrote:
> Le 15/10/2014 22:59, David Brown a écrit :
>> As you can see, there are a lot of people who think C is the best
>> language around and can't quite fathom that you can use a C++ compiler
>> to compile most good quality C.
>
> You are unable to accept the fact that are loads of programmers that do
> not want to fill their minds with C++ trivia and try to solve the task
> at hand without getting bogged down with an incomprehensible language.

You don't seem to realise than there are programmers who are able to 
comprehend C++.  Just because it is beyond you, don't tar the rest of us 
with the same brush.

-- 
Ian Collins
0
Ian
10/15/2014 11:14:09 PM
Siri Crews wrote:
>
> All you're really doing is reinventing destructors, but even worse than C++ did

What exactly do you have against destructors?

If could have one C++ feature added to C, it would be struct 
constructors and destructors.  If I couldn't have the pair, it would be 
  destructors.

-- 
Ian Collins
0
Ian
10/15/2014 11:17:10 PM
On 2014-10-15, Keith Thompson <kst-u@mib.org> wrote:
> David Brown <david.brown@hesbynett.no> writes:
>> On 15/10/14 21:06, BartC wrote:
> [...]
>>> In fact, why does the C language still exist at all, if it is pretty
>>> much a subset of C++?
>>
>> As you can see, there are a lot of people who think C is the best 
>> language around and can't quite fathom that you can use a C++ compiler 
>> to compile most good quality C.  There are also those who insist on 
>> writing C that can't be compiled as C++ - either for good reasons (such 
>> as making use of designated initialisers), avoidable reasons (such as 
>> using "friend" and "class" as variable names), or bad reasons (such as 
>> writing poor quality C code that is rejected by fussier C++ compilers).
>
> And there are a lot of people (myself included) who happen to like C
> (while acknowledging its flaws), and who enjoy using it and/or are
> required to use it at work.
>
> If I want to use a language that includes most of C as a subset,
> it's not obvious that I should choose C++ over any of the other
> C-derived languages like Objective-C, or lcc-win32 C, or GNU C
> (yes, the latter two are more dialects than distinct languages).

This is correct. For each of these possible choices, we can evaluate
the advantages of ensuring that your ISO C code is also valid in
that dialect.

Let us enumerate:

Objective C:

  The extensions of Objective C are orthogonal to C. They are introduced
  by the @ character which isn't in the translation character set of
  standard C.

  Standard C which doesn't use the extensions of Objective C is already
  Objective C; nothing has to be done to be compatible with Objective C.
  Objective C compilers probably do not provide additional diagnsotics
  for ISO C code.

  Conclusion: there appears to be little utility in writing in the C subset
  of Objective C and using an Objective C compiler.

lcc-win32:

  Making ISO C code also compile on lcc-win32 has little utility,
  since it is narrow dialect with a single implementation, and not
  a major dialect. Maybe it has great diagnostics, in which case
  there could be some value.

GNU C:

  Writing ISO C code which also compiles as GNU C is pointless, because
  GNU C is the dialect of the GNU compiler, which can be told to suppress its
  extensions and accept ISO C, so writing code not to step on GNU C extensions
  is a waste of effort. I do not believe that GCC in its native dialect mode
  provides additional useful safety or diagnosics.

  Conclusion: the subset of GNU C and C is not useful, either.


C++:

  Aha! Writing ISO C code which also compiles as C++ allows us to benefit from:

  * safter treatment of void * (this saved my ass just yesterday!)
  * type-safe enumerations
  * checks associated with const string literals
  * type-safe linkage.
  * enforcement of one-definition rule (versus some relaxed linkage
    mode of a given C compiler).
  * elimination of old-style cruft like calling undeclared functions.
  * elimination of C nonsense like file scope structs, unions and enums
    declared inside a struct or union body.

C++ is the dialect of C which brings some value to the table when used
as a C dialect.

Some of these benefits can be obtained from C compilers by manipulating their
options. That requires tedious research. Keep the code portable as C++ and you
get a whole package at once.
0
Kaz
10/15/2014 11:22:19 PM
"Kaz Kylheku" <kaz@kylheku.com> wrote in message 
news:20141015155845.947@kylheku.com...
> On 2014-10-15, Keith Thompson <kst-u@mib.org> wrote:

>> If I want to use a language that includes most of C as a subset,
>> it's not obvious that I should choose C++ over any of the other
>> C-derived languages like Objective-C, or lcc-win32 C, or GNU C
>> (yes, the latter two are more dialects than distinct languages).

> C++:
>
>  Aha! Writing ISO C code which also compiles as C++ allows us to benefit 
> from:
>
>  * safter treatment of void * (this saved my ass just yesterday!)
>  * type-safe enumerations
>  * checks associated with const string literals
>  * type-safe linkage.
>  * enforcement of one-definition rule (versus some relaxed linkage
>    mode of a given C compiler).
>  * elimination of old-style cruft like calling undeclared functions.
>  * elimination of C nonsense like file scope structs, unions and enums
>    declared inside a struct or union body.

Those aren't very interesting. And for that, you have to suffer much more 
pedantic type-strictness which can waste a lot of time to sort out.

> C++ is the dialect of C which brings some value to the table when used
> as a C dialect.

The only benefit I can think of is when interfacing to certain 
libraries/APIs that are only defined for C++ (eg. DirectX or MS' GDI+; I 
think WxWidgets can now work with other languages. Or when the only simple 
jpeg library you can find happens to be in C++).

That might still require more dabbling with the language than someone would 
like (some people like to be in control of their development and when most 
of the time is spent battling the language and/or compiler, then that is not 
very productive).

-- 
Bartc 

0
BartC
10/16/2014 12:06:25 AM
BartC wrote:
> "Kaz Kylheku" <kaz@kylheku.com> wrote in message
> news:20141015155845.947@kylheku.com...
>> On 2014-10-15, Keith Thompson <kst-u@mib.org> wrote:
>
>>> If I want to use a language that includes most of C as a subset,
>>> it's not obvious that I should choose C++ over any of the other
>>> C-derived languages like Objective-C, or lcc-win32 C, or GNU C
>>> (yes, the latter two are more dialects than distinct languages).
>
>> C++:
>>
>>   Aha! Writing ISO C code which also compiles as C++ allows us to benefit
>> from:
>>
>>   * safter treatment of void * (this saved my ass just yesterday!)
>>   * type-safe enumerations
>>   * checks associated with const string literals
>>   * type-safe linkage.
>>   * enforcement of one-definition rule (versus some relaxed linkage
>>     mode of a given C compiler).
>>   * elimination of old-style cruft like calling undeclared functions.
>>   * elimination of C nonsense like file scope structs, unions and enums
>>     declared inside a struct or union body.
>
> Those aren't very interesting. And for that, you have to suffer much more
> pedantic type-strictness which can waste a lot of time to sort out.

I'd change that to "much more pedantic type-strictness which can save 
you form nasty, arse biting problems later on".

My experiences match Kaz's.  I was drawn to C++ in the early 90s when I 
had to find no end of obscure bugs in some badly written C (and a not 
very helpful C compiler).  Coercing the code to compile as C++ removed 
the bulk of the bugs, most of which were down to misuse of enumerations 
and mismatched parameters.

I'm shire a lot of these (except maybe the enum abuse) would be picked 
up by modern C compiler or lint with the appropriate options, but that's 
more buggering about than just using g++/CC.

<snip>

> That might still require more dabbling with the language than someone would
> like (some people like to be in control of their development and when most
> of the time is spent battling the language and/or compiler, then that is not
> very productive).

Ever used Ada? :)

-- 
Ian Collins
0
Ian
10/16/2014 12:17:30 AM

"Ian Collins" <ian-news@hotmail.com> wrote in message 
news:ca8d0jFerbqU17@mid.individual.net...
> Siri Crews wrote:
>> In article <ca875pFerbqU16@mid.individual.net>,
>>   Ian Collins <ian-news@hotmail.com> wrote:
>>
>>> If you want to use C++ for a linear algebra library, you would want to
>>> take advantage of other language features.  Operator overloading only
>>> starts to add value when used with other language features, especially
>>> references.
>>
>> So you're saying I should ignore this guy:
>>
>>      In article <ca7ts8FerbqU12@mid.individual.net>,
>>       Ian Collins <ian-news@hotmail.com> wrote:
>>
>>      > One of these days you will grasp the simple concept of NOT USING 
>> THE
>>      > BITS YOU DON'T LIKE!
>>      >
>>      > If someone just wants C + operator overloading they can simply use 
>> that
>>      > subset of C++.  This works well and it is portable.
>
> No, I'm saying exercise some intelligence.
>
> Sure you can just use operator overloading alone:
>
> struct Vector
> {
>   int *data;
>
>   int operator[]( unsigned n ) { return data[n]; }
> };
>
> However overloaded arithmetic operators are usually associated with 
> objects (such as a complex number, a bignum, a matrix...).  Once the 
> object representation becomes non-trivial, it makes sense to use more 
> language features.

Once objects become non-trivial, it makes more sense to switch to a dynamic 
scripting language (Python etc), since the overheads of such languages will 
not be so significant compared with the manipulation of those objects (and 
there are faster versions coming along).

Once done, programming in a scripting language is far easier and much more 
productive (and the mechanics of the overloading etc can be kept out of the 
way more).

(It's possible that the implementation of individual methods may end up 
being done in C. But that's C, not C++, as any OO features necessary are 
taken care of by the scripting language.)

-- 
Bartc 

0
BartC
10/16/2014 12:32:32 AM
"Ian Collins" <ian-news@hotmail.com> wrote in message 
news:ca8h4qFerbqU20@mid.individual.net...
> BartC wrote:


>> That might still require more dabbling with the language than someone 
>> would
>> like (some people like to be in control of their development and when 
>> most
>> of the time is spent battling the language and/or compiler, then that is 
>> not
>> very productive).
>
> Ever used Ada? :)

A couple of statistics I remember from somewhere is that a C++ compiler 
takes around 10 man-years of effort to write. An Ada one takes 50 
man-years...

-- 
Bartc 

0
BartC
10/16/2014 12:36:09 AM
Ian Collins wrote:
> Les Cargill wrote:
>>
>> The principle virtue of software is transparency. How do you know it
>> works?
>
> You test it?
>


YEP!

--
Les Cargill
0
Les
10/16/2014 4:02:16 AM
Osmium wrote:
> "Rosario193" wrote:
>
>> On Tue, 14 Oct 2014 21:34:15 -0500, Les Cargill wrote:
>>
>>> I once worked on a project where 500 classes were in play. Myself and
>>> another old 'C' hand estimated it would have been roughly 1/10th  the
>>> complexity done in raw 'C'.
>>
>> from my beginner point of vew:
>> i'm not agree i not find difficult to follow the subset of c++ :
>> classes constructor distructors
>> operator and function definitions [overloading]
>> c subset
>
> He did say 500 *classes*, not objects. I have difficulty even imagining such
> a beast.


If somebody says "Let's use Rational Rose", don't.

<snip>

-- 
Les Cargill
0
Les
10/16/2014 4:17:02 AM
Les Cargill wrote:
>
> If somebody says "Let's use Rational Rose", don't.

I was once given the budget for Rose licenses for a team, so I spent the 
money on upgrades to their PCs and decent monitors.  A much wiser 
investment!

-- 
Ian Collins
0
Ian
10/16/2014 4:27:51 AM
On Thu, 16 Oct 2014 17:27:51 +1300, Ian Collins 
<ian-news@hotmail.com> wrote:
> Les Cargill wrote:
> >
> > If somebody says "Let's use Rational Rose", don't.


> I was once given the budget for Rose licenses for a team, so I 
spent the 
> money on upgrades to their PCs and decent monitors.  A much wiser 
> investment!

I hope to eventually switch to Rust which does not have classes and 
currently experimenting with Go, 
Uml is totally useless for them :P

-- 
Press any key to continue or any other to quit 
0
Mel
10/16/2014 4:41:53 AM
On Thu, 16 Oct 2014 12:06:59 +1300, Ian Collins wrote:

>However overloaded arithmetic operators are usually associated with 
>objects (such as a complex number, a bignum, a matrix...).  Once the 
>object representation becomes non-trivial, it makes sense to use more 
>language features.  For example you could have a simple complex number 
>type thus:
>
>struct Complex
>{
>   double i,j;
>};
>
>Complex operator+( Complex lhs, Complex rhs )
>{
>   Complex tmp;
>   tmp.i = lhs.i + rhs.i;
>   tmp.i = lhs.j + rhs.j;
>   return tmp;
>}

how this function return its value? 
for example copy tmp in the stack as 2 double and call distructor for
tmp? 
[i think not...
could be reserve mem for 2 double using malloc or new and return a
pointer to that mem in eax...]

the same for the lhs and rhs args... did they are copy in the stack?
[i think yes]

>Which is OK, but what would you use for the return type operator +=? 
>Simple, use are reference:
>
>struct Complex
>{
>   double i,j;
>
>   Complex& operator+=( Complex rhs )
>   {
>     i += rhs.i;
>     j *= rhs.i;
>     return *this;
>   }
>};
>
>What if your type was a bignum rather than a double?  
>Then it would make 
>sense to pass the parameters by reference.

in C++

Complex operator+( Complex& lhs, Complex& rhs );

seems a little easy...

if one has instead
in for example a

class v{public: char*   m;  u32  size;};
[m point to a array of char of size: size]

v    operator+( v& lhs, v& rhs )
{v    tmp_v;
 ....
 return tmp_v;
} 

as above
so operations would be in returning the result for +() operator
     compiler add code for malloc/new mem obtaining space for
           air_v of type class v
     air_v.size=temp_v.size
     compiler add code for malloc/new mem obtaining r[0..size-1]
              add code for copy  m[0..size-1] in r[0..size-1]
     air_v.m=&r
call distructor for tmp_v

return a pointer or reference to air_v in eax

and the compiler add code for use distructor for air_v
at end for statement ";"

is it right?

>What if you want to use float, double and a bignum for your complex 
>type?  Make Complex a class template.
>
>Each time the requirements get more complex, the language as a tool to 
>help you implement them.

0
Rosario193
10/16/2014 8:29:09 AM
i think said some error in the above post....so i rewrote it

On Thu, 16 Oct 2014 12:06:59 +1300, Ian Collins wrote:
>However overloaded arithmetic operators are usually associated with 
>objects (such as a complex number, a bignum, a matrix...).  Once the 
>object representation becomes non-trivial, it makes sense to use more 
>language features.  For example you could have a simple complex number 
>type thus:
>
>struct Complex
>{
>   double i,j;
>};
>
>Complex operator+( Complex lhs, Complex rhs )
>{
>   Complex tmp;
>   tmp.i = lhs.i + rhs.i;
>   tmp.i = lhs.j + rhs.j;
>   return tmp;
>}

how this function return its value? 
for example copy tmp in the stack as 2 double and call distructor for
tmp? 
[i think not...
could be reserve mem for 2 double using malloc or new and return
*thatpointer
 [but i'm not able to understand what *thatpointer mean... 
        one see that should be all pointers] 
.... ]

the same for the lhs and rhs args... did they are copy in the stack?
[i think yes and not as pointer vaule ]

>Which is OK, but what would you use for the return type operator +=? 
>Simple, use are reference:
>
>struct Complex
>{
>   double i,j;
>
>   Complex& operator+=( Complex rhs )
>   {
>     i += rhs.i;
>     j *= rhs.i;
>     return *this;
>   }
>};
>
>What if your type was a bignum rather than a double?  
>Then it would make 
>sense to pass the parameters by reference.

in C++

Complex operator+( Complex& lhs, Complex& rhs );

seems a little easy...

if one has instead
in for example a

class v{public: char*   m;  u32  size;};
[m point to a array of char of size: size]

v&    operator+( v& lhs, v& rhs )
{v    tmp_v;
 ....
 return tmp_v;
} 

as above
so operations would be in returning the result for +() operator
     compiler add code for malloc/new mem obtaining space for
           air_v of type class v
     air_v.size=temp_v.size
     compiler add code for malloc/new mem obtaining r[0..size-1]
              add code for copy  m[0..size-1] in r[0..size-1]
     air_v.m=&r
call distructor for tmp_v

return a pointer or reference to air_v in eax

and the compiler add code for use distructor for air_v
at end for statement ";"

is it right?

>What if you want to use float, double and a bignum for your complex 
>type?  Make Complex a class template.
>
>Each time the requirements get more complex, the language as a tool to 
>help you implement them.

0
Rosario193
10/16/2014 8:49:42 AM
On Wed, 15 Oct 2014 07:21:05 -0500, "Osmium" wrote:

>"Rosario193" wrote:
>
>> On Tue, 14 Oct 2014 21:34:15 -0500, Les Cargill wrote:
>>
>>>I once worked on a project where 500 classes were in play. Myself and
>>>another old 'C' hand estimated it would have been roughly 1/10th  the
>>>complexity done in raw 'C'.

can be if someone use all the power of c++ [i think c++ exceptions,
librarys for algo as stl without their c++ code, etc] and without a
counter of leaks... 

but if someone write easy to read constructors distructors and them
can be seen in c++ code[and not in binary only] and them are 90% pure
and easy to read C [or assembly] i not see the problem...

>> from my beginner point of vew:
>> i'm not agree i not find difficult to follow the subset of c++ :
>> classes constructor distructors
>> operator and function definitions [overloading]
>> c subset
>
>He did say 500 *classes*, not objects. I have difficulty even imagining such 
>a beast.
>That says, to me,there were 500 different *types* of variables.  Plus what 
>you started with in C.  ISTM there was no adult supervision at all during 
>that design.

it is not the number o classes i fear... i fear code too much
difficult to understand for me... i have seen c++ code [and c code
too] very difficult to understand
i'm lucky i write the code i have to understand...

>The world is not full of simple problems where classes make sense, so the 
>books really strain credulity.  Look at the books full of dogs that bark, 
>monkeys that chatter.  Images where a rectangle is a variant of a square (or 
>is it the other way around?). These were the books I encountered while 
>learning in the 90's, have the books changed?  The student adopts the 
>attitude of the author or instructor.  Cobol has long variable names, that's 
>what my instructor thought so now *I* think that. Does it make sense or is 
>it true? Of course not!  But the thought is still there in the back of my 
>mind.

it is easy: if run ok would be ok
it is not too much important who said what

>On the other hand the world is full of math problems (and thus examples) 
>that want to be solved in simple minded C.
>
>Properly used with constraint, OO is a great idea.  But OO is an idea that 
>has been, and is being, terribly oversold.  This notion of forecasting the 
>future so your 20 lines of super clever code can work it's way into all 
>kinds of wonderful programs, not yet written,  makes me want to puke. If I 
>were supervising programmers The number times you would see "protected" in 
>code written by my guys used would be very close to zero. Write the damn 
>program, let the future take care of itself. 

for me one want classes for a pair of reasons:
1) less code to write or indent [or think]
2) constructor and destructor are the easier way all object has to 
   be to born and to end

for "protected" i'm agree if i understand right, 
i in class used only key word "public" no word "private" no word
"protected"

the same for const: const not exist but i have to fight with the c++
compiler so i put some const somewhere :)
0
Rosario193
10/16/2014 4:51:10 PM
On 16/10/14 00:54, Keith Thompson wrote:
> David Brown <david.brown@hesbynett.no> writes:
>> On 15/10/14 21:06, BartC wrote:
> [...]
>>> In fact, why does the C language still exist at all, if it is pretty
>>> much a subset of C++?
>>
>> As you can see, there are a lot of people who think C is the best
>> language around and can't quite fathom that you can use a C++ compiler
>> to compile most good quality C.  There are also those who insist on
>> writing C that can't be compiled as C++ - either for good reasons (such
>> as making use of designated initialisers), avoidable reasons (such as
>> using "friend" and "class" as variable names), or bad reasons (such as
>> writing poor quality C code that is rejected by fussier C++ compilers).
>
> And there are a lot of people (myself included) who happen to like C
> (while acknowledging its flaws), and who enjoy using it and/or are
> required to use it at work.
>

Yes, personal preference is a perfectly good reason (along with 
experience, familiarity with tools, etc.).

> If I want to use a language that includes most of C as a subset,
> it's not obvious that I should choose C++ over any of the other
> C-derived languages like Objective-C, or lcc-win32 C, or GNU C
> (yes, the latter two are more dialects than distinct languages).

I think it is obvious - Objective-C has an unbelievably ugly syntax and 
there is no clear way to guess what object code will be produced from 
given source code (if you want that kind of abstraction, use a /real/ 
high level language such as Python).  Objective-C is also highly 
Mac-oriented - great if Macs are your life, but not for anything else. 
lcc-win32 C is probably fine for people who use it, but is tightly 
limited to a single platform and a single compiler.  And gcc C is not 
very different from standard C - it has a few improvements over standard 
C, but only enough to be an "accent" rather than a "dialect".


>
> If I'm programming in C, I generally don't even think about whether
> my code will also compile as C++, because I'm not going to try to
> compile it as C++.  There's no need to avoid calling a variable
> "class" or "friend" if that happens to be the best name for it.
> If I need to use C code in a C++ project, there's always extern "C".
> Sure, there are cases where it makes sense to write code that will
> compile either as C or as C++.  In my experience, those cases are
> rarer than some people think they are, and I don't think I've ever
> had such a requirement myself.

The embedded C++ projects (which have not been that many - most of my 
work is in C) I have been involved in have mixed C with the C++. 
Sometimes the customer also compiles the C code as C++, so compatibility 
was vital there - in other cases, I /could/ have used local variables 
called "friend" in the C code, but it would just cause confusion.

But I fully agree that there is no point in artificially imposing 
limitations when you know that they will not be relevant for the code 
you write.  And writing code that works equally as C and C++ code can 
look a bit odd in each language - to make C++ happy, you have to cast 
the return value of malloc(), and to make C happy you have to declare 
your parameterless functions as "foo(void)".

>
> I do avoid non-prototype function declarations and definitions --
> not because C++ doesn't permit them, but because it makes for better
> C code.
>

Indeed.  Most C "features" that are not valid in C++ are poor C practice 
too.


0
David
10/16/2014 7:22:01 PM
On 10/16/2014 03:22 PM, David Brown wrote:
....
> Indeed.  Most C "features" that are not valid in C++ are poor C practice 
> too.

I believe that even the latest version of C++ has not incorporated all
of the new features that were added in C99 and C2011 (though I'm not
sure which ones were left out - I haven't had time to spare to do a
detailed comparison), and I like most of those new features.

0
James
10/16/2014 7:31:43 PM
David Brown <david.brown@hesbynett.no> writes:
> On 16/10/14 00:54, Keith Thompson wrote:
[...]
>> If I'm programming in C, I generally don't even think about whether
>> my code will also compile as C++, because I'm not going to try to
>> compile it as C++.  There's no need to avoid calling a variable
>> "class" or "friend" if that happens to be the best name for it.
>> If I need to use C code in a C++ project, there's always extern "C".
>> Sure, there are cases where it makes sense to write code that will
>> compile either as C or as C++.  In my experience, those cases are
>> rarer than some people think they are, and I don't think I've ever
>> had such a requirement myself.
[...]
> The embedded C++ projects (which have not been that many - most of my
> work is in C) I have been involved in have mixed C with the
> C++. Sometimes the customer also compiles the C code as C++, so
> compatibility was vital there - in other cases, I /could/ have used
> local variables called "friend" in the C code, but it would just cause
> confusion.
>
> But I fully agree that there is no point in artificially imposing
> limitations when you know that they will not be relevant for the code
> you write.  And writing code that works equally as C and C++ code can
> look a bit odd in each language - to make C++ happy, you have to cast
> the return value of malloc(), and to make C happy you have to declare
> your parameterless functions as "foo(void)".

Beyond that, if I deliberately write my C code so it won't compile as
C++, then I don't have to think about C++ compatibility, particularly
about any code I might *accidentally* write that is valid C and
valid C++ but with different behavior.  (Such cases are not common,
but it's nice to have one fewer thing to worry about.)

A clumsy way to do that is to declare something like "int friend;".
A less clumsy way, which I've actually used, is:

#ifdef __cplusplus
#error "Please use a C compiler not a C++ compiler"
#endif

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/16/2014 7:57:49 PM
On Wednesday, October 15, 2014 2:07:24 PM UTC-5, Bart wrote:
[snip] 
 
> In fact, why does the C language still exist at all, if it is pretty much
> a subset of C++?

It's not a proper subset (not all legal C programs are also legal C++
programs with identical semantics), and with each standard revision, that
subset gets smaller.  There are C features that C++ compilers are never
going to support, such as VLAs.  

C++ is a much bigger and more complicated language than C, hence C++
compilers are bigger, more complicated, and harder to validate than C
compilers.  There are still some areas of the C++ language where the 
rules are hard to understand, and two compiler vendors may choose to 
interpret them differently.  

So, in some cases, C is the more appropriate choice.  
0
John
10/16/2014 10:10:00 PM
On 2014-10-16, Keith Thompson <kst-u@mib.org> wrote:
> David Brown <david.brown@hesbynett.no> writes:
>> On 16/10/14 00:54, Keith Thompson wrote:
> [...]
>>> If I'm programming in C, I generally don't even think about whether
>>> my code will also compile as C++, because I'm not going to try to
>>> compile it as C++.  There's no need to avoid calling a variable
>>> "class" or "friend" if that happens to be the best name for it.
>>> If I need to use C code in a C++ project, there's always extern "C".
>>> Sure, there are cases where it makes sense to write code that will
>>> compile either as C or as C++.  In my experience, those cases are
>>> rarer than some people think they are, and I don't think I've ever
>>> had such a requirement myself.
> [...]
>> The embedded C++ projects (which have not been that many - most of my
>> work is in C) I have been involved in have mixed C with the
>> C++. Sometimes the customer also compiles the C code as C++, so
>> compatibility was vital there - in other cases, I /could/ have used
>> local variables called "friend" in the C code, but it would just cause
>> confusion.
>>
>> But I fully agree that there is no point in artificially imposing
>> limitations when you know that they will not be relevant for the code
>> you write.  And writing code that works equally as C and C++ code can
>> look a bit odd in each language - to make C++ happy, you have to cast
>> the return value of malloc(), and to make C happy you have to declare
>> your parameterless functions as "foo(void)".
>
> Beyond that, if I deliberately write my C code so it won't compile as
> C++ ...

Then you're programming just to spite what some people said in some
Usenet newsgroup, which is a bad mindset to be in.

> then I don't have to think about C++ compatibility, particularly
> about any code I might *accidentally* write that is valid C and
> valid C++ but with different behavior.

*Deliberately* making code nonportable to C++ is the same as deliberately
writing nonportable code for any other target, which provides for
easy analogies to check the soundness of your reasoning.

"If I deliberately wite my code so it doesn't port to MIPS or ARM,
then I don't have to think about portability, particularly about any
code that I might accidentally write that is valid for my my x86
PC, MIPS, ARM but with different behavior."

In my experience, maintaining a C++ port of some C code is quite easy;
easier than maintaining an operating system or architecture port,
though that doesn't have to be difficult either.

Even producing an initial port to C++ isn't very hard.

  "It is not uncommon to be able to convert tens of thousands of lines of ANSI
  C to C-style C++ in a few hours. " -- Bjarne Stroustrup, B.S.'s FAQ.

Very little, if any, code that compiles cleanly as C and C++ with
decent diagnostic levels turned on, will have different behavior.

If code shows behavior changes when ported from a C compiler to a C++
compiler, then it cannot be trusted not to show behavior changes
when ported from one C compiler to another.

If testing shows that you have behavior changes between a C and C++ build,
that is your lucky day: you have found a bug that you would not have
seen if you had only used one of those compilers.

That bug is certainly not a *result* of writing in Clean C. Just because
you don't run your code thorugh a C++ compiler (and have put in juvenile
hacks to prevent that from happening) doesn't mean that the remaining
parts which could compile as C++ do not have different behavior in C++.
The fact remains what it is.

This is related to how the world does not cease to exist when you close your
eyes.

> (Such cases are not common,
> but it's nice to have one fewer thing to worry about.)

Speaking of having a lot less to worry about ...

> A clumsy way to do that is to declare something like "int friend;".
> A less clumsy way, which I've actually used, is:
>
> #ifdef __cplusplus
> #error "Please use a C compiler not a C++ compiler"
> #endif

And of course:

 #if CHAR_BIT != 8
 #error "Please use a compiler for a Vax-like machine"
 #endif

 assert (sizeof(int) == 4);

If you want a *lot* fewer things to worry about, you can just toss ISO C out
the window and write according to the compiler and architecture manual for your
favorite platform.
0
Kaz
10/16/2014 10:22:49 PM
Kaz Kylheku <kaz@kylheku.com> wrote:
> On 2014-10-16, Keith Thompson <kst-u@mib.org> wrote:
>> David Brown <david.brown@hesbynett.no> writes:
>>> On 16/10/14 00:54, Keith Thompson wrote:
>> [...]
>>>> If I'm programming in C, I generally don't even think about whether
>>>> my code will also compile as C++, because I'm not going to try to
>>>> compile it as C++.  There's no need to avoid calling a variable
>>>> "class" or "friend" if that happens to be the best name for it.
>>>> If I need to use C code in a C++ project, there's always extern "C".
>>>> Sure, there are cases where it makes sense to write code that will
>>>> compile either as C or as C++.  In my experience, those cases are
>>>> rarer than some people think they are, and I don't think I've ever
>>>> had such a requirement myself.
>> [...]
>>> The embedded C++ projects (which have not been that many - most of my
>>> work is in C) I have been involved in have mixed C with the
>>> C++. Sometimes the customer also compiles the C code as C++, so
>>> compatibility was vital there - in other cases, I /could/ have used
>>> local variables called "friend" in the C code, but it would just cause
>>> confusion.
>>>
>>> But I fully agree that there is no point in artificially imposing
>>> limitations when you know that they will not be relevant for the code
>>> you write.  And writing code that works equally as C and C++ code can
>>> look a bit odd in each language - to make C++ happy, you have to cast
>>> the return value of malloc(), and to make C happy you have to declare
>>> your parameterless functions as "foo(void)".
>>
>> Beyond that, if I deliberately write my C code so it won't compile as
>> C++ ...
> 
> Then you're programming just to spite what some people said in some
> Usenet newsgroup, which is a bad mindset to be in.
> 
>> then I don't have to think about C++ compatibility, particularly
>> about any code I might *accidentally* write that is valid C and
>> valid C++ but with different behavior.
> 
> *Deliberately* making code nonportable to C++ is the same as deliberately
> writing nonportable code for any other target, which provides for
> easy analogies to check the soundness of your reasoning.

He didn't say he deliberately made it non-portable just to be non-portable.

What's he's saying is that he's not deliberately making it portable (quite a
big difference). There are dangerous, subtle differences that he's not
interested in policing. For example GCC g++ compiles compound literals into
C++ temporaries; the former have block-scoped liftimes while the latter are
expression scoped. (g++ originally compiled compound literals using C
semantics, but in 4.8 they changed course. That was a crappy day for me.)

He said he deliberately inserts non-portable constructs as a safeguard so
people don't accidentally shoot themselves by assuming he deliberately kept
his code base portable.

> "If I deliberately wite my code so it doesn't port to MIPS or ARM,
> then I don't have to think about portability, particularly about any
> code that I might accidentally write that is valid for my my x86
> PC, MIPS, ARM but with different behavior."

That's a very poor analogy.

> In my experience, maintaining a C++ port of some C code is quite easy;
> easier than maintaining an operating system or architecture port,
> though that doesn't have to be difficult either.

Easy or not, some people aren't interested in maintaning C++ compatability
in their C code.

I'm not interested in forgoing designed initializers or compound literals.

I'm not interested in casting void pointers--casting in general leads to
obfuscated code.

"C/C++" is the worse of both worlds.

> Even producing an initial port to C++ isn't very hard.

Rubbing my belly and tapping my head isn't particularly difficult in itself,
either. But I typically find such exercises pointless and not worth the
opportunity costs.

>  "It is not uncommon to be able to convert tens of thousands of lines of
>  ANSI C to C-style C++ in a few hours. " -- Bjarne Stroustrup, B.S.'s FAQ.

1) It's implied that the cost in bugs is an acceptable price.

2) Porting C to C++ is not the same task as maintaing a code base which
compiles as both C and C++.

3) This is 2014. Many developers no longer use ANSI C. Even Visual Studio
supports most of C99 now.

> Very little, if any, code that compiles cleanly as C and C++ with
> decent diagnostic levels turned on, will have different behavior.

In most parts of the country I can cross an interstate highway with my eyes
closed with a high probability of surviving. But why? I can do it with my
eyes open, too, but it's still a risk. And I need a reason to take the risk.

> If code shows behavior changes when ported from a C compiler to a C++
> compiler, then it cannot be trusted not to show behavior changes
> when ported from one C compiler to another.

That's ridiculous.
 
> If testing shows that you have behavior changes between a C and C++ build,
> that is your lucky day: you have found a bug that you would not have
> seen if you had only used one of those compilers.

Again with the testing. You and Ian Collins are the only engineers on the
face of earth that unit test every expression on every line of every source
code unit in all their projects.

Sadly, C++ operator overloading deflates the sarcasm in that remark.

> That bug is certainly not a *result* of writing in Clean C. Just because
> you don't run your code thorugh a C++ compiler (and have put in juvenile
> hacks to prevent that from happening) doesn't mean that the remaining
> parts which could compile as C++ do not have different behavior in C++.
> The fact remains what it is.

Without argument or substantial evidence (I'll grant you that data _is_ the
plural of ancedote), you've claimed that it's trivial to maintain
compatability. Furthermore, you've failed to etablish why anybody should
bother implementing their project in Clean C--which you clearly understand
is neither C nor C++.

I'd be happy to make the case that C header files should be kept compatible
with C++. But as a general matter I see no purpose in implementing projects
in Clean C. I prefer C over C++ because of C's simplicity. But Clean C
doesn't result in the kind of simplicity that matters.

Are there systems with C++ compilers but no C compiler? Or at the very
least, where the C++ compiler is modern and well maintained but the C
compiler has been left to rot? If you can show that, that would be
_something_.

0
william
10/16/2014 11:11:34 PM
On Tue, 14 Oct 2014 07:15:33 -0700, Rick C. Hodgin wrote:

> On Tuesday, October 14, 2014 9:38:05 AM UTC-4, Nobody wrote:
>> On Sun, 12 Oct 2014 15:33:23 -0700, Rick C. Hodgin wrote:
>> > Sxyz* myvar = xyz_new(whatever);  //ctor . .
>> > .
>> > xyz_delete(&myvar);  //dtor
>> 
>> Except that now you have to ensure that xyz_delete is called regardless
>> of how control escapes the scope: reaching the end of the scope, return,
>> break, continue, goto, exception, signal, longjmp(), ...
> 
> You have to do that in C++ with any explicitly created objects using new.

Objects created using new are normally supposed to "leak" from the scope
in which they're created.

If you want "auto" semantics for such objects, there's std::auto_ptr,
std::unique_ptr or boost::scoped_ptr.

> I have no issues with constructors or destructors.  My issues relate to
> using local variable class definitions which are not created explicitly
> with new, or in my suggestion lnew.  I do not like the fact that these
> things are taking place "invisibly".

You still have the C mindset. If you use C++ enough, you eventually take
it for granted that every value is constructed and destroyed by some code.
Types with default (compiler-generated) constructors and destructors are
the special case.

Even C generates code for initialisation, assignment, type conversions,
parameter passing and the like. C++ just allows the code to be
user-defined, whereas in C it's all hard-coded into the compiler.

0
Nobody
10/16/2014 11:34:11 PM
On Tue, 14 Oct 2014 07:51:48 -0700, Malcolm McLean wrote:

>> This is no different to C code which provides a header file containing a
>> struct declaration plus declarations of the functions which use it, and
>> a source file containing the definitions of those functions.
>> 
>> The only difference is on which side of the closing brace the function
>> declarations live.
>>
> That makes all the difference.
> 
> In C, the functions have a link to the structures they operate on, but the
> structures don't know anything about the functions that operate on them.
> So we can easily delete a function, without affecting anything else. We
> can add a function in static file scope, without affecting anything else.
> And we can write a function that operates on two structures of different
> types, without creating any dependencies between the two structures.

You can do all that in C++ too. Just because a function could be a method
(member function), it doesn't mean that it must be or even should be. The
only functions which have to be methods are those which refer to private
members and virtual methods. But that's the whole point: you can obtain an
exhaustive list of the functions which manipulate private members directly
without having to search the entire code base.

Adding or removing methods doesn't change the layout of the structure
(beyond the fact that whether the number of virtual methods is non-zero
has an impact).

Making such a change to the header will require recompilation of affected
code, but that's equally true of C code which places function declarations
in the same header as the structure definition.

Aside: the other significant distinction between methods and non-member
functions, namely the call syntax (obj.method() vs function(obj)) may be
removed in the future:

	https://isocpp.org/files/papers/N4165.pdf

Short version: if code uses obj.function() and no such method exists,
check whether function(obj) is valid (and if so, use it) before reporting
an error.

0
Nobody
10/16/2014 11:56:33 PM
"Nobody" <nobody@nowhere.invalid> wrote in message 
news:pan.2014.10.16.23.34.10.153000@nowhere.invalid...
> On Tue, 14 Oct 2014 07:15:33 -0700, Rick C. Hodgin wrote:
>
>> On Tuesday, October 14, 2014 9:38:05 AM UTC-4, Nobody wrote:
>>> On Sun, 12 Oct 2014 15:33:23 -0700, Rick C. Hodgin wrote:
>>> > Sxyz* myvar = xyz_new(whatever);  //ctor . .
>>> > .
>>> > xyz_delete(&myvar);  //dtor
>>>
>>> Except that now you have to ensure that xyz_delete is called regardless
>>> of how control escapes the scope: reaching the end of the scope, return,
>>> break, continue, goto, exception, signal, longjmp(), ...
>>
>> You have to do that in C++ with any explicitly created objects using new.
>
> Objects created using new are normally supposed to "leak" from the scope
> in which they're created.
>
> If you want "auto" semantics for such objects, there's std::auto_ptr,
> std::unique_ptr or boost::scoped_ptr.
>
>> I have no issues with constructors or destructors.  My issues relate to
>> using local variable class definitions which are not created explicitly
>> with new, or in my suggestion lnew.  I do not like the fact that these
>> things are taking place "invisibly".
>
> You still have the C mindset.

You obviously have a C++ mindset if you think introducing entities such 
'std::autoptr', 'std::unique_ptr' and 'boost::scoped_ptr' is is going to 
make life easier for anyone. No doubt there are hundreds more of those, 
designed to solve problems that only arise because of the language.

> If you use C++ enough, you eventually take
> it for granted that every value is constructed and destroyed by some code.
> Types with default (compiler-generated) constructors and destructors are
> the special case.
>
> Even C generates code for initialisation, assignment, type conversions,
> parameter passing and the like. C++ just allows the code to be
> user-defined, whereas in C it's all hard-coded into the compiler.

That's where a lot of this belongs. You don't want to be doing the 
compiler's job for it.

Some things ought to be done behind-the-scenes by the language, and some by 
the actual program. But just because it is technically possible in C++, it 
seems that as much as possible is off-loaded to the language, in the form of 
unwieldy sets of classes and methods. To the extent that little is left to 
be explicitly written in the source code. (And it's not so much behind the 
scenes either; enough is visible to clutter the program, without it being 
clear what is actually happening. The interactions are complex.)

-- 
Bartc
 

0
BartC
10/17/2014 12:07:09 AM
"Nobody" <nobody@nowhere.invalid> wrote in message
news:pan.2014.10.16.23.56.31.949000@nowhere.invalid...


> Aside: the other significant distinction between methods and non-member
> functions, namely the call syntax (obj.method() vs function(obj)) may be
> removed in the future:
>
> https://isocpp.org/files/papers/N4165.pdf
>
> Short version: if code uses obj.function() and no such method exists,
> check whether function(obj) is valid (and if so, use it) before reporting
> an error.

(Interesting, I introduced such syntax to one of my projects last year.

So that f(x, y) can also be written x.f(y). A bit of a laugh really, 20
minutes spent so that the syntax resembles some OO code, although that
project had few OO features (f() here is a single global function, there
isn't a specific method f() to match the type of x.)

Even so, this does change the emphasis so that f() appears to be applied to
x, rather than x being mere argument to a some function f. Maybe C could do
with it too, execpt such changes happen so slowly that it could be decades
away.)

-- 
Bartc

 

0
BartC
10/17/2014 12:27:42 AM
On Thursday, October 16, 2014 7:33:57 PM UTC-4, Nobody wrote:
> On Tue, 14 Oct 2014 07:15:33 -0700, Rick C. Hodgin wrote:
> > I have no issues with constructors or destructors.  My issues relate
> > to using local variable class definitions which are not created
> > explicitly with new, or in my suggestion lnew.  I do not like the
> > fact that these things are taking place "invisibly".
> 
> You still have the C mindset. If you use C++ enough, you eventually take
> it for granted that every value is constructed and destroyed by some code.
> Types with default (compiler-generated) constructors and destructors are
> the special case.

I have developed in C++.  It's not a matter of not recognizing or
appreciating what it does for me ... it's a matter of I think it is
wrong, an incorrect manner to process data, to have the compiler
doing things for your which are not explicitly conveyed to it, unless
it is part of an optional protocol where you still are in the driver's
seat, and if you then choose not to exercise your prerogative to act,
then the compiler will fill in for you.

> Even C generates code for initialisation, assignment, type
> conversions, parameter passing and the like. C++ just allows the
> code to be user-defined, whereas in C it's all hard-coded into
> the compiler.

Initialization is required.  Assignment is required.  Type conversions
are required.  Parameter passing is required.  Those are all things
that are required to process data.  It is not a required thing to have
the compiler prepare a variable for you upon entry into a function,
whether it ultimately ends up being used or not based on flow through
the function, to then have it destroy it for you upon exit.

I would rather always see the explicit creation, and the explicit
destruction, at the C/C++ level.  When I get into higher level languages
like an XBASE language, then I want it to do things for me.

My mindset of C/C++ is a higher level version of assembly, but still
a lower-level language than most others.

Best regards,
Rick C. Hodgin
0
Rick
10/17/2014 12:35:50 AM
<william@wilbur.25thandClement.com> writes:
[...]
> Are there systems with C++ compilers but no C compiler? Or at the very
> least, where the C++ compiler is modern and well maintained but the C
> compiler has been left to rot? If you can show that, that would be
> _something_.

That nearly describes MS Windows.  There are non-Microsoft compilers,
but Microsoft's own C compiler doesn't support any C standard past
1990, while their C++ compiler is (I think) reasonably up to date.

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/17/2014 1:23:42 AM
Keith Thompson <kst-u@mib.org> wrote:
> <william@wilbur.25thandClement.com> writes:
> [...]
>> Are there systems with C++ compilers but no C compiler? Or at the very
>> least, where the C++ compiler is modern and well maintained but the C
>> compiler has been left to rot? If you can show that, that would be
>> _something_.
> 
> That nearly describes MS Windows.  There are non-Microsoft compilers,
> but Microsoft's own C compiler doesn't support any C standard past
> 1990, while their C++ compiler is (I think) reasonably up to date.
> 

Visual Studio now supports designated initializers, compound literals,
snprintf, _Bool, and a tremendous amount of other C99 features.

Specifics are hard to find, but see

* http://msdn.microsoft.com/en-us/library/hh409293.aspx 
* http://blogs.msdn.com/b/vcblog/archive/2013/07/19/c99-library-support-in-visual-studio-2013.aspx
* http://blogs.msdn.com/b/vcblog/archive/2014/06/03/visual-studio-14-ctp.aspx

They're not officially pursuing C99 or C11 conformance. But neither have
they left their C compiler to wither away. Abandoning their C compiler was
they're stated intention, but reality intervened. C just isn't going away so
they had to adjust.

I think they may support _Complex. I'm unsure if their <complex.h>
implementation is just a wrapper around C++. In any event, given that C11
made so many aspects of the language optional, Visual Studio is
theoretically within spitting distance of C11 conformance.

0
william
10/17/2014 1:53:03 AM
> Objects created using new are normally supposed to "leak" from the scope
> in which they're created.
> 
> If you want "auto" semantics for such objects, there's std::auto_ptr,
> std::unique_ptr or boost::scoped_ptr.

You make so easy so hard.

Compare to Javascript which doesn't have so many different kinds of pointers. 
Actually it doesn't have any pointers. Its prototype (aka delegate) syntax is a 
bit obscure, but it makes do without a dazen different types of prototypes.

> You still have the C mindset. If you use C++ enough, you eventually take

And you have a C++ mindset. Objective-C and Javascript are object using 
languages with much simpler syntax and semantics. The difference is Objective-C 
and Javascript are willing to be 'inefficient' to be comprehensible.

Apple tried to make Objective-C 'efficient' and ended up with the bug generator 
known as reference counts.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/17/2014 2:12:37 AM
On Tuesday, October 14, 2014 10:38:47 PM UTC+1, Ian Collins wrote:
> Malcolm McLean wrote:
> 
> > Another possibility is that a and b are strings, of a is a file
> > and b a record, or some other defensible but confusing misuse of
> > the addition operator. Now it gets very difficult. It's hard to
> > find the function. Worse, someone might have defined the +
> > operation as concatenation in one place, and insertion in
> > another.
> 
> Then they are a fool who would be equally likely to write messed up C.
> 
Someone could very reasonably write

streamobject & streamobject::operator << (numericalobject &x)
{
  this->append( x.tostring() );
  return *this;
} 

template <integral> numericalobject & numericalobject::operator << integral i
{
   while(i != 0)
   {
      *this *= 2;
      i = i - 1;
   }
   return *this;
}

In client code

numericalobject a, b;
streamobject str;

str << a << b;

Now what did the programmer intend, and what will happen?
 
0
Malcolm
10/17/2014 6:57:59 AM
Malcolm McLean wrote:
> On Tuesday, October 14, 2014 10:38:47 PM UTC+1, Ian Collins wrote:
>> Malcolm McLean wrote:
>>
>>> Another possibility is that a and b are strings, of a is a file
>>> and b a record, or some other defensible but confusing misuse of
>>> the addition operator. Now it gets very difficult. It's hard to
>>> find the function. Worse, someone might have defined the +
>>> operation as concatenation in one place, and insertion in
>>> another.
>>
>> Then they are a fool who would be equally likely to write messed up C.
>>
> Someone could very reasonably write
>
> streamobject & streamobject::operator << (numericalobject &x)
> {
>    this->append( x.tostring() );
>    return *this;
> }
>
> template <integral> numericalobject & numericalobject::operator << integral i
> {
>     while(i != 0)
>     {
>        *this *= 2;
>        i = i - 1;
>     }
>     return *this;
> }
>
> In client code
>
> numericalobject a, b;
> streamobject str;
>
> str << a << b;
>
> Now what did the programmer intend, and what will happen?

I've no idea, but I'm sure the code wouldn't get past a review (or a 
compiler).

-- 
Ian Collins
0
Ian
10/17/2014 7:11:30 AM
On Friday, October 17, 2014 8:11:41 AM UTC+1, Ian Collins wrote:
> Malcolm McLean wrote:
> 
> > On Tuesday, October 14, 2014 10:38:47 PM UTC+1, Ian Collins wrote:
> 
> >> Malcolm McLean wrote:
> 
> 
> >>> Another possibility is that a and b are strings, of a is a file
> >>> and b a record, or some other defensible but confusing misuse of
> >>> the addition operator. Now it gets very difficult. It's hard to
> >>> find the function. Worse, someone might have defined the +
> >>> operation as concatenation in one place, and insertion in
> >>> another.
> 
> 
> >> Then they are a fool who would be equally likely to write messed up C.
> 
> >>
> 
> > Someone could very reasonably write
> 
> >
> 
> > streamobject & streamobject::operator << (numericalobject &x)
> > {
> >    this->append( x.tostring() );
> >    return *this;
> > }
> 
> 
> > template <integral> numericalobject & numericalobject::operator << integral i
> > {
> >     while(i != 0)
> >     {
> >        *this *= 2;
> >        i = i - 1;
> >     }
> >     return *this;
> > }
> 
> >
> 
> > In client code
> 
> >
> 
> > numericalobject a, b;
> 
> > streamobject str;
> 
> >
> 
> > str << a << b;
> 
> >
> 
> > Now what did the programmer intend, and what will happen?
> 
>
> I've no idea, but I'm sure the code wouldn't get past a review (or a 
> compiler).
> 
On what grounds would you fail it?

0
Malcolm
10/17/2014 7:22:12 AM
On 16/10/14 21:31, James Kuyper wrote:
> On 10/16/2014 03:22 PM, David Brown wrote:
> ...
>> Indeed.  Most C "features" that are not valid in C++ are poor C practice 
>> too.
> 
> I believe that even the latest version of C++ has not incorporated all
> of the new features that were added in C99 and C2011 (though I'm not
> sure which ones were left out - I haven't had time to spare to do a
> detailed comparison), and I like most of those new features.
> 

It is true that there are features in C99 and C11 that are not in C++ of
any version, but there are not many that I think are particularly useful
- perhaps designated initialisers are the example most used of something
C++ is missing.

There are also some features that the languages share, but can be subtly
different.  A "const" in C++ has, by default, static linkage - but in C
it has "extern" linkage.  It is easy to make code that works in either
language, by adding "static" or "extern" explicitly, at the cost of
being slightly non-idiomatic in either language.

And there are points that make you feel the C++ and the C committees
should have their heads banged together until they agree to cooperate -
C++ has "static_assert" while C has "_Static_assert" to do the same
thing.  (Yes, I /know/ there are good reasons for this.)  And each
committee has come up with an implementation of atomics and threading
that is, I believe, different and incompatible - and about 15 years too
late to be really useful.


There were a lot of C99 features that are very useful, but the only C11
feature that I find important is static assertions (and anonymous
structs and unions, but gcc has supported them for years).  Is there
anything else in C11 that you use?


0
David
10/17/2014 8:00:34 AM
On 17/10/14 00:10, John Bode wrote:
> On Wednesday, October 15, 2014 2:07:24 PM UTC-5, Bart wrote:
> [snip] 
>  
>> In fact, why does the C language still exist at all, if it is pretty much
>> a subset of C++?
> 
> It's not a proper subset (not all legal C programs are also legal C++
> programs with identical semantics), and with each standard revision, that
> subset gets smaller.  

The subset (or rather, intersection between C and C++) does not get
smaller - it has been getting larger with each revision.  But the areas
outside that subset get bigger faster, especially for C++ (C has seen
only minor changes in C11).

> There are C features that C++ compilers are never
> going to support, such as VLAs.  

VLA's are a planned feature for C++17, and are already supported in C++
on gcc.

> 
> C++ is a much bigger and more complicated language than C, hence C++
> compilers are bigger, more complicated, and harder to validate than C
> compilers.  There are still some areas of the C++ language where the 
> rules are hard to understand, and two compiler vendors may choose to 
> interpret them differently.  
> 
> So, in some cases, C is the more appropriate choice.  
> 

I agree, even though I think it is seldom an issue to write your C code
as C++ compatible.

0
David
10/17/2014 8:13:18 AM
On 2014-10-15 22:20, jacob navia wrote:
> There are no stores to buy more brain space however. People have just a
> limited amount of patience to swallow more and more C++ stuff. I is a
> language that takes years and years to get accustomed to it, and it will
> never end. Nobody masters ALL of the language anymore, not even
> Stroustroup.

This got me thinking of a passage from the ACM talk given by Edsger 
Dijkstra:

http://www.youtube.com/watch?v=yKiVz71AVKg#t=7m30s

Although it's about PL/1 I think it's equally applicable to C++.


-- August
0
August
10/17/2014 8:13:27 AM
On 17/10/14 09:22, Malcolm McLean wrote:
> On Friday, October 17, 2014 8:11:41 AM UTC+1, Ian Collins wrote:
>> Malcolm McLean wrote:
<snip>
>>> Now what did the programmer intend, and what will happen?
>>
>>
>> I've no idea, but I'm sure the code wouldn't get past a review (or a 
>> compiler).
>>
> On what grounds would you fail it?
> 

On the grounds that he has no idea what the programmer intended, or what
will happen.  That is the most important reason for failing code in a
review.

0
David
10/17/2014 8:15:41 AM
David Brown wrote:
> On 17/10/14 09:22, Malcolm McLean wrote:
>> On Friday, October 17, 2014 8:11:41 AM UTC+1, Ian Collins wrote:
>>> Malcolm McLean wrote:
> <snip>
>>>> Now what did the programmer intend, and what will happen?
>>>
>>>
>>> I've no idea, but I'm sure the code wouldn't get past a review (or a
>>> compiler).
>>>
>> On what grounds would you fail it?
>>
>
> On the grounds that he has no idea what the programmer intended, or what
> will happen.  That is the most important reason for failing code in a
> review.

You just pipped me to it!

-- 
Ian Collins
0
Ian
10/17/2014 8:17:23 AM
Keith Thompson wrote:

> <william@wilbur.25thandClement.com> writes:
> [...]
>> Are there systems with C++ compilers but no C compiler? Or at the very
>> least, where the C++ compiler is modern and well maintained but the C
>> compiler has been left to rot? If you can show that, that would be
>> _something_.
> 
> That nearly describes MS Windows.

I don't see why. Windows doesn't ship with either a C compiler or a C++ 
compiler, so it doesn't qualify on that count. If we're allowed to add 
compilers after the fact, there are plenty of C compilers, as well as C++ 
compilers, available for Windows.

> There are non-Microsoft compilers,

There most certainly are.

> but Microsoft's own C compiler doesn't support any C standard past
> 1990, while their C++ compiler is (I think) reasonably up to date.

Why are you inferring a requirement for a connection between the OS vendor 
and the compiler vendor?
 
-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/17/2014 9:19:44 AM
In article <m1qj34$de9$1@dont-email.me>,
 August Karlstrom <fusionfile@gmail.com> wrote:

> On 2014-10-15 22:20, jacob navia wrote:
> > There are no stores to buy more brain space however. People have just a
> > limited amount of patience to swallow more and more C++ stuff. I is a
> > language that takes years and years to get accustomed to it, and it will
> > never end. Nobody masters ALL of the language anymore, not even
> > Stroustroup.
> 
> This got me thinking of a passage from the ACM talk given by Edsger 
> Dijkstra:
> 
> http://www.youtube.com/watch?v=yKiVz71AVKg#t=7m30s
> 
> Although it's about PL/1 I think it's equally applicable to C++.
> 
> 
> -- August

http://worrydream.com/refs/Hoare%20-%20The%20Emperors%20Old%20Clothes.pdf

CAR Hoare's 'The Emperor's Old Clothes' including comments on Ada.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/17/2014 10:07:20 AM
"Ian Collins" <ian-news@hotmail.com> wrote in message
news:ca8h4qFerbqU20@mid.individual.net...
> BartC wrote:

>> Those aren't very interesting. And for that, you have to suffer much more
>> pedantic type-strictness which can waste a lot of time to sort out.
>
> I'd change that to "much more pedantic type-strictness which can save you
> form nasty, arse biting problems later on".
>
> My experiences match Kaz's.  I was drawn to C++ in the early 90s when I
> had to find no end of obscure bugs in some badly written C (and a not very
> helpful C compiler).  Coercing the code to compile as C++ removed the bulk
> of the bugs, most of which were down to misuse of enumerations and
> mismatched parameters.

I've been using C++ to compile a couple of my modules to see what would
happen.

The main thing (apart from having to convert many char* parameters to const
char*) is that it doesn't do implicit coercions from void* to T*.

So code such as:

 T *p;

 p = f();
 ....
 p = g();

now has to be written:

 T *p;

 p = (T*)f();
 ....
 p = (T*)g();

Apart from the extra clutter (and having to search the code to find out
exactly what type of pointer p is), the type T now is distributed throughout
the code. Changing T to U will take a lot of work! (Especially if p is
actually declared in a different file maintained by someone else, or for
some reason you cannot easily find or duplicate the type).

This can't be considered good, so maybe there is a 'type_of' kind of
operator? Apparently so, although it doesn't work for everything. Now the
code turns into this:

 T *p;

 p = (typeof(p))f();
 ....
 p = (typeof(p))g();

Looks good ... not! And there is still the repetition of p (simple in this
case, can be arbitrarily complex in others).

Considering how many things happen implicitly in C++, you'd think that
allowing void* to T* conversion would be one of them. It's allowed in C
after all, and is well-defined; what could go wrong?

Or, maybe there there is another construct, some 'auto' cast that has been
hastily bolted on to solve this problem? There is one in C, and it looks
like this:



(That is, you don't need to type anything at all! But actually, I created 
such a feature for my own language, and, in C-style, would be written like 
this:

   x = (cast)y;

which, if x is of type T, is equivalent to: x = (T)y. Which is extremely
handy, solving some of those problems above /if/ a cast is actually needed.
Of course, the conversion also has to be allowed.)

(This is one tiny thing I've picked on, when testing C++ on working code
that has already been honed to work with a multitude of C compilers with
little complaint.

What will happen with new code, where C++, ready to accept all sorts of
special object-oriented syntax that I know nothing about, bamboozles me
with error messages that will mean nothing to me? I've already had messages
about errors in some 'lambda' function I must have been
inadvertently attempting!)


-- 
Bartc
 

0
BartC
10/17/2014 11:21:45 AM
David Brown <david.brown@hesbynett.no> writes:
<snip>
> There are also some features that the languages share, but can be subtly
> different.  A "const" in C++ has, by default, static linkage - but in C
> it has "extern" linkage.  It is easy to make code that works in either
> language, by adding "static" or "extern" explicitly, at the cost of
> being slightly non-idiomatic in either language.

You may have to do a bit more than that because a C++ const can often be
used in a constant expression so

  const int size = 42;
  int array[size];

is valid C++ at file scope.  You can't make it valid C no matter what
storage class specifiers you add to the definition of size!

<snip>
-- 
Ben.
0
Ben
10/17/2014 2:06:37 PM
On 2014-10-17, BartC <bc@freeuk.com> wrote:
> "Ian Collins" <ian-news@hotmail.com> wrote in message
> news:ca8h4qFerbqU20@mid.individual.net...
>> BartC wrote:
>
>>> Those aren't very interesting. And for that, you have to suffer much more
>>> pedantic type-strictness which can waste a lot of time to sort out.
>>
>> I'd change that to "much more pedantic type-strictness which can save you
>> form nasty, arse biting problems later on".
>>
>> My experiences match Kaz's.  I was drawn to C++ in the early 90s when I
>> had to find no end of obscure bugs in some badly written C (and a not very
>> helpful C compiler).  Coercing the code to compile as C++ removed the bulk
>> of the bugs, most of which were down to misuse of enumerations and
>> mismatched parameters.
>
> I've been using C++ to compile a couple of my modules to see what would
> happen.
>
> The main thing (apart from having to convert many char* parameters to const
> char*) is that it doesn't do implicit coercions from void* to T*.
>
> So code such as:
>
>  T *p;
>
>  p = f();
>  ....
>  p = g();
>
> now has to be written:
>
>  T *p;
>
>  p = (T*)f();
>  ....
>  p = (T*)g();
>
> Apart from the extra clutter (and having to search the code to find out
> exactly what type of pointer p is), the type T now is distributed throughout
> the code.

> Changing T to U will take a lot of work!

Aha, so you can now thank C++ for showing you that your code has
a screwd up organization that invites bugs.

You've been (ab)using C as a typeless language.

Of course you don't want to spread all these casts through the program.
Even though they explicitly document that a potentially unsafe conversion
is going on, the conversion is still going on! Documenting it doesn't
make it right.

A well-organized C program minimizes the unsafe conversions to small
areas, keeping the rest of the program type safe.

> (Especially if p is
> actually declared in a different file maintained by someone else, or for
> some reason you cannot easily find or duplicate the type).

In that case you can make a type-safe wrapper for p which works with the
proper type.

   /* Wrapper for void * based library */
   T *T_p(void)
   {
     return (T *) p();
   }

   ...

   p = T_p();  /* no cast */

> This can't be considered good, so maybe there is a 'type_of' kind of
> operator?

Not in standard C or C++. "typeof" is a C and C++ extension in the GNU
compilers and perhaps others.

> Considering how many things happen implicitly in C++, you'd think that
> allowing void* to T* conversion would be one of them. It's allowed in C
> after all, and is well-defined; what could go wrong?

   double d, *pd = &d;
   void *unsafe = pd;
   int *pi = unsafe;

   *pi = 3;

   printf("%f\n", d);

No diagnostic required in C and there are no casts.

C++ diagnoses the "pi = unsafe" initialization.

The behavior is not well defined in either C or C++.

The void * pointer type carries no information about the type of what it
is pointing to, so allowing it to convert to any pointer type without
a cast means that you can do type punning without a cast:

   /* library header */

   typedef void *opaque_t; /* Monumentally stupid */

   void foo(opaque_t opaque, widget_t *pwidget);

   /* your module */
   foo(pwidget, opaque); /* Reversed! Not diagnosed by C, caught by C++ */

In C, pwdiget implicitly converts to void *, and opaque implicitly
converts to widget_t *.

The fact that one of the two arguments is strongly typed doesn't catch the
reversal in C.

This happened to me recently with a re-entrant scanner generated by GNU Flex,
which stupidly produces "void* yyscan_t;" as the external representation
of the scanner state.  The GNU g++ compiler caught it for me.
0
Kaz
10/17/2014 2:23:31 PM
In article <87r3y6zt76.fsf@bsb.me.uk>, Ben Bacarisse <ben.usenet@bsb.me.uk> 
wrote:

> David Brown <david.brown@hesbynett.no> writes:
> <snip>
> > There are also some features that the languages share, but can be subtly
> > different.  A "const" in C++ has, by default, static linkage - but in C
> > it has "extern" linkage.  It is easy to make code that works in either
> > language, by adding "static" or "extern" explicitly, at the cost of
> > being slightly non-idiomatic in either language.
> 
> You may have to do a bit more than that because a C++ const can often be
> used in a constant expression so
> 
>   const int size = 42;
>   int array[size];
> 
> is valid C++ at file scope.  You can't make it valid C no matter what
> storage class specifiers you add to the definition of size!

@ cc -c t.c
@ cat t.c
enum {size = 42};
int array[size];

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/17/2014 2:31:45 PM
On 2014-10-17, Ben Bacarisse <ben.usenet@bsb.me.uk> wrote:
> David Brown <david.brown@hesbynett.no> writes:
><snip>
>> There are also some features that the languages share, but can be subtly
>> different.  A "const" in C++ has, by default, static linkage - but in C
>> it has "extern" linkage.  It is easy to make code that works in either
>> language, by adding "static" or "extern" explicitly, at the cost of
>> being slightly non-idiomatic in either language.
>
> You may have to do a bit more than that because a C++ const can often be
> used in a constant expression so
>
>   const int size = 42;
>   int array[size];
>
> is valid C++ at file scope.  You can't make it valid C no matter what
> storage class specifiers you add to the definition of size!

Someone knowledgeable who is promoting the use of "Clean C" in a project would
know better than to write this in the first place, knowing that it's C++ only.
Should it happen to someone else, it would be flushed out by the C build.
0
Kaz
10/17/2014 2:51:29 PM
Richard Heathfield <invalid@see.sig.invalid> writes:
> Keith Thompson wrote:
>> <william@wilbur.25thandClement.com> writes:
>> [...]
>>> Are there systems with C++ compilers but no C compiler? Or at the very
>>> least, where the C++ compiler is modern and well maintained but the C
>>> compiler has been left to rot? If you can show that, that would be
>>> _something_.
>> 
>> That nearly describes MS Windows.
>
> I don't see why. Windows doesn't ship with either a C compiler or a C++ 
> compiler, so it doesn't qualify on that count. If we're allowed to add 
> compilers after the fact, there are plenty of C compilers, as well as C++ 
> compilers, available for Windows.

Good point.  The restrictions are less relevant for programmers who are
able to use non-Microsoft compilers.  (Not all are able to do so.)

[...]

> Why are you inferring a requirement for a connection between the OS vendor 
> and the compiler vendor?

Sometimes such requirements are imposed from above.

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/17/2014 2:59:53 PM
On 2014-10-17, David Brown <david.brown@hesbynett.no> wrote:
> There are also some features that the languages share, but can be subtly
> different.  A "const" in C++ has, by default, static linkage - but in C
> it has "extern" linkage.  It is easy to make code that works in either
> language, by adding "static" or "extern" explicitly, at the cost of
> being slightly non-idiomatic in either language.

A "static const" is only non-idiomatic in C++. Every other combination
is idiomatic in both languages.

If we want a small (perhaps scalar-valued) constant c of type T to be inlined in
translation units, then in C++ we can put a "const T c;" into a header
file which is then included into multiple translation units. Each one
ends up with a copy of this, but we expect occurrences of c to be folded
into code anyway. In C, we would make it "static const T c;" which works
in C++, but is slightly non-idiomatic, as you note.

If T is something large that will likely generate an definition, then
we might instead put an "extern const T c;" into that header, and
then define the object somewhere, once.  This is idiomatic C and C++.

If you forget the static, and your C compiler follows the "relaxed ref/def" model
(a term used in the 1989 ANSI C Rationale), then it won't be diagnosed.
With gcc you have to use -fno-common to enforce the rule that an external
object name is defined at most once.
0
Kaz
10/17/2014 3:11:27 PM
On 10/17/2014 1:57 AM, Malcolm McLean wrote:
> On Tuesday, October 14, 2014 10:38:47 PM UTC+1, Ian Collins wrote:
>> Malcolm McLean wrote:
>>
>>> Another possibility is that a and b are strings, of a is a file
>>> and b a record, or some other defensible but confusing misuse of
>>> the addition operator. Now it gets very difficult. It's hard to
>>> find the function. Worse, someone might have defined the +
>>> operation as concatenation in one place, and insertion in
>>> another.
>>
>> Then they are a fool who would be equally likely to write messed up C.
>>
> Someone could very reasonably write
>
> streamobject & streamobject::operator << (numericalobject &x)
> {
>    this->append( x.tostring() );
>    return *this;
> }
>
> template <integral> numericalobject & numericalobject::operator << integral i
> {
>     while(i != 0)
>     {
>        *this *= 2;
>        i = i - 1;
>     }
>     return *this;
> }
>
> In client code
>
> numericalobject a, b;
> streamobject str;
>
> str << a << b;
>
> Now what did the programmer intend, and what will happen?

You'll get a syntax error in the template.

What will happen (assuming the error is fixed and this is all the 
relevant code) is that a's tostring method will be called and its return 
will be passed to str's append method. Then, b's tostring method will be 
called and its return will be passed to str's append method.

What the programmer intended is anybody's guess as any information that 
would give us a hint was left out (probably deliberately).

Martin Shobe

0
Martin
10/17/2014 3:27:30 PM
Keith Thompson wrote:

> Richard Heathfield <invalid@see.sig.invalid> writes:
>> Keith Thompson wrote:
>>> <william@wilbur.25thandClement.com> writes:
>>> [...]
>>>> Are there systems with C++ compilers but no C compiler? Or at the very
>>>> least, where the C++ compiler is modern and well maintained but the C
>>>> compiler has been left to rot? If you can show that, that would be
>>>> _something_.
>>> 
>>> That nearly describes MS Windows.
>>
>> I don't see why. Windows doesn't ship with either a C compiler or a C++
>> compiler, so it doesn't qualify on that count. If we're allowed to add
>> compilers after the fact, there are plenty of C compilers, as well as C++
>> compilers, available for Windows.
> 
> Good point.  The restrictions are less relevant for programmers who are
> able to use non-Microsoft compilers.  (Not all are able to do so.)

It's not hard, so presumably you mean they're not allowed to. I have 
encountered a good many such restrictions on tool choices over the years, 
mostly arbitrary and pointless, but imposed anyway. But the Windows example 
fails anyway, even when one is restricted to MS tools, because Visual Studio 
has a fairly decent C compiler and a fairly decent C++ compiler.

> 
> [...]
> 
>> Why are you inferring a requirement for a connection between the OS
>> vendor and the compiler vendor?
> 
> Sometimes such requirements are imposed from above.

The question was whether there are "systems" that have C++ compilers but no 
well-maintained C compilers. That question does not refer to such a 
requirement, so we can't assume that such a requirement is part of the 
question. That meaningless and pointless restrictions can be imposed in some 
organisations does not imply that all instances of a system are subject to 
those meaningless and pointless restrictions. It is easy to construct *one* 
instance of a system that has a C++ compiler but no C compiler - just find a 
box, install an OS, remove any C compilers that might happen to be lurking 
therein, and then install a C++ compiler. But to do so is, I think, to miss 
the point of the question.

In practice, C is just about as universal as a language is likely to get, 
and C++ isn't quite so widespread. So I think the true answer to the 
question is very likely to be "no".
 
-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/17/2014 3:40:06 PM
"BartC" <bc@freeuk.com> writes:
<snip>
> Considering how many things happen implicitly in C++, you'd think that
> allowing void* to T* conversion would be one of them. It's allowed in C
> after all, and is well-defined; what could go wrong?

Lots of things:

  int i = 42;
  void *p = &i;
  MyBigClass *mbcp = p;
  mbcp->do_something();

C++ tries hard to be much more type-safe than C.  Obviously it can't go
all the way, but it stops up some of the largest holes in C's type
system.

<snip>
-- 
Ben.
0
Ben
10/17/2014 3:52:00 PM
BartC wrote:
> "Ian Collins" <ian-news@hotmail.com> wrote in message
> news:ca8h4qFerbqU20@mid.individual.net...
>> BartC wrote:
>
>>> Those aren't very interesting. And for that, you have to suffer much more
>>> pedantic type-strictness which can waste a lot of time to sort out.
>>
>> I'd change that to "much more pedantic type-strictness which can save you
>> form nasty, arse biting problems later on".
>>
>> My experiences match Kaz's.  I was drawn to C++ in the early 90s when I
>> had to find no end of obscure bugs in some badly written C (and a not very
>> helpful C compiler).  Coercing the code to compile as C++ removed the bulk
>> of the bugs, most of which were down to misuse of enumerations and
>> mismatched parameters.
>
> I've been using C++ to compile a couple of my modules to see what would
> happen.
>
> The main thing (apart from having to convert many char* parameters to const
> char*) is that it doesn't do implicit coercions from void* to T*.
>
> So code such as:
>
>   T *p;
>
>   p = f();
>   ....
>   p = g();
>
> now has to be written:
>
>   T *p;
>
>   p = (T*)f();
>   ....
>   p = (T*)g();
>
> Apart from the extra clutter (and having to search the code to find out
> exactly what type of pointer p is), the type T now is distributed throughout
> the code. Changing T to U will take a lot of work! (Especially if p is
> actually declared in a different file maintained by someone else, or for
> some reason you cannot easily find or duplicate the type).
>
> This can't be considered good, so maybe there is a 'type_of' kind of
> operator? Apparently so, although it doesn't work for everything. Now the
> code turns into this:

*If* you have to resort to all those casts, C++ helps you both keep them 
relatively safe and easy to find by using named casts.  Yes they can 
look ugly, but that just goes to emphasise that what you are doing is 
ugly.  As for keeping things safe, they save you from bugs like:

const void* f();

int* p = (int*)f();

-- 
Ian Collins
0
Ian
10/17/2014 6:20:52 PM
Richard Heathfield <invalid@see.sig.invalid> writes:
> Keith Thompson wrote:
>> Richard Heathfield <invalid@see.sig.invalid> writes:
>>> Keith Thompson wrote:
>>>> <william@wilbur.25thandClement.com> writes:
>>>> [...]
>>>>> Are there systems with C++ compilers but no C compiler? Or at the very
>>>>> least, where the C++ compiler is modern and well maintained but the C
>>>>> compiler has been left to rot? If you can show that, that would be
>>>>> _something_.
>>>> 
>>>> That nearly describes MS Windows.
>>>
>>> I don't see why. Windows doesn't ship with either a C compiler or a C++
>>> compiler, so it doesn't qualify on that count. If we're allowed to add
>>> compilers after the fact, there are plenty of C compilers, as well as C++
>>> compilers, available for Windows.
>> 
>> Good point.  The restrictions are less relevant for programmers who are
>> able to use non-Microsoft compilers.  (Not all are able to do so.)
>
> It's not hard, so presumably you mean they're not allowed to. I have 
> encountered a good many such restrictions on tool choices over the years, 
> mostly arbitrary and pointless, but imposed anyway. But the Windows example 
> fails anyway, even when one is restricted to MS tools, because Visual Studio 
> has a fairly decent C compiler and a fairly decent C++ compiler.

Unless something has changed recently, Visual Studio has a fairly
decent C90 compiler; it doesn't attempt to conform to either of the
two more recent ISO C standards.  Whether that qualifies it as a
"fairly decent C compiler" is a judgement call.  (I do not suggest
that anyone who is happy with C90 shouldn't be.)

[snip]

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/17/2014 6:48:00 PM
On 2014-10-17, Ian Collins <ian-news@hotmail.com> wrote:
> BartC wrote:
>> "Ian Collins" <ian-news@hotmail.com> wrote in message
>> news:ca8h4qFerbqU20@mid.individual.net...
>>> BartC wrote:
>>
>>>> Those aren't very interesting. And for that, you have to suffer much more
>>>> pedantic type-strictness which can waste a lot of time to sort out.
>>>
>>> I'd change that to "much more pedantic type-strictness which can save you
>>> form nasty, arse biting problems later on".
>>>
>>> My experiences match Kaz's.  I was drawn to C++ in the early 90s when I
>>> had to find no end of obscure bugs in some badly written C (and a not very
>>> helpful C compiler).  Coercing the code to compile as C++ removed the bulk
>>> of the bugs, most of which were down to misuse of enumerations and
>>> mismatched parameters.
>>
>> I've been using C++ to compile a couple of my modules to see what would
>> happen.
>>
>> The main thing (apart from having to convert many char* parameters to const
>> char*) is that it doesn't do implicit coercions from void* to T*.
>>
>> So code such as:
>>
>>   T *p;
>>
>>   p = f();
>>   ....
>>   p = g();
>>
>> now has to be written:
>>
>>   T *p;
>>
>>   p = (T*)f();
>>   ....
>>   p = (T*)g();
>>
>> Apart from the extra clutter (and having to search the code to find out
>> exactly what type of pointer p is), the type T now is distributed throughout
>> the code. Changing T to U will take a lot of work! (Especially if p is
>> actually declared in a different file maintained by someone else, or for
>> some reason you cannot easily find or duplicate the type).
>>
>> This can't be considered good, so maybe there is a 'type_of' kind of
>> operator? Apparently so, although it doesn't work for everything. Now the
>> code turns into this:
>
> *If* you have to resort to all those casts, C++ helps you both keep them 
> relatively safe and easy to find by using named casts.  Yes they can 
> look ugly, but that just goes to emphasise that what you are doing is 
> ugly.  As for keeping things safe, they save you from bugs like:
>
> const void* f();
>
> int* p = (int*)f();

The diagnostic when the cast is not there is what keeps you from the bug.
With the cast in place, the compiler is quiet.

The C++ way to deal with this is to avoid the "all powerful" C cast,
and to use the more specific C++ casts.

  // This version expresses only the intent to strip the const qualifier
  // without changing type:
  void *p = const_cast<void *>(f());

  // This version expresses the intent of converting to int *
  // without stripping any qualifiers:
  int *q = static_cast<int *>(f());

These are not an option if you're just compiling C as C++ to catch bugs
of course. Or are they?

You can certainly do something like this in a header file somewhere:

  #ifdef __cplusplus
  #define strip_qual(TYPE, EXPR) const_cast<TYPE>(EXPR)
  #define convert(TYPE, EXPR) static_cast<TYPE>(EXPR)
  #define reinterpret(TYPE, EXPR) reinterpret_cast<TYPE>(EXPR)
  #else
  #define strip_qual(TYPE, EXPR) ((TYPE) (EXPR))
  #define convert(TYPE, EXPR) ((TYPE) (EXPR))
  #define reinterpret(TYPE, EXPR) ((TYPE) (EXPR))
  #endif

Do the precise cast that is intended when compiling as C++, or
the generic "cast all" cast in C.

Then:

  void *p = strip_qual(void *, f());
  int *q = convert(int *, f());
  int r = reinterpret(int, f());

In other words, in "Clean C", using conditional compilation and macros, you can
avail yourself of the precision of C++ style casts.

Anything that can be hidden behind a macro which can be retargetted to
C or C++ is fair game.

This is fantastic; I'm going to use this idea to improve some of my Clean C
code even more.
0
Kaz
10/17/2014 7:43:06 PM
On Friday, October 17, 2014 2:48:11 PM UTC-4, Keith Thompson wrote:
> Unless something has changed recently, Visual Studio has a fairly
> decent C90 compiler;

In all of the development I've done in Visual Studio (from VS98 through
VS2010), I've never had any issues that it could not handle, except for
two:

(1) Needed a way to initialize constant strings at compile time to
read/write memory.  I found my solution thanks to this group in the
form of a compound literal.  I was able to use GCC via MinGW to
compile that portion and link it directly to my VS output, even
executing the GCC command line as part of a pre-build event from
the normal build process in Visual Studio.  Was very pleased with
this solution.

(2) Needed a way to populate non-first-member unions at compile-time.
I found the solution for this in GCC but the implementation of it was
more trouble than I wanted, so I just worked around it with custom
code at startup which populates those few members.  This solution is
an issue for me, but it works.

Best regards,
Rick C. Hodgin
0
Rick
10/17/2014 7:50:32 PM
David Brown <david.brown@hesbynett.no> wrote:
> On 16/10/14 21:31, James Kuyper wrote:
>> On 10/16/2014 03:22 PM, David Brown wrote:
>> ...
>>> Indeed.  Most C "features" that are not valid in C++ are poor C practice 
>>> too.
>> 
>> I believe that even the latest version of C++ has not incorporated all
>> of the new features that were added in C99 and C2011 (though I'm not
>> sure which ones were left out - I haven't had time to spare to do a
>> detailed comparison), and I like most of those new features.
>> 
<snip>
> And there are points that make you feel the C++ and the C committees
> should have their heads banged together until they agree to cooperate -
> C++ has "static_assert" while C has "_Static_assert" to do the same
> thing. (Yes, I /know/ there are good reasons for this.)

If you look at the treatment of <stdbool.h>, <assert.h>, <stdalign.h>, and
<complex.h>, the C++ committee has been moving away from adopting the
underlying C language constructs.

For example, in C++ <stdbool.h> defines __bool_true_false_are_defined but
C++ doesn't support _Bool nor does it #define true or #define false. That's
reasonable, but of course it's not how C99 or C11 define <stdbool.h>.

<assert.h> defines static_assert in C, so C++ had no incentive to bother
with _Static_assert, given their shift away from C. Same situation wrt
<stdalign.h> and alignof.

C++ code can include <complex.h>, but C++ doesn't support _Complex. Rather,
it only has std:complex, which is a templated type. So you cannot mix C and
C++ code using complex types. However, C and C++ define their respective
such that they're ABI compatible. C requires that _Complex be compatible
with an array of two floating-point objects, with index 0 being the real
part and index 1 the imaginary part. See 6.2.5p13 (N1256 or N1570). C++ has
the same requirement. See 26.4p4 (N3092).

> And each committee has come up with an implementation of atomics and
> threading that is, I believe, different and incompatible - and about 15
> years too late to be really useful.

Actually, atomics is one area where they worked closely together. If you
read the C++11 and C11 working papers, it clearly was a joint effort. AFAICT
C code using the <stdatomic.h> macro interfaces should work in C++.

I didn't follow the work on threading so can't speak to it.
0
william
10/17/2014 8:03:28 PM
"Richard Damon" <Richard@Damon-Family.org> wrote in message
news:VQB_v.298948$Ub6.127543@fx20.iad...
> On 10/12/14, 4:05 PM, BartC wrote:

>> managed to build it as it was into the 64-bit DLL file I needed. Except
>> I have to import snappy function names such as
>> "_ZN4jpge27compress_image_to_jpeg_fileEPKciiiPKhRKNS_6paramsE" into the
>> non-C++ code that uses it!)

> One comment on the lousy function names for export, if you declare your
> API interface functions (where possible) like:
>
> extern "C" void fun(char* name);
>
> in the interface header (which is included in the file where the function
> is defined), then the compiler will NOT mangle the name (or just use the
> limited C mangling which might add a _ before the name).

I've tried that and it works well, thanks.

(Not on this library, as I'm going with the native C version which is more
amenable to translating or generally hacking around. But I might need to use
C++ for interfaces which can't be easily linked to with C.)

-- 
Bart 

0
BartC
10/17/2014 10:45:34 PM
On Fri, 17 Oct 2014 01:07:09 +0100, BartC wrote:

> You obviously have a C++ mindset if you think introducing entities such
> 'std::autoptr', 'std::unique_ptr' and 'boost::scoped_ptr' is is going to
> make life easier for anyone. No doubt there are hundreds more of those,
> designed to solve problems that only arise because of the language.

The problems don't arise from the language; the solution does. Nothing
stops you from using the C approach: explicitly de-allocate at each
possible exit point (and hope you don't miss any cases).

But unlike in C, you also have the "fire and forget" option where you
specify at creation time that destroying the pointer should destroy the
object to which it points, and leave the rest to the compiler.

>> Even C generates code for initialisation, assignment, type conversions,
>> parameter passing and the like. C++ just allows the code to be
>> user-defined, whereas in C it's all hard-coded into the compiler.
> 
> That's where a lot of this belongs. You don't want to be doing the
> compiler's job for it.

But that's exactly what C makes you do. You can't just tell it to "do X
whenever Y happens". You have to track down all the cases where Y happens
and insert explicit code to do X.

0
Nobody
10/18/2014 2:01:16 AM
On Thu, 16 Oct 2014 17:35:50 -0700, Rick C. Hodgin wrote:

> I have developed in C++.  It's not a matter of not recognizing or
> appreciating what it does for me ... it's a matter of I think it is wrong,
> an incorrect manner to process data, to have the compiler doing things for
> your which are not explicitly conveyed to it,

They are explicitly conveyed to it, by writing the relevant methods or
overloads.

> My mindset of C/C++ is a higher level version of assembly,

Even C isn't really that any more. It "officially" stopped being so
in the transition from K&R C to ISO C, although it took a while for
compilers to advance to the point that the distinction really mattered.

C++ was never meant to be that in the first place.

In the meantime, many of the same issues now apply to assembly, i.e. the
relationship between the assembly code and what happens on the CPUs pins
has been getting ever more "high-level".

Where C89 added "volatile" for the cases where you need to force the
compiler to interpret code literally, CPUs have added memory barriers and
MTRRs to force load/store to be interpreted literally.

0
Nobody
10/18/2014 2:21:28 AM
In article <pan.2014.10.18.02.01.15.428000@nowhere.invalid>,
 Nobody <nobody@nowhere.invalid> wrote:

> But that's exactly what C makes you do. You can't just tell it to "do X
> whenever Y happens". You have to track down all the cases where Y happens
> and insert explicit code to do X.

I guess it's obvious then: we need to reintroduce COBOL 88 condition definitions.

   05  SEX                   PIC X.
       88  MALE     VALUE "M".
       88  FEMALE   VALUE "F".

http://www.3480-3590-data-conversion.com/article-reading-cobol-layouts-1.html

COBOL--The initial 'C' language.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/18/2014 2:34:26 AM
On Thu, 16 Oct 2014 19:12:37 -0700, Siri Crews wrote:

>> If you want "auto" semantics for such objects, there's std::auto_ptr,
>> std::unique_ptr or boost::scoped_ptr.
> 
> You make so easy so hard.
> 
> Compare to Javascript which doesn't have so many different kinds of
> pointers. Actually it doesn't have any pointers.

Oh, it has lots and lots of pointers. It has so many that it doesn't
even bother distinguishing the pointer from the thing it points to.

>> You still have the C mindset. If you use C++ enough, you eventually take
> 
> And you have a C++ mindset.

If you're going to program in C++, that's kind of essential. This
thread started with a complaint which basically boils down to "C++ doesn't
behave like C". Having a C mindset is perfectly fine ... if you're
actually using C.

I do still use C regularly, but for anything non-trivial it doesn't take
very long before I start wishing I could use C++ instead.

> Objective-C and Javascript are object using
> languages with much simpler syntax and semantics. The difference is
> Objective-C and Javascript are willing to be 'inefficient' to be
> comprehensible.

There are many applications of C++ where such a trade-off isn't
acceptable. If C++ also behaved like that, what would we be left with?

0
Nobody
10/18/2014 2:36:38 AM
On Fri, 17 Oct 2014 12:21:45 +0100, BartC wrote:

> So code such as:
> 
>  T *p;
> 
>  p = f();
>  ....
>  p = g();
> 
> now has to be written:
> 
>  T *p;
> 
>  p = (T*)f();
>  ....
>  p = (T*)g();

Actually, it should be written as:

  p = reinterpret_cast<T*>(f());
  ....
  p = reinterpret_cast<T*>(g());

The reason for using the more specific cast operators is that you can
then search for occurrences of "reinterpret_cast" (which is the most
dangerous case) while ignoring the safer cases (static_cast, const_cast,
dynamic_cast).

> Considering how many things happen implicitly in C++, you'd think that
> allowing void* to T* conversion would be one of them. It's allowed in C
> after all, and is well-defined; what could go wrong?

Well one thing which could go wrong is that sometimes the compiler
has to actually modify the pointer value. Specifically, if you define a
derived class with multiple base classes, you get the equivalent of:

typedef struct {
   base1 b1_;
   base2 b2_;
   // members specific to derived
} derived;

While you can simplify "&p->b1_" (converting a pointer to the derived
class to a pointer to the first base class) to just "(base1*)p",
the same trick won't work for the second base class (i.e. "&p->b2_"); the
compiler has to offset the pointer value.

So although you can (legally) use a derived* wherever a base2* is
expected, converting from derived* to void* to base2* won't work, because
you're hiding the information the compiler needs to perform the conversion.

0
Nobody
10/18/2014 3:11:32 AM
Nobody wrote:
> Even C isn't really that any more. It "officially"
> stopped being so in the transition from K&R C
> to ISO C...
>
> C++ was never meant to be that in the first
> place.

C is that way in my view. C++ is lesser that way,
but only marginally.

My targets are basic hardware, i386 as a base,
but also other ISAs and machines with a range
of abilities. I am not particularly interested in
optimization specific to architectures, but only
algorithms in their general terms.

I believe a soon breakthrough in semiconductor
research will increase performance more than an
order of magnitude. Even "inefficient" algorithms
at those speeds will be exceedingly usable.

Best regards,
Rick C. Hodgin
0
Rick
10/18/2014 3:24:38 AM
Nobody wrote:
> On Fri, 17 Oct 2014 01:07:09 +0100, BartC wrote:
>
>> You obviously have a C++ mindset if you think introducing entities such
>> 'std::autoptr', 'std::unique_ptr' and 'boost::scoped_ptr' is is going to
>> make life easier for anyone. No doubt there are hundreds more of those,
>> designed to solve problems that only arise because of the language.
>
> The problems don't arise from the language; the solution does. Nothing
> stops you from using the C approach: explicitly de-allocate at each
> possible exit point (and hope you don't miss any cases).
>
> But unlike in C, you also have the "fire and forget" option where you
> specify at creation time that destroying the pointer should destroy the
> object to which it points, and leave the rest to the compiler.
>

You still have that in 'C' ( atexit() or your own "burn notice" method
when a frame goes out of ... frame ) but you have to make stupid
lists. I've actually *done* it as stupid lists. Precambrian
GC, so to speak.

>>> Even C generates code for initialisation, assignment, type conversions,
>>> parameter passing and the like. C++ just allows the code to be
>>> user-defined, whereas in C it's all hard-coded into the compiler.
>>
>> That's where a lot of this belongs. You don't want to be doing the
>> compiler's job for it.
>
> But that's exactly what C makes you do. You can't just tell it to "do X
> whenever Y happens". You have to track down all the cases where Y happens
> and insert explicit code to do X.
>

I am not completely sure that is true. Dispatch pattern in 'C'
is pretty easy.

I dunno - for any given problem, you can run a 'C' solution and a C++
solution and see what works better.

-- 
Les Cargill


0
Les
10/18/2014 3:32:20 AM
Les Cargill wrote:
> Nobody wrote:
>>
>> But that's exactly what C makes you do. You can't just tell it to "do X
>> whenever Y happens". You have to track down all the cases where Y happens
>> and insert explicit code to do X.
>
> I am not completely sure that is true. Dispatch pattern in 'C'
> is pretty easy.

It is, try "whenever this function returns, free any resources allocated 
by the current code path".  Or "when the last use of this resource goes 
out of scope, free it"

-- 
Ian Collins
0
Ian
10/18/2014 4:44:57 AM
On Saturday, October 18, 2014 3:36:45 AM UTC+1, Nobody wrote:
>
> I do still use C regularly, but for anything non-trivial it doesn't take
> very long before I start wishing I could use C++ instead.
> 
The secret is to have your components written in portable ANSI C
with clean interfaces, not exporting unnecessary types, taking
parameters as double for reals, ints for integers, and char *s
for text, and flat pointers for arrays, including multi-dimensional
ones. 
The you can plug them together with glue code. 
0
Malcolm
10/18/2014 6:18:25 AM
Malcolm McLean wrote:
> On Saturday, October 18, 2014 3:36:45 AM UTC+1, Nobody wrote:
>>
>> I do still use C regularly, but for anything non-trivial it doesn't take
>> very long before I start wishing I could use C++ instead.
>>
> The secret is to have your components written in portable ANSI C
> with clean interfaces, not exporting unnecessary types, taking
> parameters as double for reals, ints for integers, and char *s
> for text, and flat pointers for arrays, including multi-dimensional
> ones.
> The you can plug them together with glue code.

Oh you make it all sound so easy!  That "glue code" is otherwise known 
as application or business logic which makes up what, 90% of most 
applications?

-- 
Ian Collins
0
Ian
10/18/2014 9:21:13 AM
jacob navia <jacob@spamsink.net> wrote:

> Le 13/10/2014 22:20, Richard Heathfield a �crit :
> > In C, you could use new [] instead of new, but you'd struggle to resize it.
> Not if youy use the c containers library (CCL)
> 
> You just use the resizable vectors  IN C without any trouble.
> 
> As I have said here a zillion times.

And as you've been told a zillion times as well, if we want
unpredictable overhead, we'll just _go_ with C++ instead of fscking up
our clean C code.

Richard
0
raltbos
10/18/2014 11:28:37 AM
Ian Collins <ian-news@hotmail.com> wrote:

> jacob navia wrote:
> > Le 15/10/2014 17:07, Kaz Kylheku a écrit :
> >> The C++ people already improved the C language with operator overloading and
> >> containers several decades ago. You're also scoffing at people who have
> >> worked to improve the language.
> >
> > I do not accept the complexity of C++
> 
> One of these days you will grasp the simple concept of NOT USING THE 
> BITS YOU DON'T LIKE!

That day will come when I will have to read my own code, and only my own
code, and I, and only I, will have to read that code.

The problem with C++ is not the 20% of it that I might like to use. The
problem with C++ is that every other programmer out there will want to
use a different 20%, and therefore to use someone else's code I'll have
to understand 100% of C++ (which, I firmly believe, nobody does) - _and_
still suffer from code that doesn't do what it appears to do on the
surface.

Richard
0
raltbos
10/18/2014 11:32:33 AM
Keith Thompson <kst-u@mib.org> wrote:

> Richard Heathfield <invalid@see.sig.invalid> writes:
> > Why are you inferring a requirement for a connection between the OS vendor 
> > and the compiler vendor?
> 
> Sometimes such requirements are imposed from above.

Frankly, that's a problem with "above", not (for once) with Micro$oft.

Richard
0
raltbos
10/18/2014 11:38:14 AM
On Saturday, October 18, 2014 10:21:24 AM UTC+1, Ian Collins wrote:
>
> Oh you make it all sound so easy!  That "glue code" is otherwise known 
> as application or business logic which makes up what, 90% of most 
> applications?
> 
Most projects fail because the glue code cannot manage the complexity
of all the interactions demanded by the users, it's true. If a 
component cannot be written, then generally it's known that it cannot
be written without a major research effort, and the project never
even begins.
0
Malcolm
10/18/2014 2:35:02 PM
Le 18/10/2014 13:28, Richard Bos a �crit :
> jacob navia <jacob@spamsink.net> wrote:
>
>> Le 13/10/2014 22:20, Richard Heathfield a �crit :
>>> In C, you could use new [] instead of new, but you'd struggle to resize it.
>> Not if youy use the c containers library (CCL)
>>
>> You just use the resizable vectors  IN C without any trouble.
>>
>> As I have said here a zillion times.
>
> And as you've been told a zillion times as well, if we want
> unpredictable overhead, we'll just _go_ with C++ instead of fscking up
> our clean C code.
>

WHAT "unpredictable overhead" ?

The SAME "unpredictable overhead" as C++ since C++ resizes the vector 
the same way as the CCL does!


But with you I am always wrong since several years. You answer ALL the 
threads I start with always the same tactics: lies, insults and personal 
attacks.

You have never used the software I am speaking about nor you know 
anything about it. But you say here things AS IF you wpuld have used it 
for years.

FSCK UP!

.... to use your "Richard Bos english"

0
jacob
10/18/2014 2:53:30 PM
jacob navia <jacob@spamsink.net> wrote:

> Le 18/10/2014 13:28, Richard Bos a �crit :
> > jacob navia <jacob@spamsink.net> wrote:
> >
> >> Le 13/10/2014 22:20, Richard Heathfield a �crit :
> >>> In C, you could use new [] instead of new, but you'd struggle to resize it.
> >> Not if youy use the c containers library (CCL)
> >>
> >> You just use the resizable vectors  IN C without any trouble.
> >>
> >> As I have said here a zillion times.
> >
> > And as you've been told a zillion times as well, if we want
> > unpredictable overhead, we'll just _go_ with C++ instead of fscking up
> > our clean C code.
> 
> WHAT "unpredictable overhead" ?
> 
> The SAME "unpredictable overhead" as C++ since C++ resizes the vector 
> the same way as the CCL does!

Exactly.

So why choose a one-vendor, non-portable solution over an equally bad
but at least no worse, portable, and extensively documented C++?

> FSCK UP!
> 
> ... to use your "Richard Bos english"

My dear boy, _my_ English is a good deal better than _that_ abortion.
Et de plus, je suis s�r que _mon_ fran�ais, assez pauvre qu'il est, est
sup�rieur � _votre_ n�erlandais.

So stop trying to use sarcasm; you French are pathetically bad at it.
Stick to what you're good at: wine rather than whine.

Richard
0
raltbos
10/18/2014 3:28:03 PM
On Saturday, October 18, 2014 4:28:14 PM UTC+1, Richard Bos wrote:
> 
> So why choose a one-vendor, non-portable solution over an equally bad 
> but at least no worse, portable, and extensively documented C++?
> 
People have repeated gone to C, tried to implement some of the features
of C++, but also tried to avoid making what they consider C++
mistakes. Jacob has done the same.

There is a real issue with the standards effect. That's the main
reason I don't use Jacob's extensions. Not because I think C++
is superior, but because I'd have to persuade the entire company
that it was a tool we should adopt. It's not an effective use of
my emotional energy.
   
0
Malcolm
10/18/2014 4:24:17 PM
On 18/10/14 05:24, Rick C. Hodgin wrote:
> Nobody wrote:
>> Even C isn't really that any more. It "officially"
>> stopped being so in the transition from K&R C
>> to ISO C...
>>
>> C++ was never meant to be that in the first
>> place.
>
> C is that way in my view. C++ is lesser that way,
> but only marginally.
>
> My targets are basic hardware, i386 as a base,
> but also other ISAs and machines with a range
> of abilities. I am not particularly interested in
> optimization specific to architectures, but only
> algorithms in their general terms.
>
> I believe a soon breakthrough in semiconductor
> research will increase performance more than an
> order of magnitude.

Why would you think that?  The history of semiconductors so far has 
shown a remarkably stable increase in performance, and there is no 
indications of a "jump of an order of magnitude" on the horizon.

> Even "inefficient" algorithms
> at those speeds will be exceedingly usable.
>

That statement demonstrates a total lack of understanding of algorithmic 
complexity.

<http://en.wikipedia.org/wiki/Wirth%27s_law>


0
David
10/18/2014 6:56:12 PM
On Saturday, October 18, 2014 2:56:23 PM UTC-4, David Brown wrote:
> On 18/10/14 05:24, Rick C. Hodgin wrote:
> 
> > Nobody wrote:
> 
> >> Even C isn't really that any more. It "officially"
> 
> >> stopped being so in the transition from K&R C
> 
> >> to ISO C...
> 
> >>
> 
> >> C++ was never meant to be that in the first
> 
> >> place.
> 
> >
> 
> > C is that way in my view. C++ is lesser that way,
> 
> > but only marginally.
> 
> >
> 
> > My targets are basic hardware, i386 as a base,
> 
> > but also other ISAs and machines with a range
> 
> > of abilities. I am not particularly interested in
> 
> > optimization specific to architectures, but only
> 
> > algorithms in their general terms.
> 
> >
> 
> > I believe a soon breakthrough in semiconductor
> 
> > research will increase performance more than an
> 
> > order of magnitude.
> 
> 
> 
> Why would you think that?  The history of semiconductors so far has 
> shown a remarkably stable increase in performance, and there is no 
> indications of a "jump of an order of magnitude" on the horizon.

It's just a feeling.  It's the same feeling I have that we will soon
uncover a revolution in mathematics that will correlate the super-
symmetry seen in physics with peculiarities in other math disciplines,
such as the symmetry of the primes:

https://sites.google.com/site/geometryoftheprimes/

Some facts I do possess:  MIT researchers announced last Friday a
superconducting transistor that is approaching the point of being
feasible in commercial products.  They have demonstrated its transistor
switch speed at 770 GHz.  While a commercial product could likely
never achieve the individual switch speed maximum, the fact that it
is superconducting means there is no heat generation, so something
on the order of 100x to 300x or greater would be attainable.

These will affect supercomputers and the computers huge companies
and governments use because they will require special cooling,
which means money (at least until such time as room temperature
superconductors become economically viable, something I do not
believe will happen in a computing product before Jesus returns).

Now, here's where it's pure gut feeling:

I believe something else will soon be discovered, however, which will
benefit general consumer products across a very wide array of products,
giving us a speedup of about 10x in silicon products, but it will also
impact communication speed, the size and power footprint of devices,
allowing for far more advanced and economically viable electronic devices,
such as those seen in the Tom Cruise, Minority Report.

It will come in the form of a discovery in manufacturing.  They will
use a metal layer process whereby a soon-to-be discovered molecule is
used for doping all around the outside of the metal layers.  It will
be like sprinkling toppings on a pizza. :-)  However, the particles
will impact a radius around their deposition, allowing a mere smattering
to be employed while still keeping the effect in tact over the entire
3D structure, including the wires on and off the chip.  This will work
also in products which go around thin wires, such as use of a thin
film casing.

These particles will induce a type of "calming" effect in the metal
wires, a peculiar principle of the molecule, and it will be something
which has the effect of lowering their electrical resistance,
ultimately allowing for much cooler operations.

Manufacturing companies will be able to perfect the doping process
very quickly and will race to push clock speeds up and up to their
thermal limits again, which will terminate around 10x faster than
what we have today.  And this will be the last general purpose
advancement we see in traditional semiconductors before Jesus
returns.

I can't explain how I would know this ... I just have that feeling.
:-)  I could be completely wrong of course.  Time will tell.

> > Even "inefficient" algorithms
> > at those speeds will be exceedingly usable.
> 
> That statement demonstrates a total lack of understanding of
> algorithmic complexity.
> 
> <http://en.wikipedia.org/wiki/Wirth%27s_law>

Wirth's "Law" becomes wirthless when the hardware speedups no
longer follow historical patterns.  To put it plainly, and in
basic math...

Consider:

    Algorithm X at Y throughput equals "almost usable to usable,"
    Algorithm X at 10Y throughput equals "uber-usable up into wow!"

Best regards,
Rick C. Hodgin
0
Rick
10/18/2014 7:44:18 PM
On Saturday, October 18, 2014 8:44:27 PM UTC+1, Rick C. Hodgin wrote:
>
> Wirth's "Law" becomes wirthless when the hardware speedups no
> longer follow historical patterns.  To put it plainly, and in
> basic math...
> 
> Consider:
> 
>     Algorithm X at Y throughput equals "almost usable to usable,"
>     Algorithm X at 10Y throughput equals "uber-usable up into wow!"
> 
I don't know much about how to design computer hardware. But my 
understanding is that the process of increasing speed and reducing
power consumption at the same time by making transistors smaller
and smaller has reached a limit. In the medium term, it will be a
process of slightly reducing clock speed, but running several 
instructions in parallel. In the long run, totally new electronic
principles may make massively faster speeds possible, I really 
couldn't comment on that. Jesus hasn't consulted me on when He 
should return either.

Algorithms scale by a factor. So if you multiply two integers by
repeated addition, that's O(N) where N is the size of the integer 
you add. If you multiply using the primary school method, it's
O(log N). It follows that as integers get reasonably large, the
O(N) method becomes totally impractical, whilst the O(log N) 
method is quite feasible. Consider two twenty digit numbers.

However normally humans want to multiply integers which are in
the low thousands at most. To do it by repeated addition, with
pencil and paper, is infeasible. To do a couple of thousand
additions electronically takes a tiny fraction of a second. 
There is a sense in which massively increasing speed is a game 
changer. However if we go to twenty digits, then even on an electronic
computer, the repeated addition method struggles.

That's not just true of multiplication. It's true of all algorithms.
O(2^N) algorithms are feasible only for very tiny N, and you only
add one or two for heroic improvements in hardware.

However parallel hardware changes this a bit. In some environments,
parallel processors are effectively available on demand, at least up
up to low thousands. So if you can make an algorithm parallel,
it will run in O constant time until N reaches low thousands. Then
you'll have a sudden jump to double execution time as you need to
run one step on the same processor twice. The complexity notation
is still valid, ultimately as N goes very high it cannot be evaded,
but for as long as N is less than the number of processors, the
behaviour is quite different, and often that's the practically
important case.
 
0
Malcolm
10/18/2014 8:05:47 PM
On Saturday, October 18, 2014 4:05:56 PM UTC-4, Malcolm McLean wrote:
> [snip]

Go back to a 350 MHz Pentium III and try installing some modern version
of Linux on it.  It will work (provided you have enough RAM), but it
will be a painfully slow procedure.

Now, take that same version of Linux and install it on a 3.5 GHz CPU.
While it will not be 10x faster in this case (because the other
hardware has not increased commensurately), it will be notably faster.

In this hardware advancement I'm talking about, all aspects of the
machine will be increased, ultimately yielding around a 10x
improvement.

I could be wrong.  Wouldn't be the first time (even the first time
today).

Best regards,
Rick C. Hodgin
0
Rick
10/18/2014 8:27:40 PM
On Saturday, 18 October 2014 23:27:52 UTC+3, Rick C. Hodgin  wrote:
> On Saturday, October 18, 2014 4:05:56 PM UTC-4, Malcolm McLean wrote:
> > [snip]

Don't you, Rick, feel that you are sort of rude here? Maybe Malcolm
missed some of your point or something but then state it.

> Go back to a 350 MHz Pentium III and try installing some modern version
> of Linux on it.  It will work (provided you have enough RAM), but it
> will be a painfully slow procedure.

You are likely wrong there. Thing is that lot of Linuxes run on 
embedded devices and that pushes the operating system towards being
small and light. I recently saw boys playing with that Rasberry Bi
toy, size of credit card, runs with 700 MHz and Linux on it seemed
to work fine. 

You maybe can't run some monstrous bloat (like Chrome or Firefox
or Eclipse or Skype) on such hardware too well but Linux is fine. 

> Now, take that same version of Linux and install it on a 3.5 GHz CPU.
> While it will not be 10x faster in this case (because the other
> hardware has not increased commensurately), it will be notably faster.
> 
> In this hardware advancement I'm talking about, all aspects of the
> machine will be increased, ultimately yielding around a 10x
> improvement.

Indeed, same things that use scalable algorithms to solve same problems
run faster on faster computer. However that does not mean that we may
ever expect to use non-scalable algorithms loosely. These will keep
hanging our computers for eons if we aren't using them carefully.
0
ISO
10/18/2014 10:50:26 PM
On Saturday, October 18, 2014 6:50:33 PM UTC-4, =D6=F6 Tiib wrote:
> Don't you, Rick, feel that you are sort of rude here? Maybe Malcolm
> missed some of your point or something but then state it.

I apologize if I was rude.  I was shooting for humor as much as
anything.  But I am aware from my 45 years on this Earth that
my humor is sometimes only that:  "my humor."

I do apologize to you, (I can't produce the characters of your name,
so I will just try with) Oo, and also to you, Malcolm, as well as
the others on this list (to whom it came across rude).

> > Go back to a 350 MHz Pentium III and try installing some modern version
> > of Linux on it.  It will work (provided you have enough RAM), but it
> > will be a painfully slow procedure.
>=20
> You are likely wrong there. Thing is that lot of Linuxes run on=20
> embedded devices and that pushes the operating system towards being
> small and light. I recently saw boys playing with that Rasberry Bi
> toy, size of credit card, runs with 700 MHz and Linux on it seemed
> to work fine.=20

I recently did something like this.  It was a 1.5 GHz VIA C7-M CPU
in a 2005 laptop with 1 GB of RAM.  I installed Linux Mint 17 from
USB disk, and then several other apps afterward with the same high-
speed Internet connection.  It was excruciating how slow it was, and
I know from past performance that the VIA C7-M CPU at 1.5 GHz is
about on par with an Intel Celeron 1.0 GHz.  The laptop also had
a slow hard drive.  Overall it took about 1.5 hours to install.

I had also recently installed Linux Mint 17 onto a 2007 Pentium Dual
Core 1.7 GHz laptop, and it was notably faster (two CPUs, higher
clock speed, OoO super-pipelined, superscalar architecture, faster
memory subsystem, etc.).  Took about 30 minutes.  Made all the
difference.

The same install on my modern AMD A8-based laptop took about 20
minutes from start to finish.  Same on my AMD 6-core desktop
machine (probably only slightly faster than the A8 laptop).

I used those recent installs as reference, assuming the 350 MHz
Pentium III would be slower still, but not much, than the 1.5 GHz
VIA C7-M laptop.

> > In this hardware advancement I'm talking about, all aspects of the
> > machine will be increased, ultimately yielding around a 10x
> > improvement.
>=20
> Indeed, same things that use scalable algorithms to solve same problems
> run faster on faster computer. However that does not mean that we may
> ever expect to use non-scalable algorithms loosely. These will keep
> hanging our computers for eons if we aren't using them carefully.

We won't be here on the Earth that long. :-)  These are truly the end
times, though we are still years and years away.

Best regards,
Rick C. Hodgin
0
Rick
10/18/2014 11:20:37 PM
On Sat, 18 Oct 2014 11:32:33 +0000, Richard Bos wrote:

>> One of these days you will grasp the simple concept of NOT USING THE
>> BITS YOU DON'T LIKE!
> 
> That day will come when I will have to read my own code, and only my own
> code, and I, and only I, will have to read that code.
> 
> The problem with C++ is not the 20% of it that I might like to use. The
> problem with C++ is that every other programmer out there will want to use
> a different 20%, and therefore to use someone else's code I'll have to
> understand 100% of C++ (which, I firmly believe, nobody does) - _and_
> still suffer from code that doesn't do what it appears to do on the
> surface.

This is a genuine problem, to which the only viable solution is to learn
the entire language (although not necessarily to "language lawyer" level).

Don't like templates? Well most libraries (including the standard library)
use them. Same for most of the other features an individual programmer
could probably live without (i.e. anything that's not in C; C programmers
manage to live without any of those).

But it's worth remembering that when it was created, C was very much a
"minimalist" language by the standards of the day. Its complexity is
roughly comparable to Pascal, which was supposed to be simplistic in light
its role as an educational language.

Even C++ is somewhat simpler than what you might have expected the typical
language of 2014 to look like, given the trajectory of language
development prior to C's appearance.

0
Nobody
10/19/2014 12:16:21 AM

"Nobody" <nobody@nowhere.invalid> wrote in message
news:pan.2014.10.19.00.16.20.329000@nowhere.invalid...
> On Sat, 18 Oct 2014 11:32:33 +0000, Richard Bos wrote:
>
>>> One of these days you will grasp the simple concept of NOT USING THE
>>> BITS YOU DON'T LIKE!
>>
>> That day will come when I will have to read my own code, and only my own
>> code, and I, and only I, will have to read that code.
>>
>> The problem with C++ is not the 20% of it that I might like to use. The
>> problem with C++ is that every other programmer out there will want to
>> use
>> a different 20%, and therefore to use someone else's code I'll have to
>> understand 100% of C++ (which, I firmly believe, nobody does) - _and_
>> still suffer from code that doesn't do what it appears to do on the
>> surface.
>
> This is a genuine problem, to which the only viable solution is to learn
> the entire language (although not necessarily to "language lawyer" level).

> But it's worth remembering that when it was created, C was very much a
> "minimalist" language by the standards of the day. Its complexity is
> roughly comparable to Pascal, which was supposed to be simplistic in light
> its role as an educational language.

It's so not simple anymore. And the compilers (especially the ones that also
support C++) are monstrous.

A non-gcc, non-C++ compiler seems to occupy some tens of megabytes, while 
gcc with C++ support needs hundreds of megabytes (and my Mingw was around 
1.2GB, I think because it had support for other languages, which took 20 
minutes to copy onto another machine, because my own 0.7MB compiler project 
had a dependency on it!)

So, while pure C compilers are already bigger and more unwieldy than
they should be, including support for C++ seems to further increase the size
and complexity by a magnitude. It's not always possible to 'ignore' that
complexity.

(BTW I've been looking at an independent linker called 'golink', which is 
only 47KB (kilobytes; remember those?), and does not need .a or .lib files 
for every third-party library in existence. It's refreshing to see other 
people still producing human-scale, non-bloated software.)

-- 
Bartc 

0
BartC
10/19/2014 10:31:08 AM
On Sun, 2014-10-19, BartC wrote:
....
> A non-gcc, non-C++ compiler seems to occupy some tens of megabytes, while 
> gcc with C++ support needs hundreds of megabytes

Not true, not on my systems at least (Debian Linux).  It seems you
need roughly 20 MB for the C part of gcc, and another 20 for g++.

/Jorgen

-- 
  // Jorgen Grahn <grahn@  Oo  o.   .     .
\X/     snipabacken.se>   O  o   .
0
Jorgen
10/19/2014 12:23:34 PM
On Sunday, October 19, 2014 11:32:54 AM UTC+1, Bart wrote:
>
> A non-gcc, non-C++ compiler seems to occupy some tens of megabytes, while  
> gcc with C++ support needs hundreds of megabytes (and my Mingw was around  
> 1.2GB, I think because it had support for other languages, which took 20 
> minutes to copy onto another machine, because my own 0.7MB compiler project 
> had a dependency on it!)
> 
> So, while pure C compilers are already bigger and more unwieldy than
> they should be, including support for C++ seems to further increase the size
> and complexity by a magnitude. It's not always possible to 'ignore' that
> complexity.
> 
> (BTW I've been looking at an independent linker called 'golink', which is 
> only 47KB (kilobytes; remember those?), and does not need .a or .lib files 
> for every third-party library in existence. It's refreshing to see other 
> people still producing human-scale, non-bloated software.)
> 
I was fond of tcc (tiny C compiler) but Microsoft seemed to break it on my
notebook. It's a one man project.
You can write a C interpreter or even a C compiler yourself, writing a C++
compiler is a full time job, probably not even a one person job. That's
not usually too important, because normally you get the compiler from
someone else. But where it's important, it can be crucial.  
0
Malcolm
10/19/2014 12:23:47 PM
"Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message 
news:slrnm47ba5.1ks.grahn+nntp@frailea.sa.invalid...
> On Sun, 2014-10-19, BartC wrote:
> ...
>> A non-gcc, non-C++ compiler seems to occupy some tens of megabytes, while
>> gcc with C++ support needs hundreds of megabytes
>
> Not true, not on my systems at least (Debian Linux).  It seems you
> need roughly 20 MB for the C part of gcc, and another 20 for g++.

I wish I could tell you the size of gcc on my Ubuntu system, but I wouldn't 
have a clue how to. Linux applications seem to have a habit of disseminating 
themselves across the file system, making the task harder. But it's also 
possible that with gcc and Linux, the line between compiler and OS is hazier 
than with Windows.

-- 
Bartc 

0
BartC
10/19/2014 1:13:29 PM
On 2014-10-19, BartC <bc@freeuk.com> wrote:
> A non-gcc, non-C++ compiler seems to occupy some tens of megabytes, while 
> gcc with C++ support needs hundreds of megabytes (and my Mingw was around 
> 1.2GB, I think because it had support for other languages, which took 20 
> minutes to copy onto another machine, because my own 0.7MB compiler project 
> had a dependency on it!)

Surprising fact:

$ ls -l /usr/lib/gcc/i686-linux-gnu/4.6/cc1*
-rwxr-xr-x 1 root root 10721104 Apr 15  2012 /usr/lib/gcc/i686-linux-gnu/4.6/cc1
-rwxr-xr-x 1 root root 11548532 Apr 15  2012 /usr/lib/gcc/i686-linux-gnu/4.6/cc1plus

$ size  /usr/lib/gcc/i686-linux-gnu/4.6/cc1*
   text    data     bss     dec     hex filename
   10689955       25308 1227148 11942411         b63a0b /usr/lib/gcc/i686-linux-gnu/4.6/cc1
   11519570       25340 1238668 12783578         c30fda /usr/lib/gcc/i686-linux-gnu/4.6/cc1plus

On this Unbutu system, the GCC 4.6 executable for the C++ compiler has only
a 7.7% larger text segment relative to the C executable.

The initialized data is only 32 bytes larger, and the uninitialized data
(zero-filled globals) are only 0.9% larger.

There is considerable bloat in C++ in the header files and libraries.
Much of that is stuff you don't pay for if you don't use.

Other kinds of bloat in the toolchain is shared.
0
Kaz
10/19/2014 3:05:12 PM
On 2014-10-19, Kaz Kylheku <kaz@kylheku.com> wrote:
> On 2014-10-19, BartC <bc@freeuk.com> wrote:
>> A non-gcc, non-C++ compiler seems to occupy some tens of megabytes, while 
>> gcc with C++ support needs hundreds of megabytes (and my Mingw was around 
>> 1.2GB, I think because it had support for other languages, which took 20 
>> minutes to copy onto another machine, because my own 0.7MB compiler project 
>> had a dependency on it!)
>
> Surprising fact:
>
> $ ls -l /usr/lib/gcc/i686-linux-gnu/4.6/cc1*
> -rwxr-xr-x 1 root root 10721104 Apr 15  2012 /usr/lib/gcc/i686-linux-gnu/4.6/cc1
> -rwxr-xr-x 1 root root 11548532 Apr 15  2012 /usr/lib/gcc/i686-linux-gnu/4.6/cc1plus
>
> $ size  /usr/lib/gcc/i686-linux-gnu/4.6/cc1*
>    text    data     bss     dec     hex filename
>    10689955       25308 1227148 11942411         b63a0b /usr/lib/gcc/i686-linux-gnu/4.6/cc1
>    11519570       25340 1238668 12783578         c30fda /usr/lib/gcc/i686-linux-gnu/4.6/cc1plus
>
> On this Unbutu system, the GCC 4.6 executable for the C++ compiler has only
> a 7.7% larger text segment relative to the C executable.

Mind you, that 7.7% difference, in absolute terms, is 829615 bytes of code.
You can implement a lot of logic in that much code, especially if you have
good libraries elsewhere.

It rather raises the question of what is taking ten megs to implement in the C
compiler.
0
Kaz
10/19/2014 3:13:02 PM
Le 19/10/2014 12:31, BartC a écrit :

> (BTW I've been looking at an independent linker called 'golink', which
> is only 47KB (kilobytes; remember those?), and does not need .a or .lib
> files for every third-party library in existence. It's refreshing to see
> other people still producing human-scale, non-bloated software.)
>

lcc-win is just 1.5MB source code. From gthat, 600K is generated C that 
implements the machine description which is only 100K long. The actual 
code then, is only 1MB of C source, including the assembler.

lcclnk, the linker, is just 2 files of pure C.

More code is inside the IDE and the debugger, but it stays all very 
reasonable.
0
jacob
10/19/2014 8:36:39 PM
On 18/10/14 21:44, Rick C. Hodgin wrote:
> On Saturday, October 18, 2014 2:56:23 PM UTC-4, David Brown wrote:
>> On 18/10/14 05:24, Rick C. Hodgin wrote:
>>
>>> Nobody wrote:
>>
>>>> Even C isn't really that any more. It "officially"
>>
>>>> stopped being so in the transition from K&R C
>>
>>>> to ISO C...
>>
>>>>
>>
>>>> C++ was never meant to be that in the first
>>
>>>> place.
>>
>>>
>>
>>> C is that way in my view. C++ is lesser that way,
>>
>>> but only marginally.
>>
>>>
>>
>>> My targets are basic hardware, i386 as a base,
>>
>>> but also other ISAs and machines with a range
>>
>>> of abilities. I am not particularly interested in
>>
>>> optimization specific to architectures, but only
>>
>>> algorithms in their general terms.
>>
>>>
>>
>>> I believe a soon breakthrough in semiconductor
>>
>>> research will increase performance more than an
>>
>>> order of magnitude.
>>
>>
>>
>> Why would you think that?  The history of semiconductors so far has
>> shown a remarkably stable increase in performance, and there is no
>> indications of a "jump of an order of magnitude" on the horizon.
>
> It's just a feeling.  It's the same feeling I have that we will soon
> uncover a revolution in mathematics that will correlate the super-
> symmetry seen in physics with peculiarities in other math disciplines,
> such as the symmetry of the primes:
>
> https://sites.google.com/site/geometryoftheprimes/

Both supersymmetry and the distribution of primes are complex fields 
where people have gradually made progress in understanding.  For many of 
the problems regarding primes, such as Riemann Hypothesis, people are 
more concerned about whether it can be solved or not, or if it is 
possible to prove that it is solvable or unsolvable.  Someone might 
surprise us, but certainly nothing suggests that breakthroughs are immanent.

>
> Some facts I do possess:  MIT researchers announced last Friday a
> superconducting transistor that is approaching the point of being
> feasible in commercial products.  They have demonstrated its transistor
> switch speed at 770 GHz.  While a commercial product could likely
> never achieve the individual switch speed maximum, the fact that it
> is superconducting means there is no heat generation, so something
> on the order of 100x to 300x or greater would be attainable.

News like that turns up regularly - and has done for decades.  Someone 
gets a great idea, and after laborious testing finds the perfect 
materials to make a fast switch.  Then after millions of dollars and 
many years work, they completely fail to turn it into a practical 
technology.  This is great for our general understanding of physics, and 
can result in the occasional step forward for technology, but you simply 
don't get the kind of leaps you are hoping for.

And heat /will/ be generated.  Superconducting means zero resistance in 
the steady state, but you cannot transfer charge around and change 
states without energy loss unless the process is perfectly reversible 
and no information is removed from the system.  So even if you could do 
your calculation without heat, you would generate the heat when you read 
the results (and the heat and energy loss is dependent on the 
calculations, not just the number of bits in the result).

>
> These will affect supercomputers and the computers huge companies
> and governments use because they will require special cooling,
> which means money (at least until such time as room temperature
> superconductors become economically viable, something I do not
> believe will happen in a computing product before Jesus returns).

Superconductors have plenty of possible applications, including high 
speed interconnections and power lines to supercomputers.  But they 
won't turn up in practical, economic and efficient computers.  Even if 
someone does figure out a room temperature superconductor, it will not 
be used for the processors themselves.

>
> Now, here's where it's pure gut feeling:
>
> I believe something else will soon be discovered, however, which will
> benefit general consumer products across a very wide array of products,
> giving us a speedup of about 10x in silicon products, but it will also
> impact communication speed, the size and power footprint of devices,
> allowing for far more advanced and economically viable electronic devices,
> such as those seen in the Tom Cruise, Minority Report.

A 10x speed up will occur over the next 5 years.  That's Moore's law - 
you don't need a crystal ball to see it.  It's not guaranteed, being no 
more of a /law/ than Murphy's, but it certainly looks like it will hold 
for the foreseeable future.

And a 10x speed up will not change the algorithms we use, nor make 
inefficient algorithms efficient.  It might mean that we can run those 
algorithms on slightly bigger data sets, or run them in slightly shorter 
time.  But it's not a world changer.

>
> It will come in the form of a discovery in manufacturing.  They will
> use a metal layer process whereby a soon-to-be discovered molecule is
> used for doping all around the outside of the metal layers.  It will
> be like sprinkling toppings on a pizza. :-)  However, the particles
> will impact a radius around their deposition, allowing a mere smattering
> to be employed while still keeping the effect in tact over the entire
> 3D structure, including the wires on and off the chip.  This will work
> also in products which go around thin wires, such as use of a thin
> film casing.
>
> These particles will induce a type of "calming" effect in the metal
> wires, a peculiar principle of the molecule, and it will be something
> which has the effect of lowering their electrical resistance,
> ultimately allowing for much cooler operations.
>
> Manufacturing companies will be able to perfect the doping process
> very quickly and will race to push clock speeds up and up to their
> thermal limits again, which will terminate around 10x faster than
> what we have today.  And this will be the last general purpose
> advancement we see in traditional semiconductors before Jesus
> returns.
>
> I can't explain how I would know this ... I just have that feeling.
> :-)  I could be completely wrong of course.  Time will tell.

Countless millions have prophesied the end of the world or the return of 
their favourite hero, and everyone else has been wrong so far.  Maybe 
you are the one in a billion who gets it right!

>
>>> Even "inefficient" algorithms
>>> at those speeds will be exceedingly usable.
>>
>> That statement demonstrates a total lack of understanding of
>> algorithmic complexity.
>>
>> <http://en.wikipedia.org/wiki/Wirth%27s_law>
>
> Wirth's "Law" becomes wirthless when the hardware speedups no
> longer follow historical patterns.  To put it plainly, and in
> basic math...
>
> Consider:
>
>      Algorithm X at Y throughput equals "almost usable to usable,"
>      Algorithm X at 10Y throughput equals "uber-usable up into wow!"
>

Give me just /one/ example of an algorithm where that applies.  You can 
find hundreds of counter examples at 
<http://en.wikipedia.org/wiki/List_of_algorithms>


0
David
10/19/2014 8:51:56 PM
On 18/10/14 22:27, Rick C. Hodgin wrote:
> On Saturday, October 18, 2014 4:05:56 PM UTC-4, Malcolm McLean wrote:
>> [snip]
>
> Go back to a 350 MHz Pentium III and try installing some modern version
> of Linux on it.  It will work (provided you have enough RAM), but it
> will be a painfully slow procedure.
>
> Now, take that same version of Linux and install it on a 3.5 GHz CPU.
> While it will not be 10x faster in this case (because the other
> hardware has not increased commensurately), it will be notably faster.
>

All you demonstrate there is Wirth's law.

The 350 MHz machine was fine for what it did - but your modern version 
of Linux runs so much more, and spends much more effort on "looking good".

But if you are talking about doing the same job, you can run a perfectly 
usable desktop on a 350 MHz machine.  For stuff that doesn't involve 
graphics (where the advances over that same period have been much more 
than 10x both in the hardware and in the requirements made of that 
hardware), there is no problem.  It's only about three years since I 
retired a server machine running Linux on a 90 MHz pentium - and moving 
it onto multi-GHz hardware made /no/ difference to its speed or usability.

10x faster is always nice, but it is /not/ a game changer (except to 
marketing and advertisers).

> In this hardware advancement I'm talking about, all aspects of the
> machine will be increased, ultimately yielding around a 10x
> improvement.
>
> I could be wrong.  Wouldn't be the first time (even the first time
> today).
>
> Best regards,
> Rick C. Hodgin
>

0
David
10/19/2014 8:58:57 PM
David Brown wrote:

> For many of
> the problems regarding primes, such as Riemann Hypothesis, people are
> more concerned about whether it can be solved or not, or if it is
> possible to prove that it is solvable or unsolvable.  Someone might
> surprise us, but certainly nothing suggests that breakthroughs are
> immanent.

Louis de Branges may have a proof for the Riemann Hypothesis. But then 
again, he may not. Having proved the Bieberbach Conjecture, he cannot easily 
be dismissed as a crank. Nevertheless, mathematicians have been trying hard 
to ignore his claim for ten years now. They have, on the whole, succeeded.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/19/2014 9:07:20 PM
In article <m218md$r6a$1@dont-email.me>, David Brown <david.brown@hesbynett.no> 
wrote:

> 10x faster is always nice, but it is /not/ a game changer (except to 
> marketing and advertisers).

Quantum computer are being researched. If they can work, markets will drive 
their rapid development and deployment. They should be able to speed things by 
more than tenfold.

Currently shared memories require serialisation that steals away much of the 
parallelism. HP is researching huge shared memories that support parallel access 
without needing storage hierarchies or serialisation. If it works, that will 
make massively parallel computation much faster: suppose you have enough 
processing elements to handle any practical matrix and you can do matrix 
multiplies and other operations in one clock.

Or not.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/19/2014 10:29:12 PM
On Sunday, October 19, 2014 11:29:30 PM UTC+1, Siri Crews wrote:
>
> Currently shared memories require serialisation that steals away 
> much of the parallelism. HP is researching huge shared memories 
> that support parallel access without needing storage hierarchies 
> or serialisation. If it works, that will make massively parallel
> computation much faster: suppose you have enough processing elements
> to handle any practical matrix and you can do matrix multiplies and
> other operations in one clock.
>  
> Or not.
> 
Graphics chips are highly parallel, and can be pressed into service 
for many non-graphics calculations. However they're restricted to
executing the same instructions on multiple data sets, which limits
what they can do, but is usually good for graphics where all the pixels
are essentially identical.
 
0
Malcolm
10/19/2014 10:38:08 PM
In article <m21898$q1b$1@dont-email.me>, David Brown <david.brown@hesbynett.no> 
wrote:

> Give me just /one/ example of an algorithm where that applies.  You can 
> find hundreds of counter examples at 

O(N^2) would seem to be always slower than O(N) but if the problem has a fixed 
upper bound on N, O(N^2) becomes O(1).

Weather prediction for examples simulates the world into three dimensional 
cellular automata and runs various energy transport algorithms on that. Because 
of the limitted resolution of input data, it doesn't make sense to reduce that 
grain of the simulation indefinitely, so you end up with a maximum number of 
useful points in the simulation.

CGI is another: you have a limitted number of pixels in each frame. It doesn't 
make sense to use models that have higher resolution than the frame resolution.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/19/2014 10:38:44 PM
> David Brown wrote:
> > Rick C. Hodgin wrote:
> > Consider:
> > Algorithm X at Y throughput equals "almost
> > usable to usable,"
> > Algorithm X at 10Y throughput equals
> > "uber-usable up into wow!"
>
> Give me just /one/ example of an algorithm
> where that applies. You can find hundreds of
> counter examples at
> <http://en.wikipedia.org/wiki/List_of_algorithms>

Consider the wind-up mechanical monkey toy.
He has a governor to only allow him to cheep
and bang his symbols together at a maximum
speed. He can crash them together say three
times per second. 3 Hz.

    http://www.google.com/m?q=mechanical+wind+up+monkey+toy

Remove the governor, replace it with one that
lets him operate at 2x. Now he's crashing away
at six times per second. Twice the fun.

Now, he's a mechanical creation, subject to
many things electronic devices aren't. But, it's
the same general principle.

I think there was also an "I Love Lucy" episode
about this concept: "Speed her up!" (in the
manufacture and packaging of chocolates.

Best regards,
Rick C. Hodgin
0
Rick
10/20/2014 9:20:28 AM
)

Best regards,
Rick C. Hodgin
0
Rick
10/20/2014 9:28:24 AM
On 20/10/14 00:29, Siri Crews wrote:
> In article <m218md$r6a$1@dont-email.me>, David Brown <david.brown@hesbynett.no> 
> wrote:
> 
>> 10x faster is always nice, but it is /not/ a game changer (except to 
>> marketing and advertisers).
> 
> Quantum computer are being researched. If they can work, markets will drive 
> their rapid development and deployment. They should be able to speed things by 
> more than tenfold.

Quantum computers have been researched for many years, and will continue
to be researched for decades to come.  The theory is wonderful, but the
practice is useless.  Shor's algorithm gives a way to factor numbers in
polynomial time (instead of exponential time, with conventional
computers) - but the record, using millions of dollars worth of
equipment, is to factor 21.  Forgive me for being unimpressed.

The only "success" in quantum computing has been D-Wave, which have
produced a machine that combines some fantastic low-temperature
engineering with even more impressive marketing tricks.  The result is a
computer that solves a very small range of problems, and runs slower
than a simple off-the-shelf PC (once you use appropriate software,
unlike in their demonstrations) while only costing several thousand
times as much.

I believe that quantum computers will get gradually more powerful.  But
as you add more and more qbits (since you have to scale in qbits if you
want to avoid scaling in time - you don't get the speed for free),
practical considerations such as noise and stability rapidly get worse.
 I don't believe we will ever reach a point where a quantum computer can
solve a given task faster than the same money spent on conventional
computing.

> 
> Currently shared memories require serialisation that steals away much of the 
> parallelism. HP is researching huge shared memories that support parallel access 
> without needing storage hierarchies or serialisation. If it works, that will 
> make massively parallel computation much faster: suppose you have enough 
> processing elements to handle any practical matrix and you can do matrix 
> multiplies and other operations in one clock.
> 

Shared memories, content-addressable memories, and mixing computing
elements with memory elements, are all gradual steps forward in
computing architecture.  They are nothing new, and are not sudden leaps
or breakthroughs - they are technologies which are in use now and which
have gradually been improving.  Just like using graphics card
technologies for SIMD computations, these things have their place and
may well move into the mainstream (CAM's are key to high-end routers and
network switches, shared memories are useful in massively parallel
systems, and mixed memory and compute elements have been used to speed
up sorting algorithms).

> Or not.
> 

We'll see some of these things move into practical technology, but while
science and research often moves in leaps and limps, technology tends to
have more steady progress.

0
David
10/20/2014 2:33:28 PM
On 20/10/14 00:38, Siri Crews wrote:
> In article <m21898$q1b$1@dont-email.me>, David Brown <david.brown@hesbynett.no> 
> wrote:
> 
>> Give me just /one/ example of an algorithm where that applies.  You can 
>> find hundreds of counter examples at 
> 
> O(N^2) would seem to be always slower than O(N) but if the problem has a fixed 
> upper bound on N, O(N^2) becomes O(1).
> 
> Weather prediction for examples simulates the world into three dimensional 
> cellular automata and runs various energy transport algorithms on that. Because 
> of the limitted resolution of input data, it doesn't make sense to reduce that 
> grain of the simulation indefinitely, so you end up with a maximum number of 
> useful points in the simulation.
> 
> CGI is another: you have a limitted number of pixels in each frame. It doesn't 
> make sense to use models that have higher resolution than the frame resolution.
> 

That's all true - if N is constant, or bounded, then O(f(N)) always
reduces to O(1), albeit usually with a large scale factor.

However, I was looking for an example of an algorithm which is currently
"almost usable to usable" and after a 10x speed up becomes "uber-usable
up into wow!".  And as usual in such cases, I wanted N to be variable,
even though I didn't say so.

0
David
10/20/2014 2:37:34 PM
On Monday, October 20, 2014 10:37:44 AM UTC-4, David Brown wrote:
> However, I was looking for an example of an algorithm which is currently
> "almost usable to usable" and after a 10x speed up becomes "uber-usable
> up into wow!".  And as usual in such cases, I wanted N to be variable,
> even though I didn't say so.

It's not one algorithm, unless you'd consider a source code like
"p->nextFrame();" to be calling "the nextFrame() algorithm." :-)

Back in the 1990s some 3D games started surfacing for the PC.  Games like
Doom, Quake, Castle Wolfenstein, etc.  I had a slow PC and these games
were not quite usable.  However, when I later got a faster PC they were
usable.  That transformation from something like 8fps to 40fps (a 5x
speedup) made the game very impressive, though perhaps not quite "wow"
as much as the graphics were still very limited.  Later graphics card
improvements drove it up into the 80+ fps, but with much higher video
resolutions and color modes.  Those were into the "wow" stage.

Other algorithms are SQL SELECT statements in FoxPro for DOS/Windows.
Again, back in the 1990s, the SQL SELECT statements were limited by
several factors related to CPU in some instances, but more toward disk
I/O and available memory.  Once the machines got faster, and more RAM
became available to cache previously retrieved data, the same queries
which were, back then, usable, now became very speedy and actually
exceedingly impressive.  For full apps, processes that used to take
many seconds to run, were now being done in just a few seconds, which
for many users translated into the "wow" area.

Today we simply expect that kind of performance.

Best regards,
Rick C. Hodgin
0
Rick
10/20/2014 3:22:37 PM
David Brown <david.brown@hesbynett.no> wrote:

(snip)

> Quantum computers have been researched for many years, and will continue
> to be researched for decades to come.  The theory is wonderful, but the
> practice is useless.  Shor's algorithm gives a way to factor numbers in
> polynomial time (instead of exponential time, with conventional
> computers) - but the record, using millions of dollars worth of
> equipment, is to factor 21.  Forgive me for being unimpressed.

This is all true, but if you had made predictions anywhere along the
development of modern digital electronics based on the recent past,
you would have been very surprised.
 
(snip)

> I believe that quantum computers will get gradually more powerful.  But
> as you add more and more qbits (since you have to scale in qbits if you
> want to avoid scaling in time - you don't get the speed for free),
> practical considerations such as noise and stability rapidly get worse.
> I don't believe we will ever reach a point where a quantum computer can
> solve a given task faster than the same money spent on conventional
> computing.

We might some day be surprised, and then wonder why we didn't see
it coming. 

Or maybe not.

-- glen


0
glen
10/20/2014 3:54:59 PM
On 10/19/14 22:51, David Brown wrote:
> A 10x speed up will occur over the next 5 years.  That's Moore's law - you don't
> need a crystal ball to see it.  It's not guaranteed, being no more of a /law/
> than Murphy's, but it certainly looks like it will hold for the foreseeable future.

If one looked carefully at the middle of [1], where the cyan and yellow data
plots show performance development of the TOP500 average and the 500th system
respectively, one could see a slow-down of the performance increase for 1.5
years. We might indeed be at a time where performance, following frequency,
which didn't increase much over the last 8 years already, is no longer
increasing as much as it used to and the end of Murphy's law may be not that far
away with fewer and fewer cpu/gpgpu manufacturers spending ever more money and
ever more time on shrinking their process.

Some architectural or technology breakthrough might be just around the corner or
still very far away. Exponential growth is bound to end.

Thomas

[1] http://s.top500.org/static/lists/2014/06/TOP500_201406_Poster.png
0
Thomas
10/20/2014 4:01:45 PM
Ian Collins <ian-news@hotmail.com> writes:
> Siri Crews wrote:
> >
> > All you're really doing is reinventing destructors, but even worse than C++ did
> 
> What exactly do you have against destructors?

That question could be rephrased as "What exactly do you
have against exceptions?".

Phil
-- 
The best part of re-inventing the wheel is that you get to pick how
many sides the new one has. -- egcagrac0 on SoylentNews
0
Phil
10/20/2014 4:18:53 PM
"BartC" <bc@freeuk.com> writes:
> "Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message
> news:slrnm47ba5.1ks.grahn+nntp@frailea.sa.invalid...
> > On Sun, 2014-10-19, BartC wrote:
> > ...
> >> A non-gcc, non-C++ compiler seems to occupy some tens of megabytes, while
> >> gcc with C++ support needs hundreds of megabytes
> >
> > Not true, not on my systems at least (Debian Linux).  It seems you
> > need roughly 20 MB for the C part of gcc, and another 20 for g++.
> 
> I wish I could tell you the size of gcc on my Ubuntu system, but I
> wouldn't have a clue how to. Linux applications seem to have a habit
> of disseminating themselves across the file system, making the task
> harder. But it's also possible that with gcc and Linux, the line
> between compiler and OS is hazier than with Windows.

apt-cache show has an Installed-Size: field, apparently measured in
KB or KiB. Remember to recurse down the right dependencies.

Phil
-- 
The best part of re-inventing the wheel is that you get to pick how
many sides the new one has. -- egcagrac0 on SoylentNews
0
Phil
10/20/2014 4:23:47 PM
On 20/10/14 17:54, glen herrmannsfeldt wrote:
> David Brown <david.brown@hesbynett.no> wrote:
>
> (snip)
>
>> Quantum computers have been researched for many years, and will continue
>> to be researched for decades to come.  The theory is wonderful, but the
>> practice is useless.  Shor's algorithm gives a way to factor numbers in
>> polynomial time (instead of exponential time, with conventional
>> computers) - but the record, using millions of dollars worth of
>> equipment, is to factor 21.  Forgive me for being unimpressed.
>
> This is all true, but if you had made predictions anywhere along the
> development of modern digital electronics based on the recent past,
> you would have been very surprised.
>

It's hard to predict the particular direction technology will take, or 
what will become important or popular.  But the /way/ technology changes 
is relatively predictable, based on simple economics.  For the most 
part, if someone has a bright idea that can be turned into working 
technology, devices are released early when they are big, slow and 
expensive.  This tests the market, and brings in money - if it is a 
money-maker, then people invest towards making it gradually smaller, 
more powerful and cheaper.  It is extremely rare that you get a sudden 
jump, because that would involve someone taking a massive gamble about 
the success of a product and investing hugely in untried technology.

Sometimes this gradual progression happens quite quickly, but it is 
still fairly smooth.  Jumps of 10x performance do not happen.

> (snip)
>
>> I believe that quantum computers will get gradually more powerful.  But
>> as you add more and more qbits (since you have to scale in qbits if you
>> want to avoid scaling in time - you don't get the speed for free),
>> practical considerations such as noise and stability rapidly get worse.
>> I don't believe we will ever reach a point where a quantum computer can
>> solve a given task faster than the same money spent on conventional
>> computing.
>
> We might some day be surprised, and then wonder why we didn't see
> it coming.
>
> Or maybe not.
>

Like all scientists, I love to be proven wrong!

In a couple of decades, I'll either get to say "I told you so!", or I'll 
get have fun quantum toys to play with.  I can't lose :-)


0
David
10/20/2014 6:43:40 PM
On Monday, October 20, 2014 2:43:50 PM UTC-4, David Brown wrote:
> In a couple of decades, I'll either get to say "I told you so!", or I'll 
> get have fun quantum toys to play with.  I can't lose :-)

The thing about quantum toys is they're like us (here one moment, gone
the next, and you never know when that departure will happen).  So, be
sure to take care of everything that needs taken care of before ...
they're (you're) ... gone.  Get it?

After the departure, there is no return to change anything ... it is
at that point now a matter of historical record.

Best regards,
Rick C. Hodgin
0
Rick
10/20/2014 8:19:39 PM
Rick C. Hodgin <rick.c.hodgin@gmail.com> wrote:

(snip)

> The thing about quantum toys is they're like us (here one moment, gone
> the next, and you never know when that departure will happen).  So, be
> sure to take care of everything that needs taken care of before ...
> they're (you're) ... gone.  Get it?

There is a cartoon in the lastest Physics Today, though there seems
to be no online reference.

With a Quantum Microwave oven, and Chicken a la Schrodinger.

Your chicken either comes out cooked, or walks out, and you 
don't know which one it will be.

-- glen
0
glen
10/20/2014 8:54:55 PM
On Monday, October 20, 2014 4:55:08 PM UTC-4, glen herrmannsfeldt wrote:
> There is a cartoon in the lastest Physics Today, though there seems
> to be no online reference.
> 
> With a Quantum Microwave oven, and Chicken a la Schrodinger.
> 
> Your chicken either comes out cooked, or walks out, and you 
> don't know which one it will be.
> 
> -- glen

:-)  I like that.

I think we'll soon discover that Einstein was right, and that God
does not play dice with the universe.  The quantum phenomenon will
be explained by the next layer of revelations that will stem from
research which takes place following "The Adinkra Revelations" as
they might come to be known. :-)

Best regards,
Rick C. Hodgin
0
Rick
10/20/2014 9:16:40 PM
http://worldnewsdailyreport.com/newly-found-document-holds-eyewitness-account-of-jesus-performing-miracle/

A first hand narrative from an author who saw
Jesus revive a stillborn infant by prayer.

Recently discovered in Vatican archives
from 31 A.D.

Best regards,
Rick C. Hodgin
0
Rick
10/21/2014 11:05:01 AM
On 20/10/14 23:16, Rick C. Hodgin wrote:
> On Monday, October 20, 2014 4:55:08 PM UTC-4, glen herrmannsfeldt wrote:
>> There is a cartoon in the lastest Physics Today, though there seems
>> to be no online reference.
>>
>> With a Quantum Microwave oven, and Chicken a la Schrodinger.
>>
>> Your chicken either comes out cooked, or walks out, and you 
>> don't know which one it will be.
>>
>> -- glen
> 
> :-)  I like that.
> 
> I think we'll soon discover that Einstein was right, and that God
> does not play dice with the universe.  The quantum phenomenon will
> be explained by the next layer of revelations that will stem from
> research which takes place following "The Adinkra Revelations" as
> they might come to be known. :-)
> 
> Best regards,
> Rick C. Hodgin
> 

I can recommend a book for you:

<http://physics.about.com/od/einsteinrelativitybooks/gr/whydoesemc2.htm>

Although the title implies it is about relativity, it also includes a
discussion about quantum mechanics, and shows how the uncertainty
principle is the inevitable consequence of a few simple and sensible
assumptions.

A lot of people like to think that the uncertainties and weird effects
we see in quantum mechanics is just because we can't see everything
properly, and that particles really behave with some sort of classical
mechanics if only we had the complete picture - perhaps string theory
with its extra dimensions will show that the particles behave "nicely"
as long as we consider those dimensions.  Unfortunately, that's not what
happens in reality - and more advanced theories of physics are not
challenging the uncertainty principle.

0
David
10/21/2014 11:18:03 AM
Rick C. Hodgin wrote:

> http://worldnewsdailyreport.com/newly-found-document-holds-eyewitness-
account-of-jesus-performing-miracle/
> 
> A first hand narrative from an author who saw
> Jesus revive a stillborn infant by prayer.
> 
> Recently discovered in Vatican archives
> from 31 A.D.

<sigh> What has this to do with C?</sigh>

Okay, here we go. From that site:

Disclaimer

World News Daily Report is a news and political satire web publication, 
which may or may not use real names, often in semi-real or mostly fictitious 
ways. All news articles contained within worldnewsdailyreport.com are 
fiction, and presumably fake news. Any resemblance to the truth is purely 
coincidental, except for all references to politicians and/or celebrities, 
in which case they are based on real people, but still based almost entirely 
in fiction.

Snopes is always worth checking, too:

http://www.snopes.com/media/notnews/miraclewitness.asp

Even if it were a genuine report, which it clearly isn't, it is hard to see 
how it would convince anyone who is not already convinced by existing 
eyewitness accounts.

The truth is not well served by the spread of falsehood.

I applaud your zeal and your determination, but I would like to invite you 
to consider spending a little more time checking your sources (and, if you 
would be so good, selecting newsgroups appropriate to your choice of topic).

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/21/2014 11:24:52 AM
David,

I could recommend a book to you as well.

Best regards,
Rick C. Hodgin
0
Rick
10/21/2014 11:30:11 AM
Made you think. :-)

Best regards,
Rick C. Hodgin
0
Rick
10/21/2014 11:35:22 AM
Here's the real article link if anyone wants to
research truth:

    http://biblehub.com/kjv/genesis/1.htm

:-) I warn you, many who start reading the
content at that link are forever changed.

Best regards,
Rick C. Hodgin
0
Rick
10/21/2014 11:47:01 AM
Rick C. Hodgin wrote:

> Made you think. :-)

I always think before posting articles to comp.lang.c. It is a course of 
action that I can heartily recommend.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/21/2014 11:57:33 AM
David Brown <david.brown@hesbynett.no> writes:
> On 20/10/14 23:16, Rick C. Hodgin wrote:
> > On Monday, October 20, 2014 4:55:08 PM UTC-4, glen herrmannsfeldt wrote:
> >> There is a cartoon in the lastest Physics Today, though there seems
> >> to be no online reference.
> >>
> >> With a Quantum Microwave oven, and Chicken a la Schrodinger.
> >>
> >> Your chicken either comes out cooked, or walks out, and you 
> >> don't know which one it will be.
> >>
> >> -- glen
> > 
> > :-)  I like that.
> > 
> > I think we'll soon discover that Einstein was right, and that God
> > does not play dice with the universe.  The quantum phenomenon will
> > be explained by the next layer of revelations that will stem from
> > research which takes place following "The Adinkra Revelations" as
> > they might come to be known. :-)
> 
> I can recommend a book for you:
> 
> <http://physics.about.com/od/einsteinrelativitybooks/gr/whydoesemc2.htm>
> 
> Although the title implies it is about relativity,

Its title implies it's full of bogosity. E != mc^2.
If you're determined to isolate familiar expressions on one side
of the formular, then 
  mc^2 = sqrt(E^2 - p^2.c^2)
or
  E = sqrt(m^2.c^4 + p^2.c^2) = m.sqrt(1 + (p/c)^2).c^2
But m.sqrt(1+(p/c)^2) is not m unless p = 0. So E_0 = mc^2, but E_0
is not E *precisely because of relativity*. I.e. relativity *proves*
that E != mc^2. The m.sqrt(1+(p/c)^2) term dubbed "relativistic mass"
causes more harm and more confusion than it solves. Some have called
the propagation of its use a pedagogical virus. Einstein himself 
decried its use when he saw how easy it was to misuse.

> it also includes a
> discussion about quantum mechanics, and shows how the uncertainty
> principle is the inevitable consequence of a few simple and sensible
> assumptions.

If they've simplified it down to misleading falsity, then no thanks.

Phil, E&OE, look in a reliable physics textbook for physics, not c.l.c.
-- 
The best part of re-inventing the wheel is that you get to pick how
many sides the new one has. -- egcagrac0 on SoylentNews
0
Phil
10/21/2014 12:58:59 PM
On Tuesday, October 21, 2014 12:25:02 PM UTC+1, Richard Heathfield wrote:
>
> Disclaimer
> 
> World News Daily Report is a news and political satire web publication, 
> which may or may not use real names, often in semi-real or mostly fictitious 
> ways. All news articles contained within worldnewsdailyreport.com are 
> fiction, and presumably fake news.
>
You've saved me a lot of time.
I take an interest in these things and wondered how I hadn't heard of this.

But it's not a comp.lang.c topic.
0
Malcolm
10/21/2014 1:02:37 PM
On 21/10/14 13:30, Rick C. Hodgin wrote:
> David,
> 
> I could recommend a book to you as well.
> 

Yes, but the one I recommended is relevant to the topic under discussion
(though still not about C), and is factual.

I already own several copies of the book you are thinking of, and have
read a solid chunk of it in the past (at a guess, perhaps 30-40% of the
NT, and maybe 5-10% of the OT).  It has very little to say on the
subjects of quantum mechanics and C programming.


0
David
10/21/2014 1:03:05 PM
David Brown wrote:

> On 21/10/14 13:30, Rick C. Hodgin wrote:
>> David,
>> 
>> I could recommend a book to you as well.
>> 
> 
> Yes, but the one I recommended is relevant to the topic under discussion
> (though still not about C), and is factual.
> 
> I already own several copies of the book you are thinking of, and have
> read a solid chunk of it in the past (at a guess, perhaps 30-40% of the
> NT, and maybe 5-10% of the OT).  It has very little to say on the
> subjects of quantum mechanics and C programming.

Off-hand, I can't think of a QM reference, but Isaiah 55:11 does make it 
quite clear that God is a C programmer.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/21/2014 1:39:16 PM
In article <dc5624b8-8099-4399-a076-56c5f7fc8a28@googlegroups.com>,
Malcolm McLean  <malcolm.mclean5@btinternet.com> wrote:
>On Tuesday, October 21, 2014 12:25:02 PM UTC+1, Richard Heathfield wrote:
>>
>> Disclaimer
>> 
>> World News Daily Report is a news and political satire web publication, 
>> which may or may not use real names, often in semi-real or mostly fictitious 
>> ways. All news articles contained within worldnewsdailyreport.com are 
>> fiction, and presumably fake news.
>>
>You've saved me a lot of time.
>I take an interest in these things and wondered how I hadn't heard of this.
>
>But it's not a comp.lang.c topic.

But no moreso or lessso (is that a word?) than anything else posted in this
newsgroup.  Because, as I've demonstrated so many times, there is nothing
that is on-topic here.  Everything is off-topic.

P.S.  Ya gotta love Ricky posting religious tripe in a thread titled "C vs C++".

-- 
The whole rightwing media machinery exists to be an endless
spigot of almost-reasonable-sounding excuses for their listeners to use to
self-justify their racism.  For them to be able to say to themselves (and
others), "Oh no, I don't hate Obama because he's black, I hate him because
he's _______________".

0
gazelle
10/21/2014 2:13:26 PM
Nobody <nobody@nowhere.invalid> writes:

> [snip]
> Aside: the other significant distinction between methods and
> non-member functions, namely the call syntax (obj.method() vs
> function(obj)) may be removed in the future:
>
> 	https://isocpp.org/files/papers/N4165.pdf
>
> Short version: if code uses obj.function() and no such method
> exists, check whether function(obj) is valid (and if so, use it)
> before reporting an error.

I laughed when I saw this.  After decades of existence, C++
decides to reinvent work done 40 years ago.
0
Tim
10/21/2014 6:15:36 PM
David Brown <david.brown@hesbynett.no> wrote:
> A lot of people like to think that the uncertainties and weird effects
> we see in quantum mechanics is just because we can't see everything
> properly, and that particles really behave with some sort of classical
> mechanics if only we had the complete picture - perhaps string theory
> with its extra dimensions will show that the particles behave "nicely"
> as long as we consider those dimensions.  Unfortunately, that's not what
> happens in reality - and more advanced theories of physics are not
> challenging the uncertainty principle.

My high school chemistry teacher (in the 1990s!) once told me that she
thought they'd eventually find the missing mass in nuclear fission and
fusion reactions. In other words, she was convinced it was just a fancy
chemical reaction.

I'm fairly certain she had a Masters in chemistry. And I presume from a
Florida public university, which quietly excel at STEMS education.

I don't think her opinion matters, though. Lots of people also think the
Earth is only several thousand years old. I don't think the people who doubt
the fundamental implications of relativistic and quantum mechanics reflect
any more poorly on the scientific field than young earth creationists.

There's a fine line between skepticism and ignorance. I really disapprove of
Donald Rumsfeld's politics (I once deliberately passed up an opportunity* to
personally meet him and several other officials, including Richard Perlman),
but he was spot on with his famous "Known knowns..." observation, at least
as a general matter**. See http://en.wikipedia.org/wiki/There_are_known_knowns

* It wouldn't have been a particularly intimate affair. If it had been I'd
have gone just to hear the devil uncensored.

** The particular rhetorical purpose, however, was FUD.
0
william
10/21/2014 7:31:02 PM
In article <msoihb-bn.ln1@wilbur.25thandClement.com>,
 <william@wilbur.25thandClement.com> wrote:

> My high school chemistry teacher (in the 1990s!) once told me that she
> thought they'd eventually find the missing mass in nuclear fission and
> fusion reactions. In other words, she was convinced it was just a fancy
> chemical reaction.

I guess she never found out that chemical bonds can slightly increase mass.

-- 
:-<> Siri Seal of Disavowal #000-001. Disavowed. Denied. Deleted.
'I desire mercy, not sacrifice.'
Icke's razor: Given two equally plausible explanations, choose the weirder.
Defenders of Anarchy.
0
Siri
10/21/2014 10:27:00 PM
On Tuesday, October 21, 2014 9:03:14 AM UTC-4, David Brown wrote:
> I already own several copies of the book you are thinking of, and have
> read a solid chunk of it in the past (at a guess, perhaps 30-40% of the
> NT, and maybe 5-10% of the OT).  It has very little to say on the
> subjects of quantum mechanics and C programming.

You will be consumed by your arrogance and pride when you discover what
it says regarding the power of God. :-(

I would save you from that day by imploring you:  Read it again.  Today.
Seek the Truth and you WILL find it.  He is found, when you seek Him
with all your heart (a wholly purposed effort to search for the truth).

I think you'll also be amazed what He has to say about quantum mechanics,
and more specifically the fundamental nature of the entirety of the
substance of our universe, made by Him, maintained continually by Him.
For example, consider very carefully Ezekiel's vision of the tempest,
the four creates with the faces of a lion, a man, an ox, and an eagle,
and then also John's vision in Heaven.  Ezekiel saw four wings.  John
saw six.  I'll give you a hint:  These variances are important.

Best regards,
Rick C. Hodgin
0
Rick
10/22/2014 12:51:21 PM
"Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote in message 
news:d9ce5ba5-196c-497b-b0e8-47eb31bfd654@googlegroups.com...
> On Tuesday, October 21, 2014 9:03:14 AM UTC-4, David Brown wrote:
>> I already own several copies of the book you are thinking of, and have
>> read a solid chunk of it in the past (at a guess, perhaps 30-40% of the
>> NT, and maybe 5-10% of the OT).  It has very little to say on the
>> subjects of quantum mechanics and C programming.
>
> You will be consumed by your arrogance and pride when you discover what
> it says regarding the power of God. :-(
>
> I would save you from that day by imploring you:  Read it again.  Today.
> Seek the Truth and you WILL find it.  He is found, when you seek Him
> with all your heart (a wholly purposed effort to search for the truth).

(Somewhat off-topic, possibly blasphemous too)

(I would love to be able the read that book. I've got a beautiful annotated 
and illustrated edition printed in 1792, and it's huge.

But as for reading it rather than just looking, unfortunately it's also the 
dullest, most boring and badly written book I've ever seen in my life. And 
those 31,000 verses - talk about repetition! They just look like the same 
set of stock words and phrases but in lots of different permutations.

Still, it does look splendid enough to see that people used to take this 
stuff very seriously. And must have been an astonishing feat of type-setting 
for the time.)

-- 
Bartc 

0
BartC
10/22/2014 2:07:32 PM
On 22/10/14 14:51, Rick C. Hodgin wrote:
> On Tuesday, October 21, 2014 9:03:14 AM UTC-4, David Brown wrote:
>> I already own several copies of the book you are thinking of, and have
>> read a solid chunk of it in the past (at a guess, perhaps 30-40% of the
>> NT, and maybe 5-10% of the OT).  It has very little to say on the
>> subjects of quantum mechanics and C programming.
> 
> You will be consumed by your arrogance and pride when you discover what
> it says regarding the power of God. :-(

I have read what it says about "the power of god".  But unless you have
some external reason to think that the book is particularly special,
it's just a book with no scientific or historical evidence.  Indeed, it
is extraordinary how little historical evidence there is for the
principle characters.  Note that Biblical references to the authority of
the Bible are pointless, since they are circular - it's no more than
compelling than someone saying "I'm right because I say I'm right".

So without anything clear, irrefutable and independent of the Bible that
gives the Bible special authority, it is no stronger evidence for the
existence of God than "The Lion, The Witch and the Wardrobe" is evidence
for the existence of Narnia.

> 
> I would save you from that day by imploring you:  Read it again.  Today.
> Seek the Truth and you WILL find it.  He is found, when you seek Him
> with all your heart (a wholly purposed effort to search for the truth).
> 
> I think you'll also be amazed what He has to say about quantum mechanics,
> and more specifically the fundamental nature of the entirety of the
> substance of our universe, made by Him, maintained continually by Him.
> For example, consider very carefully Ezekiel's vision of the tempest,
> the four creates with the faces of a lion, a man, an ox, and an eagle,
> and then also John's vision in Heaven.  Ezekiel saw four wings.  John
> saw six.  I'll give you a hint:  These variances are important.
> 

The variances show that the Bible is inconsistent.  It has nothing to do
with quantum mechanics.


0
David
10/22/2014 2:15:01 PM
BartC wrote:

> "Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote in message
> news:d9ce5ba5-196c-497b-b0e8-47eb31bfd654@googlegroups.com...
>> On Tuesday, October 21, 2014 9:03:14 AM UTC-4, David Brown wrote:
>>> I already own several copies of the book you are thinking of, and have
>>> read a solid chunk of it in the past (at a guess, perhaps 30-40% of the
>>> NT, and maybe 5-10% of the OT).  It has very little to say on the
>>> subjects of quantum mechanics and C programming.
>>
>> You will be consumed by your arrogance and pride when you discover what
>> it says regarding the power of God. :-(
>>
>> I would save you from that day by imploring you:  Read it again.  Today.
>> Seek the Truth and you WILL find it.  He is found, when you seek Him
>> with all your heart (a wholly purposed effort to search for the truth).
> 
> (Somewhat off-topic, possibly blasphemous too)

Oh, I don't know that it's blasphemous. It is, however, OT, as you say. 
Still, we're here now. And I do manage to drag C into this reply at one 
point, albeit briefly.

> (I would love to be able the read that book. I've got a beautiful
> annotated and illustrated edition printed in 1792, and it's huge.

Family heirloom?

> But as for reading it rather than just looking, unfortunately it's also
> the dullest, most boring and badly written book I've ever seen in my life.

Ah now, you're judging the authors by today's standards. The Bible was 
written over some eight hundred (or so) years, by many different people, at 
a time when writing was very much in alpha. Since its completion, we've had 
two thousand years of incremental improvements in writing style, and you've 
become used to much better writing nowadays than was common at the time the 
Bible was being penned (and that doesn't stop our culture from producing 
some utter tripe at frequent intervals).

And yet the Bible contains some remarkably beautiful language, although I 
will accept that this is not necessarily evident on every page.

Example, he says. Come on, he says, show me a well-written bit. All right, 
hang on. Psalm 139, for example. Adjust your translation according to taste. 
Biblegateway has a few different versions. Not the only well-written bit, by 
the way, not by a long chalk - just one of many examples.

So, to take your other charge, is the Bible dull and boring? Yes, perhaps, 
but in the same way that a treatise on mathematics might be dull and boring 
to a chav, or in the same way that K&R might be dull and boring to an arts 
student, or a paper on the public sector borrowing requirement might be dull 
and boring to me. To those who "buy into" the Bible, it is far from boring. 
In fact, it's fascinating.

Even for those who don't buy into it, it has its moments. Jael and the tent-
peg (Judges 4:14ff), Elisha and the bears (2 Kings 2:23ff), David and 
Goliath... ooh, while we're on the subject, let's look at the theology of 
David and Goliath. It's quite interesting in its own little way. David uses 
a stone to kill Goliath, right? And then he chops off Goliath's head (1 
Samuel 17:48-51). This is actually in obedience to the blasphemy law in 
Leviticus 24:14, but there's a twist. Leviticus requires that the blasphemer 
be taken outside the camp (well, Goliath /was/ outside the camp, so that was 
okay), and the witnesses to his blasphemy are to lay hands on his head, and 
then he's to be stoned. But David does this the other way round. (That's the 
bit I find quite interesting in its own little way.) Why? Because David's 
not very tall, and Goliath is a giant. By stoning him first, David causes 
Goliath to fall, so that he can /reach/ his head.

(Okay, it was barbaric, but actually the Levitical punishments are, on the 
whole, considerably /less/ barbaric than was the case for comparable 
contemporaneous cultures, so it actually represented a significant step 
forward in humane treatment. And how can there be a more barbaric punishment 
than to stone a blasphemer to death? Easy. Stone his family, his friends, 
even his cattle...)

Anyway, so David actually acts strictly in accordance with the Law, but 
practical constraints require him to be a little flexible, rather like a 
purist C programmer who /has/ to know the size of a file without reading it 
first, so he has to cheat a bit and use stat() or filelength().

Maybe little twists like that aren't your cup of tea. But don't dismiss the 
Bible just because it's got lots of genealogical and census data. Even those 
who don't subscribe to its premise

> And those 31,000 verses - talk about repetition! They just look like the
> same set of stock words and phrases but in lots of different permutations.

There are some big blocks that are more or less repeated, yes, but not so 
very many of them really.

> Still, it does look splendid enough to see that people used to take this
> stuff very seriously.

People still do.

> And must have been an astonishing feat of
> type-setting for the time.)

Indeed.

A quick question if I may - do you find Shakespeare boring?

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/22/2014 2:52:15 PM
Richard Heathfield wrote:

<snip>
 
> Maybe little twists like that aren't your cup of tea. But don't dismiss
> the Bible just because it's got lots of genealogical and census data. Even
> those who don't subscribe to its premise

Sloppy editing on my part. Sorry. I meant to say that even those who don't 
subscribe to its premise can be enriched by the beauty of its language.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/22/2014 2:55:03 PM
On Wednesday, October 22, 2014 10:15:10 AM UTC-4, David Brown wrote:
> The variances show that the Bible is inconsistent.  It has nothing to do
> with quantum mechanics.

In Ezekiel's vision the divider was above the creatures (Ezekiel 1:22),
it was the color of "terrible crystal," and the creatures behaved
somewhat differently than the ones seen in John's vision, where the
sea of glass was clear as crystal and below / in front of the throne
(Revelation 4:6).

I'm urging you, David ... think on this.  Think deeply.  It is there
for you if you'll receive it.  Seek to know truly, and you will know.

Best regards,
Rick C. Hodgin
0
Rick
10/22/2014 3:43:16 PM
On Wednesday, October 22, 2014 3:15:10 PM UTC+1, David Brown wrote:
>
> So without anything clear, irrefutable and independent of the Bible that
> gives the Bible special authority, it is no stronger evidence for the
> existence of God than "The Lion, The Witch and the Wardrobe" is evidence
> for the existence of Narnia.
>
People think that "The Bible" is one source because you happen to buy it bound
in one handily volume.
Funny how details like that can dictate someone's theological position.
0
Malcolm
10/22/2014 3:54:41 PM
On Wednesday, October 22, 2014 11:43:30 AM UTC-4, Rick C. Hodgin wrote:
> On Wednesday, October 22, 2014 10:15:10 AM UTC-4, David Brown wrote:
> > The variances show that the Bible is inconsistent.  It has nothing
> > to do with quantum mechanics.
> 
> In Ezekiel's vision the divider was above the creatures (Ezekiel 1:22),
> it was the color of "terrible crystal," and the creatures behaved
> somewhat differently than the ones seen in John's vision, where the
> sea of glass was clear as crystal and below / in front of the throne
> (Revelation 4:6).
> 
> I'm urging you, David ... think on this.  Think deeply.  It is there
> for you if you'll receive it.  Seek to know truly, and you will know.

I will give you some additional clues:  In Ezekiel's vision, the four
living creatures were red.  The wheels that were with them were green.
They were all covered with eyes all round about.  Between the four
living creatures were burning coals of fire passing back-and-forth
between them.  And when the creatures moved they went as a flash of
lightening, but each one only went straight ahead, never diagonally.

Overhead was a sea of a dividing expanse, and over the top of the
glass was a throne, brilliant blue.  And the one who sat on the throne
was like glowing metal, surrounded by burning fire.  And a continuous
stream of rainbow brightness extended out (all the colors).

Graphically animated here:
    https://www.youtube.com/watch?v=EKh_U2NnIs8

-----
In John's vision, he was called up in the spirit (this is important).
He was not yet perfected as he still doubted and considered things as
a man considers them, not as a perfected Heavenly being would do in
full faith and trust.  Yet, he was called up as a man to record what
he saw, to give this vision unto us.

The one on the throne was red, and surrounding the throne was a full
emerald rainbow, not as the half rainbow we see here on Earth, but
the full circular rainbow (this is important) being only green.  In
front of the throne are seven lamps which are the seven spirits of God.
A lamb was in front of the throne looking like it had been slain.  It
had seven horns and seven eyes.  And in the hand of the one on the
throne was a scroll sealed with seven seals.  We later learn that
no one could open the scroll, or even look inside, except for one:
Jesus Christ (this is important).

The four living creatures had six wings and were also full of eyes
front and back (this is important).  We see the living creatures in
Heaven fully composed of their form, whereas in Ezekiels vision each
had a portion of the full form (this is important).  In John's vision
we see the creatures looking in form like a lion, a calf, a man, and
an eagle in flight.

Encircling the four living creatures and the throne were 24 elders.
They were wearing white clothes, had golden crowns, and each had been
given a bowl of incense and a harm (both of gold).  There was also an
exceedingly large host of angels surrounding all of this.

Day and night they gave God thanks, speaking specifically of His
attributes and holiness (perfection, singular purpose of foundational
truth and truth only, this is important).  The 24 elders also bowed
down before the throne and cast their crowns before Him saying
likewise (this is important).  And the surrounding angels also gave
like honor to the One upon the throne.  And everyone in Heaven, and
on Earth, and in the Earth, and on the sea, and in the sea, all gave
like honor to the One upon the throne (this is important).

Graphically animated here:
    https://www.youtube.com/watch?v=8dI-990ati8

God has created this universe upon the model of His very Kingdom.
If you look deeply, very deeply, you will see what I am talking
about.

Look deeply.

And here's something else to consider:  The new Jerusalem which
will descend out of Heaven, the river of Life, the tree of Life:

Graphically animated here:
    https://www.youtube.com/watch?v=uvNR5i4Keg8

Best regards,
Rick C. Hodgin
0
Rick
10/22/2014 5:16:08 PM
On 22/10/14 19:16, Rick C. Hodgin wrote:
> On Wednesday, October 22, 2014 11:43:30 AM UTC-4, Rick C. Hodgin wrote:
>> On Wednesday, October 22, 2014 10:15:10 AM UTC-4, David Brown wrote:
>>> The variances show that the Bible is inconsistent.  It has nothing
>>> to do with quantum mechanics.
>>
>> In Ezekiel's vision the divider was above the creatures (Ezekiel 1:22),
>> it was the color of "terrible crystal," and the creatures behaved
>> somewhat differently than the ones seen in John's vision, where the
>> sea of glass was clear as crystal and below / in front of the throne
>> (Revelation 4:6).
>>
>> I'm urging you, David ... think on this.  Think deeply.  It is there
>> for you if you'll receive it.  Seek to know truly, and you will know.
>
> I will give you some additional clues:  In Ezekiel's vision, the four
> living creatures were red.  The wheels that were with them were green.
> They were all covered with eyes all round about.  Between the four
> living creatures were burning coals of fire passing back-and-forth
> between them.  And when the creatures moved they went as a flash of
> lightening, but each one only went straight ahead, never diagonally.
>
> Overhead was a sea of a dividing expanse, and over the top of the
> glass was a throne, brilliant blue.  And the one who sat on the throne
> was like glowing metal, surrounded by burning fire.  And a continuous
> stream of rainbow brightness extended out (all the colors).
>
> Graphically animated here:
>      https://www.youtube.com/watch?v=EKh_U2NnIs8
>
> -----
> In John's vision, he was called up in the spirit (this is important).
> He was not yet perfected as he still doubted and considered things as
> a man considers them, not as a perfected Heavenly being would do in
> full faith and trust.  Yet, he was called up as a man to record what
> he saw, to give this vision unto us.
>
> The one on the throne was red, and surrounding the throne was a full
> emerald rainbow, not as the half rainbow we see here on Earth, but
> the full circular rainbow (this is important) being only green.  In
> front of the throne are seven lamps which are the seven spirits of God.
> A lamb was in front of the throne looking like it had been slain.  It
> had seven horns and seven eyes.  And in the hand of the one on the
> throne was a scroll sealed with seven seals.  We later learn that
> no one could open the scroll, or even look inside, except for one:
> Jesus Christ (this is important).
>
> The four living creatures had six wings and were also full of eyes
> front and back (this is important).  We see the living creatures in
> Heaven fully composed of their form, whereas in Ezekiels vision each
> had a portion of the full form (this is important).  In John's vision
> we see the creatures looking in form like a lion, a calf, a man, and
> an eagle in flight.
>
> Encircling the four living creatures and the throne were 24 elders.
> They were wearing white clothes, had golden crowns, and each had been
> given a bowl of incense and a harm (both of gold).  There was also an
> exceedingly large host of angels surrounding all of this.
>
> Day and night they gave God thanks, speaking specifically of His
> attributes and holiness (perfection, singular purpose of foundational
> truth and truth only, this is important).  The 24 elders also bowed
> down before the throne and cast their crowns before Him saying
> likewise (this is important).  And the surrounding angels also gave
> like honor to the One upon the throne.  And everyone in Heaven, and
> on Earth, and in the Earth, and on the sea, and in the sea, all gave
> like honor to the One upon the throne (this is important).
>
> Graphically animated here:
>      https://www.youtube.com/watch?v=8dI-990ati8
>
> God has created this universe upon the model of His very Kingdom.
> If you look deeply, very deeply, you will see what I am talking
> about.
>
> Look deeply.
>
> And here's something else to consider:  The new Jerusalem which
> will descend out of Heaven, the river of Life, the tree of Life:
>
> Graphically animated here:
>      https://www.youtube.com/watch?v=uvNR5i4Keg8
>
> Best regards,
> Rick C. Hodgin
>

That's not quantum mechanics - it is a description from a bad trip on 
magic mushrooms.

It's one thing to believe in God, and that Jesus "died on the cross for 
our sins".  If you were trying to convince me of that, I would 
understand you and your aim.  But your uncritical acceptance of this 
sort of tripe shows serious mental issues - and your belief that posting 
it will convince others is even more ludicrous.

(This opinion is, I think, backed up by other Christian posters here who 
post quietly after due consideration and respect for their listeners, 
rather than foaming at the mouth and regurgitating the same nonsense 
that alienated your audience long ago.)

0
David
10/22/2014 9:52:21 PM
On 22/10/14 17:54, Malcolm McLean wrote:
> On Wednesday, October 22, 2014 3:15:10 PM UTC+1, David Brown wrote:
>>
>> So without anything clear, irrefutable and independent of the Bible that
>> gives the Bible special authority, it is no stronger evidence for the
>> existence of God than "The Lion, The Witch and the Wardrobe" is evidence
>> for the existence of Narnia.
>>
> People think that "The Bible" is one source because you happen to buy it bound
> in one handily volume.
> Funny how details like that can dictate someone's theological position.
>

They also seem to think that there is something special about the books 
of the Bible, while other contemporary texts (such as the Gospel of 
Judas, or the Gnostic writings) are heretical.  They forget that the 
choice of what to put in the Bible was made by non-religious bureaucrats 
who were concerned with bringing order back to the Roman society after 
the chaos caused by intolerant Christians.  Books were picked for the 
Bible if they strengthened the appearance of legitimacy for the Church 
and encouraged Christians to obey the Church rather than have their own 
faith.  Theology, historical accuracy, and even the "truth" (if there 
were such a thing) were pretty much irrelevant.

0
David
10/22/2014 9:58:04 PM
Le 22/10/2014 14:51, Rick C. Hodgin a �crit :
> I would save you from that day by imploring you:  Read it again.  Today.
> Seek the Truth and you WILL find it.  He is found, when you seek Him
> with all your heart (a wholly purposed effort to search for the truth).

Let's read that book then, seeking the "truth".

Deuteronomy 2:34
At that time we took all his towns, and gave them over to complete 
destruction, together with men, women, and children; we had no mercy on any:

That is the truth Rick. That is the bible, that is god. Violence, 
destruction and hate: the essence of religion.

Deuteronomy 3:3
So the Lord our God gave up Og, king of Bashan, and all his people into 
our hands; and we overcame him so completely that all his people came to 
their end in the fight.

Deuteronomy 3:4
At that time we took all his towns; there was not one town of the sixty 
towns, all the country of Argob, the kingdom of Og in Bashan, which we 
did not take.

Deuteronomy 3:5
All these towns had high walls round them with doors and locks; and in 
addition we took a great number of unwalled towns.

Deuteronomy 3:6
And we put them to the curse, every town together with men, women, and 
children.

THIS IS THE TRUTH RICK!

THIS IS THE TRUTH ABOUT RELIGION!

0
jacob
10/22/2014 11:27:26 PM
On Wednesday, October 22, 2014 7:27:36 PM UTC-4, jacob navia wrote:
> Le 22/10/2014 14:51, Rick C. Hodgin a =E9crit :
> > I would save you from that day by imploring you:  Read it again.  Today=
..
> > Seek the Truth and you WILL find it.  He is found, when you seek Him
> > with all your heart (a wholly purposed effort to search for the truth).
>=20
> Let's read that book then, seeking the "truth".
> ...
> THIS IS THE TRUTH RICK!
> THIS IS THE TRUTH ABOUT RELIGION!

You forgot to mention the end result of the people who died in the flesh
here upon the Earth, who are presently asleep within the Earth.  They
will again rise, being summoned by name to stand before God giving an
account of their life.  Those who have accepted Jesus Christ will be
spared judgment, but those who have not accepted Him will be judged.

http://biblehub.com/revelation/20-15.htm
"And whosoever was not found written in the book of life was cast into
the lake of fire."

Our lives here upon this Earth are temporary.  Everybody dies.  It is
the second death that is our great concern.  That is the death nobody
will ever return from.

Salvation from that second death is the reason Jesus Christ came to
this Earth.  The Old Testament shows the strictness of God, the
punishment of God, the necessity of maintaining the Law of Moses,
which is God's Law.  However, that Law has no life in it.  It only
condemns.

God completed the revelation of who He is by sending us His Son to die
in our place, so that we don't have to.  He did this while we were yet
sinners, while we were still under eternal condemnation.

Everybody dies in the flesh.  Nobody has to die in eternity ... except
for those who refuse to accept His free offer of salvation.

Best regards,
Rick C. Hodgin
0
Rick
10/23/2014 12:25:08 AM
In article <AwP1w.579886$Au1.275395@fx36.am4>, invalid@see.sig.invalid 
says...
> 
> BartC wrote:
> 
> > "Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote in message
> > news:d9ce5ba5-196c-497b-b0e8-47eb31bfd654@googlegroups.com...
> >> On Tuesday, October 21, 2014 9:03:14 AM UTC-4, David Brown wrote:
> >>> I already own several copies of the book you are thinking of, and have
> >>> read a solid chunk of it in the past (at a guess, perhaps 30-40% of the
> >>> NT, and maybe 5-10% of the OT).  It has very little to say on the
> >>> subjects of quantum mechanics and C programming.
> >>
> >> You will be consumed by your arrogance and pride when you discover what
> >> it says regarding the power of God. :-(
> >>
> >> I would save you from that day by imploring you:  Read it again.  Today.
> >> Seek the Truth and you WILL find it.  He is found, when you seek Him
> >> with all your heart (a wholly purposed effort to search for the truth).
> > 
> > (Somewhat off-topic, possibly blasphemous too)
> 
> Oh, I don't know that it's blasphemous. It is, however, OT, as you say. 
> Still, we're here now. And I do manage to drag C into this reply at one 
> point, albeit briefly.
> 
> > (I would love to be able the read that book. I've got a beautiful
> > annotated and illustrated edition printed in 1792, and it's huge.
> 
> Family heirloom?
> 
> > But as for reading it rather than just looking, unfortunately it's also
> > the dullest, most boring and badly written book I've ever seen in my life.
> 
> Ah now, you're judging the authors by today's standards. The Bible was 
> written over some eight hundred (or so) years, by many different people, at 
> a time when writing was very much in alpha. Since its completion, we've had 
> two thousand years of incremental improvements in writing style, and you've 
> become used to much better writing nowadays than was common at the time the 
> Bible was being penned (and that doesn't stop our culture from producing 
> some utter tripe at frequent intervals).
> 
> And yet the Bible contains some remarkably beautiful language, although I 
> will accept that this is not necessarily evident on every page.
> 
> Example, he says. Come on, he says, show me a well-written bit. All right, 
> hang on. Psalm 139, for example. Adjust your translation according to taste. 
> Biblegateway has a few different versions. Not the only well-written bit, by 
> the way, not by a long chalk - just one of many examples.
> 
> So, to take your other charge, is the Bible dull and boring? Yes, perhaps, 
> but in the same way that a treatise on mathematics might be dull and boring 
> to a chav, or in the same way that K&R might be dull and boring to an arts 
> student, or a paper on the public sector borrowing requirement might be dull 
> and boring to me. To those who "buy into" the Bible, it is far from boring. 
> In fact, it's fascinating.
> 
> Even for those who don't buy into it, it has its moments. Jael and the tent-
> peg (Judges 4:14ff), Elisha and the bears (2 Kings 2:23ff), David and 
> Goliath... ooh, while we're on the subject, let's look at the theology of 
> David and Goliath. It's quite interesting in its own little way. David uses 
> a stone to kill Goliath, right? And then he chops off Goliath's head (1 
> Samuel 17:48-51). This is actually in obedience to the blasphemy law in 
> Leviticus 24:14, but there's a twist. Leviticus requires that the blasphemer 
> be taken outside the camp (well, Goliath /was/ outside the camp, so that was 
> okay), and the witnesses to his blasphemy are to lay hands on his head, and 
> then he's to be stoned. But David does this the other way round. (That's the 
> bit I find quite interesting in its own little way.) Why? Because David's 
> not very tall, and Goliath is a giant. By stoning him first, David causes 
> Goliath to fall, so that he can /reach/ his head.
> 
> (Okay, it was barbaric, but actually the Levitical punishments are, on the 
> whole, considerably /less/ barbaric than was the case for comparable 
> contemporaneous cultures, so it actually represented a significant step 
> forward in humane treatment. And how can there be a more barbaric punishment 
> than to stone a blasphemer to death? Easy. Stone his family, his friends, 
> even his cattle...)
> 
> Anyway, so David actually acts strictly in accordance with the Law, but 
> practical constraints require him to be a little flexible, rather like a 
> purist C programmer who /has/ to know the size of a file without reading it 
> first, so he has to cheat a bit and use stat() or filelength().
> 
> Maybe little twists like that aren't your cup of tea. But don't dismiss the 
> Bible just because it's got lots of genealogical and census data. Even those 
> who don't subscribe to its premise
> 
> > And those 31,000 verses - talk about repetition! They just look like the
> > same set of stock words and phrases but in lots of different permutations.
> 
> There are some big blocks that are more or less repeated, yes, but not so 
> very many of them really.
> 
> > Still, it does look splendid enough to see that people used to take this
> > stuff very seriously.
> 
> People still do.
> 
> > And must have been an astonishing feat of
> > type-setting for the time.)
> 
> Indeed.
> 
> A quick question if I may - do you find Shakespeare boring?

Anybody who saw <https://www.youtube.com/watch?
feature=player_embedded&v=ToHhQUhdyBY> in first run and wasn't 
profoundly moved has no soul.  Alas many atheists seem to fall into that 
category.


0
J
10/23/2014 6:20:18 AM
13/10/2014 23:06, BartC wrote:
>
>
> "Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message
> news:slrnm3od1b.1ks.grahn+nntp@frailea.sa.invalid...
>> On Sun, 2014-10-12, BartC wrote:
>>> I've just been working with a bit of C++ code for the first time (with a
>>> view to spending a couple of hours converting it to C. Or to *any*
>>> language
>>> that mere humans could understand, for that matter).
>>>
>>> But after that experience I don't think I'll ever complain about
>>> anything in
>>> C again!
>> ...
>>> Anyone reading this and is at the point of being undecided about
>>> whether to
>>> learn C++ or not, please do everyone else a favour and don't bother!
>>
>> Let me see that code, and I'll have an opinion about it.
>
> I did post a link to it in a reply, but here it is again:
>
> (The bit of code I was trying to convert is here:
> https://code.google.com/p/jpeg-compressor/source/browse/trunk/jpgd.cpp

I agree with you that it's impossible to understand that bit of C++ 
code, for the only reason that it's one of the most inelegant use of C++ 
I have ever seen. The code looks like the result of some unlucky 
experiments in C++ attempted by some "old" C programmer: unfortunately, 
there are some C programmers who have had an exposure to C to the point 
that they are now mentally mutilated "beyond hope of regeneration".

0
Luca
10/23/2014 9:08:19 PM
Le 23/10/2014 23:08, Luca Risolia a �crit :

> I agree with you that it's impossible to understand that bit of C++
> code, for the only reason that it's one of the most inelegant use of C++
> I have ever seen. The code looks like the result of some unlucky
> experiments in C++ attempted by some "old" C programmer: unfortunately,
> there are some C programmers who have had an exposure to C to the point
> that they are now mentally mutilated "beyond hope of regeneration".
>

This arrogance can't come from anybody else but from a C++ fanatic. 
Exposure to C++ makes your mind a pile of rubbish, so that you can think 
that people that program in C, a widely used programming language, have 
their brains mutilated.

I do not know what that code does, if it is readable or not, whatever. I 
just can't support the unfathomable arrogance of the C++ guys and their 
continuous insults, specially when they come to this group to  vomit 
their preconceived ideas about us.

Yes, I am a C programmer and I have been using this computer language 
since a long time.

SO WHAT?

Is my brain "beyond hope of regeneration"? Maybe, I think I will never 
interest myself for C++. It suffices to see the effects of that 
programming language in people like you to see that it is not a good idea.


0
jacob
10/23/2014 10:18:28 PM
Il 24/10/2014 00:18, jacob navia ha scritto:
> Le 23/10/2014 23:08, Luca Risolia a �crit :
>
>> I agree with you that it's impossible to understand that bit of C++
>> code, for the only reason that it's one of the most inelegant use of C++
>> I have ever seen. The code looks like the result of some unlucky
>> experiments in C++ attempted by some "old" C programmer: unfortunately,
>> there are some C programmers who have had an exposure to C to the point
>> that they are now mentally mutilated "beyond hope of regeneration".
>>
>
> This arrogance can't come from anybody else but from a C++ fanatic.
> Exposure to C++ makes your mind a pile of rubbish, so that you can think
> that people that program in C, a widely used programming language, have
> their brains mutilated.
>
> I do not know what that code does, if it is readable or not, whatever. I
> just can't support the unfathomable arrogance of the C++ guys and their
> continuous insults, specially when they come to this group to  vomit
> their preconceived ideas about us.
>
> Yes, I am a C programmer and I have been using this computer language
> since a long time.
>
> SO WHAT?
>
> Is my brain "beyond hope of regeneration"? Maybe, I think I will never
> interest myself for C++. It suffices to see the effects of that
> programming language in people like you to see that it is not a good idea.

I think did not offend anyone and did not mean to offend anyone. If you 
read better, I deliberately said *some* C programmers. Put it in other 
words, some people with a strong C background claiming that C++ is 
subtle and overcomplicated simply cannot/don't want to change their mind 
(with regard to programming). I personally don't see anything bad in 
this. It's just that you cannot program in C++ as you program in C: you 
get the worst from both the languages. That was my point. Ok ?

0
Luca
10/23/2014 10:54:30 PM
On 10/23/2014 06:54 PM, Luca Risolia wrote:
....
> this. It's just that you cannot program in C++ as you program in C: you 
> get the worst from both the languages. That was my point. Ok ?

If that was your point, I'll have to disagree with it. You can program
in C++ almost exactly as you programmed in C - there's just a few
special cases you need to worry about (see Appendix C of the C++
standard for details). The resulting programs will work in C++ just
about as efficiently as they would have if compiled in C. You're not
taking advantage of any of C++'s special features that way, but you're
also not suffering any significant disadvantages from using a C++
compiler as if it were essentially a C compiler.
0
James
10/23/2014 11:12:42 PM
Il 24/10/2014 01:12, James Kuyper ha scritto:
> On 10/23/2014 06:54 PM, Luca Risolia wrote:
> ...
>> this. It's just that you cannot program in C++ as you program in C: you
>> get the worst from both the languages. That was my point. Ok ?
>
> If that was your point, I'll have to disagree with it. You can program
> in C++ almost exactly as you programmed in C - there's just a few
> special cases you need to worry about (see Appendix C of the C++
> standard for details). The resulting programs will work in C++ just
> about as efficiently as they would have if compiled in C. You're not
> taking advantage of any of C++'s special features that way, but you're
> also not suffering any significant disadvantages from using a C++
> compiler as if it were essentially a C compiler.

That's evident but not what I was talking about (read my first comment 
as well)...
0
Luca
10/23/2014 11:23:09 PM
On 10/23/2014 07:23 PM, Luca Risolia wrote:
> Il 24/10/2014 01:12, James Kuyper ha scritto:
>> On 10/23/2014 06:54 PM, Luca Risolia wrote:
>> ...
>>> this. It's just that you cannot program in C++ as you program in C: you
>>> get the worst from both the languages. That was my point. Ok ?
>>
>> If that was your point, I'll have to disagree with it. You can program
>> in C++ almost exactly as you programmed in C - there's just a few
>> special cases you need to worry about (see Appendix C of the C++
>> standard for details). The resulting programs will work in C++ just
>> about as efficiently as they would have if compiled in C. You're not
>> taking advantage of any of C++'s special features that way, but you're
>> also not suffering any significant disadvantages from using a C++
>> compiler as if it were essentially a C compiler.
> 
> That's evident but not what I was talking about (read my first comment 
> as well)...

I'm not sure whether "first comment" refers to the immediately preceding
sentence, or the immediately preceding message. Either way,
I don't see how your earlier comments change anything that would render
your later comments valid. Can you explain how they do?

Your later comments would make a lot more sense if C++ were more
different from C than it actually is - but C++ was designed from the
beginning with backwards compatibility with C in mind, and for the most
part it has achieved that goal.
0
James
10/23/2014 11:39:09 PM
Rick C. Hodgin wrote:
> Everybody dies in the flesh.  Nobody has to die in eternity ... except
> for those who refuse to accept His free offer of salvation.

Boy, you make it sound like a protection racket...

-- 
Joost Kremers                                   joostkremers@fastmail.fm
Selbst in die Unterwelt dringt durch Spalten Licht
EN:SiS(9)
0
Joost
10/23/2014 11:56:04 PM
On Thursday, October 23, 2014 7:12:53 PM UTC-4, James Kuyper wrote:
> ...You can program
> in C++ almost exactly as you programmed in C - there's just a few
> special cases you need to worry about (see Appendix C of the C++
> standard for details).

Try that around here, and you'll be run out of town on a rail. :-)

Seriously though, in my experience, very few people who begin to
program in C++ stop at the class, or some of the tightening aspects,
and some of the loosening aspects.  That is my personal desire for
the C/C++ combination I develop in, though I am using the class
less and less as I progress.

Best regards,
Rick C. Hodgin
0
Rick
10/24/2014 12:02:26 AM

"James Kuyper" <jameskuyper@verizon.net> wrote in message 
news:54498B6A.2070002@verizon.net...
> On 10/23/2014 06:54 PM, Luca Risolia wrote:
> ...
>> this. It's just that you cannot program in C++ as you program in C: you
>> get the worst from both the languages. That was my point. Ok ?
>
> If that was your point, I'll have to disagree with it. You can program
> in C++ almost exactly as you programmed in C

I'll have to disagree with /that/. People are not going to seek out C++ just 
so they can continue to program in C but with a slower, more elaborate 
compiler (that will also display incomprehensible error messages every time 
they inadvertently hit on some obscure C++ feature).

They will want to make use of all those goodies they've heard about. Or for 
various reasons they are obliged to use it and have to utilise C++ style 
interfaces to do so. And by C++ I don't mean a language that has a C subset, 
I mean everything else that makes the language controversial.

-- 
Bartc 

0
BartC
10/24/2014 12:49:41 AM
On 24/10/14 02:49, BartC wrote:
>
>
> "James Kuyper" <jameskuyper@verizon.net> wrote in message
> news:54498B6A.2070002@verizon.net...
>> On 10/23/2014 06:54 PM, Luca Risolia wrote:
>> ...
>>> this. It's just that you cannot program in C++ as you program in C: you
>>> get the worst from both the languages. That was my point. Ok ?
>>
>> If that was your point, I'll have to disagree with it. You can program
>> in C++ almost exactly as you programmed in C
>
> I'll have to disagree with /that/. People are not going to seek out C++
> just so they can continue to program in C but with a slower, more
> elaborate compiler (that will also display incomprehensible error
> messages every time they inadvertently hit on some obscure C++ feature).
>
> They will want to make use of all those goodies they've heard about. Or
> for various reasons they are obliged to use it and have to utilise C++
> style interfaces to do so. And by C++ I don't mean a language that has a
> C subset, I mean everything else that makes the language controversial.
>

Actually, it is not uncommon to use a C++ compiler for "ABC" - "A Better 
C".  You make use of C++'s slightly stricter handling of types, void* 
pointers, etc., and improvements to the C subset such as using static 
consts for sizing arrays.

Most C++ compilers will compile at virtually the same speed as their 
matching C compiler, for the same source - and they will give the same 
error messages (unless, as you say, you hit an odd C++ feature).

You can then start using additional features of the C++ language that 
are arguably still in the spirit of C, such as namespaces or constexpr, 
and possibly overloading (some people think it should be in C), or 
default values.  But you avoid classes, exceptions, templates, and 
lambdas as these are certainly C++ features that are not in line with C 
programming style.  (Clearly people will vary as to what they think is 
in "ABC" rather than full C++.)

Of course the step from ABC to "real" C++ is shorter than from C to C++, 
but it is common when transitioning code from C to C++.

0
David
10/24/2014 1:43:39 AM
Joost Kremers wrote:
> Rick C. Hodgin wrote:
>> Everybody dies in the flesh.  Nobody has to die in eternity ... except
>> for those who refuse to accept His free offer of salvation.
>
> Boy, you make it sound like a protection racket...
>


Nice soul you got there. Shame if anything were to happen to it...

--
Les Cargill
0
Les
10/24/2014 3:33:12 AM
On 2014-10-24, Rick C. Hodgin <rick.c.hodgin@gmail.com> wrote:
> On Thursday, October 23, 2014 7:12:53 PM UTC-4, James Kuyper wrote:
>> ...You can program
>> in C++ almost exactly as you programmed in C - there's just a few
>> special cases you need to worry about (see Appendix C of the C++
>> standard for details).
>
> Try that around here, and you'll be run out of town on a rail. :-)

Not by anyone who has a clue. The C-like subset of C++ is a dialect
of C. The name of the group is comp.lang.c, not comp.lang.iso-c-only.

ISO C has been influenced by more than one dialect, and a major of those
has been C++.

C++ first had // comments(*), inline functions, declarations mixed
with statements, declarations in the for (;;) syntax.

Compiler-specific dialects of C picked up some of these C++ features
also, before ISO C incorporated them.

ISO C has "C++ DNA" in it, whether anyone likes it or not.

(Speaking of which, every compiler has its own dialect, and it's usualy the
case that that is what it accepts unless told to conform to ISO.
So there are are as many dialects of C as there are compilers, including
C++ compilers.)

Bjarne Stroustrup invented the (void) declaration syntax for
prototyping functions as having no arguments (but withdrew it from
his own language, at the urging of Dennis Ritchie who thought it
was an "abomination"! (See? C++ and C people *talked*!)
C got the (void) for retaining backward compatibility, so that ()
could mean "function with an unknown (but not variadic) parameter list".

Bjarne Stroustrup also independently invented "void *" pointers
at around the same time as some C people. (Both were making a monumental
mistake, and didn't know it; his mistake was smaller because he made
void * safer than the C people.)

---

* Actually one of the languages which inspired C, namely BCPL, has // comments.
Stroustrup may ahve been familiar with BCPL and re-introduced them into his C
dialect. Most C programmers knew these comments as a C++ feature available in
the native dialect of some C compilers. The /*...*/ comments appear in IBM's
PL/I language.
0
Kaz
10/24/2014 3:51:06 AM
On 2014-10-24, BartC <bc@freeuk.com> wrote:
>
>
> "James Kuyper" <jameskuyper@verizon.net> wrote in message 
> news:54498B6A.2070002@verizon.net...
>> On 10/23/2014 06:54 PM, Luca Risolia wrote:
>> ...
>>> this. It's just that you cannot program in C++ as you program in C: you
>>> get the worst from both the languages. That was my point. Ok ?
>>
>> If that was your point, I'll have to disagree with it. You can program
>> in C++ almost exactly as you programmed in C
>
> I'll have to disagree with /that/. People are not going to seek out C++ just 
> so they can continue to program in C but with a slower, more elaborate 
> compiler (that will also display incomprehensible error messages every time 
> they inadvertently hit on some obscure C++ feature).

I just ran a test. My TXR language builds in 12.8 seconds when compiled
with gcc, and 13.3 when compiled with g++.

This is only 3.9% difference; less than a year's worth of Moore's Law.

The executable size and performance are even closer, though on this
platform the C++-produced binary links an extra library (libgcc).
0
Kaz
10/24/2014 3:57:47 AM
On Wednesday, October 22, 2014 10:58:13 PM UTC+1, David Brown wrote:
>
> They also seem to think that there is something special about the books 
> of the Bible, while other contemporary texts (such as the Gospel of 
> Judas, or the Gnostic writings) are heretical.  They forget that the 
> choice of what to put in the Bible was made by non-religious bureaucrats 
> who were concerned with bringing order back to the Roman society after 
> the chaos caused by intolerant Christians.  Books were picked for the 
> Bible if they strengthened the appearance of legitimacy for the Church 
> and encouraged Christians to obey the Church rather than have their own 
> faith.  Theology, historical accuracy, and even the "truth" (if there 
> were such a thing) were pretty much irrelevant.
>
I recommend my book 12 Common Atheist Arguments (refuted).

I won't discuss the issue further here out of respect to readers who
are irritated by religious discussions on a non-religious ng.
0
Malcolm
10/24/2014 5:07:21 AM
On Friday, October 24, 2014 1:51:40 AM UTC+1, Bart wrote:
>
> I'll have to disagree with /that/. People are not going to seek out
> C++ just so they can continue to program in C but with a slower, more
> elaborate compiler (that will also display incomprehensible error 
> messages every time they inadvertently hit on some obscure C++ feature).
> 
> They will want to make use of all those goodies they've heard about.
> Or for various reasons they are obliged to use it and have to utilise
> C++ style interfaces to do so. And by C++ I don't mean a language that
> has a C subset, 
> I mean everything else that makes the language controversial.
> 
I've got to use C++ at the moment because I'm writing code which
is for a plug-in to a commercial product. So all the interfaces are
in C++. It's pretty C-like, basically you acquire "suites" which
consist of structures of function pointers, then call the environment
through them. But the odd type is a C++ class rather than a plain
structure. I don't know if you could get it working in C, but it's
not worth the bother.

However of course you can easily call components written in C.
0
Malcolm
10/24/2014 5:18:52 AM
On Friday, October 24, 2014 4:51:17 AM UTC+1, Kaz Kylheku wrote:
>  
> Bjarne Stroustrup also independently invented "void *" pointers
> at around the same time as some C people. (Both were making a monumental
> mistake, and didn't know it; his mistake was smaller because he made
> void * safer than the C people.)
> 
Baby X is a simple windowing toolkit.

Almost every widget takes a callback and an arbitrary pointer when it
is created. So for example if you want a button it's.

MYAPP app;  /* context for the button */

app.message = "Hello Baby X"; /* some data created somewhere */

BAYBX *bbx; /* connection to Baby X system */
BBX_Button *mybutton;

mybutton = bbx_button(bbx, "Press me", mybuttonpress, &app);

....
void mybuttonpress(void *ptr)
{
   MYAPP *app = ptr;

   printf("%s\n", app->message);
}

How would you do this differently, eliminating the void * ?
0
Malcolm
10/24/2014 5:28:03 AM
David Brown wrote:

> Actually, it is not uncommon to use a C++ compiler for "ABC" - "A Better
> C".

ITYM "ADC" - A Different C. ADC is indisputable. The jury is still out on 
ABC.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/24/2014 6:16:10 AM
On 24/10/14 07:07, Malcolm McLean wrote:
> On Wednesday, October 22, 2014 10:58:13 PM UTC+1, David Brown wrote:
>>
>> They also seem to think that there is something special about the books 
>> of the Bible, while other contemporary texts (such as the Gospel of 
>> Judas, or the Gnostic writings) are heretical.  They forget that the 
>> choice of what to put in the Bible was made by non-religious bureaucrats 
>> who were concerned with bringing order back to the Roman society after 
>> the chaos caused by intolerant Christians.  Books were picked for the 
>> Bible if they strengthened the appearance of legitimacy for the Church 
>> and encouraged Christians to obey the Church rather than have their own 
>> faith.  Theology, historical accuracy, and even the "truth" (if there 
>> were such a thing) were pretty much irrelevant.
>>
> I recommend my book 12 Common Atheist Arguments (refuted).
> 
> I won't discuss the issue further here out of respect to readers who
> are irritated by religious discussions on a non-religious ng.
> 

That's fair enough.  It would be interesting to discuss this (especially
with someone who has thought about this enough to write a book, rather
than someone who simply generates random fire-and-brimstone posts).

It is unlikely that I will read your book - I simply don't have the
spare time to devote to it - but I appreciate the reference, and I will
at least read some online reviews to get an idea of it.

0
David
10/24/2014 9:35:39 AM
On 24/10/14 08:16, Richard Heathfield wrote:
> David Brown wrote:
> 
>> Actually, it is not uncommon to use a C++ compiler for "ABC" - "A Better
>> C".
> 
> ITYM "ADC" - A Different C. ADC is indisputable. The jury is still out on 
> ABC.
> 

That is in the eye of the beholder.  I am sure that most people using
such a C-like subset of C++ would consider it to be "better", or they
would not be using it.

Fortunately, there is no judge or jury here, nor any requirement to
decide one way or the other - different people and different projects
have different needs and opinions.

0
David
10/24/2014 9:40:49 AM
Joost Kremers <joost.m.kremers@gmail.com> wrote:

> Rick C. Hodgin wrote:
> > Everybody dies in the flesh.  Nobody has to die in eternity ... except
> > for those who refuse to accept His free offer of salvation.
> 
> Boy, you make it sound like a protection racket...

Nope. It really _is_ free.

But it's out of place here. Stop it, Rick.

Richard
0
raltbos
10/24/2014 10:17:41 AM
Luca Risolia <luca.risolia@linux-projects.org> wrote:

> I think did not offend anyone and did not mean to offend anyone. If you 
> read better, I deliberately said *some* C programmers. Put it in other 
> words, some people with a strong C background claiming that C++ is 
> subtle and overcomplicated simply cannot/don't want to change their mind 
> (with regard to programming).

I don't think you'll find many C programmers claiming that C++ is
subtle. In fact, its main problem is that it lacks any and all subtlety;
overcomplicated, yes, that it is, and the result is that it's about as
subtle as a Dan Brown plot.

If you want subtle, there are much better OO languages out there.

Richard
0
raltbos
10/24/2014 10:19:44 AM
On Friday, October 24, 2014 6:17:51 AM UTC-4, Richard Bos wrote:
> > Rick C. Hodgin wrote:
> > > Everybody dies in the flesh.  Nobody has to die in eternity ... except
> > > for those who refuse to accept His free offer of salvation.
> But it's out of place here. Stop it, Rick.

A warning, Richard:  the voice asking one to stop speaking and teaching
others about the Lord Jesus Christ only comes from one source -- Satan.

Matthew 28:18+
18 And Jesus came and spake unto them, saying, All power is given unto
me in heaven and in earth.
19 Go ye therefore, and teach all nations, baptizing them in the name of
the Father, and of the Son, and of the Holy Ghost:
20 Teaching them to observe all things whatsoever I have commanded you:
and, lo, I am with you alway, even unto the end of the world. Amen.

Poignant points:

    (1) All power.
    (2) Go and teach.
    (3) Teaching them all things He has commanded.

This is a continuous, unending effort that is not suppressed by other
things in life, but rather remains first and foremost in all, except
for those things the Lord allows for us to be suppressed in.

Men are dying, Richard.  It is a real war, and there are no places
which are not battlegrounds.  We need more men speaking about Jesus
Christ, with the object of their focus and attention being obedience
to the Lord's command (a new command I give you, that you love one
another as I have loved you, so you must love one another), that our
love is not upon worldly affections, but rather eternal considerations.

Jesus took our punishment so that we could be set free.  It is in no
wise at any point for us to heed Satan, or the voice of Satan speaking
through men who would say, "be silent" when it comes to any and all
matters of the Lord.

He is Lord of lords.  King of kings.  It is never, in any circumstance,
at any point and time, in any place or location, inappropriate to speak
of Him and His Kingdom to those who are perishing, because some of those
who hear might hear and come out and be saved ... the very reason the
Lord came to the Earth, suffered, and died, and then rose again, so that
those who die with Him might live with Him.

http://biblehub.com/matthew/16-23.htm
"23 But he turned, and said unto Peter, Get thee behind me, Satan: thou
art an offence unto me: for thou savourest not the things that be of God,
but those that be of men."

A warning on the necessity of serving our Lord before all other things,
including the misguided advice of Satan speaking through men's mouths.

Best regards,
Rick C. Hodgin
0
Rick
10/24/2014 10:39:26 AM
On Friday, October 24, 2014 11:39:48 AM UTC+1, Rick C. Hodgin wrote:
> On Friday, October 24, 2014 6:17:51 AM UTC-4, Richard Bos wrote:
>
> A warning, Richard:  the voice asking one to stop speaking and teaching
> others about the Lord Jesus Christ only comes from one source -- Satan.
> 
No it doesn't.

I'm the author of 12 Common Atheist Arguments (refuted). So I've got credentials
as a Christian apologist.

But I'm asking you to discuss C in C programming forums and religion, if you
like, on religious forums. (There are plenty of atheist trolls posting to the
catholic ng if you like doing battle).
0
Malcolm
10/24/2014 11:20:50 AM
David Brown wrote:

> On 24/10/14 08:16, Richard Heathfield wrote:
>> David Brown wrote:
>> 
>>> Actually, it is not uncommon to use a C++ compiler for "ABC" - "A Better
>>> C".
>> 
>> ITYM "ADC" - A Different C. ADC is indisputable. The jury is still out on
>> ABC.
>> 
> 
> That is in the eye of the beholder.  I am sure that most people using
> such a C-like subset of C++ would consider it to be "better", or they
> would not be using it.
> 
> Fortunately, there is no judge or jury here, nor any requirement to
> decide one way or the other - different people and different projects
> have different needs and opinions.

This is precisely why all such language wars are fruitless and pointless. 
Cultural diversity is extraordinarily important, and the rich variety of 
languages at our disposal is a strength, not a weakness.

C++ is my second-favourite language. There are times when I choose it over 
C. Mostly, however, I use C. But I like them both, and use them both. It's 
odd that people think there has to be some kind of competition between them.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/24/2014 11:51:07 AM
Rick C. Hodgin wrote:

> On Friday, October 24, 2014 6:17:51 AM UTC-4, Richard Bos wrote:
>> > Rick C. Hodgin wrote:
>> > > Everybody dies in the flesh.  Nobody has to die in eternity ...
>> > > except for those who refuse to accept His free offer of salvation.
>> But it's out of place here. Stop it, Rick.
> 
> A warning, Richard:  the voice asking one to stop speaking and teaching
> others about the Lord Jesus Christ only comes from one source -- Satan.

He isn't saying you should stop speaking and teaching. He's saying you 
should use a forum where it's appropriate.

When you go to the cinema, do you use the opportunity to stand up in front 
of the screen and preach to the hundreds of people in the audience?

If not, why not?

And if so, how many times have you been arrested for antisocial behaviour?

Your behaviour here is no less antisocial just because you can't be arrested 
here.

Antisocial behaviour is not a good witness.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/24/2014 11:55:59 AM
"Kaz Kylheku" <kaz@kylheku.com> wrote in message
news:20141023205329.989@kylheku.com...
> On 2014-10-24, BartC <bc@freeuk.com> wrote:

>> I'll have to disagree with /that/. People are not going to seek out C++
>> just
>> so they can continue to program in C but with a slower, more elaborate
>> compiler

> I just ran a test. My TXR language builds in 12.8 seconds when compiled
> with gcc, and 13.3 when compiled with g++.

Some tests I did between g++ and gcc showed g++ 40% slower, even after
several compiles and everything being cached.

But then after everything settled down, the difference reduced to around 15%
(for 32-bit compiles in Windows; 10% for 64-bit. This is without any
optimisation enabled.)

However, even gcc was 300% slower than the fastest C compiler (and 200%
slower than the next fastest). (Bear in mind that C compilers tend to be
written in C, and compiled with themselves. Since gcc produces very fast
code, it suggests the differences would be even greater if all compilers
were compiled with the same standard compiler!)

So if people don't care about slow C compilation speeds anyway, then they're
not going to be bothered about 10% or 20% overhead for C++.

(Why is gcc so slow even when compiling C and even after being compiled
(presumably) with gcc -O3? This is for non-optimised output code remember.
Is it possible that it is arranged internally to accommodate compiling 
source code as C++? In that case, you will /already/ be experiencing a 
slowdown for C++, even when you think you're compiling C!)

(BTW the fastest 32-bit Windows compiler in my test was lccwin, and the next
fastest was DMC. My own compiler for the equivalent code was roughly the
same speed as lccwin, and that compiler is interpreted!)

-- 
Bartc 

0
BartC
10/24/2014 11:57:00 AM
Le 24/10/2014 13:51, Richard Heathfield a �crit :
> C++ is my second-favourite language. There are times when I choose it over
> C. Mostly, however, I use C. But I like them both, and use them both. It's
> odd that people think there has to be some kind of competition between them.

In this subthread I was answering a guy that said:

<quote>
The code looks like the result of some unlucky experiments in C++ 
attempted by some "old" C programmer: unfortunately, there are some C 
programmers who have had an exposure to C to the point that they are now 
mentally mutilated "beyond hope of regeneration".
<end quote>

My answer was:
<quote>
I do not know what that code does, if it is readable or not, whatever. I 
just can't support the unfathomable arrogance of the C++ guys and their 
continuous insults, specially when they come to this group to  vomit 
their preconceived ideas about us.

Yes, I am a C programmer and I have been using this computer language 
since a long time.
<end quote>

Then, Mr Kuyper answers and starts a different thread, ignoring what I 
said, and ignoring all insults by this person.

And then you conclude (ahh so nicely above everything) with your 
consensual stuff, without EVER bothering to address the fact that for a 
C++ programmer a C programmer is somehow a mentally ill person.

And that arrogance since the start of C++

It has been almost 20 years of insults, and somehow you (and nobody in 
this group) will deem necessary to ask the c++ people to stop.

Well, it is normal since all of you are mentally retarded?






0
jacob
10/24/2014 12:01:06 PM
"Richard Heathfield" <invalid@see.sig.invalid> wrote in message
news:M2r2w.767126$an2.567598@fx09.am4...
> David Brown wrote:

>> Fortunately, there is no judge or jury here, nor any requirement to
>> decide one way or the other - different people and different projects
>> have different needs and opinions.
>
> This is precisely why all such language wars are fruitless and pointless.
> Cultural diversity is extraordinarily important, and the rich variety of
> languages at our disposal is a strength, not a weakness.
>
> C++ is my second-favourite language. There are times when I choose it over
> C. Mostly, however, I use C. But I like them both, and use them both. It's
> odd that people think there has to be some kind of competition between
> them.

But sometimes it is rammed down your throat.

You can say the same about C, but while C is largely transparent, C++ is
generally opaque.

It means the difference between being able to proceed with something, and
coming to a dead end.

-- 
Bartc 

0
BartC
10/24/2014 12:11:41 PM
BartC wrote:

> "Richard Heathfield" <invalid@see.sig.invalid> wrote in message
> news:M2r2w.767126$an2.567598@fx09.am4...
>> David Brown wrote:
> 
>>> Fortunately, there is no judge or jury here, nor any requirement to
>>> decide one way or the other - different people and different projects
>>> have different needs and opinions.
>>
>> This is precisely why all such language wars are fruitless and pointless.
>> Cultural diversity is extraordinarily important, and the rich variety of
>> languages at our disposal is a strength, not a weakness.
>>
>> C++ is my second-favourite language. There are times when I choose it
>> over C. Mostly, however, I use C. But I like them both, and use them
>> both. It's odd that people think there has to be some kind of competition
>> between them.
> 
> But sometimes it is rammed down your throat.

<shrug>
The world has no shortage of people who want everybody to be like them, to 
do what they do, like what they like, use what they use, believe what they 
believe. We can choose to react vehemently to such people, or we can choose 
to accept that they're not going to change no matter what we say to them.

The existence of C++ zealots does not mean I can't use C. It only means that 
there are such things as C++ zealots. We can waste our time arguing with 
them, or we can get on with writing C programs, and discussing them here. I 
prefer the latter course, don't you?

<snip>

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/24/2014 12:24:19 PM
"Richard Heathfield" <invalid@see.sig.invalid> wrote in message 
news:Uxr2w.506232$177.444153@fx04.am4...
> BartC wrote:
>> "Richard Heathfield" <invalid@see.sig.invalid> wrote in message

>>> C++ is my second-favourite language.

>> But sometimes it is rammed down your throat.
>
> <shrug>
> The world has no shortage of people who want everybody to be like them, to
> do what they do, like what they like, use what they use, believe what they
> believe. We can choose to react vehemently to such people, or we can 
> choose
> to accept that they're not going to change no matter what we say to them.
>
> The existence of C++ zealots does not mean I can't use C. It only means 
> that
> there are such things as C++ zealots. We can waste our time arguing with
> them, or we can get on with writing C programs, and discussing them here. 
> I
> prefer the latter course, don't you?

I'm not having a go at people. Only sometimes you need a piece of code, or 
an algorithm, or a library or API, and it's in impenetrable C++. With C, at 
least I have a fighting chance.

Some people are suggesting that I learn C++, but that's like suggesting I 
learn Chinese when I can't some article, document or manual in English! 
(Which I have actually tried, and it's not easy.)

-- 
Bartc 

0
BartC
10/24/2014 12:35:47 PM
jacob navia wrote:

> Le 24/10/2014 13:51, Richard Heathfield a écrit :
>> C++ is my second-favourite language. There are times when I choose it
>> over C. Mostly, however, I use C. But I like them both, and use them
>> both. It's odd that people think there has to be some kind of competition
>> between them.
> 
> In this subthread I was answering a guy that said:
> 
> <quote>
> The code looks like the result of some unlucky experiments in C++
> attempted by some "old" C programmer: unfortunately, there are some C
> programmers who have had an exposure to C to the point that they are now
> mentally mutilated "beyond hope of regeneration".
> <end quote>

I've heard C programmers say the same thing about BASIC. I started with 
BASIC, and I still use it occasionally, admittedly mostly for old times' 
sake. The fact that some C programmers despise BASIC is of no consequence to 
me. If they think the worse of me for having started off my programming 
experience by using it, that's their problem, not mine.

> 
> My answer was:
> <quote>
> I do not know what that code does, if it is readable or not, whatever. I
> just can't support the unfathomable arrogance of the C++ guys and their
> continuous insults, specially when they come to this group to  vomit
> their preconceived ideas about us.
> 
> Yes, I am a C programmer and I have been using this computer language
> since a long time.
> <end quote>

I don't think it's constructive to attack people, even if we perceive them 
to be attacking us. It /is/ constructive to point out that C has its 
strengths and weaknesses, and so does C++. In some areas, C++ is stronger. 
In some areas (notably its simplicity), C is stronger. Surely we can discuss 
those issues with C++ aficionados without resorting to name-calling, even if 
there is name-calling coming in the other direction.

There is a world of difference between a conversation like this:

A: You're an idiot to use C. Use C++ instead.
B: Under <itemised> circumstances, I use C instead of C++ because <reasons>. 
Do you see any flaw in my reasoning?
A: Yes, you're a bonehead for using C.
B: Please provide evidence for that.

and a conversation like this:

A: You're an idiot to use C. Use C++ instead.
B: You're a fathead.
A: Loser!
B: Arrogant fool!

Do you see the difference? The first conversation marks A out to be a fool, 
and B to be a wise man. In the second conversation, they are both fools.

I prefer to aim for wisdom.

> Then, Mr Kuyper answers and starts a different thread, ignoring what I
> said, and ignoring all insults by this person.

Ignoring insults is often the best way to proceed.

> 
> And then you conclude (ahh so nicely above everything) with your
> consensual stuff, without EVER bothering to address the fact that for a
> C++ programmer a C programmer is somehow a mentally ill person.

Some C++ programmers might say such a thing, but most of those don't really 
mean it. They are just using hyperbole. Most C++ programmers wouldn't say 
such a thing, because they're too busy writing C++ programs. More power to 
them. Even those C++ programmers who both say and mean such a thing are not 
going to be persuaded to change their behaviour by being called names.

And of course many C++ programmers are also C programmers, and vice versa. 
It is a strange kind of programmer who (effectively) calls /himself/ names, 
except perhaps after a particularly long and drawn-out debugging session.

> 
> And that arrogance since the start of C++
> 
> It has been almost 20 years of insults, and somehow you (and nobody in
> this group) will deem necessary to ask the c++ people to stop.

What good would that do? Have /you/ tried it? Yes. Did it work? No. Why bang 
your head against a brick wall? It's pointless. There are more productive 
ways to spend our time.

> 
> Well, it is normal since all of you are mentally retarded?

It is possible that any one of us, whether a C programmer, a C++ programmer, 
or both, could be mentally retarded. How would we know? But we work on the 
assumption that we are not. I work on the assumption that C++ programmers, 
like C programmers, and like those who use both languages, are mostly 
ordinary hard-working people like ourselves. The abusive minority /is/ a 
minority. Let us not waste too much time trying to persuade the 
unpersuadable. And let us waste /no/ time in insulting them. To do so merely 
reduces us to their level.

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/24/2014 12:40:04 PM
Bartc wrote:
> So if people don't care about slow C
> compilation speeds anyway, then they're not
> going to be bothered about 10% or 20%
> overhead for C++

Are you talking 10% or 20% slowdown in compilation
time? Or runtime execution time?

Best regards,
Rick C. Hodgin
0
Rick
10/24/2014 1:20:47 PM

"Rick C. Hodgin" <rick.c.hodgin@gmail.com> wrote in message 
news:5006969a-78e5-4c13-88f9-614a8e4ba889@googlegroups.com...
> Bartc wrote:
>> So if people don't care about slow C
>> compilation speeds anyway, then they're not
>> going to be bothered about 10% or 20%
>> overhead for C++
>
> Are you talking 10% or 20% slowdown in compilation
> time? Or runtime execution time?

Compilation time.

Execution time is much more difficult to measure, with various levels of 
optimisation and many categories of application to consider. But I'm not 
suggesting that C++ programs are slower than C ones.

-- 
Bartc  
0
BartC
10/24/2014 1:54:10 PM
On 24/10/14 13:51, Richard Heathfield wrote:
> David Brown wrote:
> 
>> On 24/10/14 08:16, Richard Heathfield wrote:
>>> David Brown wrote:
>>>
>>>> Actually, it is not uncommon to use a C++ compiler for "ABC" - "A Better
>>>> C".
>>>
>>> ITYM "ADC" - A Different C. ADC is indisputable. The jury is still out on
>>> ABC.
>>>
>>
>> That is in the eye of the beholder.  I am sure that most people using
>> such a C-like subset of C++ would consider it to be "better", or they
>> would not be using it.
>>
>> Fortunately, there is no judge or jury here, nor any requirement to
>> decide one way or the other - different people and different projects
>> have different needs and opinions.
> 
> This is precisely why all such language wars are fruitless and pointless. 
> Cultural diversity is extraordinarily important, and the rich variety of 
> languages at our disposal is a strength, not a weakness.
> 
> C++ is my second-favourite language. There are times when I choose it over 
> C. Mostly, however, I use C. But I like them both, and use them both. It's 
> odd that people think there has to be some kind of competition between them.
> 

If you view discussions like this as a "war", with aim of convincing
other people of the "one true path", then I agree that it would be
pointless.  But if you view them as an exchange of ideas, to improve
your ability to choose the right tool for the job in hand, then they can
be quite productive on occasion.

Like you, I think there is plenty of room for diversity - and different
languages suit different tasks.

0
David
10/24/2014 2:17:46 PM
"David Brown" <david.brown@hesbynett.no> wrote in message
news:m2cas6$e4l$1@dont-email.me...
> On 24/10/14 02:49, BartC wrote:

>> They will want to make use of all those goodies they've heard about. Or
>> for various reasons they are obliged to use it and have to utilise C++
>> style interfaces to do so. And by C++ I don't mean a language that has a
>> C subset, I mean everything else that makes the language controversial.
>>
>
> Actually, it is not uncommon to use a C++ compiler for "ABC" - "A Better
> C".  You make use of C++'s slightly stricter handling of types, void*
> pointers, etc., and improvements to the C subset such as using static
> consts for sizing arrays.

>(Clearly people will vary as to what they think is in "ABC" rather than
>full C++.)

Yes, I also had my own ideas, which can summarised here:
http://pastebin.com/TNegH8bV

(Warning: there's quite a lot! This was created in response to a separate
discussion elsewhere.)

But these are generally fairly innocuous changes; the resulting language if
the ideas were all implemented wouldn't be at a that much higher level; it
would just be /nicer/. It certainly wouldn't bring C a step closer to C++.

Using C++ would address hardly any of the points I listed.

-- 
Bartc 

0
BartC
10/24/2014 2:35:02 PM
On 10/23/2014 08:49 PM, BartC wrote:
> 
> 
> "James Kuyper" <jameskuyper@verizon.net> wrote in message 
> news:54498B6A.2070002@verizon.net...
>> On 10/23/2014 06:54 PM, Luca Risolia wrote:
>> ...
>>> this. It's just that you cannot program in C++ as you program in C: you
>>> get the worst from both the languages. That was my point. Ok ?
>>
>> If that was your point, I'll have to disagree with it. You can program
>> in C++ almost exactly as you programmed in C
> 
> I'll have to disagree with /that/. People are not going to seek out C++ just 
> so they can continue to program in C

Your expectations to the contrary notwithstanding, this is in fact a
fairly common practice. I've heard it referred to as "C++ in name only",
among other things.

In any event, I was only talking about what's possible - whether anybody
is actually doing what I referred to, is irrelevant to my comment.
0
James
10/24/2014 3:15:50 PM
On 2014-10-24, BartC <bc@freeuk.com> wrote:
> "Richard Heathfield" <invalid@see.sig.invalid> wrote in message
> news:M2r2w.767126$an2.567598@fx09.am4...
>> David Brown wrote:
>
>>> Fortunately, there is no judge or jury here, nor any requirement to
>>> decide one way or the other - different people and different projects
>>> have different needs and opinions.
>>
>> This is precisely why all such language wars are fruitless and pointless.
>> Cultural diversity is extraordinarily important, and the rich variety of
>> languages at our disposal is a strength, not a weakness.
>>
>> C++ is my second-favourite language. There are times when I choose it over
>> C. Mostly, however, I use C. But I like them both, and use them both. It's
>> odd that people think there has to be some kind of competition between
>> them.
>
> But sometimes it is rammed down your throat.
>
> You can say the same about C, but while C is largely transparent, C++ is
> generally opaque.

There are C programs which are opaque and there are C++ programs which
are opaque. Then there are transparent examples of both.

Some C programs which are opaque are precisely opaque because they
are filled with the bookkeeping that could be made invisible (and reliable)
in a higher level language. More than 50% of the C code has to do with
the programmer emulating the language that he or she *ought* to be using,
and it's done in a clumsy way which spreads this logic throughout the
program.

(A recent example of this are the internals of the "systemd" program.)

Even the use of something simple like using a table of function pointers to
simulate virtual functions can make a C program hard to follow.
Even though C++ does the same thing with virtual functions, it will be
easier because of the object oriented design which explicitly expresses
the inheritance relationships. Oh, and of course, the overrides of these
functions have the same darn names! In a C program, a virtual function
might be some  "void (*bar)(int)" member in some structure, but the
actual instances assigned to may have ad hoc names. In C++, the virtual
function base::bar will be called derived::bar, and we can hunt down
the derived class itself because it clearly inherits from base.

In the past, I've sometimes had to resort to debuggers to work this out!
I had no idea what function is being called in ptr->foo(), and it was
too hard to follow the source. Just put a breakpoint on it, and,
aha, the value of ptr->foo is the address of module_foo_impl.
0
Kaz
10/24/2014 6:56:19 PM
On Friday, October 24, 2014 7:56:35 PM UTC+1, Kaz Kylheku wrote:
>
> (A recent example of this are the internals of the "systemd" program.)
> Even the use of something simple like using a table of function
> pointers to simulate virtual functions can make a C program hard 
> to follow.
> Even though C++ does the same thing with virtual functions, it will be
> easier because of the object oriented design which explicitly expresses
> the inheritance relationships. Oh, and of course, the overrides of these
> functions have the same darn names!
>
Heavy use of function pointers makes a program difficult to follow.

But in a  mouse-event driven program, it's hard to avoid. The user
clicks on standard or conceptually isolated widgets, and somehow 
they have to be linked up to make a change in the internal model,
then synchronise the result across all the displays.

But going to C++ or a functional programming language doesn't make
the issue go away. You gain a bit on some aspects of the syntax
and the fact that the reading programmer knows what you are doing, 
but the inevitable side effect of that is loss of flexibility,
the worst being linking by name (this module calls a function by
the name "length", expecting the number of elements it can access)

If function pointers are short and sweet, such as a hash function
or a key extractor for a hash table, it's not too bad.

0
Malcolm
10/24/2014 7:18:12 PM
On 2014-10-24, Malcolm McLean <malcolm.mclean5@btinternet.com> wrote:
> On Friday, October 24, 2014 7:56:35 PM UTC+1, Kaz Kylheku wrote:
>>
>> (A recent example of this are the internals of the "systemd" program.)
>> Even the use of something simple like using a table of function
>> pointers to simulate virtual functions can make a C program hard 
>> to follow.
>> Even though C++ does the same thing with virtual functions, it will be
>> easier because of the object oriented design which explicitly expresses
>> the inheritance relationships. Oh, and of course, the overrides of these
>> functions have the same darn names!
>>
> Heavy use of function pointers makes a program difficult to follow.
>
> But in a  mouse-event driven program, it's hard to avoid. The user
> clicks on standard or conceptually isolated widgets, and somehow 
> they have to be linked up to make a change in the internal model,
> then synchronise the result across all the displays.
>
> But going to C++ or a functional programming language doesn't make
> the issue go away. You gain a bit on some aspects of the syntax

It means that a lot of the issue's clutter goes away from the code,
which partially substitutes for "goes away".
0
Kaz
10/24/2014 7:59:18 PM
On Friday, October 24, 2014 8:59:38 PM UTC+1, Kaz Kylheku wrote:
> 
> > But going to C++ or a functional programming language doesn't make
> > the issue go away. You gain a bit on some aspects of the syntax
> 
> It means that a lot of the issue's clutter goes away from the code,
> which partially substitutes for "goes away".
>
Function pointers convert the call graph from a tree to a web.

You don't get round that by prettying it up with virtual inheritance.
(Admittedly you can do the same with mutually recursive functions,
but you've got to try pretty hard to make a comparable mess).

 
0
Malcolm
10/24/2014 9:05:07 PM
On Tuesday, October 21, 2014 6:35:30 AM UTC-5, Rick C. Hodgin wrote:
> Made you think. :-)
> 
> Best regards,
> Rick C. Hodgin

No, you didn't; all you did was set off a bunch of bullshit detectors.  
0
John
10/24/2014 10:08:55 PM
Malcolm McLean wrote:
> On Friday, October 24, 2014 8:59:38 PM UTC+1, Kaz Kylheku wrote:
>>
>>> But going to C++ or a functional programming language doesn't make
>>> the issue go away. You gain a bit on some aspects of the syntax
>>
>> It means that a lot of the issue's clutter goes away from the code,
>> which partially substitutes for "goes away".
>>
> Function pointers convert the call graph from a tree to a web.
>
> You don't get round that by prettying it up with virtual inheritance.
> (Admittedly you can do the same with mutually recursive functions,
> but you've got to try pretty hard to make a comparable mess).

A couple of big advantages of virtual inheritance over function pointers 
are:

a) you have a much smaller set of possible callees.
b) the compiler is better placed to diagnose errors.
c) there are more opportunities for optimisation.

-- 
Ian Collins
0
Ian
10/24/2014 10:21:26 PM
Malcolm McLean <malcolm.mclean5@btinternet.com> writes:

> On Friday, October 24, 2014 8:59:38 PM UTC+1, Kaz Kylheku wrote:
>> 
>> > But going to C++ or a functional programming language doesn't make
>> > the issue go away. You gain a bit on some aspects of the syntax
>> 
>> It means that a lot of the issue's clutter goes away from the code,
>> which partially substitutes for "goes away".
>>
> Function pointers convert the call graph from a tree to a web.

How?  All they do (in C) is to prevent you knowing the name of the
called function.

A C program has only a fixed and finite number of functions that can be
called.  Thus a function pointer can (with a bit of hand-waving here) be
converted into a conditional expression (or a switch) using a plain
integer.  If such conditional calls are what you call a web, then
that term applies to at least some programs without function pointers.

(I want to avoid arguing about words, but using the normal meaning of
the term, the call graph of a C program need not be a tree.  Also, I
don't think "a web" is a widely used term, and it lacks the specificity
of the more usual "directed graph".)

<snip>
-- 
Ben.
0
Ben
10/24/2014 10:45:13 PM
Malcom said:
>> Function pointers convert the call graph from a tree to a web.

Ben answered:
> How?  All they do (in C) is to prevent you knowing the name of the
> called function.

Function pointers can allow you to:

1) Use a generic name like "add" for a conceptually similar precedure. 
"Create" comes to mind.

2) They form the base of functions as data, when passed around to a 
function or calculated from a table of functions.

3) You can group function pointers into interfaces, for establishing a 
set of related methods that work on a common data structure. I used 
extensively this in the containers library. Happily I am not in C++ so 
that no pre-established rules decide for me how things should run: just 
the need for clarity, and the most efficient way of organizing the 
computation.

Other uses are:

o Converting switches to function tables and (maybe) an access method. 
Adding/deleting functionality in a table makes it much less hard coded 
than  in a switch statement. Basically, a function table with an access 
method is fully equivalent to a switch statement.

lcc-win will generate a table of "functions" from a switch if you ask it 
to, eliminating 100% of the switch overhead.

o Nothing prevents you from knowing the function that will be called if 
you care to document in the code with clear names what the function 
pointer does, and in the associated docs to describe the 	algorithm of 
choice for selecting the function pointer.

ALL th rest of C++ C# lambda expressions, F#, lisp, etc are just 
glorified function pointers.

What is a "call" instruction?

Push the current instruction point
load the instruction pointer from the code just nehind the call opcode. 
This is hardwired.

What is an indirect call instruction?

Push the current instruction point
load from memory a pointer value with the next instruction point.

This extra indirection makes software incredibly more powerful and that 
is why all modern processors have such an instruction.

You can then:

o Define a data structure (a hash table, say)
o Define a table of function pointers that point to methods that create, 
update, and destroy the data structure. That is the interface of the 
data structure.

This way you can at run time slightly change the behavior of the object 
by substituting the function pointer with an enhanced function that 
calls the old function pointer after it is done with the addded 
functionality, or calls it before doing anything or both, or none.

If you want to slightly change the way a button is drawn you call the 
default routine and then change the text, or you take over all painting 
but leave the interface with the rest of the software intact or you take 
over completely and build a completely new widget.

Or you discover that the library has a bug, you fix it and at run time 
you patch the library with your fix.

Etc, the applications are infinite.

0
jacob
10/24/2014 11:16:38 PM
On 2014-10-24, jacob navia <jacob@spamsink.net> wrote:
> ALL th rest of C++ C# lambda expressions, F#, lisp, etc are just 
> glorified function pointers.

Yes; glorified with this gaudy little ornament called an "environment"
which hardly does anything.

> What is a "call" instruction?

A challenge which summons the warrior to the terminal's keyboard.

> Push the current instruction point
> load the instruction pointer from the code just nehind the call opcode. 
> This is hardwired.
>
> What is an indirect call instruction?
>
> Push the current instruction point
> load from memory a pointer value with the next instruction point.
>
> This extra indirection makes software incredibly more powerful and that 
> is why all modern processors have such an instruction.

I'm not convinced. If you don't have indirect calling, I think you can still
create an interpreter for virtual machine which does have indirect calling, so
the power is there. Indirect branches allow for efficiencies. Given
that we have instructions addressed in memory using some instruction pointer,
it makes sense to have the power to assign computed values to that poitner.
Even relocatable code, to an extent, relies on this. We don't think of it
as using function pointers because the offsets are fixed, but it's in effect
a form of indirection that lets it be relocatable: a "JUMP +500"
actually does "IP <- IP + 500", which is a computed jump.

> You can then:
>
> o Define a data structure (a hash table, say)
> o Define a table of function pointers that point to methods that create, 
> update, and destroy the data structure. That is the interface of the 
> data structure.
>
> This way you can at run time slightly change the behavior of the object 
> by substituting the function pointer with an enhanced function that 
> calls the old function pointer after it is done with the addded 
> functionality, or calls it before doing anything or both, or none.

Almost 15 years ago (December 1999) I made a container that can reorganize
itself into different kinds of trees at run-time. You simply tell it to change
its type: "I want you to be a red-black tree. Now be a splay tree. Now be just a
sorted list.  Now be a splay tree. Now be just an unbalanced binary tree ..."

http://www.kylheku.com/~kaz/austin.html

This works simply. We tell the object to flatten itself to a list
using the convert_to_list function pointer. Then we change its type by
switching the table of operations. Then we tell it to convert_from_list through
the new table of operations. All operations use the same node structure,
So this change works without reallocating any memory; it just re-interprets
some of the fields of the tree nodes.
0
Kaz
10/24/2014 11:40:56 PM
jacob navia <jacob@spamsink.net> writes:

> Malcom said:
>>> Function pointers convert the call graph from a tree to a web.
>
> Ben answered:
>> How?  All they do (in C) is to prevent you knowing the name of the
>> called function.
>
> Function pointers can allow you to:

<snip lots of uses>

Yes.  I did not mean to suggest that they are not useful, just that they
have no logical effect on the call graph.

> ALL th rest of C++ C# lambda expressions, F#, lisp, etc are just
> glorified function pointers.

But that's not true at all unless you mean "glorified" more laterally
than I suspect you do.  The glory of true higher-order functions is that
they permit all sort of programming patterns that are simply not
possible in C (always excepting the "all languages can implement each
other" answer).

Whether you want to use these ways of writing programs, or if they help
to make software more reliable, easier to maintain, or what have you is
another question, but it's a gross distortion to say that higher-order
functions are "just" glorified function pointers.

<snip>
-- 
Ben.
0
Ben
10/25/2014 12:09:28 AM
David Brown <david.brown@hesbynett.no> wrote:
<snip>
> You can then start using additional features of the C++ language that 
> are arguably still in the spirit of C, such as namespaces or constexpr, 
> and possibly overloading (some people think it should be in C), or 
> default values.

C supports limited overloading via _Generic, which avoids the name mangling
headaches because the author gets to define the symbol names.

C++ supports overridable default function parameter values. But C supports
overridable default structure initializer values via designated
initializers, a feature the C++ commitee claims is useless.

Actually, because C++ only supports positional parameters, it's trivial to
mimic default parameter initialization in C. I've used constructs
such as this before:

  #define foo(...) foo(__VA_ARGS__, -1)
  void (foo)(int a, int b);

Granted, it still requires that the first parameter be explicit. But you can
use similar tricks to do function overloading by the number of positional
parameters. And using designated initializers, you can simulate named
parameters in C, something not even possible in C++.

  struct foo { int a; int b; };
  #define foo(...) foo((struct foo){ .b = -1, __VA_ARGS__ })
  void (foo)(struct foo args);

It's all ugly, though. _Less_ because of the macro hacks and _more_ because
it can be confusing when reading the code. When one line calls foo(x) and
the other calls foo(x, y), what are you supposed to make of that? You're
compelled to go read the code, no matter well you use namespaces.

In fact, I find functional overloading to be problematic in general in terms
of adding needless complexity to code. Yes, it can be convenient. But
convenience should not be the litmus test for the soundness of a language
feature.

People confuse expressiveness with convenience. Many C++ features tend to be
convenient, not expressive. Operator overloading is a classic example.
Templates fall into a grey area. As originally conceived they were
expressive, but not they're _widely_ used to create impenetrable interfaces.
I'm often tempted to switch to C++ just to be able to use templates, but
then I realize that even the default library does evil with templates. And
then faced with the prospect of giving up everything that C brings, most of
important of which is simplicity (not merely by convention, but by
limitation!), I come to my senses.

Instead, I'm happy to use other languages with significantly more
expressiveness than C++, like Lua, which in many respects interoperates with
C even better than C++. For example, I can manipulate Lua coroutines and Lua
closures from C. In can even implement Lua closures and coroutines using C
functions. (For coroutines you register a continuation cookie along with the
C function; there's no games played with the C stack. It simply reinvokes
the C function and you jump back to where you yielded. It's all 100%
standards conformant C code.) C++ doesn't offer me that flexibility.

0
william
10/25/2014 2:31:35 AM
william@wilbur.25thandClement.com wrote:

<interesting stuff snipped>

> In fact, I find functional overloading to be problematic in general in terms
> of adding needless complexity to code. Yes, it can be convenient. But
> convenience should not be the litmus test for the soundness of a language
> feature.

I can't resist comment on this paragraph.

Like other features present in C++ but not C, function overloading can 
be considered only reasonably useful on its own but it is all but 
essential for the language as a whole to work.  You can't do type safe 
generics (templates in C++) without function overloading.  Similar 
arguments apply to pass by reference: it can justifiably be written off 
as syntactic sugar as a standalone feature, but it becomes all be 
essential once you add operator overloading to the language.

-- 
Ian Collins
0
Ian
10/25/2014 3:55:45 AM
Ian Collins wrote:
> william@wilbur.25thandClement.com wrote:
>
> <interesting stuff snipped>
>
>> In fact, I find functional overloading to be problematic in general in
>> terms
>> of adding needless complexity to code. Yes, it can be convenient. But
>> convenience should not be the litmus test for the soundness of a language
>> feature.
>
> I can't resist comment on this paragraph.
>
> Like other features present in C++ but not C, function overloading can
> be considered only reasonably useful on its own but it is all but
> essential for the language as a whole to work.  You can't do type safe
> generics (templates in C++) without function overloading.  Similar
> arguments apply to pass by reference: it can justifiably be written off
> as syntactic sugar as a standalone feature, but it becomes all be
> essential once you add operator overloading to the language.
>

I am reminded of a quote from Truman Capote - "More tears are shed over 
answered prayers than over unanswered prayers." - Teresa of �vila.

-- 
Les Cargill
0
Les
10/25/2014 4:36:18 AM
On Saturday, October 25, 2014 12:41:08 AM UTC+1, Kaz Kylheku wrote:
> On 2014-10-24, jacob navia <jacob@spamsink.net> wrote:
> > ALL th rest of C++ C# lambda expressions, F#, lisp, etc are just 
> > glorified function pointers.
> 
> Yes; glorified with this gaudy little ornament called an "environment"
> which hardly does anything.
> 
It is absolutely crucial that every C function pointer take a void *,
which is passed in with the pointer. Baby X follows this rule
rigorously. Without it, it would be impossible to build independent,
reinstantible Baby X components.

Sadly, K and R forgot this when defining the qsort interface. 
So say I've a set of points x, y which index into an image,
and I want to sort the points by the red component. With qsort,
I can't do it, other than via a global Add the void *, and 
it's easy.
0
Malcolm
10/25/2014 9:35:29 AM
On Friday, October 24, 2014 11:45:25 PM UTC+1, Ben Bacarisse wrote:
> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> 
> > Function pointers convert the call graph from a tree to a web.
> 
> (I want to avoid arguing about words, but using the normal meaning of
> the term, the call graph of a C program need not be a tree.  Also, I
> don't think "a web" is a widely used term, and it lacks the specificity
> of the more usual "directed graph".)
> 
Even you must be familiar with the term "world wide web".
0
Malcolm
10/25/2014 9:42:08 AM
Malcolm McLean <malcolm.mclean5@btinternet.com> writes:

> On Friday, October 24, 2014 11:45:25 PM UTC+1, Ben Bacarisse wrote:
>> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> 
>> > Function pointers convert the call graph from a tree to a web.
>> 
>> (I want to avoid arguing about words, but using the normal meaning of
>> the term, the call graph of a C program need not be a tree.  Also, I
>> don't think "a web" is a widely used term, and it lacks the specificity
>> of the more usual "directed graph".)
>> 
> Even you must be familiar with the term "world wide web".

When you use a word metaphorically (as I presume you are doing here) the
reader is not always sure what aspect of the association is being
referenced.  What it is about the web-like nature of a call graph do
function pointers introduce?  To me, any old C program's call graph is
web-like, with the exception that there is an "initial" node not present
in any kind of web.  This is more usually called a rooted directed
graph, and this is does not change (as far as I can see) when function
pointers are introduced.  What is the change you want your metaphor to
convey to me?  (You can assume that I have heard of the world wide web.)

-- 
Ben.
0
Ben
10/25/2014 11:12:02 AM
Malcolm McLean wrote:

> On Friday, October 24, 2014 11:45:25 PM UTC+1, Ben Bacarisse wrote:
>> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> 
>> > Function pointers convert the call graph from a tree to a web.
>> 
>> (I want to avoid arguing about words, but using the normal meaning of
>> the term, the call graph of a C program need not be a tree.  Also, I
>> don't think "a web" is a widely used term, and it lacks the specificity
>> of the more usual "directed graph".)
>> 
> Even you must be familiar with the term "world wide web".

Malcolm! I'm shocked.

Sarky is my job. You know that. So what we 'ave 'ere is a demarcation 
dispute. I'll 'ave you up before the union, I will.

Anyway, Ben really didn't deserve that.

Bad hair day? :-)

-- 
Richard Heathfield
Email: rjh at cpax dot org dot uk
"Usenet is a strange place" - dmr 29 July 1999
Sig line 4 vacant - apply within
0
Richard
10/25/2014 11:14:43 AM
On Saturday, October 25, 2014 12:12:14 PM UTC+1, Ben Bacarisse wrote:
> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> 
> > On Friday, October 24, 2014 11:45:25 PM UTC+1, Ben Bacarisse wrote:
> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> >> 
> >> > Function pointers convert the call graph from a tree to a web.
> >> 
> >> (I want to avoid arguing about words, but using the normal meaning of
> >> the term, the call graph of a C program need not be a tree.  Also, I
> >> don't think "a web" is a widely used term, and it lacks the specificity
> >> of the more usual "directed graph".)
> >> 
> > Even you must be familiar with the term "world wide web".
> 
> When you use a word metaphorically (as I presume you are doing here) the
> reader is not always sure what aspect of the association is being
> referenced.  What it is about the web-like nature of a call graph do
> function pointers introduce?  To me, any old C program's call graph is
> web-like, with the exception that there is an "initial" node not present
> in any kind of web.  This is more usually called a rooted directed
> graph, and this is does not change (as far as I can see) when function
> pointers are introduced.  What is the change you want your metaphor to
> convey to me?  (You can assume that I have heard of the world wide web.)
> 
A tree means a nice graph with a start point at the root and branches
going from it, then sub branches, and so on. A web means a mess with
links going everywhere.

You're right that C can produce a mess. Recursive functions create
a loop, but it's not important for understanding program flow
(though it is important for other things such as stack usage).
Mutually recursive functions can create an uninterpretable control
flow graph. But in C you're usually very aware that you're creating
mutually recursive functions, the language doesn't encourage it
because of the scoping rules.

The exception is when you make heavy use of function pointers,
like Baby X. Baby X comes in two versions, an X Windows version
and an MS Windows version. Both X and Windows work by giving
messages on request. However X gives a rich message structure,
Windows likes a message pump, basically you call windows again
and it calls a function pointer that points to user-defined
function attached to each window. Baby X then operates on the
windows mode, it calls user-defined functions attached to
buttons, text edit boxes, and the like to perform application
logic.

The result is that the calling programmer doesn't think of his
Baby X program as a tree, starting from main and with subroutines
branching. he thinks of it as a single call to startbabyx() to
get the ball rolling, followed by creation of Baby X windows
and callbacks to his defined functions, most of which can be called
at any time ad in any order. Then in a complex BabyX program,
these callbacks are themselves creating an tearing down Baby X
widgets. 
So the program held together with function pointers and dynamically
allocated structures which other structures hold on to. Then the 
X version of Baby X has a central loop, which everything ultimately 
returns to (though even that you push and pop to make dialogs modal). 
But the Windows version has two systems of function pointers going, the 
Baby X system and the Windows version. It was much more difficult to w
rite, and its still  being debugged. (I've just returned to it)
 
I wrote Baby X and I don't think there's another approach I could
have reasonably have used. Maybe using another language than C 
would have helped by imposing discipline rather than forcing the
programmer to apply it himself, but I suspect not.
0
Malcolm
10/25/2014 11:52:08 AM
Malcolm McLean <malcolm.mclean5@btinternet.com> writes:

> On Saturday, October 25, 2014 12:12:14 PM UTC+1, Ben Bacarisse wrote:
>> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> 
>> > On Friday, October 24, 2014 11:45:25 PM UTC+1, Ben Bacarisse wrote:
>> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> >> 
>> >> > Function pointers convert the call graph from a tree to a web.
>> >> 
>> >> (I want to avoid arguing about words, but using the normal meaning of
>> >> the term, the call graph of a C program need not be a tree.  Also, I
>> >> don't think "a web" is a widely used term, and it lacks the specificity
>> >> of the more usual "directed graph".)
>> >> 
>> > Even you must be familiar with the term "world wide web".
>> 
>> When you use a word metaphorically (as I presume you are doing here) the
>> reader is not always sure what aspect of the association is being
>> referenced.  What it is about the web-like nature of a call graph do
>> function pointers introduce?  To me, any old C program's call graph is
>> web-like, with the exception that there is an "initial" node not present
>> in any kind of web.  This is more usually called a rooted directed
>> graph, and this is does not change (as far as I can see) when function
>> pointers are introduced.  What is the change you want your metaphor to
>> convey to me?  (You can assume that I have heard of the world wide web.)
>> 
> A tree means a nice graph with a start point at the root and branches
> going from it, then sub branches, and so on. A web means a mess with
> links going everywhere.

How is this different when function pointers are introduced?  Exactly
the same functions call exactly the same set of functions.  The call
graph is identical.  What changes is how easy it usually is to know the
name of the called function by looking at the text of the program.  I
think you are maybe misusing the term "call graph".

> You're right that C can produce a mess.

I never said that.  I said that the call graph of a C program is not
necessarily a tree, even in the absence of function pointers.  You
seemed to have been saying that is was.

The call graph of a general C program is a rooted directed graph, and
that does not depend, as far as I can see, on whether the program uses
function pointers or not.

<snip>
-- 
Ben.
0
Ben
10/25/2014 12:29:35 PM
On Saturday, October 25, 2014 1:29:50 PM UTC+1, Ben Bacarisse wrote:
> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> 
> > On Saturday, October 25, 2014 12:12:14 PM UTC+1, Ben Bacarisse wrote:
> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> >> 
> >> > On Friday, October 24, 2014 11:45:25 PM UTC+1, Ben Bacarisse wrote:
> >> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> >> >> 
> >> >> > Function pointers convert the call graph from a tree to a web.
> >> >> 
> >> >> (I want to avoid arguing about words, but using the normal meaning of
> >> >> the term, the call graph of a C program need not be a tree.  Also, I
> >> >> don't think "a web" is a widely used term, and it lacks the specificity
> >> >> of the more usual "directed graph".)
> >> >> 
> >> > Even you must be familiar with the term "world wide web".
> >> 
> >> When you use a word metaphorically (as I presume you are doing here) the
> >> reader is not always sure what aspect of the association is being
> >> referenced.  What it is about the web-like nature of a call graph do
> >> function pointers introduce?  To me, any old C program's call graph is
> >> web-like, with the exception that there is an "initial" node not present
> >> in any kind of web.  This is more usually called a rooted directed
> >> graph, and this is does not change (as far as I can see) when function
> >> pointers are introduced.  What is the change you want your metaphor to
> >> convey to me?  (You can assume that I have heard of the world wide web.)
> >> 
> > A tree means a nice graph with a start point at the root and branches
> > going from it, then sub branches, and so on. A web means a mess with
> > links going everywhere.
> 
> How is this different when function pointers are introduced?  Exactly
> the same functions call exactly the same set of functions.  The call
> graph is identical.  What changes is how easy it usually is to know the
> name of the called function by looking at the text of the program.  I
> think you are maybe misusing the term "call graph".
>
Fixed versus dynamic.

Without function pointers, you have a list of a's subroutines,
which admittedly might include a again somewhere on the graph,
but you can check.

With function pointers, a calls a list of supplied functions,
which could be anything. They also call a list of supplied functions,
which could be anything. Even the size of the list of callable
supplied functions might change as the program executes.
It's now very hard to determine if there is a possible cycle
or not, or if function x is reachable from a. 

You can artificially write a function which switches on a random
value and calls one of every other function in the program,
including itself, but that's a degenerate case of structured
programming, also a degenerate fully-connect graph. We're talking 
about graphs where the number of connections between nodes is small 
but greater than two or three, and the number cycles low but not 
necessarily totally excluded, and, without function pointers, the 
number of candidates for each connection one and the number of 
connection fixed, with function pointers, the number of candidates 
two or three and the number of connections usually but not 
necessarily fixed.
The last rules change the characteristics of the system totally.
0
Malcolm
10/25/2014 2:56:18 PM
Malcolm McLean <malcolm.mclean5@btinternet.com> writes:

> On Saturday, October 25, 2014 1:29:50 PM UTC+1, Ben Bacarisse wrote:
>> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> 
>> > On Saturday, October 25, 2014 12:12:14 PM UTC+1, Ben Bacarisse wrote:
>> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> >> 
>> >> > On Friday, October 24, 2014 11:45:25 PM UTC+1, Ben Bacarisse wrote:
>> >> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> >> >> 
>> >> >> > Function pointers convert the call graph from a tree to a web.
>> >> >> 
>> >> >> (I want to avoid arguing about words, but using the normal meaning of
>> >> >> the term, the call graph of a C program need not be a tree.  Also, I
>> >> >> don't think "a web" is a widely used term, and it lacks the specificity
>> >> >> of the more usual "directed graph".)
>> >> >> 
>> >> > Even you must be familiar with the term "world wide web".
>> >> 
>> >> When you use a word metaphorically (as I presume you are doing here) the
>> >> reader is not always sure what aspect of the association is being
>> >> referenced.  What it is about the web-like nature of a call graph do
>> >> function pointers introduce?  To me, any old C program's call graph is
>> >> web-like, with the exception that there is an "initial" node not present
>> >> in any kind of web.  This is more usually called a rooted directed
>> >> graph, and this is does not change (as far as I can see) when function
>> >> pointers are introduced.  What is the change you want your metaphor to
>> >> convey to me?  (You can assume that I have heard of the world wide web.)
>> >> 
>> > A tree means a nice graph with a start point at the root and branches
>> > going from it, then sub branches, and so on. A web means a mess with
>> > links going everywhere.
>> 
>> How is this different when function pointers are introduced?  Exactly
>> the same functions call exactly the same set of functions.  The call
>> graph is identical.  What changes is how easy it usually is to know the
>> name of the called function by looking at the text of the program.  I
>> think you are maybe misusing the term "call graph".
>>
> Fixed versus dynamic.

Maybe you should say what you mean by the call graph because, to me, it
is always a dynamic thing.  The graph nodes are functions and a
(directed) edge between f and g means that f calls g.

<snip>
-- 
Ben.
0
Ben
10/25/2014 5:34:04 PM
On Sat, 25 Oct 2014 18:34:04 +0100
Ben Bacarisse <ben.usenet@bsb.me.uk> wrote:

> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> 
> > On Saturday, October 25, 2014 1:29:50 PM UTC+1, Ben Bacarisse wrote:
> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> >> 
> >> > On Saturday, October 25, 2014 12:12:14 PM UTC+1, Ben Bacarisse
> >> > wrote:
> >> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> >> >> 
> >> >> > On Friday, October 24, 2014 11:45:25 PM UTC+1, Ben Bacarisse
> >> >> > wrote:
> >> >> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> >> >> >> 
> >> >> >> > Function pointers convert the call graph from a tree to a
> >> >> >> > web.
> >> >> >> 
> >> >> >> (I want to avoid arguing about words, but using the normal
> >> >> >> meaning of the term, the call graph of a C program need not
> >> >> >> be a tree.  Also, I don't think "a web" is a widely used
> >> >> >> term, and it lacks the specificity of the more usual
> >> >> >> "directed graph".)
> >> >> >> 
> >> >> > Even you must be familiar with the term "world wide web".
> >> >> 
> >> >> When you use a word metaphorically (as I presume you are doing
> >> >> here) the reader is not always sure what aspect of the
> >> >> association is being referenced.  What it is about the web-like
> >> >> nature of a call graph do function pointers introduce?  To me,
> >> >> any old C program's call graph is web-like, with the exception
> >> >> that there is an "initial" node not present in any kind of
> >> >> web.  This is more usually called a rooted directed graph, and
> >> >> this is does not change (as far as I can see) when function
> >> >> pointers are introduced.  What is the change you want your
> >> >> metaphor to convey to me?  (You can assume that I have heard of
> >> >> the world wide web.)
> >> >> 
> >> > A tree means a nice graph with a start point at the root and
> >> > branches going from it, then sub branches, and so on. A web
> >> > means a mess with links going everywhere.
> >> 
> >> How is this different when function pointers are introduced?
> >> Exactly the same functions call exactly the same set of
> >> functions.  The call graph is identical.  What changes is how easy
> >> it usually is to know the name of the called function by looking
> >> at the text of the program.  I think you are maybe misusing the
> >> term "call graph".
> >>
> > Fixed versus dynamic.
> 
> Maybe you should say what you mean by the call graph because, to me,
> it is always a dynamic thing.  The graph nodes are functions and a
> (directed) edge between f and g means that f calls g.
> 
> <snip>

I think that he meant fixed, when function calls are direct and dynamic
when function calls are through pointers (can change).


-- 
Manjaro all the way!
http://manjaro.org/

0
Melzzzzz
10/25/2014 6:32:10 PM
Melzzzzz <mel@zzzzz.com> writes:

> On Sat, 25 Oct 2014 18:34:04 +0100
> Ben Bacarisse <ben.usenet@bsb.me.uk> wrote:
>
>> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> 
>> > On Saturday, October 25, 2014 1:29:50 PM UTC+1, Ben Bacarisse wrote:
<snip>
>> >> >> >> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> >> >> >> 
>> >> >> >> > Function pointers convert the call graph from a tree to a
>> >> >> >> > web.
<snip>
>> >> How is this different when function pointers are introduced?
>> >> Exactly the same functions call exactly the same set of
>> >> functions.  The call graph is identical.  What changes is how easy
>> >> it usually is to know the name of the called function by looking
>> >> at the text of the program.  I think you are maybe misusing the
>> >> term "call graph".
>> >>
>> > Fixed versus dynamic.
>> 
>> Maybe you should say what you mean by the call graph because, to me,
>> it is always a dynamic thing.  The graph nodes are functions and a
>> (directed) edge between f and g means that f calls g.
>> 
>> <snip>
>
> I think that he meant fixed, when function calls are direct and dynamic
> when function calls are through pointers (can change).

Yes, I think so too.  That's why I want to know what he means by the call
graph since it is independent of how the functions gets called.

-- 
Ben.
0
Ben
10/25/2014 8:02:47 PM
On Saturday, October 25, 2014 6:34:17 PM UTC+1, Ben Bacarisse wrote:
> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> 
> Maybe you should say what you mean by the call graph because, to me, it
> is always a dynamic thing.  The graph nodes are functions and a
> (directed) edge between f and g means that f calls g.
> 
In a structured program the functions are the nodes, and there's
a link between a function and all its subroutines, regardless of 
whether called or not during execution. But the functions are not 
nodes, because if both a and b call printf(), we think of it as
a tree with two printf() leaves, not a directed graph with two
paths to printf(). That breaks down when you allow cycles, of
course.

With heavy use of function pointers, that model ceases to be useful. 
We've got lists of dynamically allocated structures with links
to each other, and embedded function pointers. The general pattern 
is that you follow a link, and call a function pointer associated 
with it. It then calls other function pointers wit its links.
So the program structure is now in the pattern of linking between 
the structs. And the usual reason for writing a program like that
is that that pattern of linking isn't a fixed thing, it changes as
the the program executes and nodes get added and deleted.

Ultimately it's got to be a tree, because the program is built on
top of a structured program paradigm of nested subroutine calls,
and few languages break out of that. 
0
Malcolm
10/25/2014 9:37:00 PM
Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> On Saturday, October 25, 2014 6:34:17 PM UTC+1, Ben Bacarisse wrote:
>> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> 
>> Maybe you should say what you mean by the call graph because, to me, it
>> is always a dynamic thing.  The graph nodes are functions and a
>> (directed) edge between f and g means that f calls g.
>> 
> In a structured program the functions are the nodes, and there's
> a link between a function and all its subroutines, regardless of 
> whether called or not during execution. But the functions are not 
> nodes, because if both a and b call printf(), we think of it as
> a tree with two printf() leaves, not a directed graph with two
> paths to printf().

Really?  I think of it as a directed graph with two paths to printf().

>                    That breaks down when you allow cycles, of
> course.

I think you're saying your model breaks down when we consider recursion.
To me, that implies it's not a very good model.

[snip]

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/25/2014 10:42:44 PM
Malcolm McLean <malcolm.mclean5@btinternet.com> writes:

> On Saturday, October 25, 2014 6:34:17 PM UTC+1, Ben Bacarisse wrote:
>> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
>> 
>> Maybe you should say what you mean by the call graph because, to me, it
>> is always a dynamic thing.  The graph nodes are functions and a
>> (directed) edge between f and g means that f calls g.
>> 
> In a structured program the functions are the nodes, and there's
> a link between a function and all its subroutines, regardless of 
> whether called or not during execution.

I thought that might be what you meant, but I prefer to keep the term
"call graph" for the graph of calls rather than the graph of function
mentions or references.

<snip>
-- 
Ben.
0
Ben
10/25/2014 10:50:08 PM
On Saturday, October 25, 2014 11:42:53 PM UTC+1, Keith Thompson wrote:
> Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> 
> > In a structured program the functions are the nodes, and there's
> > a link between a function and all its subroutines, regardless of 
> > whether called or not during execution. But the functions are not 
> > nodes, because if both a and b call printf(), we think of it as
> > a tree with two printf() leaves, not a directed graph with two
> > paths to printf().
> 
> Really?  I think of it as a directed graph with two paths to printf().
>
It doesn't really matter. You can think of it as a tree with two 
separate leaves of type printf(), or as a directed acyclic graph
with just one node for each function name. In the non-function pointer,
non recursive model, it's just a matter of whether you find lots of
nodes or lots of incoming links easier to visualise.

However if we go for your method, it then becomes much harder to 
extend the model to calls via function pointers. A node of type
qsort() could have one link going from it or lots of links, as 
well as lots of links coming into it. With the previous model,
each node of type qsort has just one link going into it and
usually just one link coming out of it. 
> 
> > That breaks down when you allow cycles, of course.
> 
> I think you're saying your model breaks down when we consider recursion.
> To me, that implies it's not a very good model.
> 
It's inevitable. Allowing cycles fundamentally changes the behaviour
of a graph. And the truth is that, except in the special case of
functions that call themselves, recursion is rare and usually 
unwanted. There are exceptions, like parsers. But most C programs
don't use mutual recursion.
0
Malcolm
10/25/2014 11:03:20 PM
On Thu, 2014-10-23, Luca Risolia wrote:
> 13/10/2014 23:06, BartC wrote:
>>
>>
>> "Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message
>> news:slrnm3od1b.1ks.grahn+nntp@frailea.sa.invalid...
>>> On Sun, 2014-10-12, BartC wrote:
>>>> I've just been working with a bit of C++ code for the first time (with a
>>>> view to spending a couple of hours converting it to C. Or to *any*
>>>> language
>>>> that mere humans could understand, for that matter).
>>>>
>>>> But after that experience I don't think I'll ever complain about
>>>> anything in
>>>> C again!
>>> ...
>>>> Anyone reading this and is at the point of being undecided about
>>>> whether to
>>>> learn C++ or not, please do everyone else a favour and don't bother!
>>>
>>> Let me see that code, and I'll have an opinion about it.
>>
>> I did post a link to it in a reply, but here it is again:
>>
>> (The bit of code I was trying to convert is here:
>> https://code.google.com/p/jpeg-compressor/source/browse/trunk/jpgd.cpp
>
> I agree with you that it's impossible to understand that bit of C++ 
> code, for the only reason that it's one of the most inelegant use of C++ 
> I have ever seen. The code looks like the result of some unlucky 
> experiments in C++ attempted by some "old" C programmer:

Did we read the same code?  To me it looked like someone who knew what
he was doing: he had the old C version of the decompressor, saw that
he could simplify or speed up that block or matrix code using template
functions, and did it.

That's one uncommon but good way of using C++: C plus function
templates.  Makes sense if you're modifying algorithm-heavy C code
(and if it's ok to switch language).

Or perhaps it was the many interfaces to the decompressor which you
found lacking?  In the header file? I must admit I didn't look closely
at them -- they could easily have been a failure, like many other
"simplified" APIs.

/Jorgen

-- 
  // Jorgen Grahn <grahn@  Oo  o.   .     .
\X/     snipabacken.se>   O  o   .
0
Jorgen
10/26/2014 4:45:43 PM
On Sun, 2014-10-19, BartC wrote:
> "Jorgen Grahn" <grahn+nntp@snipabacken.se> wrote in message 
> news:slrnm47ba5.1ks.grahn+nntp@frailea.sa.invalid...
>> On Sun, 2014-10-19, BartC wrote:
>> ...
>>> A non-gcc, non-C++ compiler seems to occupy some tens of megabytes, while
>>> gcc with C++ support needs hundreds of megabytes
>>
>> Not true, not on my systems at least (Debian Linux).  It seems you
>> need roughly 20 MB for the C part of gcc, and another 20 for g++.
>
> I wish I could tell you the size of gcc on my Ubuntu system, but I wouldn't 
> have a clue how to. Linux applications seem to have a habit of disseminating 
> themselves across the file system, making the task harder.

They disseminate themselves because they can: there's a package
manager which keeps track.  I used 'dpkg -s g++-4.7' and so on to
look at the sizes.

> But it's also 
> possible that with gcc and Linux, the line between compiler and OS is hazier 
> than with Windows.

It isn't, really.  There's compiler and libraries.  Libraries are
split in runtime, development (header files and so on) and
documentation.

There's no gigabyte of C++ bloat in every Linux installation if that's
what you're thinking.

/Jorgen

-- 
  // Jorgen Grahn <grahn@  Oo  o.   .     .
\X/     snipabacken.se>   O  o   .
0
Jorgen
10/26/2014 5:05:31 PM
jacob navia <jacob@spamsink.net> writes:

> Malcom said:
>>> Function pointers convert the call graph from a tree to a web.
>
> Ben answered:
>> How?  All they do (in C) is to prevent you knowing the name of
>> the called function.
>
> Function pointers can allow you to:
>
> 1) Use a generic name like "add" for a conceptually similar
> precedure.  "Create" comes to mind.
>
> 2) They form the base of functions as data, when passed around to
> a function or calculated from a table of functions.
>
> 3) You can group function pointers into interfaces, for
> establishing a set of related methods that work on a common data
> structure.  I used extensively this in the containers library.
> Happily I am not in C++ so that no pre-established rules decide
> for me how things should run:  just the need for clarity, and the
> most efficient way of organizing the computation.
>
> Other uses are:
>
> o Converting switches to function tables and (maybe) an access
> method.  Adding/deleting functionality in a table makes it much
> less hard coded than in a switch statement.  Basically, a function
> table with an access method is fully equivalent to a switch
> statement.
>
> lcc-win will generate a table of "functions" from a switch if you
> ask it to, eliminating 100% of the switch overhead.
>
> o Nothing prevents you from knowing the function that will be
> called if you care to document in the code with clear names what
> the function pointer does, and in the associated docs to describe
> the algorithm of choice for selecting the function pointer.
>
> ALL th rest of C++ C# lambda expressions, F#, lisp, etc are just
> glorified function pointers.

There are several key differences between function pointers in C
and first-class functions in advanced functional programming
languages (AFPL).

A. In C the set of functions that exist in a program is
fixed and unchanging, whereas AFPL functions can be created
dynamically.

B. Created functions carry an environment with them which the
function definition can make use of.  This can be faked in C
using, eg, a extra void * argument, but that is not nearly as
convenient as the environment being carried along with the
function itself, because then the caller doesn't have to know
about it;  in C though you do, and the two kinds are not
interchangeable the way they are in good functional languages.

C. Functions in AFPLs support currying, or partial application to
some but not all arguments.  Doing this in C is probably possible
in some abstract sense, but in practical terms it is extremely
cumbersome, to the point of being effectively unworkable.

D. AFPLs support parametric polymorphism, which in C is difficult
even to approximate, and effectively impossible to duplicate.
Parametric polymorphism combines the advantages of strong typing,
generic definition, and single instantiation (ie, code for the
function is generated only once).  When used in conjunction with
first-class functions, parametric polymorphism makes possible
some surprisinly general mechanisms to be defined.  For example,
I recently wrote a completely generic prune-as-you-go backtrack
control algorithm, and it was only four lines of code.  Trying to
do that in C would (a) be much more difficult to do, (b) result
in quite a bit more code, and (c) require giving up type safety
or least compile-time checking of type safety.

Don't get me wrong, function pointers in C provide a significant
capability, and giving them up would be a huge loss.  But they
are not nearly as powerful as true first-class functions.
0
Tim
10/27/2014 1:25:44 PM
On 24/10/14 16:35, BartC wrote:
> "David Brown" <david.brown@hesbynett.no> wrote in message
> news:m2cas6$e4l$1@dont-email.me...
>> On 24/10/14 02:49, BartC wrote:
> 
>>> They will want to make use of all those goodies they've heard about. Or
>>> for various reasons they are obliged to use it and have to utilise C++
>>> style interfaces to do so. And by C++ I don't mean a language that has a
>>> C subset, I mean everything else that makes the language controversial.
>>>
>>
>> Actually, it is not uncommon to use a C++ compiler for "ABC" - "A Better
>> C".  You make use of C++'s slightly stricter handling of types, void*
>> pointers, etc., and improvements to the C subset such as using static
>> consts for sizing arrays.
> 
>> (Clearly people will vary as to what they think is in "ABC" rather than
>> full C++.)
> 
> Yes, I also had my own ideas, which can summarised here:
> http://pastebin.com/TNegH8bV
> 
> (Warning: there's quite a lot! This was created in response to a separate
> discussion elsewhere.)
> 
> But these are generally fairly innocuous changes; the resulting language if
> the ideas were all implemented wouldn't be at a that much higher level; it
> would just be /nicer/. It certainly wouldn't bring C a step closer to C++.
> 
> Using C++ would address hardly any of the points I listed.
> 

I've had a look at your list - much of it was quite interesting.

C++ addresses many of your points, contrary to your claims, and a number
are implemented as common extensions (such as on gcc).  Some are common
in coding standards (which some compilers or lints can check and
enforce).  A few of your points I agree with entirely, others I think
are completely daft (this being very much a matter of opinion).  For
many points, your suggestions would be best implemented as warnings in a
compiler to check for inconsistent style.

(1..5) Your comments on braces and semicolons go to the heart of the C
style of language - for the most part, the language is not sensitive to
the number or type of white spaces.  Changing this would be cause
massive disruption - it is very unlikely that it could be done in a
consistent way.  You mention Python for comparison - the whitespace
sensitivity of Python is probably that language's single most
controversial feature.

In the particular case of brackets after an "if" and similar statements,
I would be happy to see missing brackets as a warning in compilers -
they are a cause of many subtle errors in code.

(8, 10) Many compilers have supported the "0b00101101" notation for
binary constants for a fair number of years.  It is now part of C++11,
but for some reason it is missing from C11.  C++11 has a digit separator
1'234 (though I would have preferred Ada's 1_234 myself) - again, it is
strangely missing from C11.

(9) I would like to see octal numbers going the way of trigraphs - they
belong to a by-gone era.  The only modern usage of them is for unix file
permissions, and that small usage (better served using named constants)
is not worth the errors they cause.

(12) In Windows, I believe you can normally use / as a directory
separator.  Raw strings, as you describe, are in C++11.

(16 .. 20, 27) You are a bit confused regarding types - just because
"int" and "long int" happen to be the same size a particular platform,
does not mean that they are the same type.  There are incompatible as
they stand, though there are automatic conversions between them.  (For
example, an int* cannot be assigned directly to a long* pointer.)  But I
agree that the type system in C is not ideal - I strongly prefer to use
<stdint.h> types.  (It doesn't really matter how they are defined by the
compiler and/or headers - though it would be more logical to define
"int", "long", etc., in terms of fixed size types.)  I thoroughly that
"signed char" is simply wrong - it's only good use is to define "int8_t".

(21 .. 26) You mention "type attributes", such as "int.maxvalue".  I
would prefer to go a little further, and support Ada-style attributes,
such as "int'last" or "x'succ".  It is extremely rare that I have needed
to get the limits of an integer type, but far more common to be
interested in the limits for an enumerated type or array.

(30) "int* p, q, r" should be flagged as a warning by compilers - it is
probably a mistake in the code, and is certainly a maintainability and
readability error.

(35, 36) The way to make sense of complicated type declarations is not
to use complicated type declarations.  Build them up in clear, logical
steps with typedefs - then there is no problem.  C++11 has a new "alias
declaration" that has the same effect as typedef, but can be neater for
function pointer types.

(39) I agree that the default for file-level definitions should be
"static", not global.  That is standard for good language design.

(40) I also agree that it would be nice with option parameters in C,
just as in C++.

(41) And I agree entirely that C (and C++) should support named
parameters - I believe it would be simple to add consistently to the
language, and (relatively) simple to implement in compilers.  For
functions with many parameters, it would be a huge benefit.

(46) "static const" declarations function as simple named constants for
most purposes.  About the only limitations (which are removed in C++)
are that you can't use them for array sizes or case labels.

(48) A better module system, rather than headers, is on its way to C++
(and implemented already in some compilers).  It remains to be seen
whether C will ever support it.

(50 .. 52) C++ has namespaces.

(53 .. 56) C++ has scoped enums, strongly distinct types, different
sorts of struct members, and makes "struct" tags optional.  You can also
use typedef to avoid "struct" tags (as I do), though many C programmers
prefer to keep them.  Structs in C define new user types, but only in
C++ can you make them as convenient to use as other types.

(59) Struct members can be padded because the hardware requires it.
Compilers can usually warn if padding is added, and provide pragmas or
attributes to avoid padding.  Consistent standards-specified attributes
would be nice, however.

(61) C++11 gives you std::array with most of the same properties as C
arrays, but without the array/pointer mixups.

(62) C++ has more reference types than you could want...

(63 .. 72) C has all the control structures it needs.  Adding extra
keywords such as "unless (x)" instead of "if (!x)" would pointlessly
complicate the language.  The for loops work fine, as long as you don't
try to be "smart" and use comma operators, and "while (true)" gives a
perfectly good infinite loop.

(73, 74) gcc allows pointers to labels - but there are very few
situations where labels of any sort are a good idea.

(75) I agree that making fall-through the default in switch cases is a
silly design fault in the language.

(76, 78) gcc supports case ranges as an extension - I would like to see
them (along with simple lists of values) in the language.  And switching
based on other constant expressions would be nice too.

(77) Giving the compiler information about avoiding range-checking would
be a compiler-specific optimisation, rather than a language feature.
For example, in gcc you can write:

if (x >= 10) __builtin_unreachable();

The compiler can assume that after that statement, x can never be >= 10.

(80) Print statements are a bad idea - stick to functions.  If you don't
like printf syntax, there is always C++'s cout...

(85) It would be nice to have the syntax "a, b = b, a".  That would
conflict with the comma operator, but I think the comma operator should
be banned anyway.

(109) A simple inline assembler is no use to anyone.  To be relevant, it
needs to be powerful and integrate with the rest of the code generator
in a safe but efficient manner.  The syntax used by gcc (and copied by
many other compilers) is complex, but you can work through it and get it
right.



0
David
10/27/2014 2:28:19 PM
David Brown <david.brown@hesbynett.no> writes:
[...]
> (8, 10) Many compilers have supported the "0b00101101" notation for
> binary constants for a fair number of years.  It is now part of C++11,
> but for some reason it is missing from C11.  C++11 has a digit separator
> 1'234 (though I would have preferred Ada's 1_234 myself) - again, it is
> strangely missing from C11.

I like underscores as digit separators too, but in C++ that syntax is
used for user-defined literals.  Using ' within numeric constants can
cause ambiguities in tokenization; currently 1'2'3 tokenizes as 1 '2' 3
(which is a syntax error) and a single ' can cause problems for the
preprocessor.  (I've heard about problems running Ada code through a C
preprocessor.)  Presumably the C++ standard has already resolved any
such issues.

> (9) I would like to see octal numbers going the way of trigraphs - they
> belong to a by-gone era.  The only modern usage of them is for unix file
> permissions, and that small usage (better served using named constants)
> is not worth the errors they cause.

I seriously doubt that Unix file permissions are the *only* modern use
of octal constants.  For that particular use, I don't think that any use
of named constants would result in clearer code than:

    chmod("filename", 0755);

Yes, you have to understand what a mode of 0755 means, but if you know
what chmod means you almost certainly do understand it.

If C didn't already have octal constants, I wouldn't advocate adding
them -- or at least I'd want to use a less error-prone syntax.  But
removing them now would quietly break existing code, and that's not an
option.

> (12) In Windows, I believe you can normally use / as a directory
> separator.  Raw strings, as you describe, are in C++11.

You *can* use / as a directory separator in Windows, but it's
questionable whether you *should*, especially for strings intended
to be displayed to the user.

>                                                      I thoroughly that
> "signed char" is simply wrong - it's only good use is to define "int8_t".

You accidentally a word.

The three char types are both character types and narrow integer types.
I don't think that was a good design choice, but we're very much stuck
with it.  `signed char` is useful if you want to store values in the
range -127..+127 or less while minimizing space.  The closest equivalent
is int_least8_t (unless you want to add a new requirement that
CHAR_BIT==8).

[...]

> (30) "int* p, q, r" should be flagged as a warning by compilers - it is
> probably a mistake in the code, and is certainly a maintainability and
> readability error.

But
    int *p, q, r;

could very well be intentional.  In this case, the spacing probably
matters in determining (actually guessing) whether it's a programmer
error or not.  The standard has no requirements for optional warnings.
For a particular compiler, warnings that depend on layout to guess what
the programmer meant are IMHO a good idea, but it should at least be
possible to disable them.

[...]

> (39) I agree that the default for file-level definitions should be
> "static", not global.  That is standard for good language design.

You mean that the default linkage should be internal rather than
external.  I agree that that would probably have been a better choice --
but again, breaking existing code is not an option.

[...]

> (59) Struct members can be padded because the hardware requires it.
> Compilers can usually warn if padding is added, and provide pragmas or
> attributes to avoid padding.  Consistent standards-specified attributes
> would be nice, however.

Pragmas that disable padding (`#pragma pack` is common) can be
error-prone.  For example, if an int member is misaligned, the compiler
can generate code to access it correctly if it knows that it's a member
of a packed struct, but not if a pointer to it is used in another
context.  See http://stackoverflow.com/q/8568432/827263 for more
discussion.

[...]

> (63 .. 72) C has all the control structures it needs.  Adding extra
> keywords such as "unless (x)" instead of "if (!x)" would pointlessly
> complicate the language.  The for loops work fine, as long as you don't
> try to be "smart" and use comma operators, and "while (true)" gives a
> perfectly good infinite loop.

`while (true)` requires `#include <stdbool.h>`.  `while (1)` is more
idiomatic in C.

[...]

> (75) I agree that making fall-through the default in switch cases is a
> silly design fault in the language.

I agree, but the only way to fix it now would be to add a new and
syntactically distinct kind of switch statement (perhaps with a new
keyword that causes a fallthrough to the next case).

> (76, 78) gcc supports case ranges as an extension - I would like to see
> them (along with simple lists of values) in the language.  And switching
> based on other constant expressions would be nice too.

One problem with case ranges is the temptation to treat 'a' .. 'z' as a
contiguous range.  You could do that consistently if you changed the
language to require an ASCII-compatible character set, or a least to
require the 26 lower case letters (and the 26 upper case letters) to be
contiguous.  As it is, though, code using 'a' .. 'z' will break on a
system that uses EBCDIC.

[...]

> (85) It would be nice to have the syntax "a, b = b, a".  That would
> conflict with the comma operator, but I think the comma operator should
> be banned anyway.

The comma operator is probably overused.  My rule of thumb is that if a
comma operator or a semicolon would work equally well, you should use
the semicolon.  But it does have legitimate uses, particularly in macro
definitions.  And again, language changes that break existing code are
not an option.

[...]

I've been assuming that these are suggestions for changes to C.  Any
such changes that break existing code are vanishingly unlikely to
happen.  On the other hand, many of them could be good guidelines for
the design of future languages that aren't required to be
backward-compatible with C.

-- 
Keith Thompson (The_Other_Keith) kst-u@mib.org  <http://www.ghoti.net/~kst>
Working, but not speaking, for JetHead Development, Inc.
"We must do something.  This is something.  Therefore, we must do this."
    -- Antony Jay and Jonathan Lynn, "Yes Minister"
0
Keith
10/27/2014 5:21:07 PM
On Monday, October 27, 2014 5:21:18 PM UTC, Keith Thompson wrote:
>
> > (8, 10) Many compilers have supported the "0b00101101" notation for
> > binary constants for a fair number of years.  It is now part of C++11,
> > but for some reason it is missing from C11.  C++11 has a digit separator
> > 1'234 (though I would have preferred Ada's 1_234 myself) - again, it is
> > strangely missing from C11.
> 
> I like underscores as digit separators too, but in C++ that syntax is
> used for user-defined literals.  Using ' within numeric constants can
> cause ambiguities in tokenization; currently 1'2'3 tokenizes as 1 '2' 3
> (which is a syntax error) and a single ' can cause problems for the
> preprocessor.  (I've heard about problems running Ada code through a C
> preprocessor.)  Presumably the C++ standard has already resolved any
> such issues.
> 
There's no convention that a quote is a digit separator.

The options are whitespace or commas, or maybe periods, but the last is unusable.

Commas are no good because you might want long numerical constants in
array initialiser lists. Whitespace seems to be a goer, however. You could specify
the trailing digits must be in groups of three, to reduce accidental errors.




0
Malcolm
10/27/2014 5:33:12 PM
Malcolm McLean <malcolm.mclean5@btinternet.com> writes:
> On Monday, October 27, 2014 5:21:18 PM UTC, Keith Thompson wrote:
>> > (8, 10) Many compilers have supported the "0b00101101" notation for
>> > binary constants for a fair number of years.  It is now part of C++11,
>> > but for some reason it is missing from C11.  C++11 has a digit separator
>> > 1'234 (though I would have preferred Ada's 1_234 myself) - again, it is
>> > strangely missing from C11.
>> 
>>