* Posts by Ian Joyner

622 publicly visible posts • joined 6 Jun 2014

Page:

H2? Oh! New water-splitting technique pushes progress of green hydrogen

Ian Joyner Bronze badge

Fossil Fuel Cult

Excellent. We must break this cult of fossil fuels and realise that gasoline is just a battery store of energy, just like these other forms of electric batteries and hydrogen. Even when other forms of energy become much cheaper we still have the psychological lock in of cult status.

Google screwed rivals to protect monopoly, says Uncle Sam in antitrust lawsuit: We go inside the Sherman parked on a Silicon Valley lawn

Ian Joyner Bronze badge

Re: Dumbed-down User Search

"I'd love advanced search to be there as an option, but I suspect that this is one case where the market really has spoken, and has said that it wants simplified searching."

You can have both – simplified searching just a list of words, but operators in between for advanced searches. I think they used to have that (is it still there), but it seems to not work now.

Ian Joyner Bronze badge

Re: Dumbed-down User Search

"Oh, indeed. Perhaps Google can enable the use of C++ to build queries."

Ugh. As I've pointed out before, you don't need languages as ugly and flawed as C++ to do powerful things.

Ian Joyner Bronze badge

Dumbed-down User Search

We used to have searches that depended on some computing knowledge and Boolean logic. Where has that gone?

It seems that the user used to be in control of their own search – you could specify whether you wanted anything with any of the words (or), or combination of words (and), and a whole lot of other conditions. (Older people will remember INSPEC https://www.theiet.org/publishing/inspec/).

Even if you can still do that, it is not widely advertised.

Why have they done this? I can speculate it is because the search engines have moved the knowledge of what we might want to themselves. They collect this data so they have the power over what we want and can inject their own preferences.

Consumers need to take this power back. It is annoying that searches often result in what the search engine wants you to see, not really what you want.

Yes, we want ease of use, but we should not lose the capability to specify more advanced searches ourselves. We have allowed the search engines to steal that from us.

It is not just Google that has dumbed down search. We need to put the power of search back into the hands of users.

Ian Joyner Bronze badge

Don't wait for government

It is undesirable that these big corporations have so much on us individually.

Competition should mean voting with our feet. But are people really just lazy and couldn't be bothered?

We can't just rely on competition and people, we still need government protection – but it will be really slow.

And slowness means companies can use the political process. After decades against the mispractice of IBM, Ronald Reagan just dropped the anti-trust case against them. Consumer power with the industry moving faster than IBM could hold it back became the humbling of IBM. But the power was not entirely broken, just moved to Microsoft.

Samsung aims boot at Apple's decision not to bundle a charger in with the iPhone 12, foot ends up in mouth

Ian Joyner Bronze badge

Sleazy marketing

Samsung is just an aggressive marketing company that wants to take over the world. The business model is 'take on Apple and put them out of business'. Others have tried:

https://www.cultofmac.com/501138/apple-history-steve-ballmer-iphone-freakout/

It is a very old marketing tactic in computing going back to the 1890s, when Patterson of NCR would black the names of competitive cash registers with FUD. Patterson went to gaol. Thomas Watson worked for Patterson and learnt the tactics and applied them at IBM, but was a little more careful.

Apple explained why they were doing this in the keynote. 1) people already have enough chargers, they just become waste. 2) the packaging for iPhone is smaller, so you get more in a shipment, saving on fuel and shipment costs.

That is great thinking. Most marketing people want their packaging to look big on shelves. Even buying a packet of chips – pumped up with gas to make it look big. Open it and you find it is only 1/3 full. That means you need 3 times the number of trucks to deliver than product. Apple have taken a step to break that wasteful model.

Apple Watch Series 6 isn't a step back for repairability but in its own way that's a leap forward

Ian Joyner Bronze badge

Re: Watch repairs?

"Stop trying to put words in my mouth" I didn't say you'd said that, but others would use the same words to imply just that. "Seems we both agree it is not made to be reparable." So I'm glad we seem to agree.

Ian Joyner Bronze badge

Re: Watch repairs?

"It is not made to be repairable."

The suggestion that it is a deliberate dark tactic is wrong. It really is the only way this stuff can be built. Thus it is by nature, not by some deception.

Ian Joyner Bronze badge

Re: Watch repairs?

"Yes they were very repairable every part could be removed and replaced"

You have a point to an extent. But electronic components are microscopic and soldered for more reliability. Your counterexamples don't disprove the fact that most watches weren't at all user serviceable.

Ian Joyner Bronze badge

Watch repairs?

Cast your mind back to when watches were driven by tiny cogs and springs. Were they very repairable?

No, you had to take them to a specialist who had a tool to unscrew the back, and then fiddle with the insides.

Anyone else attempting that would end in disaster.

Ian Joyner Bronze badge

Hyperbole?

"usual fanboi whooping but heavy on the hyperbole"

That's patronising hyperbole on the part of Register.

Surprise! Apple launches iOS 14 today, and developers were given just 24 hours' notice

Ian Joyner Bronze badge

Re: Deep Analysis

DaLo "that is what it is saying in the article"

That is your interpretation. OK, the sentence in the article "Android devices have had this for some time and it must be implemented by developers on an app-by-app basis" is badly written, but it seems to only be referring to Android, not making a claim about Apple's PiP.

Regardless of this fact, my original point still stands – that often the competition rush out a half-baked version of things, and Apple takes time to get it right.

Ian Joyner Bronze badge

Re: Deep Analysis

"So your "deep analysis" is stating that others create a "half-baked version" and…"

Sorry DaLo, what you say in that post makes so little sense I don't even know where to start responding to it. What did you not understand that Android has PiP because it was provided in individual apps ""Android devices have had this for some time and it must be implemented by developers on an app-by-app basis" (so long as Register's reporting is correct).

"Which even with some "light analysis" is obviously talking about Apple developers needing to implement it on an app-by-app basis."

That is not what at least the Reg article suggested at all.

Ian Joyner Bronze badge

Title Misleading (no not this one!)

"Surprise! Apple launches iOS 14 today, and developers were given just 24 hours' notice"

No, they have had the beta for at least several months since WWDC. Everyone has known iOS 14 has been imminent.

The last paragraph addresses this but relies on a couple of tweets by unknown people. The first:

"Apple announcing the iOS 14 release date 24 hours ahead of time as a summer ending gift to developers"

Well, that does not sound bad. If Apple said "well it will be available" next month people could be complaining. In fact, people are used to this kind of announcement. An expectation of release (at WWDC), then a sudden 'we have it'.

The reply tweet says "@KarlCheeseAji Apple really really hates developers"

That's it. So what? Maybe KarlCheeseAji is one of those people who trolls the net posting anti-Apple stuff.

It is not true anyway. But Register uses a single tweet as absolute proof. Poor journalism.

I won't say it is a trouble free relationship. Apple, like technology, moves fast. Personally, I have been caught a few times by this. But Apple do a whole lot to also support developers. One has been very successful – Microsoft. But they are also responsible for Apple's begin careful what they let out, because it is well known Gates took advantage of Apple's early openness.

Ian Joyner Bronze badge

"There goes Apple, inventing things that already exist again."

A bit more deep analysis needed here. Other companies compete with Apple. They get an idea (maybe because Apple is working on it), and rush a half-baked version to market.

Apple takes its time and gets it right. The Register article says:

"Meanwhile, the Picture-in-Picture feature allows users to watch videos and place voice calls while using other applications. Android devices have had this for some time and it must be implemented by developers on an app-by-app basis"

Note PiP is in Android because app developers have done it. Could be the same on iOS, but developers probably don't bother because they know Apple is doing it independent of apps.

Ian Joyner Bronze badge

Re: Shocked, shocked I say

Some Japanese guy posts a tweet that only says "Apple really really hates developers." And Register uses this as absolute proof.

I won't say it is a trouble free relationship. Apple, like technology, moves fast. Personally, I have been caught a few times by this. But Apple do a whole lot to also support developers. One has been very successful – Microsoft. But they are also responsible for Apple's begin careful what they let out, because it is well known Gates took advantage of Apple's early openness.

Who cares what Apple's about to announce? It owes us a macOS x86 virtual appliance for non-Mac computers

Ian Joyner Bronze badge

Re: Apple Vaporware

'Apple vaporware prior to rollouts and the Pavlovian reaction of tech blog writers and fanbois has become something between a social phenomenon and a bad joke.'

Vapourware is preannouncing products. That is not what Apple does. So I think your comment is wrong (it is not very clear what you are getting at).

Ian Joyner Bronze badge

Re: Cloud-based virtual machines

"they don't see themselves as a software company and the software they do produce is with one intention - to sell Apple hardware."

Wrong – a software company is exactly what Apple is. They keep control of the hardware to support the software. Hardware is secondary to software. You can't trust hardware to hardware people and companies.

https://www.scribd.com/doc/61812037/Barton-B5000

Ian Joyner Bronze badge

Re: Hackintosh

"That said, the move to soldering everything is a real PITA"

No soldering is more reliable than pluggable.

Ian Joyner Bronze badge

Industry Standards

"Microsoft and have made their hardware a little more open and/or standards complaint, paving the way for open source drivers that can potentially be written for any hardware-OS combination."

"Could Apple's jump to ARM be enough to cause not only a swing away from their products but also a reset of the 'industry standards' on some of the expensive propriety software and especially the property data-interchange formats that currently mean that switching software but isn't an option in some professions?"

I think the writer of the above does not understand industry standards. Industry standards are not whatever is most widely used. Industry standards are data layouts and protocols for interoperation so that there is as little lock in as possible.

It does not mean that Intel as the most widely used is the standard that vendors must use. That would forever keep the industry back. Standards should be as little prescriptive as possible,

Ian Joyner Bronze badge

Walled Garden? Linux?

People keep throwing in this now pejorative phrase 'walled garden'. This is nonsense.

The basis of security is to set boundaries and respect boundaries.

Actually, it is more than respecting boundaries it is enforcing boundaries. A platform and OS must enforce boundaries.

Some spread this myth that such controls are against freedom. For some this is childish, others naive, some dishonest – at the very least it is a complete misunderstanding of security.

Linux is also overrated. I think it is a good OS, but its adherents push that it is technically superior to anything else. That is not true. Linux has made a tradeoff of security for performance. And that is why MacOS is based on a better version of Unix for end users. Mind you I also think Unix is overrated, but at least the Mach, Darwin, BSD underpinnings of Apple are better than Linux.

Ian Joyner Bronze badge

Re: IBM seems to have done very well over the years

IBM have done well out of using the dirtiest tricks in the industry. Read Richard DeLamater's 'Big Blue: IBM's Use and Abuse of Power'.

The financial world would be much better off using Burroughs mainframes from Unisys, with maybe what is still the best processor architecture in the business. In fact, the ideas from these machines influenced Apple because Alan Kay who invented the window was a student of the Burroughs designer Bob Barton. Burroughs were the first machines exclusively programmed in HLLs, and Burroughs extensions to ALGOL make C look like a toy (which it is).

IBM systems saw people as being peripherals to the system. That was broken by Silicon Valley putting people in control of computers.

Burroughs systems take a fraction of the staff of IBM. But complexity ensures lock in in this industry and that is a very bad thing.

IBM eventually lost because it brought out the awful IBM PC which was only to crush Apple.

IBM doing well over the years has been to the detriment of the industry and computing as a way of exploring what computing and humanity is.

Mind you IBM has done some very good stuff, like relational databases, but then somewhat wrecked that invention with SQL.

Ian Joyner Bronze badge

Why listen to someone who won't buy Apple anyway?

Why would Apple listen to you who probably won't buy anything from Apple? Are you admitting that MacOS might be desirable to run rather than your beloved Linux, Android, or Windows?

The best computing experience is from integration – integration of both hardware and software. The best security is also from integration.

Bob Barton – the original 'think different' guy realised this. He went on to teach Alan Kay and others who really invented today's industry and influenced in no small way Apple.

https://www.scribd.com/doc/61812037/Barton-B5000

Classy move: C++ 20 wins final approval in ISO technical ballot, formal publication expected by end of year

Ian Joyner Bronze badge

Re: C++ – never classy

"It seems to me you know very little about the object of your vitriol.". This is the typical response to try to make legitimate problems of C++ as vitriol on the part of those who raise them,

"Now there is an alternative to #include, there is no reason to use #defines either as the last use case, guard defines, are rendered redundant. Obvs."

No, they are orthogonal. You confused #include with #define in the first place.

Ian Joyner Bronze badge

Re: @Ian Joyner

"I'm still not Jake. Jake is one of at least two other entities here on ElReg."

Well, don't post as Jake.

Ian Joyner Bronze badge

Re: C++ – never classy

Someone Else: "Well then, I bet you're crying in your beer that C++ has effectively deprecated #defines with modules...."

Wait a minute modules replace #include, not #define.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

Man inna barrel

Re: Is C++ becoming too large and complex?

"Languages get messy because life is messy."

No, that is absolutely the wrong approach to languages.

Yes the problems are complex. The complexity (essential) should remain in the problems. We should solve the problems with simple tools and paradoxically that is true, not "Languages get messy because life is messy." When the languages get messy that is accidental complexity and you end up wasting a lot of time dealing with that complexity that is not necessary.

"C is actually a very good language, because it provides just enough abstraction from the underlying hardware"

Which underlying hardware are you talking about? This is actually not true, but one of the simplistic maxims of C. I have ported large C systems (from well-known vendors) to platforms that C does not abstract well at all. C needs its own sandbox in that case. Mind you I knew the internals of that C compiler (written in Pascal!) and it was very well implemented.

"With C++, I am still learning C++11, so I am way behind. I lapped up many of the new concepts, but I still have a lot to learn."

Well, why this process of learning each step of C++. This is why I say to understand C++, you actually need to understand the history and evolution, much more than any other language.

Ian Joyner Bronze badge

Re: @Stjalodbaer-- Object

@Stjalodbaer-- Re: Object

"Like most programming languages, it has no formal logical or mathematical basis and is just a bundle of nostrums, as it seemed good in someone’s judgement."

Someone Else : "So a language that can be formally defined by BNF " has no formal logical or mathematical basis", and is "just a bundle of nostrums"?"

Wow! Just plain Wow...

So what constitutes a "formal logical or mathematical basis", in your nsHO?

BNF is only for syntax. A language should also be semantically rigorously defined. The problem with C++ is that it lacked this at the beginning. Strachey (the designer of CPL, the forerunner of C) did work on denotational semantics.

And as far as you can shoehorn C++ into BNF, that does not actually prove a formal syntax. That was retrofitted as well.

http://trevorjim.com/c-and-cplusplus-are-not-context-free/

Why talk about these language lawyer and formal things? Because if you get them right the languages are simpler for everyday programmers. When they are wrong you get all sorts of problems and headaches. Complexity is a pain and C++ inflicts a lot of that.

Ian Joyner Bronze badge

Re: C++ – never classy

Me: "Now here's the thing. People who have used such other languages and systems seem to be able to move on."

"Yes, and they seem to have moved on to C and C++...."

Like all matter gets sucked into a black hole.

Lock in is the second-worst problem in the industry after security – which C and C++ are also very bad at. It is not a good thing and the state of programming is not in good shape.

Ian Joyner Bronze badge

Re: C++ is great

So I said "It is you who miss the point. C and C++ are self-obsessed languages. You absolutely have to know what there is and why it might be marked deprecated or something softer. To understand C++, you must understand that."

Someone Else (SE) responds: "Complete garbage. I believe I have a firm understanding of C++ (I've been using it successfully and commercially since '92,"

That proves my point SE has been following this for nearly 30 years – about the same time since I was assigned to a large X.500 C++ project. People who have been involved with the language for that long can't understand how they have followed the history and changes, but for someone new to the language trying to understand all that is difficult. That is why I said C and C++ are 'self-obsessed' languages.

"But understanding the history of the changes? Bah" you just don't get it. You have been doing it for 30 years. Perhaps you understand not to use pointers and rather references or smart pointers or whatever. It is not just a case of syntax (although C++ is peculiarly convoluted and ugly). To understand C++ and why it is the way it is you need to read the history and evolution books, more so than in any other language. And that is for regular programmers, when it should really only be for the language lawyers. In other words C++ exposes all that when other languages abstracts that working away.

"Oh, and "self-obsessed"? Puh-LEEZE! You have a lot of damn gall to project your personal opinion on the entirety of the ANSI and ISO standards committees. Methinks you overreach, my friend."

OK, I'll address that again. Complexity becomes self obsessed. Complex technologies get lock in. Note how obsesses C and C++ people are, always leaping to defence whenever anyone criticise them. There is an obsession around C and C++ that is almost cult like – day anything that is wrong and abuse those who point them out.

I've been involved in ISO myself and there is a lot of posturing that goes on. The observations are not overreach. But there are probably some good minds trying to fix C++ and get it right. But it still is not in that position in 2020 and they still have things left out to fix in 2023. And those are things that have been around for 30 years. The problem is trying to kludge them into a language that was not suitable in the first place.

Ian Joyner Bronze badge

"I don't get all the hate for C++"

You don't get it because it is not hate. All programming languages are technical artefacts and should be analysed for what they are good for, where they are a match to some problem domain more than others. But they should also be criticised for their cross-domain flaws.

What computer people should aim for is simplicity. C++ is the antithesis of that. That is not hate on the part of people pointing that out – the problem IS WITH C++.

"If that's not your use case: pick another language. It's as simple as that"

No it's not as 'simple as that'. Many are stuck with having to suffer C++ because of its flaws. Some can see the flaws – others just don't know that is why they are having a hard time programming.

"It is definitely evolving."

Evolving towards what other languages were 30 years ago. Still contracts are put off until C++23. Many people who know other languages roll their eyes and say 'C++ finally got that'. But then they look at it and the C++ way is so much more obscure and then you find it interacts badly with other parts of the language in non-orthogonal ways.

"The right tool for the right job should be the mantra for every programmer." The C and C++ world is full of these false mantras. Yes, there is some truth in that. but in contrast C++ tries to be the tool for all jobs and in that becomes overly complex. The right tools are simple and sophisticated. C++ takes the unsophisticated and complex path.

Ian Joyner Bronze badge

Re: Evolving

Timrowledge:

Yes, I quite agree.

"C users write a file with a cpp extension and claim “hey, I’m programming objects!”"

This is a very good observation and no less than David Parnas said it somewhere as "the worst thing about OO is that people think they are doing good programming just by using an OO language" – maybe not in those exact words. Anyway, he was making the point it is the clean and enforced design of APIs that is important, that is defining all possible interactions of objects.

With C++ you have pointers which can subvert the published APIs of objects.

"I’d argue that new programmers should be taught documenting before any sort of programming language!"

Well, a programming language is actually a documentation language, but what makes it better is that a programming languages is an alive document. That is one of the great things about programming.

Ian Joyner Bronze badge

Re: @Ian Joyner

Jake: "That's not ad-hom, it's an observation based on available evidence."

Oh, yes it is, you said:

"Quite frankly, you are coming off as a jaded, disillusioned academic."

You used the word 'you' referring to me and then made a remark, which is just wrong. Instead of addressing the subject which is C++, you attacked the person. That is ad hominem.

Jake: "Whoever I am, I am not Jake."

So you don't understand (or hopefully you do now if you read my last post) that system programmers should be hiding the details of the platform so that applications programmers don't all have to deal with those details (that is the aim of software at all levels), and yet you want to hide your own identity.

I can only think you have come here to troll, and this discussion has certainly declined into ranting.

Ian Joyner Bronze badge

Re: "Competent, core language"

Jake: "Wait, what? If a systems language is used for applications, the systems programmers have failed to do their jobs? How on Earth is it the system programmer's fault if an application programmer chooses an/the incorrect language for the application? And how the fuck does it follow that the system programmer doesn't understand their job, based of what some other person does?

Your illogic is mind boggling."

OK, you don't understand what the fundamental goal of system programming is. That is to provide a platform free of the underlying considerations of hardware and 'bare metal' as C programmers love to call it. That is any hardware timing concerns that might affect the outcome of computations are handled. Any other foibles of the hardware platform. Those details are abstracted away.

Similarly in distributed systems, middleware abstracts from the differences between systems to provide a single consistent view.

Put even more simply, the goal of any level of software is to provide a strong and consistent abstraction.

When system programmers think other programmers should be using C, exposing the lower level details that should have been handled, it means the system programmers have not done their job.

I'm sorry, if I wasn't clear on that before, and I hope that explains it to you.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

"Do keep carrying on talking about the 1950s."

What I'm suggesting is we should not forget the language ambitions of those times. They were very far sighted. For example Christopher Strachey's CPL, which could not be implemented at the time, so Martin Richards did BCPL, out of which C came.

But C was a compromise at the time both because it was done on limited machines, and because it did not understand the vision of the other languages.

So anything based on C is going to fall short of what could be done in languages. C++ tried to fix some of that, but really just added its own level of cruft.

Yes, Dan 55, do move on. The rest of us have.

Ian Joyner Bronze badge

Re: "Competent, core language"

"Hilarious. Keep on claiming Smalltalk, ALGOL, Simula, Smalltalk, Eiffel, Pascal, Oberon, etc... are all perfectly designed"

No, I did not say that. However, it does not matter what people think – C++ is a terribly designed language.

Dan 55 is descending to ranting.

Ian Joyner Bronze badge

Re: "Competent, core language"

"None of the languages you cite are getting traction in the commercial world and you miss all the langagues that provide real competition like Java or are up-and-coming like Go and Rust. But apparently I am wrong."

See that is what always happens, people like Dan 55 fall back on "oh it's not used widely". That is the problem of today's populism. People like Dan 55 can't break out of the mould and too scared to admit the flaws in C and C++ and can only keep beating the same drum.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

" was trying to get the point across that there was no contemporary competition to C "

You are living in the narrow world of C. There were plenty of other languages around when C came out. In the 1960s there was much language activity. C people like you like to take the view that C had no contenders. It did.

"I don't know, if you don't like C++ and even don't like C but cite languages like ALGOL and Pascal in other posts"

Because again that was to contradict your contention there was nothing else. ALGOL was an ancestor of C, and yet, in many ways better. Better and rigorously defined. C was a step backwards. ALGOL was improved by CPL, ALGOL68, Pascal and beyond. C and C++ have become stuck. Even though C++ might seem to have improved with new versions that is still trying to fix mistakes, not any real improvement.

I have more than moved on, but we should not ignore the lessons from the late 1950s and beyond because those people understood the foundations of computing that is lacking in today's practitioners who think programming is down to a couple of flawed languages.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

Dan 55: "while the language itself remained pristine and unsullied and unusable so the academics were happy.

Meanwhile, in the real world"

You have this false division between academic and real world. While you persist in that false belief and distinction there is no discussion to be had with you. You live in some kind of 'fool's hell'.

Programming is an intellectual activity. It is based on computational models and virtual machines, some of which are implemented in electrical circuits.

You just want to feel like you can dismiss anything else as not being 'real world'. That is a narrow and silly view.

Computing must improve and move on and if that is a move to academic purity, that is a good thing.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

Dan 55 "See also Why Pascal is Not My Favorite Programming Language."

There is another thing here that C people treat whatever Kernighan, Ritchie, and Stroustrup have written as some kind of holy scripture and last word on the subject.

Like I said, Kernighan has a lot of things right in that paper (which I have read several times over the years), but it is not the last word, and Pascal wasn't where that stream of languages ended. Like most things, people moved on from Pascal. But C and C++ seem to attract the kind of people who refuse to move on.

Ian Joyner Bronze badge

Re: Is C++ becoming too large and complex?

Dan 55: >>Pascal could manage a glorified bootstrapper with a GUI, which I guess is an improvement in MS-DOS, but Apple moved away from Pascal with the move towards co-operative multitasking and PowerPC.

See also Why Pascal is Not My Favorite Programming Language.<<

Apple moved to NeXT and OS X. So?

And the Kernighan paper – most of it I agree with. But most C people take it too widely and think that it applies to any language with Pascal-like syntax, and that is an excuse to remain in the C black hole. Kernighan's paper is not an excuse just to ignore everything else – it is about some very specific things about Pascal, very notably having array sizes as part of a type, which is wrong.

Kernighan's paper is not permission or exhortation to just ignore everything else.

Many of Kernighan's complaints were addressed in commercial Pascal implementations.

Ian Joyner Bronze badge

Re: C++ – never classy

Someone Else: "Modules"

Modules? Isn't that what different files did with separate compilation, or maybe namespaces.

In another post you said just don't use things if you don't like them and you don't need to know all the history.

If modules really replace #define then you need to know you should use modules instead and why you should not use #define. It is all history and having to know that about C++ – it has this thing, but don't use it.

Ian Joyner Bronze badge

Re: C++ is great

Someone Else (yet another cowardly anonymous poster)

>>It is much more than that. You need to learn what to use and not use and that means understanding why it was there, or still is there.<<

"No, you miss the point (and now I finally understand why you whine so much about C++ and revere academic oddities like Eiffel). You don't need to learn what to use, based on some abstruse history lesson. Rather, you need to learn when to use a feature, and by induction, when not to use it."

It is you who miss the point. C and C++ are self-obsessed languages. You absolutely have to know what there is and why it might be marked deprecated or something softer. To understand C++, you must understand that.

"revere academic oddities like Eiffel" That comment is no misinformed, it does not even deserve a response.

"It's a different mindset, and one that may not be available to you."

No, it is a mindset I consciously reject because it is a wrong mindset – one stuck in the mistakes of the past. One that is locked in and unwilling to see that there have been other much better things.

Ian Joyner Bronze badge

Re: C++ – never classy

Someone Else: "Well then, I bet you're crying in your beer that C++ has effectively deprecated #defines with modules...."

No, the Burroughs defines were argued about when they were first included. They were much more powerful than the weak C defines, with several levels.

It was very much text-based processing along the lines of the General Purpose Macrogenerator (which one of my language teachers worked on).

https://en.wikipedia.org/wiki/General-purpose_macro_processor#General_Purpose_Macrogenerator

Amazing what you find. Here is an article by Andrew Herbert (who I knew at ISO ODP meetings) on GPM:

https://www.computerconservationsociety.org/software/elliott903/more903/Manuals/CCS%20Resurrection%20GPM%20Article.pdf

Strachey then developed CPL, which became BCPL, B, and C.

A couple of us Burroughs people did a language called SDL on the Apple II which was a cut down (because it had to be) of Burroughs and Elliott ALGOL. The main language designer did not have define # in it – he invented a rather neat macro mechanism to directly put in 6502 assembler code for things not worth doing in the language. However, in a bigger system (which we have these days) you would put such systems stuff directly in the language.

Now here's the thing. People who have used such other languages and systems seem to be able to move on. Sadly the same is not true for C and C++ people who seem to become rusted on to the deficiencies and bad ideas in C and C++.

Ian Joyner Bronze badge

Re: "Competent, core language"

Doctor Syntax:

Re: "Competent, core language"

""A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job."

I think there's a couple of non sequiturs there."

You are wrong. And you don't even bother to note what the non sequiturs are.

I'll say it again, because it is right:

A systems language should only be used for systems, not for applications. Where a systems language is used for applications, it means the systems programmers have failed to do their job

And more than failed to do their job – it probably means they don't even understand what their job is.

Ian Joyner Bronze badge

Doctor Syntax:

""'Best approximation of C++ ideals so far,' ... but is it too big and complex?"

I thought too big and complex were the ideals."

No, sophistication is the idea – and sophistication results in simplicity. It is the complexity of the problems we should handle, not the complexity of the tools. Problem complexity is essential complexity – complexity in tools is accidental and self-inflicted complexity. That is the problem with C++.

Ian Joyner Bronze badge

Re: Too large and complex

"There a huge body of software written in C and its derivatives (including Java)."

Popularity is not a measure of quality. Neither is the misguided belief in the industry that C and derivates are good and all else bad.

Ian Joyner Bronze badge

Re: Coroutines?

"C++ was an OO systems programming language leveraging the advantages of C and providing alternatives"

Most programming is NOT systems programming. The fact that systems programmers think all other programmers should be using C and C++ indicates the systems programmers have NOT done their job (they probably don't even understand that is their job).

Simula (like any language) can very simply be extended for system facilities – system facilities are actually very small and they should be limited to small areas of an OS and device drivers.

C and C++ are completely the wrong approach.

Ian Joyner Bronze badge

Re: C++ is great

The problem in computing is programming is really powerful and you can do anything in just about any language, no matter how bad it is. BASIC programmers used to extol how good BASIC was and would not move on.

You must learn what orthogonality is – it will greatly help build better software.

Basically orthogonality means inclusion of features that are independent of one another and that don't interact in bad ways. It is the independence of 90º. Non-orthogonality means things interact in surprising and not pleasant ways. Complex languages are more likely to suffer from non-orthogonality. This makes them difficult to predict what will happen.

http://searchstorage.techtarget.com/definition/orthogonal

Orthogonality is fundamental to good software design. Mind you, I do sympathise with you, because of lots of academic-speak (the kind of pretentious academic speak I hate and usually the user does not know what it is either) around 'orthogonality' it took me years to understand it – but the definition is simplicity itself (see what I did there!).

Ian Joyner Bronze badge

Re: @Ian Joyner

"Quite frankly, you are coming off as a jaded, disillusioned academic."

You descend to ad hominem. I just wrote this in another answer: my first exposure to this world was I was taught Simula by a professor who had worked in the UK with some of the great minds in computing (he did not say it at the time, but I have found out since). But the term OO was not mentioned. I then became one of the first adopters of OO in this country, have done languages and compilers, many large software projects, and studied OO deeply.

So you can take your false assessments and you know what to do with them.

As for giving you (Jake, whoever you are) examples – well you can do your own research. C++ people just bash everything.

Page: