back to article Python creator Guido van Rossum sys.exit()s as language overlord

Guido van Rossum – who created the Python programming language in 1989, was jokingly styled as its “benevolent dictator for life”, and ushered it to global ubiquity – has stepped down, and won’t appoint a successor. In a mailing list post on Thursday titled, “Transfer of Power,” he wrote: “Now that PEP 572 is done, I don't …

  1. Anonymous Coward
    Devil

    reflecting opinions more than best practice

    From The Article:

    [ ... ] some developers felt PEP 572 was a poor approach that reflected van Rossum’s opinions more than best practice.

    The same can be said about the entire Python Language.

    Actually, this has already been said about Python. Many times over.

    1. Anonymous Coward
      Anonymous Coward

      Re: reflecting opinions more than best practice

      Ah, repeating bilious burps. Annoying, those.

    2. The Man Who Fell To Earth Silver badge
      Mushroom

      Re: reflecting opinions more than best practice

      I'm old enough to know that Python is just the present "flavor of the month" programming language. It is only a matter of time before another comes along.

      1. JDX Gold badge

        Re: reflecting opinions more than best practice

        Flavour of the month for 15 years or so. It's been a widely used niche language for that time, widely espoused by developers as something that "should" take off more widely.

        In the meantime, Ruby came as a competitor... and seemingly went.

        1. Anonymous Coward
          Anonymous Coward

          Re: reflecting opinions more than best practice

          Flavour of the month for 15 years or so.

          In the end it seems Python beat out Ruby, PHP, JavaScript, and Lua as the king of convenience languages, which only exist because C(++) takes forever to compile nitpick your syntax to death. Now that language designers have run out of 'clever' hooks for new languages, it is only a matter of time until a master architect distills this mess into a single proper language. But not in our lifetimes.

          And that's why we can't have nice things, etc, etc. "Learn to program, build anything your heart desires!" they said. But if you want to build real software, you'll mostly be learning and relearning the intricacies of ecosystems/jungles like C++, Python, .NET, and HTML5.

          Y'all see why I'm a cynical troll?

        2. The Man Who Fell To Earth Silver badge

          Re JDX: reflecting opinions more than best practice

          "It's been a widely used niche language..."

          You can say the same thing about FORTRAN, COBOL, PL/I, BASIC's various forms, C's various forms, etc. They are all still around and used in their various niches. Flavor of the month does not require they become extinct when the month is over. My definition of "flavor of the month" is what fad language entry level college computer programming courses use. In my high school (in the '70's), it was a form of Assembly. In my college (also in the '70's), it was PL/I.

      2. Anonymous Coward
        Anonymous Coward

        Re: reflecting opinions more than best practice

        "I'm old enough to know that Python is just the present "flavor of the month" programming language"

        If using a flavour of the month language means that Perl finally crawls away and dies then thats good enough for me. Anyway, after being around this long I think its fair to say Python is part of the dev furniture now, not a newcomer.

        1. Anonymous Coward
          Alert

          Re: reflecting opinions more than best practice

          You should not have asked for that. Perl has crawled off, but it's crawled off into the abyssal void beneath the world we know where it waits, surrounded by an uncountable[*] host of chittering things, waiting, waiting for the time to be right to rise again and destroy this transient thing we call the world. Even now, in the machine room late at night I catch glimpses of unspeakable things and see the trails of slime on the racks: I no longer dare look under the false floor as I know what waits there. Very soon now we will die, horribly screaming as we are slowly eaten from within. And then our world will be a memory and shortly not even that: what remains will be darkness, and Perl, as there once was and as there always will be. As there always should have been.

          [*] You understand of course what I mean by 'uncountable': there are more of them than there are rationals. I know this because I have seen them.

      3. Anonymous Coward
        Anonymous Coward

        Another language will come along

        Yes, one with curly brackets, hopefully (yes, and proper indentation as well).

        I’m sorry, but I just find code much easier to read (and write) when I can actually see the block delimiters!

    3. MarkB

      Re: reflecting opinions more than best practice

      As ST comments, surely that's always been the case with Python, just as it's the case (as far as I can understand) with Perl - the language enshrines the opinions and prejudices of the single originator. That's part of my issues with both languages, as I'm not keen on some aspect of both Guido's and Larry's opinions and prejudices.

      1. Anonymous Coward
        Anonymous Coward

        Re: reflecting opinions more than best practice

        perl -- not so much since he went on to try to create perl6. Since then it's been a bunch of curmudgeons that want to make it more obtuse and harder to learn. In addition to a recent backtrack on relaxing explicit prefix-sigils (@%) for references (that always include their type), they even added postfix sigils -- little tails you can play "pin the tail on the var" to indicate type.

        perl's problem has been that most of those in charge know nothing about computer languages or computer science and have no formal training. A few do and that shows, but machine room operators don't make for good language designers.

        This lack of background, but having expertise in perl shows just as it shows for C++ -- the designers

        design for experts -- not new users, with ideas & projects to help and attract new users shot down as too simplistic or too DWIM'y (Do What I Mean) - one of the original design goals of perl that it's continued to drift away from. Anything that *they* wouldn't use or that deobfuscates the language is considered wrong. In addition to holes in the language that generate errors rather than provide a logical feature, they are adding more errors by taking away existing usable and useful features -- what a bonus! They don't want the language to go forward without them, so they are designing in its obsolescence. A shame really, especially since in many applications it ran 5-15 times faster than an equivalent python program -- more so if it involved parallelism.

        But when it came to new designs, they believed in nuking problems rather than applying touch-up fixes. It took about 20 years to get back to a working unicode that works the same as it did 20 years back -- if they'd just fixed binary files (vs console text files). Instead they killed the whole feature and now perl's unicode has a permanently established bug where you can read in files in a local, 8-bit encoding, and they'll be written out in a multi-byte utf-8 encoding. The believed it was better to preserve this bug for posterity for ancient (20yr+) program compatibility. On one hand they'd keep the bad, but on another hand, they have no problem destroying compatibility by adding new behaviors that would be 'non-optional'. In regard to bugs: they regard them as virtues: "Historically, we've held ourselves to a far higher standard than backward-compatibility -- bugward-compatibility. Any accident of implementation or unintentional side-effect of running some bit of code has been considered to be a feature of the language to be defended with the same zeal as any other feature or functionality -- thus bugs have been cemented into the language with the effect of crippling growth.

    4. phuzz Silver badge
      IT Angle

      Re: reflecting opinions more than best practice

      It's nice to see ST getting a perfectly balanced 17 upvotes and 17 downvotes (at time of writing). That's a fine line to walk.

      1. DropBear
        Joke

        Re: reflecting opinions more than best practice

        On the other hand, you seem a bit imbalanced with 1:0 - just say the word, I'd be happy to help...!

        1. onefang

          Re: reflecting opinions more than best practice

          Phuzz was 2:2 when I read it. I'm now reluctant to upvote.

    5. Anonymous Coward
      Alert

      Re: reflecting opinions more than best practice

      I think that's true: there are many, many questionable decisions in Python's design and it seems to gleefully ignore a lot of things that people understood how to do quite well. And that's before the gratuitous 2-3 incompatibility.

      But, it exists, it works everywhere, it has a big library. There's something to be said for that, and it almost certainly would not have happened without Guido van Rossum. It's easy for language snobs like me to snipe from the sidelines, but that's all it is: sniping.

      1. John Smith 19 Gold badge
        Headmaster

        it works everywhere, it has a big library.

        Library(s) (or rather libraries).

        But you're right.

        They are all pretty big.

        1. Anonymous Coward
          Anonymous Coward

          Re: it works everywhere, it has a big library.

          I was treating the collection of all the generally available modules & packages as the Python library, hence singular. Also by 'big' I meant 'large in coverage' rather than 'a lot of bytes': since I turnd off my 11/70 I've stopped worrying about counting bytes.

    6. asdf

      Re: reflecting opinions more than best practice

      Most languages that have been around as long as Python have advantages and disadvantages. Bloat and performance have always been the knock against Python as is to be expected for what Python does, was designed for and what it gives you. As always horses for courses. Still glad its around as choice is always good and hope the community picks up the pieces well.

    7. John Smith 19 Gold badge
      Unhappy

      Actually, this has already been said about Python. Many times over.

      True.

      Think I'll still learn it though.

      It looks good enough to get the job done.

      1. asdf

        Re: Actually, this has already been said about Python. Many times over.

        >It looks good enough to get the job done.

        Probably true for vast majority of use cases out there. You wouldn't want to use it say for embedded systems development though. It does have some JIT capability IIRC but horses for courses.

      2. Tom 7

        Re: Actually, this has already been said about Python. Many times over.

        For a while until someone 'upgrades' something. I'm finding it increasingly difficult to 'get the job done again' as code that used to work no longer does due to different choices of which version of the language/libraries to settle on,

        I'm trying to teach kids this stuff and having to take them through a virtual environment configuration for each flashy example is getting a bit fucking tedious.

    8. buttler987

      Re: reflecting opinions more than best practice

      yes you are right

  2. Notas Badoff

    Live! from your keyboard - your reputation

    Um, wow. When the BDFL (retired) is minded to point to the code of conduct and also remind everyone that maillists are public information...? It has obviously been rough riding herd on a federation of (some) foul tempers?

    Two things to note here.

    No misogyny, misandry, racism, classism, or other -isms were employed in the making of this debacle. Just people being much much less than ideal. Thus this is a good (?) example to point to, that there are way too many people out there who simply don't know how to play well together. Quit trying to out-Godwin each other, everybody loses.

    From the nature of the interactions, you have the chance to be a much *nicer*, more *intelligent* person in print than in real life. When you forget one person's name in real life you've lost one future friend. When you forget your humanity on the web, you've lost your career.

    1. Dan 55 Silver badge

      Re: Live! from your keyboard - your reputation

      I'm glad this is a site where this post receives more upvotes than downvotes. Seems most Internet commentary is a shocking cesspit.

  3. This post has been deleted by its author

    1. a_yank_lurker

      Re: Futuristic progression of Programming Languages?

      I think if you can write the code then visual style development is a great aid because you should be able to hand to tweak the code if needed (though it probably is rarely done). Where I find visual style development a disaster waiting to happen is when the person does not understand the basics of the underlying languages and can not visualize what the code might actually look like.

      1. The Indomitable Gall

        Re: Futuristic progression of Programming Languages?

        I agree that visual programming is very limited.

        I think the future of code is in frame-based programming. Check out the Stride programming language used in the Greenfoot and BlueJ educational IDEs. It's designed to let you develop traditional line-paradigm code more efficiently by reducing the number of keystrokes and making syntax errors impossible and scope errors rare.

        Every type of code statement has a limited range of possible syntaxes, so Stride turns each statement type into a template where you fill in the boxes. As an educational programming language, Stride maps to a subset of Java, and any Java code can be called from Stride.

        It also renders the block delimiters vs meaningful whitespace debate moot, as blocks, scopes and indentation are handled by the editor automatically as the programmer is no longer dealing with plaintext.

        I think this frame-based paradigm has real potential to change coding practices in a way visual coding never really did.

    2. Anonymous Coward
      Anonymous Coward

      Re: Futuristic progression of Programming Languages?

      All hardware sucks, all software sucks, all languages suck. They all suck the same (but in different ways).

      If your eyes glow brightly when a particular language is mentioned, I'm going to treat you as radioactive!

      1. jake Silver badge

        Re: Futuristic progression of Programming Languages?

        "All hardware sucks, all software sucks, all languages suck."

        You forgot OSes, they suck too. And worse, you forgot fanbois, who suck in all kinds of spectacular ways.

        1. FeRDNYC

          Re: Futuristic progression of Programming Languages?

          You forgot OSes, they suck too. And worse, you forgot fanbois, who suck in all kinds of spectacular ways.

          Yes, well, the value of pointing out that all hardware, software, and OSes suck is to remind us to focus on an individual example's positives, rather than its negatives, because they all have negatives.

          There are no redeeming positives to fanbois, though, so there's less value in noting that they all suck, even though they most certainly do. They suck not only individually but collectively, and they should be dismissed in the same manner: Begone, all ye fanbois!

      2. Wilco
        Coat

        Re: Futuristic progression of Programming Languages?

        "All hardware sucks, all software sucks, all languages suck. They all suck the same (but in different ways)."

        ... except Scala, obvs

        Mine's the one with the trefoil on the back ...

      3. AndrueC Silver badge
        Thumb Up

        Re: Futuristic progression of Programming Languages?

        All hardware sucks, all software sucks, all languages suck...

        Yes, and as was already pointed out 'all operating systems suck'. I learnt that one back in the 90s when I briefly became an OS/2 fanboi. Since then I've forced/taught myself to use whatever is most suited to the task. Of course some of that 'suitability' is 'what managers/customers/the market wants' but a good engineer is a pragmatic engineer ;)

        So for now it's Windows and C# for me in the main. With luck that'll see me through another few years until I retire. Beyond that I will endeavour to try and not care.

    3. Anonymous Coward
      Devil

      Re: Futuristic progression of Programming Languages?

      > they all seem kind-of C-like

      Python is like C in the same way a goat's ass is like an orchid.

      Just sayin'.

    4. thames

      Re: Futuristic progression of Programming Languages?

      A program written in Python can be a fraction of the number of lines as a program which does the same thing in C.

      Time is money, or whatever other means you want to measure the value of time in. You can get a finished program in fewer man-hours. That matters in a lot of fields where being first to market is what counts, or where you are delivering a bespoke solution to a single customer at the lowest cost, or where you have a scientific problem that needs solving without investing a lot of time in writing the software part of the project.

      Python isn't the best solution to all possible problems, but it is a very good solution to a lot of problems which are fairly prominent at this time. It also interfaces to C very nicely, which allows it to use lots of popular C libraries that already exist outside of Python itself. These are why it is popular right now.

      There is no one size fits all solution to all programming problems. It is in fact considered to be good practice to write bits of your program in C and the rest in Python if that is what makes for a better solution for your problem. There is no necessity to re-write everything in Python in the manner that certain other languages require everything to be re-written in "their" language. The result is that Python has become the language of choice for a lot of fields of endeavour where you can reuse existing industry standard C and Fortran libraries from Python.

      Van Rossum's "retirement" isn't a huge shock and won't make much difference. For quite some time other members of the community have been taking the lead in developing new features and Van Rossum's main role has been to say "no" to adding stuff that was trendy but didn't provide a lot of value. Everything should continue along find with the BDFL further in the background. Overall, it is probably a good idea to get the post-BDFL era started now while the BDFL is still around.

      1. jake Silver badge

        Re: Futuristic progression of Programming Languages?

        "A program written in Python can be a fraction of the number of lines as a program which does the same thing in C."

        This is both a blessing and a curse. Unfortunately, the folks who see it as a blessing will probably never fully understand why it's also a curse. Folks who know it's a curse debug code with the compiler's assembly output anyway, and don't care.

      2. The Man Who Fell To Earth Silver badge
        WTF?

        Re: Futuristic progression of Programming Languages?

        "A program written in Python can be a fraction of the number of lines as a program which does the same thing in C.

        This isn't unique to Python. And it's only a "blessing" in limited circumstances when writing pedestrian software were knowing what is really going on is deemed unimportant.

        1. Vincent Ballard

          Re: Futuristic progression of Programming Languages?

          90%+ of software is pedestrian.

  4. Nick Kew

    Reinventing a more limited wheel

    I clicked the PEP 572 link. Seems to me that everything claimed for it has been accomplished by the C comma list since before Python was ever indented.

    (Never Python's greatest fan - can you tell?)

    1. thames

      Re: Reinventing a more limited wheel

      I would be fascinated to hear how you would do the following in one line of idiomatic C using commas.

      results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0]

      The major objective appears to be avoiding duplicating work unnecessarily when doing multiple things in a single expression. The previous way of doing the above would have been:

      results = [(x, f(x), x/f(x)) for x in input_data if f(x) > 0]

      I can think of multiple instances in which I could have used this feature in cases similar to the above.

      1. tb7842676

        Re: Reinventing a more limited wheel

        [(x, y, x/y) for x, y in ((x, f(x)) for x in input_data) if y > 0]

        The new syntax is using less characters. This appeals to many programmers but I thought Python was not such a language.

        1. Tom 38

          Re: Reinventing a more limited wheel

          [(x, y, x/y) for x, y in ((x, f(x)) for x in input_data) if y > 0]

          The new syntax is using less characters. This appeals to many programmers but I thought Python was not such a language.

          At $JOB, that would immediately fail code review. Nested list comprehensions are hard to comprehend, particularly compared to assignment expressions, and disguise their purpose. Lets run through how many PEP-20 violations that is - its ugly, its complex, its nested, its dense and it has poor readability.

          1. FeRDNYC

            Re: Reinventing a more limited wheel

            Nested list comprehensions are hard to comprehend

            I'm impressed how you seemingly managed to say that without a trace of irony.

          2. Anonymous Coward
            Anonymous Coward

            Re: Reinventing a more limited wheel

            @Tom 38: Glad you mentioned it also. That code line is fucking horrendous. Just because you can do something in one line doesn't mean you should. Likewise I find that line to be a violation of the principles relating to code readability.

        2. FeRDNYC

          Re: Reinventing a more limited wheel

          The new syntax is not only using fewer characters, it's far clearer about what's actually going on than your example. Yes, you have to wrap your brain around the := operator, but that's just syntax and syntax is trivial. But when you compare your version:

          [(x, y, x/y) for x, y in ((x, f(x)) for x in input_data) if y > 0]

          vs. the PEP 572 version:

          results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0]

          The fact that there will be a result for every item in the input_data list and that y = f(x) in the output are both made far clearer and more obvious in the latter form.

      2. Tom 38

        Re: Reinventing a more limited wheel

        I would be fascinated to hear how you would do the following in one line of idiomatic C using commas.

        Well newlines are optional, and there is no limit on the number of statements per line, so pretty easy.

      3. rgmiller1974

        Re: Reinventing a more limited wheel

        I'm curious about the example thames posted. Is

        results = [(x, f(x), x/f(x)) for x in input_data if f(x) > 0]

        really any slower than

        results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0] ?

        In the first expression, I can see that f(x) could potentially be evaluated three times, but would that actually happen? The example implicitly assumes that f(x) returns the same value each time it's evaluated. If that's the case, is the interpreter not smart enough to cache the result of f(x) for later use? (And if that's not the case - for example if f(x) returned x+time.time() - then the two expressions above aren't actually equivalent.)

        1. thames

          Re: Reinventing a more limited wheel

          @rgmiller1974 said: "I'm curious about the example thames posted. Is ... really any slower than ..."

          The interpreter doesn't cache the results of f(x), and I doubt it would be feasible to determine if it could do so in all cases. Static analysis couldn't determine that since function "f" could be written in another language (e.g. 'C') for which you might not even have the source code and dynamic analysis would run into similar problems.

          The new syntax achieves the same result under the control of the programmer as well as being useful in other applications. Plus, you can see what is going on without having to analyse the behaviour of f.

        2. Claptrap314 Silver badge
          Facepalm

          Re: Reinventing a more limited wheel

          No, and that is precisely the point. If you think that

          results = [(x, f(x), x/f(x)) for x in input_data if f(x) > 0]

          is in any way equivalent to

          results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0]

          then I don't want you on my team, and may I never be affected by any your code.

          There is absolutely no way to ensure that f(x) is idempotent. If you don't understand that, then step away from the keyboard.

          1. The Indomitable Gall

            Re: Reinventing a more limited wheel

            " There is absolutely no way to ensure that f(x) is idempotent. If you don't understand that, then step away from the keyboard. "

            And this is another reason to favour the PEP -- taking the same example

            results = [(x, f(x), x/f(x)) for x in input_data if f(x) > 0]

            if the function f is not idempotent, then we now have the possibility of throwing a divide-by-zero exception, which would cause the whole comprehension to be binned. (E.g. first call to f(x) returns 1, but in x/f(x), f(x) returns zero.)

            In the case of the assignment expression version...

            results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0]

            ... as f(x) is only evaluated once, x/y will never result in a divide-by-zero exception.

      4. Phil Endecott

        Re: Reinventing a more limited wheel

        > results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0]

        That would surely be much more understandable in multiple lines:

        vector<T> results;

        for (x: input_data) {

        auto y = f(x);

        if (y > 0) results.append( make_tuple(x,y,x/y) );

        }

        ...if that’s what it actually does....

    2. Flocke Kroes Silver badge

      Re: Reinventing a more limited wheel

      No because this is not about comma lists, which have very context dependent meanings in both languages. It is not about the comma operator, which in C means calculate the left argument for side effects and return the right argument (in python, the , operator creates a tuple of the two or more arguments). This is about expressions, statements and assignments. Both languages have contexts where an expression is permitted but not a statement, for example C: "while(expression) statement or {statement list}" and python: "while expression: statement or indented block". The difference was that in C assignments are expressions but in python assignments had to be statements. (Both languages allow using an expression where a statement is expected).

      The new feature in python is an extra assignment operator (:=) so assignment expressions are now possible. In the past, converting C: "if (a=b) {...}" to python required the assignment to be in a separate statement from the condition making it abundantly clear that the programmer did not intend C: "if (a==b) {...}". Python will now allow: "if (a:=b): ..."

      This has clearly caused a blood feud between different styles of language designers. On one hand, some people think that best practices must be forced down the throat of all programmers because some of them have to create insane code when the language allows it. Other people think that programmers that dumb are going to fuck up no matter what the language enforces, so the language should rely on the sensible programmers' self discipline to follow best practices and not get in the way when a programmer has an outstanding reason to do something odd.

      Before you reply that clearly python has lacked C's assignment operator for years, there are plenty of things that C++ has attempted to copy from python (or whatever language python copied from), and often still struggles to get close to tolerable.

  5. jake Silver badge

    And people wonder why ...

    ... Linus goes all draconian on brats who don't play well with others.

    If it's your project, make it yours. People who don't like the concept can fuck right the hell off. You're probably better off without them anyway. Right, Mr. Wall?

    1. Voland's right hand Silver badge

      Re: And people wonder why ...

      Linus goes all draconian on brats who don't play well with others.

      Indeed. Guido has been way too benevolent over the years. In fact, I am surprised Python got that far without regular exercise the dictator part in BDFL.

    2. Flocke Kroes Silver badge

      Re: And people wonder why ...

      Actually people who disagree with the project leader can fork off. If they are right then others will switch to the fork and leave the old leader behind.

      1. jake Silver badge

        Re: And people wonder why ...

        Of course, Mr. Kroes! I wouldn't wish it any other way, either.

    3. Camilla Smythe

      Re: And people wonder why ...

      In any online discussion about computer language development and/or maintenance as the discussion goes on the probability of a comparison with Linus Torvalds approaches 1.

      Assuming this has not already been postulated you may now refer to it as Camilla's law.

    4. Mike Moyle
      Joke

      Re: And people wonder why ...

      "I'm going to try and let you all (the current committers) figure it out for yourselves."

      Fortunately, given the "many eyes" theory of open source, we can at least be assured that -- whatever form the resulting administrative system takes -- it will be simple, breathtakingly logical, and satisfactory to all involved.

  6. vincent himpe

    Here's a PEP

    i propose introducing proper keywords like Begin and End or at least some brackets of some type as opposed to whitespace and indentation...

    1. thames

      Re: Here's a PEP

      Just use #{ and #} where you would like to use brackets and you can put in as many as you want.

    2. jake Silver badge

      Re: Here's a PEP

      Might want to add GOTO to your listie. All teh kewl h4xors use GOTO.

      1. Korev Silver badge

        Re: Here's a PEP

        Might want to add GOTO to your listie. All teh kewl h4xors use GOTO.

        10 print "Great idea"

        20 goto 10

      2. vincent himpe

        Re: Here's a PEP

        very funny. you need to look at assembly created by compilers.

        It is full of goto statements. jump jump jump. You can't make a CPU without a JMP operation.

        so i don't get the hate against 'goto'. If i am here and the next few statements do not need executing simply skip to the continuation point. Place a label there and off you go. It is up to you to make sure you don't imbalance the stack.

        what is different between these two constructs :

        -----------------

        if x = 1 then goto continue_here

        do this

        then that

        continue_here :

        some more stuff

        -----------------

        vs

        -----------------

        if (x<>1) {

        do this

        then that

        }

        some more stuff.

        -----------------

        The compiler still converts that if-then statement into a jump operation.

        1. Claptrap314 Silver badge

          Re: Here's a PEP

          I spent eight years programming assembly. He're my response: http://www.u.arizona.edu/~rubinson/copyright_violations/Go_To_Considered_Harmful.html

          The fact that a computer necessary branches while executing your code does not mean that your brain is fit to reason about the state of the machine when it does. Demanding that later maintainers tiptoe around the landmines that are created by gotos is at best unneighborly.

          1. jake Silver badge

            Re: Here's a PEP

            My admittedly bad and somewhat tired[0] throw-away joke aside, I quite agree with the comp.lang.c FAQ on the subject of GOTO. See section 17:10 here.

            [0] Some would say exhausted to the point of being a euthanasia candidate.

    3. Anonymous Coward
      Angel

      Re: Here's a PEP

      > i propose introducing proper keywords like Begin and End

      If you keep going down this path you'll end up writing FORTRAN. :-)

      But please make the Begin and End keywords uppercase mandatory:

      BEGIN PYTHON PROCEDURE Foo

      [ ... ]

      END PYTHON PROCEDURE Foo

      And yes, the name of the PYTHON PROCEDURE must be repeated in the END clause. What else are parsers for?

      Also: the line continuation token can be either a '&' on the preceding line, or a '$' on the succeeding line. Screw regexp. And the succeeding line must be indented by 4 tabs. Because.

      Also also: scalar variables must be declared with var, while all other variable types must be declared with dim. For clarity, and complexity reduction.

      1. Waseem Alkurdi
        Alert

        Re: Here's a PEP

        @ST

        BEGIN PYTHON PROCEDURE Foo

        [ ... ]

        END PYTHON PROCEDURE Foo

        And yes, the name of the PYTHON PROCEDURE must be repeated in the END clause. What else are parsers for?

        Visual Basic is calling and it wants its End Function/End Sub/End (something) back.

        while all other variable types must be declared with dim.

        Visual Basic is calling again and wants its Dim x As Integer back.

      2. Warm Braw

        Re: Here's a PEP

        you'll end up writing FORTRAN

        Proper FORTRAN didn't have BEGIN and END, though it did have significant leading whitespace, so perhaps had more in, er, COMMON with Python than is at first apparent.

        I'd offer an alternative vision of brevity if I could find my APL golfball...

        1. Someone Else Silver badge
          Happy

          @Warm Braw -- Re: Here's a PEP

          Proper FORTRAN didn't have BEGIN and END, though it did have significant leading whitespace, so perhaps had more in, er, COMMON with Python than is at first apparent.

          Ba-DOOM--tish!

        2. MacroRodent

          Re: Here's a PEP

          > Proper FORTRAN didn't have BEGIN and END, though it did have significant leading whitespace,

          In classic FORTRAN, you had to have the leading white space for statements (I think 6 spaces was the standard, but most compilers also accepted a single TAB), and everything after column 72 is ignored. Otherwise white space was optional and ignored. And I mean optional: IF(I.GT.0)GOTO100 is valid and jumps to the statement labeled 100 when I>0. You could also add extra white space inside keywords and identifiers, so you could write GO TO 100 if you preferred that.

          1. MJB7

            Re: Here's a PEP

            Don't forget "DO 15 I = 1. 100" ... which implicitly declares a real variable called DO15I and assigns it the value 1.1. Not to be confused with "DO15I=1,100" which runs a DO loop to the line labeled 15 with I varying from 1 to 100 inclusive. This lead to the failure of the Mariner 1 satellite in 1962.

    4. JLV
      Trollface

      Re: Here's a PEP

      While you are at it you should also open up a PEP to the LISP folk and tell them off about about their parenthese use. I am sure they're dying to have your input.

      It's a core language design decision. Doesn't mean it's a good one for everyone. But there's also nothing, zip, nada, zilch to keep you from coding in something else.

      1. codejunky Silver badge

        Re: Here's a PEP

        To be honest I do appreciate the forced indentation formatting but that is possibly because I have had to decipher code written by a lazy programmer who either didnt indent or worse sporadic indentation and not necessarily correctly lining up.

        While I dont do anything serious in python (never had the need) I do keep coming back to it for my own little projects and I do like it. Like all languages it is a personal preference thing.

      2. onefang

        Re: Here's a PEP

        "LISP folk and tell them off about about their parenthese use"

        Wait, you are saying that LISP should replace parentheses with BEGIN and END? That'll be a lot of typing. Though I guess that would then be renamed LISBE.

    5. Charlie Clark Silver badge

      Re: Here's a PEP

      Try import braces…

      1. Robin Bradshaw

        Re: Here's a PEP

        >>> from __future__ import braces

        File "<stdin>", line 1

        SyntaxError: not a chance

      2. jake Silver badge

        Re: Here's a PEP

        I have a Mustang style export brace on my '70 Cougar ...

    6. Anonymous Coward
      Anonymous Coward

      Re: Here's a PEP

      I used to think that. Then I started working somewhere with scientists. It turns out that the whole indentation thing means the code that I have to look at is not either just completely randomly indented, or indented from the right margin or something.

    7. VeganVegan
      Unhappy

      Re: Here's a PEP

      Geez, now you've dredge up a memory I tried to suppress all these years. I once used a language (BASIC? On the Apple II? Fortran on the IBM 1130?) that insisted that constants start with I, J, K or something like that, while the other letters are reserved for variables, or some such nonsense (yes the memory is still somewhat suppressed).

      1. HandleAlreadyTaken

        Re: Here's a PEP

        IMPLICIT INTEGER (A,Z), REAL*8 (I,J,K,L,M,N)

      2. MJB7

        Re: Here's a PEP

        It was FORTRAN. Variables that started I to M inclusive were INTEGER. All other variables were REAL When I was using FORTRAN IV in 1980 you could at least *choose* to declare the types if you didn't like that. Very early FORTRAN didn't have that option.

  7. Anonymous Coward
    Anonymous Coward

    Any language that depends on differing amounts of whitespace to alter the program is stupid. This especially includes Python. It's worse than not using curly braces on a one line if statement in C.

    The lack of compile time type checking ain't great either. I've seen several large Python developments hit painful runtime problems years down the line.

    I know that every pythonista's response is "unit testing" , and they'll probably sling in the word "mock" toi, but that's a stupid thing to do for what is essentially type checking. Compilers do that work for free.

    And as for the Python 2 vs 3 schism, oh deary me...

    1. Flocke Kroes Silver badge

      Unit Testing! :-)

      Why did C++ re-purpose the auto keyword to mean "the type no sane person can type in correctly even given five attempts and a type checking compiler with enormous error messages"?

      1. Anonymous Coward
        Devil

        Re: Unit Testing! :-)

        > Why did C++ re-purpose the auto keyword to mean [ ... bla bla bla ... ]

        C++ did not do that. But thanks for completely missing the point of C++'s auto.

    2. BinkyTheHorse
      Facepalm

      Really, because Python-style whitespace-based control flow is any more difficult for our powerful human visual cortex to process. Maybe that's why there's multiple competing styles on how to write that... oh, wait, no, that's brackets.

      I always derive a certain sense of Schadenfreude by imagining how people who complain about whitespace in Python would fare in times when not being able to move your arse out of your comfort zone meant not noticing that new-in-the-area predator preparing to ambush and maul you.

    3. thames

      @AC said: "Any language that depends on differing amounts of whitespace to alter the program is stupid. "

      For those who have moved on since the days of GWBASIC, everybody (other than you apparently) indents their code in a way which is intended to convey meaning about it.

      Differing amounts of white space alter the meaning of programs in all programming languages - in the eyes of the programmer for whose benefit those visual cues are present. The fact that in most programming languages indentation level doesn't alter the meaning of the program in the "eyes" of the compiler is a major problem.

      The Python compiler reads code the same way that a human would and derives meaning from the indentation level similar to how a human would. That eliminates whole classes of errors which would derive from humans reading it one way and the compiler reading it another. And once the compiler uses the same cues that the programmer does the block tokens become redundant and can be eliminated.

      1. Anonymous Coward
        Anonymous Coward

        For those who have moved on since the days of GWBASIC, everybody (other than you apparently) indents their code in a way which is intended to convey meaning about it.

        Just because most languages are uncaring about indentation, it does not follow that the programmers who use them are bone idle scruffy devs churning out unreadable code. Most serious coding standards require code to be correctly formatted. For example, try getting scruffy C code past Linus Torvalds.

        Differing amounts of white space alter the meaning of programs in all programming languages - in the eyes of the programmer for whose benefit those visual cues are present.

        A programmer relying on the outline shape of their code for clues to its behaviour aren't paying enough attention. How can something that you can't see, i.e. white space, act as a visual cue as to what's going to happen when you run it? Python gives you no help in this. In contrast, a curly brace, being visible, printable, and unambiguous, is a visual cue, which is why the better coding standards require them to be on their own line.

        The fact that in most programming languages indentation level doesn't alter the meaning of the program in the "eyes" of the compiler is a major problem.

        No, it's a major advantage. Furthermore it makes it possible to use a code beautifier to ensure that the code is neat, and a decent static analysis tool can point out where style problems have been typed. Python's problem is that it can look neat, and therefore is susceptible to being accepted as correct because only a human can pronounce on its correctness, but can in fact be utterly wrong.

    4. Charlie Clark Silver badge

      Any language that depends on differing amounts of whitespace to alter the program is stupid.

      Python itself doesn't really need the whitespace, they syntax enforcement is for the users.

      Compile-time type-checking is now optionally available for those few projects that need it. Books have been written on whether compile-time type-checking brings any advantages over memory optimisation.

      Unit testing won't really pick up type errors either, you need fuzzing for that.

      The 2 to 3 schism is now largely over. Most new projects are using Python 3, 3.5+ finally brings tangible improvements, 2.7 will be maintained until at least 2020. Projects can coexist for 2 and 3 with fairly minimal changes. That said, it's an investment with no immediate return for many older libraries. But rinse and repeat the discussion for any major release in any popular language.

      Face it, you're a grumpy old git who resents the success that many people have because of Python.

      1. Anonymous Coward
        Anonymous Coward

        Python itself doesn't really need the whitespace, they syntax enforcement is for the users.

        Hmm, well try un-indenting your Python code and see how your if statements pan out.

        Compile-time type-checking is now optionally available for those few projects that need it. Books have been written on whether compile-time type-checking brings any advantages over memory optimisation.

        Welcome to the 1960s. Been there, done that.

        The 2 to 3 schism is now largely over. Most new projects are using Python 3, 3.5+ finally brings tangible improvements, 2.7 will be maintained until at least 2020. Projects can coexist for 2 and 3 with fairly minimal changes. That said, it's an investment with no immediate return for many older libraries. But rinse and repeat the discussion for any major release in any popular language.

        It should never have happened in the first place. It was always going to cause problems, those problems arose immediately, and they're still annoying today. And given that there's a shit load of code written in 2.7 (large chunks of a lot of Linux distros, for example), are we seriously saying that it is OK to risk all that falling prey to security vulnerabilities after 2020 unless it is re-written for Python 3? Python might have language features that make a programmer efficient, but making them write, review, test, release, deploy and support their code twice simply because some language theoretician got all excited about a new way of expressing themselves in Python makes them much less efficient. And error prone.

        AFAIK no other language has cut off source code bases written in previous versions of the language. For example, K&R C will still compile, just like it always did, possibly requiring a compiler switch to say it's OK.

        Face it, you're a grumpy old git who resents the success that many people have because of Python.

        Smirk. I've seen plenty of people get into deep, deep trouble with Python chasing idiotic bugs long after deployment that weren't picked up in development, simply because Python brings no certainty to what's going to happen when your code runs.

        All the so called "improvements" you've outlined in your post (hints of type checking, resolving the schism, etc) are simply acknowledging that the things being "improved" represent a litany of errors made by those in control of the Python language. It's like they've experimented with radical ideas, and then slowly understood why others didn't do them first. Next thing we know Python will add curly braces.

        As it happens there's a lot I think that Python has got right, it's a useful language. I like the multiprocessing, list comprehension, string handling, etc. But the pitfalls are too problematic for it to be taken seriously as a major language. Plus it's dog slow, unless relying on a shit load of compiled wheels, or pips, or whatever (usually they're written in C). Just how pythonic is it to rely on code written in C.

        Also there's the mad, lunatic idea of having a package manager as part of the language installation that is independent of all the OS-provided package managers and makes massive assumptions about what sort of C/C++ build chain is present. It makes deploying Python application highly painful. It's fine for the enthusiastic dev who has root privileges, owns their own box, etc, but useless elsewhere.

        Plus there's the issue of the fact that Python is not quite the same on different platforms; Python on Linux does some things subtly differently to Python on Windows. To make for totally reliable Python the dev has to be on top of what all these differences are and code around them as required. Of course, no one actually does.

        I'll be sticking to my C, C++, C#.

        1. Charlie Clark Silver badge

          It should never have happened in the first place. It was always going to cause problems

          Some major changes in the language were inevitable after the introduction of new-style classes. Python 3 was more or less inevitable but the release was very badly handled. Thankfully, the core developers realised this at some point and did something to fix it. It's no longer an issue for the vast majority of developers.

          Smirk. I've seen plenty of people get into deep, deep trouble with Python chasing idiotic bugs long after deployment

          And I could point to a long list of projects that were saved by migrating to Python…

          Also there's the mad, lunatic idea of having a package manager as part of the language installation

          This is unavoidable because there is no installer that works for all OSes. I maintain a popular library and much as I moan about Python's packaging I'm so glad I don't have to package it for all the different Linux distros, MacOS, Windows.

          It makes deploying Python application highly painful.

          Strangely I rarely hear this, in fact usually the opposite. Deployment is such a huge problem that there are entire conferences devoted to the infrastructure.

          Python on Linux does some things subtly differently to Python on Windows.

          Python itself doesn't do things differently but invariably the different OSes have different APIs, file handling springs to mind. It's the same for any language that interfaces directly with the OS.

          I'll be sticking to my C, C++, C#.

          I'm happy for you. Why can't you be happy for people who, despite your predictions of doom, are successful with Python?

    5. GIRZiM
      Coat

      Any language that depends on differing amounts of whitespace to alter the program is stupid.

      And just what have you against Whitespace?

      You'll be turning your nose up at Malbolge next as well, I shouldn't wonder!

      Bring back BBC BASIC, I say! It just needs to modified to be compilable and for procedures as well as functions to allow parameter passing and you'll have a nice, all purpose language capable of handling... I'll get my coat ; D

      1. Anonymous Coward
        Anonymous Coward

        Re: Any language that depends on differing amounts of whitespace to alter the program is stupid.

        And just what have you against Whitespace?

        You'll be turning your nose up at Malbolge next as well, I shouldn't wonder!

        Nothing wrong with Whitespace. I was going to mention Brainfuck, but I read that Wiki page first and saw that Malbolge builds on it. Deary holy mackerel. Still, I'll add it to the list of candidate languages for my next project (though possibly quite a long way down, but still above Python...).

        1. GIRZiM

          Re: Any language that depends on differing amounts of whitespace to alter the program is stupid.

          Deary holy mackerel

          If you can still find any reference to it, you might like to take a look at NNAPL (Neural Net Application Programming Language): a general purpose programming language that just happened to have built-in primitives for NN development.

          The NN side of things was impressive: create your nodes and layers by say 'trainnet <data file>' or 'testnet <data file>'and it knew exactly what you meant - you didn't have to specify anything (not even the number of nodes or layers and, as far as I could tell, it did it by magic (or possibly quantum)).

          On the rather more 'exciting' (as in 'brown trousers') side of things, it had no data types of any kind beyond integers: when developing, you had to assign variables (even integers) by their byte size and add an extra byte for a CR/LF marker to indicate 'End of Variable'. Insane stuff that meant you could quite literally add 144 to Thursday and get a result that was, as far as the thing was concerned, perfectly valid - I didn't explore it greatly, but trying to teach it floating point arithmetic was really quite hairy.

          1. FeRDNYC

            Re: Any language that depends on differing amounts of whitespace to alter the program is stupid.

            5½ years later, rereading this comment, what's fascinating to me is how well it's aged.

            It's now 2024 (welcome, mind the hangover), and the difference from 5+ years ago is the subsequent explosion of AI into the public consciousness — in the form of large language models (LLMs), in particular.

            And, of course, when dealing with LLMs, the notion that "everything is a [number]" — or, more precisely, a vector of them — seems perfectly reasonable, even implicit. As does the notion of "adding 144 to Thursday", since working with LLMs is all about reducing everything to a vector of coordinates in n-dimensional space, then manipulating those vectors.

            (A classic example in the neural net space is that "King - Man + Woman == Queen". You take the three vectors on the LHS, you combine them as indicated, and you come out with something very close to the vector representing Queen.)

            I daresay this NNAPL's greatest weakness might've been that it was just a bit too far ahead of its time!

            (Well, if it couldn't do floating-point math, then that was probably an even bigger weakness. Most models on HuggingFace and etc. work in floating-point space, with fractional vector coordinates. There are models like MarkupLM constructed from integer data — its dataset is all torch.LongTensor values — but even those typically perform floating-point math and output torch.FloatVector results.)

    6. The Indomitable Gall

      Whitespace/brackets

      Anyone arguing whitespace vs block delimiters is really missing the point -- these are both compromise solutions for coding in plaintext on memory-limited machines.

      Modern computers handle block nesting in much more sophisticated ways -- think XML.

      Take a look at Stride, an educational programming language based on Java.

      The editor is "frame-based", meaning all the block delimitation is implicit, and it's damn near impossible for a coder to mix up the flow control.

      It's also impossible to commit any syntax errors, as starting a line with if, while etc leads to a template being presented with all possible slots presented as text boxes, so you can only put something where it's permissible.

      Better still, you no longer have to type full commands, with a single keystroke indicating the line's key function: "=" for assignment, "i" for if, "w" for while etc.

      It's quicker to code than standard java, it's less bug prone than standard java, and it's made up almost entirely of keyboard shortcuts, and because it saves as XML, you technically could edit it in plaintext if you wanted. It seems to be everything programmers want... so why aren't programmers picking up on it?

  8. JLV

    Too bad. I've chatted with Guido twice and he is really a very nice, unassuming person. And you can tell that he is also quite engineer-curious - him and his posse got sidetracked wondering about snow load vs roof slope (or something like that). Just geek-ish fun, not at all business types.

    PEP 572? I took a short look at it and didn't get till I saw what @Thames had posted. I have wished in the past to reuse filter-side expression. It's nice, maybe a bit syntaxic sugar rather than core need. If it's really expensive computation you can always do it in 2 lines.

    result = [f(x) for x in input]

    result = [x for x in result if x]

    I guess maybe the pushback was concern that too much special casing of stuff and symbols ends up PERL-ing things up?

    Suppose it's all for the best though - the language will need to transition to another governance one of these days anyway.

    BTW, for those who just don't get it: liking Python does not mean a dislike of C. Quite the opposite, matter of fact.

    1. Charlie Clark Silver badge

      I think a lot of the more recent PEPs have been syntactic sugar, hence the pushback. And maybe it is time for Guido to take more of a backseat. I've met him too a couple of times and he is a really nice guy and very smart.

      Often the biggest win is using generators (or equivalents) with memory use being Python's biggest challenge.

      For those increasing hordes using Python with TensorFlow, et al., it's just noise.

    2. Tom 38

      Ahem

      result = (f(x) for x in input)

      result = [x for x in result if x]

      No point in having that extra list around wasting memory!

  9. A Non e-mouse Silver badge

    I hope the Python community wakes up (quickly) and realizes what it's done. You must have really screwed up if your dictator leader walks away 'cause you've all been such ****s,

  10. Anonymous Coward
    Anonymous Coward

    I like Python and C

    What I don't understand is why a language allows you to do things in two different ways (a python one liner or spread over multiple lines) and the Police at StackOverflow tell you how wrong you are when you write Python in a C style, and in fact belittle you, when since it works is good code.

    Coming from a C background some of the Python List comprehension stuff confuses me so I avoid it. Those that tell me I am wrong and belittle me seem nothing more than fashion victims who probably tell others that wearing a baseball cap with the peak at the front is wrong. If writing Python in a C style is so wrong why does it work in Python. Seems like the developers themselves didn't know what they wanted.

    The style Police who show themselves in StackOverflow are clearly demonstrating the problems Python PEPs will have now that their dictator has gone. With a dictator a project moves forward, when fashion and committees come together to make a decision nothing can happen without a style war and bloodshed.

    I like Python because I can pick and choose how I write code, and have flexibility. There's nothing better nor easier than writing (something that could be a one liner) over multiple lines so a simple print statement can be inserted as an extra (if condition, print the state at this point) that can be removed later. Then once happy I can combine into a one liner if I choose. Show this development code to a Python fashion victim mid way through development and you'll get lambasted as a dim wit.

    We are already at the stage where a PC needs two versions of Python, where loads of modules that worked perfectly ceased development and then no longer work with Python3. The way Python is going (may go) it'll repeat the Python2 to Python3 problem with Python4 or 5

    1. Gordon 10
      Mushroom

      Re: I like Python and C

      Then avoid SE. Simples!

      If your code performs the same over multiple lines as a one liner then ignore them. If the one liner is more efficient then suck it up and learn the art and zen of 1 line code.

      Personally I hate "short hand" code and the syntactic sugar that is introduced to support it. Its just an excuse to act elitist, make things obscure to noobs and a modern variant of not commenting your code. If it introduces new functionality or more speed or more safety then great. If its elitist ego wanking then f*ck off. </thus endeth my rant>

      1. Baldrickk

        Re: I like Python and C

        I don't think comprehensions are obscure, they're just different, a construct that doesn't exist in C.

        But then again, someone moving from python to C will probably feel the same way about pointers and references.

        A comprehension has the advantage of being succinct, which is why I like them in general

    2. Charlie Clark Silver badge

      Re: I like Python and C

      What I don't understand is why a language allows you to do things in two different ways

      Python doesn't really like this. Hence, the claim from import this, that There should be one-- and preferably only one --obvious way to do it.

      Comprehensions and multiline expressions are more than just functionally equivalent, hence I wouldn't expect flame wars about them, but otherwise there is a sense of what is Pythonic. It's not canonical and open for debate, but also considered a value in itself.

    3. JLV

      Re: I like Python and C

      SE has some extremely toxic individuals, esp in the G... police that moderates it

      As to 2 vs 3 an analogy might help: DOS’s 640k RAM limit.

      Some design decisions were, in hindsight, wrong. ‘print’ - statement rather than ‘print()’ - expression, for example . Means you can’t ‘[print(x) for c in x]’.

      Revisiting them was considered better than enshrining them forevermore. Guido, who in temperament is very consensual - dictator is a bit of a Monty P joke - has mused that 3.x would be handled differently in hindsight.

      Now, you can write 90% matching 2-3 code on 2.7, which is intended as a transitional 2.x bringing as much of 3 as possible. There are automated converters. Almost all of core lib packages written on 3, if they do not rely on 3’s internals, get backported into 2.7. Being careful, you can use special scaffolding that keeps your code running on 2 and 3.

      The problem was less the language incompatibilities _for your code_ as the existence or not of your core dependency packages (ORMs, web frameworks..) in 3. Plus, for the first 3-4 years 3.x was slower by 10-20%.

      So... much later, a cleaner language. How many times do we techies regret cruft and inconvenience kept in the name of backward compatibility? Yet, for all that, it was a traumatic birth and unlikely to be repeated in a 4.x release.

      1. Charlie Clark Silver badge

        Re: I like Python and C

        Some design decisions were, in hindsight, wrong. ‘print’ - statement rather than ‘print()’ - expression, for example . Means you can’t ‘[print(x) for c in x]’.

        I actually want my print statement back. Why on earth would I ever wrap a print call in a comprehension? Never ever felt even the urge to do this in the past. And if I ever did, two lines would be fine.

        Code that works on both 2 and 3 requires minimal changes if you can start with 3-style. Then it's really just unicode literals, ints instead of longs. It's a bit more work if you have extensions. All in all less work than changing a major component.

      2. jake Silver badge

        Re: I like Python and C

        "DOS’s 640k RAM limit"

        Assuming you mean the MS-DOS/PC-DOS twins, they didn't actually have a 640K limit. They could use all the contiguous memory that you could give them. Really. Look it up. The actual limit was in the hardware. IBM, in it's infinite wisdom, decided to put the system ROM at the top of memory. Had they put it at the bottom, DOS would have be able to use as much memory as the CPU could address.

        And it wasn't really 640K, it was more like 704K, if you knew what you were doing. Later, memory tricks allowed up to about 720K, later still around 950K. I find it absolutely amazing that the "640K DOS limit" piece of incorrect trivia is still being parroted as fact after all these years ...

        On the other hand, I personally remember Steve Jobs saying that "128K ought to be enough for anybody", at a meeting of the Homebrew Computer Club in late 1983, as he was demonstrating the original 128K Mac, just before the public unveiling. At the time, he had a point ... people were running flight simulators in 64K!

        1. JLV

          Re: I like Python and C

          Well, I didn't know that. But I'm not sure I agree with your viewpoint DOS had nothing to do with it.

          It's not that I disagree with the hardware info you state, you know more than I do about it, apparently.

          However, I well remember mucking around with config.sys to get DOS to see enough RAM to run a flight sim on my first pc. So, the OS, which should have abstracted away the hardware specifics, wasn't quite up to that role. Ditto tons of memory manager utilities floating around to help out.

          Also, IIRC, things like QNX had no trouble running 4MB RAM workstations.

          Regardless of the actual causes, it's still one of the better known examples of (hardware or software) design issues sticking around like a wart.

          Last "640K is enough for everyone" is, perhaps wrongly, attributed to Gates, so hardly surprising the meme would survive as a "DOS sin".

          https://www.computerworld.com/article/2534312/operating-systems/the--640k--quote-won-t-go-away----but-did-gates-really-say-it-.html

          1. jake Silver badge

            Re: I like Python and C

            JLV, if you configured your hardware such that the CGA memory space was replaced with RAM (CGA was at the bottom of the so-called "Upper Memory Area"), that 64K of RAM would be recognized by DOS automatically, giving you 704K of so-called "Lower Memory" without the use of third-party memory management tricks, or even any mucking about with config.sys or autoexec.bat.

            "So, the OS, which should have abstracted away the hardware specifics, wasn't quite up to that role."

            You are quite correct, DOS wasn't up that kind of thing. But then DOS wasn't really an operating system, it was just a glorified program loader.

        2. Adrian Harvey
          Boffin

          Re: I like Python and C

          “IBM, in it's infinite wisdom, decided to put the system ROM at the top of memory.”

          I’m not sure the blame doesn’t sit with Intel there- the 8086 processor bootstrap begins by executing code ar FFFF:0000 - right at the end of memory (for those too young to remember segmented addressing, that’s 16 bytes shy of the 1MB highest possible address on the 20 bit address bus.). So you would have to have some ROM there to handle the bootstrap process. And putting the system ROM somewhere else in the memory map would probably have required a second chip or some custom part.

          For all I know it may sit further back in computing history than that....

          1. W@ldo

            Re: I like Python and C

            IBM owning a big chunk of a then faltering Intel gave us the joys of segment:offset memory addressing. That set us back years in assembly coding quality software. There were Zilog and Motorola chips around at the time that were much better and could handle direct memory addressing. We all paid a price for 15+ years until Intel reached that point.

            If you don't know how segment:offset works, take some time and you'll see the futility we all faced back in the day....I moved on to C, never embraced C++ and Python is the only interpreted language I became fond of. Too bad about the situation as it has been a fun ride.

            1. jake Silver badge

              Re: I like Python and C

              W@ldo, IMO, Intel's biggest problem back in the day was a lack of MMU.

              Frankly, I never had an issue with the segmented address space. Every CPU has it's quirks, some are more quirky than others. They all suck, but we use 'em anyway.

              1. Charlie Clark Silver badge

                Re: I like Python and C

                Frankly, I never had an issue with the segmented address space.

                x86's memory addressing and "context switching" chained CPU performance to the 1970s for decades. Intel had admirable processes for a dreadful architecture. But, as with VHS over Beta (feel free to add your own examples), it's often not the best technology which succeeds initially. Eventually, however, the better technology is likely to be adopted.

                1. jake Silver badge

                  Re: I like Python and C

                  Rather clumsy wording on my part. What I meant was that I had no real issues with programming Intel's kludge of a flagship. It was there, it worked, we used it. Some were quite vocal in their displeasure, I just learned the faults & foibles & got on with it. Seemed easier than griping about a problem that was never going to go away.

                2. onefang

                  Re: I like Python and C

                  "Eventually, however, the better technology is likely to be adopted."

                  You had me nodding my head in agreement up until this line. Marketing wins over good tech every time, and then the better tech disappears from view, coz the marketing worked so well. By the time the better tech is generally accepted as actually being better, both techs are obsolete, and the current tech winner is the new shiny with the better marketing.

                  I think the reason is that the better tech knows they are better, and don't bother to market it so well, but the not so good tech knows it's not so good, so they pour money into good marketing to get their higher market share.

                  1. Charlie Clark Silver badge

                    Re: I like Python and C

                    Marketing wins over good tech every time, and then the better tech disappears from view, coz the marketing worked so well.

                    Tech that is "good enough" is likely to win through but I think we've seen also seen examples of good technology winning through in the end. For example, similar to the VHS versus Beta battle, the microchannel architecture of the PS/2 was most definitely better than the ISA of the PC, but it wasn't that much better, plus IBM didn't want to repeat the BIOS problem so they didn't want to licence it. A few years later when a replacment for ISA became essential we got VESA local bus and PCI. VESA was first to market and the devices were cheaper but PCI won out. Okay, Intel's backing did help. Intel was also behind the push of USB for everything but at some point jumped ship to back Apple's FireWire follow-up that is "USB in name only". But Intel came a cropper by betting on Sprint and WiMax: not sufficiently good enough to get networks to dump LTE for it. The same goes for trying to convince the developers of embedded devices that x86 can do the same work as an ARM with the same power.

          2. jake Silver badge

            Re: I like Python and C

            True enough, Adrian Harvey. Note where I said "contiguous"?

            "And putting the system ROM somewhere else in the memory map would probably have required a second chip or some custom part."

            As happened starting with the 80286.

        3. Charlie Clark Silver badge

          Re: I like Python and C

          And it wasn't really 640K, it was more like 704K, if you knew what you were doing.

          Like running DOS inside OS/2…

          1. jake Silver badge

            Re: I like Python and C

            Indeed, Charlie Clark :-)

      3. Anonymous Coward
        Anonymous Coward

        Re: I like Python and C

        The actual decision that was wrong about print was having statements. Expression languages have none of these problems. But I realise that that's far too radical an idea to ever succeed.

  11. Potemkine! Silver badge

    Whoever is not a misanthrope at forty can never have loved mankind

    Facing the stupidity, violence, hatred from the many despicable assholes making a big part of the human kind is a harsh experiment. It's getting even harsher with the so-called 'social' media which are so great to propagate this stupidity, violence and hatred even more.

    I'm going back in the garage.

    1. Gordon 10

      Re: Whoever is not a misanthrope at forty can never have loved mankind

      Nice wheezer song have an upvote.

  12. Anonymous Coward
    Anonymous Coward

    IIRC the use of ":=" for (all) assignments used to be the sensible choice since using the mathematical equality symbol for this was obviously inappropriate. But then people complained that not only was it two keypresses instead of one, but it also involved finding the shift key - adding "short hand" to programming languages has a long history.

    1. Wensleydale Cheese

      Re introducing ":="

      "But then people complained that not only was it two keypresses instead of one, but it also involved finding the shift key - adding "short hand" to programming languages has a long history."

      On a recent personal project which involves the manual entry of lots of time stamps, I decided to drop the colon mainly because it requires shift. I simply use HHMM format rather than HH:MM.

      It's my data and if I want to use it in other applications which expect a colon, it's a simple enough matter to let the computer do the work (via an extra workflow step, if necessary).

      There, saved myself a lot of keyboard work, and a useful by product is less typos which need correcting.

      1. Anonymous Coward
        Anonymous Coward

        Re: Re introducing ":="

        Isn't that the sort of thing which caused the year 2000 bug, saving all the keyboard work on not typing 19. Recently I came across some files which shouldn't of been destroyed in 1910 which they hadn't been, maybe because they didn't exist then.

        Your data so you can do want you want, I've just spent to much time dealing with other peoples and there typos, when there wasn't anything in place to highlight the typos when they happened.

        Anon of course

        1. FrankAlphaXII
          FAIL

          Re: Re introducing ":="

          >>I've just spent to much time dealing with other peoples and there typos

          Instead of complaining about other people's typos, maybe you should worry more about your own, as your own sample of typing here isn't exactly a shining example of perfection.

        2. FeRDNYC

          Re: Re introducing ":="

          Two-digit years wasn't so much about typing as it was about data storage, back in a time when saving two bytes per date seemed like real economy. If it was just about typing, they could've allowed operators to type a two-digit year, but stored it as 19xx, in which case Y2K would've been a snooze. (The same way UNIX has always stored dates as integers — seconds since the epoch — for efficiency, and therefore UNIX/Linux were largely immune to the Y2K bug, at least in terms of system code.)

          Pascal did indeed use ':=' for assignment (and '=' for equality test), whereas C went with '=' and '=='. Which probably was about typing laziness regarding the shifted character.

          (Or possibly internationalization — did the ':' character appear on all early keyboard layouts? Does it appear on all of them now, for that matter? Obviously it's already required if you want to write Python code, so it's not an issue using it for ':='.)

  13. Anonymous Coward
    Anonymous Coward

    "Python can be a fraction of the number of lines as a program which does the same thing in C."

    Interesting... What are the Top-Things people like about Python specifically? Does it have easy syntax, rich libraries of functionality, no overbearing pointers, headers, macros, virtual-funcs, mem-management?

    1. Anonymous Coward
      Anonymous Coward

      Re: "Python can be a fraction of the number of lines as a program which does the same thing in C."

      Yes. And ease of use, and ubiquity across platforms, and the ability to do very rapid development, and built-in debugging/trace aids and...

    2. Flocke Kroes Silver badge

      Re: "Python can be a fraction of the number of lines as a program which does the same thing in C."

      Python has a buffer protocol that effectively wraps pointers and allows you to experience all the joys of debugging double free, use after free and memory leaks - if you want to.

      Importing python modules is a bit like including a header file.

      You can emulate macros with string templates and the codeop module - if you want to.

      All python methods are virtual functions.

      What I like about python is that programming styles from different languages are (to a varying extent) supported. I can pick the style best suited to each part of the problem and the interpreter will not add pointless road blocks to send me in the direction Bjarne Stroustrup knows is the only possible way to solve a different problem.

      1. Anonymous Coward
        Anonymous Coward

        'programming styles from different languages are (to a varying extent) supported'

        Cheers!

        How does that actually work in practice, can you say more?

        1. JLV

          Re: 'programming styles from different languages are (to a varying extent) supported'

          Different paradigms, rather than languages:

          Functional - the list comprehensions and iterators, maps, reduces, lambdas, all sorts of goodies (which I know little about).

          Object Oriented - pretty much everything is an object, including classes themselves and functions/methods. There's a lot of depth in the data model that few people use. For examples, classes are themselves objects with their class being a metaclass. That's useful, for some use cases, or for some people's programming preferences - I have a bit of a blind spot for them, they're an unused tool in my case. You can generate classes on the fly as well - say a class for each database table you are reading from.

          (one thing to beware of : mutable objects as attributes, at a class level or as default arguments, bites everyone sooner or later. self._list = [] looks like cls._list = [], but in the first case appending stuff self._list.append(hit) affects your instance, in the second all that class's instances.)

          Procedural - if you want to do write something with a main calling all sorts of functions, there's really nothing forcing you to use classes or objects as your building blocks - sticking to functions is perfectly permitted. ditto avoiding list comprehensions.

          Multiplatform/scripting - rare is the case where you really have do worry about Windows vs Nix. os.path.join("foo","bar","zoom") will do the right thing on either, barring issues with Windows C:/D: drive names.

          Since functions are objects, you can say assign any attribute, say a template to a function. The reasons why you might to do this are not common, but it can be helpful at times. For example, I explicitly assign template file paths to webserver functions because it allows you to automatically introspect which urls use which templates.

          def f_view(**kwds):

          (indent) print (f_view.template % kwds)

          f_view.template = "my foo is: %(foo)s"

          f_view(foo=2, bar="1")

          All these tricks need to be sanity-checked against clarity - it is just as possible to write incomprehensible code in Python as in C!

          1. JLV

            Re: 'programming styles from different languages are (to a varying extent) supported'

            things it doesn't do (I've probably missed some):

            - compile time checks - there are some basic syntax checks (dangling commas, bad indents, etc...), but nothing like a real compiler. It is an interpreted language, albeit a strongly typed one. Even the 3.x type annotations are more intended for 3rd party library parsing than real compile time type checking. That's a hard separation - you either want compile checks or you don't.

            - information hiding and encapsulation. There is no privacy as such to class and module attributes, though single underscore, _my_somewhat_private, by convention means non-public and double underscores, __my_almost_private, are obfuscated, but still accessible.

            - full-on threading. There's something called the Global Interpreter Lock in the main (C-based) version of the language that enforces code locks. It looks like full threading from the POV of the coder, but code blocks will take their turn in some cases. Different ways exist to mitigate, and it looks fine from the dev's POV, but it's still there.

            - speed. You can find cases of quick Python programs that compare fairly favorably to C alternatives, but that's just because the algos are not CPU-bound. Or they are, but the heavy lifting could be left to objects which are implemented in C. For example, the built-in hash maps are very clever and can often make a huge difference in speed, but they're C-based, not native Python. Ditto things like pandas or numpy, used in data science - libs are all in C, but dev need not care. Generally, Python knows full well that it can't do everything quickly and goes out of its way to facilitate interfacing to native compiled code.

            Pure Python CPU-bound code? Slow. Writing a driver in Python? Not a great idea.

            - it's not manual memory management. Which means you may experience the joys of garbage collection kicking in at inopportune moments.

            1. druck Silver badge

              Re: 'programming styles from different languages are (to a varying extent) supported'

              Multi-threading is pretty useless in Python unless you are launching system commands, any real parallelism has to be done using multi-processing (multiple processes), and is far more difficult to get right. It's more akin to writing code for MPP systems in that all data has to be transferred between sub processes using slow mechanisms.

              On Linux fork allows the sub processes get any existing data already calculated in the main program (although it can't modify anything not explicitly shared), but on Windows each sub process starts as a clean sheet and all data it needs must be sent to it. Unless you are doing a lot of processing with a small amount of data, there's a big risk that multiprocessing ends up slower than just using a single process.

  14. petef

    Larry Wall for BDFL

    Let Parrot flourish

  15. CAPS LOCK

    I hope he goes on to start a robotics company...

    ... general purpose ones...

  16. Paddy

    Change is inevitable.

    Thanks for your role as BDFL for so long. You did so well in helping to bring Python to where it is,and in fostering such a great community.

    Now I have to sit and think of what more I might need to/could do to help grow Python and it's community.

  17. Someone Else Silver badge
    Coat

    Hmmmm...

    So, all you have to do is post a bunch of disagreements with der Führer to have a disruptive change in top level management? I guess it’s good that Guido isn’t leading Linux. (But I wish it was that easy to affect the "top management" in Washington....)

    1. JLV

      Re: Hmmmm...

      yessssss, our first Godwin.

  18. jelabarre59
    Joke

    Fictional leader

    Well, I understand Miss Kobayashi is a crack Python programmer (according to this screenshot. You don't have to deal with a real person, and anyone who disagrees with her decisions will have to deal with the wrath of her maid/dragon.

  19. IGnatius T Foobar ✅

    This happened to Perl too...

    Actually, Guido wasn't the first to do this. Larry Wall stepped down some time ago, but no one was able to read his resignation letter.

    *rimshot*

    1. FeRDNYC

      Re: This happened to Perl too...

      ...I don't want to find that joke baked from low-hanging fruit funny ("Hey, this guy is finding humor in the common perception that Perl syntax is hard to read, do they have a Nobel Prize for Humor yet?") ... ... ... but dammit, it IS funny.

      There's More Than One Way To Earn An Upvote.

  20. Anonymous Coward
    Anonymous Coward

    Is this equivalent to the following, widely recognised as horrible in any C-like?

    if (a = f(b)) { ...

    rather than

    a = f(b);

    if (a) { ...

    Because if so, I'm absolutely on the side of the haters.

    1. FeRDNYC

      Re: Is this equivalent to the following, widely recognised as horrible in any C-like?

      Soooooorta, but you're also inside a for loop (which is what a list comprehension is, an expression evaluated for every member of a given list), so there's also a sense in which it's a shorthand for (in Perl):

      foreach my $x ( @input_data ) {

      my $y = f($x);

      push(@output_data, [$x, $y, $x/$y]) if ($y > 0);

      }

      Yes, the assignment is inside the if statement, but that's because you've already compressed the for loop into a one-line comprehension and there's really no way to locate it anywhere else. The point is to call f(x) only ONCE, assign it to y, and then output the tuple iff y > 0;

    2. FeRDNYC

      Re: Is this equivalent to the following, widely recognised as horrible in any C-like?

      And, in fact, PEP 572 isn't JUST about list comprehensions. It adds in-expression assignment generally, a very common language pattern which Python previously lacked. So, while these two expressions are equivalent:

      foreach my $x (@input_data) { # Perl

      for x in input_data: # Python

      other common in-loop assignments such as (again, Perl):

      while (my $x = $parser->get_token()) {

      do_stuff_to_x

      }

      had no direct syntactic Python equivalent. But with PEP 572, this can now be written in Python as:

      while x := parser.get_token():

      do_indented_stuff_to_x

      In-expression assignments like that are generally a common pattern. They can be abused, of course. But C is the king of "enough rope to hang yourself" so a feature being open to abuse isn't an argument against the feature. It's against abusing it.

      (And I would argue that the list comprehension form does not count as abuse. There's a reason they went with := instead of =, and it was to avoid exactly the type of horrible C code you mentioned.)

      Because while

      if (a = f(b)) {

      is absolutely horrible C due to its "hidden" assignment, if the syntax were

      if (a := f(b)) {

      that would be far less problematic, because it's clearly different from a == test for equality.

  21. mfabian

    It seems to me, you will be useful to read this https://webcase.studio/blog/advantages-and-disadvantages-python-programming-language/

    You will see the pros and cons of Python programming language over Java, C, C++, and understand why companies prefer Python.

    1. FeRDNYC

      Nice spam, but as a warning for other readers,

      You will see the pros and cons of Python programming language over Java, C, C++, and understand why companies prefer Python.

      ...You really won't.

      What you'll find is a bunch of empty, regurgitated half-platitudes with no real explanation or justification. (Example: "Python has some unique characteristics that are valuable for programmers because they make coding easier." ...That's it, the claim is never expanded upon.)

      Plus, you'll find blatantly incorrect nonsense like:

      "Another advantage of Python programming is that no bug can originate a segmentation fault."

      ...Wanna BET? (This is the sentence that immediately follows the previously-quoted one, BTW. Like I said, no justifications, just jumping from nonsense point to nonsense point.)

      Or how about:

      "C++ or any other typical scripting language might be easier to use for constructing mobile applications or games"

      #WAT. Android native code is written in Java. iOS native code is written in Swift or Objective C. There are a few exceptions (like Qt's mobile framework), but really what is this even TALKING about?!?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like