Peyton-Jones's Book of Martyrs
Nearer than you think. Simon P-J's wife Dorothy is a priest at my local Anglican church.
Obvious icon choice.
"Today, object-oriented programming (OOP) rules the IT industry absolutely. It is impossible to dislodge. While functional programming (FP) has seen a resurgence of late, it is typically used as an adjunct to OOP." – Louis Cyphre, 'Why are functional programming languages used so rarely in practice?' The leader of the …
Back in the mid-70s, at the U of St Andrews, one of our lecturers was Dave Turner, who created the functioning programming language SASL, and later moved to Kent at Canterbury where he developed Miranda, an ancestor of Haskell.
His colleagues thought it was a great joke to allocate him to give us the Fortran course... and scheduled the lectures for 0900 in the morning.
For some reason, my Fortran was never very good...
His colleagues thought it was a great joke to allocate him to give us the Fortran course... and scheduled the lectures for 0900 in the morning.
Ah, the joys of being a young lecturer. Lectures at 9:00 Monday and 5:00pm Friday, while the Profs & Readers get just before lunch in the midweek.
For some reason, my Fortran was never very good...
Was anybody's? There was a joke back in the 70s & 80s that as all the particle physicists shared code, a significant number of the particles discovered were probably Fortran bugs.
I remember FORTAN as being painful because of the decidedly non-agile development cycle. It was at school in the 70's, where coding took place on paper worksheets, transformed into cards using a manual hand punch, and then the card decks sent via mail to the local technical college (remember them). Where it would be compiled and the almost inevitable error message sent back to you. A week long process for one iteration. But eventually I did get a graphical plot, on the classic green stripy paper, that looked a bit like a pair of boobs - as any 14 year old would want.
No, that is why you punch sequence numbers in columns 73-80 of the cards. The sequence numbers should increment by 10, to allow insertions between lines without re-sequencing.
If you look hard enough in the back of the computer room, there is an old electromechanical sorter, with 10 output bins, and you know how to program it (by wiring a punchboard) to perform a radix sort, most significant column first.
And if you are an honor student, you can write a simple program for the computer to punch the numbers for you. If you are a total genius, you can program the 029 keypunch to do it.
I remember Dave at Kent - he was still teaching Fortran occasionally) - actually mine was quite good. I really remember the SBNOBOL4 courses - run for arts students - mainly English who wanted to do analysis on texts - very popular with CS students for some reason (nothing whatsoever to do with the large number of women who attended of course)
I just use "array.sorted", but I'm sure that there are much more complicated ways to achieve the same thing. ;-)
A functional version of quicksort is only about 7 lines long, and the implementation is very close to the abstract algorithm, making use of recursion.
"Inefficiently, yes, and with the danger of blowing the stack, yes."
You seem to be under the impression that functional languages have some magic fairy dust they sprinkle on their recursion. Newsflash - it all gets converted to assembler under the hood so the recursion either works in the same way as procedural languages by pushing the stack and a call, or its simulated by iterative loops and heap allocation. The latter can also be done in procedural and OO languages which oddly enough is what most functional language compilers & interpreters are written in.
"You seem to be under the impression that functional languages have some magic fairy dust they sprinkle on their recursion."
Yes, they do. The magic fairy dust is called bloody necessity. You have to have fast recursion in a functional language or it all goes pair shaped. In an object orientated language, you often don't.
Case in point: the JVM does not implement generalized (nor any) tail recursion. A problem for both Scala and Clojure which are compiled to JVM byte code. Solution: both use a hack in their respective compilers to provide at least a specialised tail recursion that addresses the issue in most cases.
Java could do it, but doesn't. Clojure and Scala do do it, because they have to.
"Yes, they do. The magic fairy dust is called bloody necessity. You have to have fast recursion in a functional language or it all goes pair shaped"
Congratulations on not understanding the point I was making. I guess you think functional languages use a special functional based CPU rather than the normal one? Never mind.
"Congratulations on not understanding the point I was making. I guess you think functional languages use a special functional based CPU rather than the normal one? Never mind."
The magic fairy dust is this: If you structure your function so that the recusive call is the last act of your function, then erlang will reuse the same stack memory, allowing you to write a recusive function to iterate over a very long list without running out of stack space.
Other languages dont tend to do this, as the programmer will have written a loop instead.
I never understood the value of sorting arrays, except as a way to demonstrate a technique. I would always build an index with the desired new sort order. OK, I admit that this comes from a time when batch processing (without BAT files) was a thing. Yeah, and sometimes I'd be looking at a potentially confusing element such as myarray (index1 (index2 (counter))). Sigh. I suppose that's why they invented debmesses.
You work out a way to "copy" a list efficiently, so that you can make a new one for every step without it going really slow. It's easy to do this with a singly linked list; you can share identical list tails between all other lists that have the same tail. It's relatively easy to program, although quite hard to visualize, which is a recurrent theme for functional programming.
The other way is to google the answer. Although we are, apparently going through a functional craze at the moment, people have been thinking about this stuff for years. All the simple and obvious questions were answered 20 years ago, so you usually find the answer if you really want it.
Them were the days all right.
Running RSX-11D on a very early PDP-11/40 with 56Kb of RAM and three 2.4Mb RK05's
We developed a flight simulator to test aircraft avionics. We could run the Real Time simulation while others could compile Fortran. All on a 200KHz machine.
So nowadays, we need a 4GHz PC to answer emails? [redacted]
Not sure if things really have got better
I learned programming on the HP2000F Time Sharing System, 4K RAM, 5MB HD, 32 simultaneous users.
... so I dutifully followed the link, and encountered a vastly expanded backstory compared to when I was there last, far far away a long time ago possibly around 1977. Newer illustrations as well.
I recognise those eyes. And those porcine cheeks.
From the thumbnail sketch of the Hatt family tree provided, I expect we shall also find a connection to the Hutt family tree, with Jabba being on a cadet branch (line).
The directors of national rail companies related to some of the most notorious organized criminals in the Galaxy. Who'd have thought it?
The closest I've got to functional programming is SQL and that has often made me fear for my sanity.
Perhaps functional programmers have travelled through a black hole and out the other side into hell and end up as some unnatural human-functional code merged construct, like in the Disney film of the same name*. They don't make them like that any more.
* The Black Hole, not that whole sentence.
I think that Functional Programming means different things to different people. I don't go in for the really abstract stuff.
As someone who has 20+ years of successfully writing professional C in embedded systems, then my view of functional programming is that it is generally a better way to more quickly write programs that are easier to understand and have far fewer problems.
If I try and sum it up in a nutshell, then it feels to me that imperative programming is generally about telling the computer *how* to manipulate some data, but functional programming is more about telling the computer *what* manipulation you want it to perform on the data.
You might think that this makes functional programs really slow, but its speed is comparable to imperative programs, e.g. A reasonable Scala program is generally within 3 times the speed of highly optimised C. And, by highly optimised, I mean that most professional C programmers would not choose to write the code that way, and other programmers needing to maintain it wouldn't be thanking them for it either.
Further, it is also my belief that it makes me a better C programmer because I start to think about things in different ways, and I try to write my code differently, so that it doesn't just work today, but so that it will work reliably (even when extended) in 20 years time.
Finally, I see that the best techniques of functional programming are making their way into main stream languages like Java, and the new crop of languages like Swift, Rust, etc are adopting even more functional principals, just because it makes programmers lives easier.
Perhaps google for "Programming in Scala" by Martin Odersky, Lex Spoon, and Bill Venners. The first edition is freely available online, and gives some access into functional programming from one of the people who is very much at the forefront of modern program language design ...
You're one of the few commentards to "get it," Robot W. Functional programming is not some obscure passtime for nerds, but a powerful way of thinking about programming and the management of complexity in the systems we create. From experience, a solid foundational knowledge of FP makes programmers more effective in any language.
The article was still mildly funny though, if only for the excellent TtTE reference.
And a humble way of programming is to use an ASR33 teletype as your I/O device while you
code everything all up.
All in all a humbling experience.
Of course, the alternative before that was a keypunch and cards with a turn around time of over an hour, but I digress.
I remember the particle physicist (it might even have been Jim Virdee in his PhD student days) running an overnight job on our DecSystem10 that produced a whole box of line printer paper for its output.
He looked at the first page, realised it had all gone wrong, and scrapped the output.
Still, it helped the Christmas Party fund along niceley.
I'd like to posit the claim that having to wait a non-trivial amount of time to get code results may actually make for better code, not worse. Because you don't want to explain to your boss why you took 48 hours to produce a core dump, so you take the time to code it right the first time. As opposed to the hack/build/hack/build loop that we are in today.
Discuss.
This post has been deleted by its author
Well, I agree.
It's basically "measure twice, cut once". Inconvenient obstacles make you think things through a little further. Useful not only when writing progams.
(FORTRAN 77 with punchcards, a 40 minute bus ride to the uni datacentre, waiting anything between a couple of hours to a week until they had time to feed your stack into the card reader. Yes, diagonal stripe in big felt tip pen on the side of the stack. Columns 78-80 had to contain your initials. Still, after the first trip to the datacentre to collect your printout and stack which inevitably meant collecting something that had not worked as intended, you thought harder and longer before you punched that card.)
From https://rjlipton.wordpress.com/2009/05/27/arithmetic-hierarchy-and-pnp/
> While working for Church, Stephen was trying to show that the lambda calculus could compute anything “computable”. Unfortunately, Church and Kleene could not see how to compute even the following function:
> $x \rightarrow x+1.$
> Yes, even the lowly successor function was not obviously computable in their calculus. The story goes that one day Stephen was getting his hair cut, when he realized how to encode the successor function into the lambda calculus. It is claimed, he jumped out of the barber chair, with half a haircut, and ran to tell Church. Not quite Archimedes running naked through the streets of Syracuse, shouting “Eureka,” but close. I think the story, true or not, says more about the nature of the lambda calculus as a basis of computing than anything else, but that is my take.