Re: Penny in the air?
I also came here to find the obligatory, "I have my own crackpot Grand Unifed Theory" post.
Why, hello there.
5761 publicly visible posts • joined 20 Jul 2010
the underpinnings of a model must be simpler than the higher levels of a model.
I can show you some nice intuitive UIs and the code that underpins them as a proof-by-example of the falseness of that hypothesis.
Hell, your argument is akin to saying the workings of a clock must be simpler than the clock face.
You really have some more thinking to do if you believe that, just because you can't see something, it must be simpler than the things you can see. Reality isn't a computer game, where if you clip through the surface there is nothing underneath.
Really revenue is the number you want to look at, not turnover.
You could have a turnover of £100bn, but it's not great if your costs are £200bn, because you'd have very large negative revenue.
A quick google though, tells me that GoDaddy's revenue is an astounding $3.7. They must be really fleecing their customers for that, unless they're doing something to earn that sort of dosh beyond hosting and domain name registrations.
It doesn't surprise me, when one of the board members of Nominet is an executive from GD. A cynic would suggest that he is only interested in the size of his bonus, and a bonus from a larger organisation (Nominet) can be larger than one from a smaller one (GD). Like corporate executives everywhere, the primary focus is on their own salary, not the wellbeing of the company, otherwise there'd be no such thing as large companies taking over smaller ones and asset-stripping them for short-term profits.
The ones making the laws mostly didn't study law, they all studied PPE (no, not that PPE, the other one) at Oxford, or even worse, Classics.
The notable exception, of course, being the current Leader of Her Majesty's Opposition, who not only studied law, went on to practise it, became a top barrister (QC) and then the Attorney General (head lawyer of the CPS), and got knighted for his service to the Crown as a result. Perhaps people should pay more attention to him, rather than the sack of blubber opposite him...
Your erstwhile organ has neglected to report on the apparent route the ship took in the Red Sea prior to being wedged into the canal, as reported elsewhere:
(with apologies for linking to the bastard offspring of the Fascist Daily Mail)
Everyone knows status icons are displayed as white in the Android task bar. If an app developer failed to do even the cursory testing that would have indicated this, whose fault is it? Clue: not the OS.
As for "If I set the background to be white, I can't see the white things in front of it" - whose fault is that? (hint: might be the person setting the background)
Guess what? It used to be the case in Windows (I don't know if it still is, because I don't waste my time doing the electronic equivalent of navel-gazing and sit there fiddling with personalisation) that you could set all the customisable bits of the OS to the same colour. A lime green window, with a lime green title bar, and lime green text, lime green scroll bar, etc. etc. Only an idiot would actually do such a thing and then claim it was the fault of the OS. I have a strong inkling that you may not agree, for fear of self-incrimination.
To me, Apple sounds far from "pretty damn good". Overpriced hardware with built-in obsolescence, hard to repair, phones which have a 50% chance of having a broken screen at any time, judging from the number of smashed iPhone screens I've seen on people's desks. Ecosystem lock-in, hardware you can't even put your own software on until it has been "approved" by the manufacturer. Shops that look like recruitment centres for a cult religion, with "geniuses" that probably put their underpants on backwards half the time. I could go on.
Sure, other companies aren't great either, but standing up for Apple by attacking Google is a pure straw-man argument. "Yes your honour, my client murdered the victim, but look over there, Fred West murdered more".
The fact of the matter here, is that Apple knew their keyboard design was shoddy, and it's pretty reasonable to assume that in normal usage, the odd crumb is going to fall into a keyboard. It's not a huge stretch to assume that crumbs aren't going to stop your keyboard from working, especially since keyboards have been around longer than the computers that use them.
Granted, there's a bunch of infrastructure that needs to be maintained, and upgraded as hardware reaches EoL. That hardly needs a board to oversee it, it's a technical day-to-day job by people the board would never involve themselves with. I think of the hardware people in my organisation, and the sort of things they have to do on a day-to-day basis, with no board involvement whatsoever, and I work in an organisation that by its nature manages a lot of change for a lot of clients, in various and myriad ways.
The job of a registrar, on the other hand, is notable in its unchanging nature: Keep a lit of domain names, manage renewals, maintain infrastructure hardware such as root DNS servers, have DR plans for such, and so on. All stuff which should have been well settled a couple of decades ago. It's not like the protocols that underpin the internet change substantially on an ongoing basis. The last actual change I can think of is the deprecation of TLS 1.0, and unless I'm mistaken, that will have affected Nominet in nothing but the most cursory fashion, since they are responsible for the domain names, not the transport layer implementation of individual websites.
it's well within the reals of reason that every antagonistic, duplicitous trick the board played in the last six weeks cost them the 5 percent they lost by.
It's also well possible that their tricks - being secretive about how voting numbers were calculated, a concerted campaign of FUD, making email addresses of members unavailable to campaigners whilst bombarding the same members with pro no-vote emails, and so on, actually narrowed that margin to only 5%. If they had played fair, it could well have been a bigger margin, but by the sounds of it, the likelihood of them deciding to play fair at the last minute was miniscule. Once they started down the road of heavy-handedness and obfuscation, there was no going back, and it's good to see them get their just desserts.
My experience of people who claim they never make mistakes is of a machine-room manager in a previous job who had uttered the immortal line, "I don't make mistakes," in my presence, about a fortnight before his wife left him.
Any person, or organisation, claiming such a thing is exhibiting an unwise amount of hubris.
...and even if it's fizzy (permissible I s'pose if it's sunny out) it shouldn't have an inch of froth on top, and certainly not below the rim of the glass.
But then, I like a pint of real beer, not that shite with detergent in it.
I've never quite figured out why the carbonate bottled ale though; the same stuff that would be completely flat from a beer tap, for some reason is expected to have fizz that goes up your nose if it's bottled. And most of them are pasteurised, so they can't even claim live bottling...
One would hope that this is because testing is getting better (and thus finding more existing issues) rather than the developers getting worse, and creating more of them.
Whatever happened to test-driven development? One would have thought that with all the emphasis on things like dependency injection in .Net Core, Microsoft would be writing perfect code by now. In reality, I suspect they are carving out great chunks of unverifiable C++ instead.
Unless you're going for a lockable metal shed with no windows (which is going to advertise that there may be something worth stealing inside), you should consider a garden shed as merely a means to keep (some of) the weather off your tools.
As an allotment holder, we are advised to not put locks on our sheds, because when the local thieves break into the site (which they usually do once a year or so, on a stormy night when nobody is around) they will just break the hinges off the doors if they are locked. Fixing or replacing the door on a shed costs more than replacing whatever low value items they decide to steal. If you're silly enough to keep anything that is valuable, and light enough, to steal with little effort in a shed, then you are asking for a lesson in risk assessment.
And for any operation that will change data, you should have started that with a BEGIN TRANSACTION.
That way, if you screw it up, you only have to deal with the database locking it will have created from moving all the records in the table into the log and back again when you do a ROLLBACK.
Of course, if you want to empty a massive table, you might want to forego the transactions and use a TRUNCATE to avoid the log file bloat. You'd better make sure you have your steel-reinforced underpants on first though before going anywhere near a TRUNCATE command (on a production server or otherwise, because of that situation where you thought it was the test environment but it turned out not to be).
It's trivial to handle short disconnects by a retry, but how long should the software wait before giving up, and how should that be handled? It's hard to be graceful when your database suddenly isn't there, especially if the software is in the middle of some complicated dance between several different things that need to be coordinated. The sort of things where you're trying to commit the result of an operation to several databases to update them to say you're finished, but the last one goes down as you're committing and doesn't respond with whether the transaction was completed. Do you reverse everything else you just committed on the other data sources and risk the DB coming back up with a completed status when it has actually been rolled back? Or do the opposite and risk an operation being duplicated?
Sometimes failing gracefully isn't graceful...
I wouldn't call C++ a high-level language here though, it's being compiled down to machine code, and still has low-level memory management (malloc, and pointers). It's really just C with-bells-on. Perhaps wrongly, I refer to C and C++ interchangeably.
Languages like Java or C# compile to bytecode, which runs in an interpreter. Yes, you can run Python on embedded devices, but again, this relies on an interpreter, which makes it a lot slower than something compiled to run on bare metal.
I'm not thinking of anything so sophisticated as something that can run a GUI either - things like Arduinos and Pi Picos that are being used to control things at a very basic level, and which have very limited memory and storage to work with. Things where available memory is measured in kilobytes.
Sometimes you just don't have the luxury of the headspace to run even a cut-down kernel if you need to use that memory for things like shifting a lot of data about between I/O channels.
Touching the car body is unlikely to properly ground you after building up a static charge, rubber tyres are generally a good insulator. There was a trend in the '80s and '90s of having "earthing straps" on the backs of cars, because of the belief that static charge causes car sickness. As any fule kno, car sickness is, of course, caused by witchcraft.
Well, there's several potential (geddit) ways for there to be a static potential between the user and the pump handle, from the pump not being grounded, and the user being grounded, to the other way round, or neither. Clothing can build a static charge, from moving around, etc. The main point here, though, is that if the user is holding onto the pump, they are going to be at the same potential all the time whilst they are doing so, and the only discharge is going to occur when they initially touch the pump, which has a metal body, and is most definitely grounded, and unless something is seriously wrong, won't be surrounded by an explosive mix of air and petrol vapour, unlike the tip of the nozzle whilst it is pumping.
People really did smoke a lot then though. When you see films and series made now and set in the '80s or earlier, it's a dead give-away when nobody is smoking, and all the white walls aren't yellow.
It's remarkable what we put up with back then. Even back in the '90s, I recall night-clubs where you couldn't see the other end of the room, and if they were hot and sweaty, there would be tar dripping from the ceiling. I can't say I miss it.
I believe the reason for not having the "click the handle and let go" thing here is that there are instances of static charge building up on the nozzle whilst it is pumping, and then sparking when it is picked up again, igniting the petrol vapour. I don't know if this is apocryphal, but having to hold onto the nozzle while it pumps would mean that it would be continually earthed though your body, and it wouldn't spark when you touch it again.
Apparently not having the locking pins also helps prevent accidental spillages, from nozzles falling out of tanks, overflows, etc.
Almost every smoker I've ever encountered has said things like (in reference to the smoking ban before it came in) "It's my right to smoke in a pub... if non smokers don't like it they can go somewhere else..."
This reminds me of an anecdote, in a time before the smoking ban, and when I myself used to smoke.
I was sat, with a friend, in the pub at the only occupied table in the section at the back of the pub, having a ciggie with our pint. A guy rocks up, with a couple of lackeys, sits at the other end of the table and starts doing exaggerated coughing and waving of arms, as if we had decided to sit at his table and start smoking. Whatever you might think of smoking in pubs, I think you'll agree that this sort of behaviour is just a dick move.
It turned out later, that the guy was our local (Lib Dem) MP. He got voted out at the next election.
Modal windows with NO back, OK, Cancel or X, apparently you tap or click on the background. Who hired the idiots that decided on this.
You find this most commonly in web pages (Bootstrap, I'm looking at you), but it seems to have crept its way into apps that are pretending to be web pages (or are just wrappers around them). I find hitting Esc also usually dismisses the pop-up, which more often than not is something you didn't even care about. Sometimes more drastic measures are required (changing display properties from the browser debug window) when it's a pop-up trying to get you to agree to something you don't want to ("I see you're using an ad blocker, please turn it off so we can spew tripe at you" seems to be the usage of the day).
Better examples mighthave been things like theft and violence, which are definitely illegal in the home, even if it is your wife or children.
Ask Stephen Fry about that one, eh? He went to gaol for stealing from his parent (forging cheques in his father's name).
My other points stand though. I don't think anyone would refer to a compiled program as a document.
We need a simple name for a concept that is complex and abstracted from the user. "File" is as good as any other, especially since it is nomenclature that has been established. Changing it now is only going to confuse people. "File" is a metaphor for a bunch of sequential bytes that are on a storage device, it is in the nature of metaphors that they do not accurately reflect the nature of a thing, but serve as a simplification that is conceptually easier to deal with.
I see a parallel here between this and how people refer to subatomic particles. An electron can be referred to as either a wave or a particle - it has properties that are akin to both, but it is in fact neither. It is useful, on a conceptual level, to refer to it as one or the other, even though any physicist will know it is a convenience and nothing more. Arguing about whether it is better to call it a wave, or call it a particle, is utterly moot. The electron doesn't care. It also doesn't care that we assign the name "spin" to one of its properties that has absolutely nothing to do with spinning. It definitely has the property we call "spin", otherwise every atom in the universe (other than free neutral hydrogen) would be violating the Pauli Exclusion Principle.
My experience is generally the opposite of this. Those who abuse the English language are generally those for whom it is their first language. Abominations like "should of", "take are country back", etc. arise from the organic learning of spoken language, without reference to the written form. This generally happens when people learn a languagfe as a child, not as a result of formalised teaching, which tends to teach written and spoken language together. Thus you find that non-native English speakers who are fluent, tend to speak the language with much greater precision than native English speakers, and, more subtly, tend to speak and write the same form of language. For an example of this, try listening to people you know having a conversation, and compare it to their written language. It will likely be peppered with "syntactic sugar", such as the "quotative like", "ummms" and "aahs", and all sorts of syntactical errors which you wouldn't see written in an email, for example.
Part of this also comes from the fact that when learning a second language, you have to also learn about such things as grammar, syntax, tenses, etc., so speakers of more than one language tend to understand the structure of language better. When you become aware of this, you can often spot things from the speaker's native language that give them away. For example, a number of Eastern European languages don't tend to use definite articles; rather than "the cat sat on the mat", they might say "cat sat on mat". Some languages, such as Mandarin Chinese, don't really have past tenses in the same way as English, so when talking about something that happened in the past, they might talk in the present tense, and imply the timescale through context instead. Rather than "the cat sat on the mat", they might say "it was the cat sits on the mat" or something that to our ear sounds equally syntactically garbled.
Anyway, whether someone speaking English as a second language uses "less" and "fewer" correctly would probably have more to do with whether their native language has the concepts of countable and non-countable comparisons, and whether the words for each actually differ. Note also, that if we think about the opposite of "less/fewer", which is "more/greater", we can see that the battle is already lost there, and we just say "more". A pedant might pick you up if you say "3 oranges is less than 5" but I bet you they even notice if you say "5 is more than 3".
Well yes, the offensive intent is codified in the manner of communication, and the cultural and personal references that may be in play. Without getting too philosophical about it, I'm pretty sure it's easy enough to be clearly and unambiguously, deliberately offensive to another person who has the same cultural background as yourself. It is also pretty clear that people can take offence where none is meant, and also to affect offence where none is either intended, or in truth, taken.
Culture does, of course, change over time. Whilst calling someone a cunt is considered offensive now (my spell-checker even tells me "this word is considered offensive"), it is arguably less so than it would have been, say, 50 years ago. If you go back further to the origins of the word, it was originally not considered offensive at all before some time around the 18th century.
"England and America are two countries divided by a common language"George Bernad Shaw (attrib)
This is a great example of how the British and American versions of this idiom differ. Being as I am, British, the American version is obviously wrong. I'm sure any cross-pondians will take the opposite stance. QED.
Taking offence might be a matter of morals/ethics. Giving offence, not so much. You might be offended by my shirt, or might not. If I call you a cunt for dissing my shirt though, there's not really a question about whether I'm being offensive, and deliberately so. Whilst there might be a fair amount of grey space in the middle, sometimes things really are black and white clear-cut.
Yes, "primary" and "secondary" would be much clearer terminology here. Or even "active" and "passive". This smacks of someone being told to change it by their boss, but not being told what to change it to, and not being given the guidance or context to make a correct choice. As a developer, this sort of thing is the bane of my existence.