Re: TL; DR
people who talk with unusual accents
Do we get to deport anyone who speaks Mockney? If so, count me in!
12110 publicly visible posts • joined 16 Apr 2007
Threatening to use a gun, whether you have one or not, is an offence in itself.
If you have the means to receive the BBC's content then you are, by law, required to pay the licence fee. You might not like it, and arguments about plurality obviously don't interest you, but it's still the law.
It also tells you what a bargain the licence fee is: lots of original production, local news in TV and radio.
A subscription fee is, by definition, not compatible with a universal service such as that which the BBC is required by law to provide. The licence fee feels like a tax but provides some degree of independence from government interference by going straight to the broadcaster.
Of course, all the populist arguments are about how shit and biased the BBC is. This is because they are designed to ignore the millions of people who depend on the service for one thing or another.
I'd much rather have a debate about possibly narrowing the remit for the service and possibly taking some of the money away from sport by defining more events of "national importance". Footballers, test cricketers, Formula 1 drivers already have more enough money. We can also discuss how to check for bias and mitigate against in ways other than giving nutjobs a right to vent – they're doing this anyway without any checks all over the interwebs – and also how much the BBC pissed on that pet project BBC Sounds. Apparently IPlayer Radio within the UK is no longer available: get yerselves the international version for buzzword free enjoyment.
SQL Server is still playing catch up with Oracle and DB2, which is why it can afford to / has to be cheaper. To some it's still the "new kid on the block" but if you look at the cost of licences for other MS "enterprise" products, they have a distinctly "Oracle" feeling about them.
500,000 USD to 1,000,000USD
Equates to around 2 to 4 top-notch DBAs for a year so you were saving money from the start. Of course, a full analysis would cover what compromises (say reports), if any, you had to make and whether you got any additional benefits.
Most of the work I automated with python (Hey, I'm a lazy programmer / DBA - what can I say ...)
But that's the best kind. The beancounters love automation but they often don't understand the kind of people it takes to get it.
Performance doing what precisely? Postgres is generally considered to have excellent support for concurrency but is acknowledged to have not the best write performance, though changes in the last few releases have seen significant improvements there.
But in any particular test (OLAP/OLTP) it's going to come more down to the ability of the DBA to configure the system correctly than any inherent DB features: if you don't know what they are or how to control them, they aren't going to help much.
That's a nice AWS setup you've got there, would be a pity if something were to happen to it.
All the cloud vendors have a vested interest in making it easy to check in but difficult to leave their data hotels.
And their engagement with open source is often only as far as it serves their narrow, commercial requirements.
The commas are a little confusing on the VM Ware licences.
Did you spend you spend all your time doing this? In which case it's presumably savings after the first year or so?
Oracle -> PG is tried and tested and EnterpriseDB means you can keep most of your "stored procedures" aka Oracle lock-in bombs.
Welcome aboard! The familiarity is, of course, no really a coincidence as back in the mists of time they were both based on the same project at Berkeley.
And I think experienced Oracle DBAs are going to help make Postrges even better. In the last 10 years it's got so much better largely due to the input (both in source code and comments) of seasoned industry professionals.
It shouldn't really be does as a black-ops project but it might have to start as such (cf. how YouTube convinced the internet to stop using Internet Explorer 6) with a proof of concept. I suspect that the vast majority of any company's databases does nothing special and could migrated to another system with a dump/load. The rest will, of course, take research, planning and costing but the savings can quickly become substantial once you stop having to buy licences for everyone in the company on all their devices.
If Oracle is required for "mission critical" stuff then work around it. Once you can demonstrate functional equivalance without the licensing costs for some projects, it should be possible to draw up a high-level analysis that quotes Amazon as having done the same thing already. Going to the "cloud" will probably be mentioned and any references to this new form of lock-in are likely to fall on deaf ears, because "cloud" sound a lot like "lower headcount" to a PHB.
Note, as soon as Oracle finds out that something like this is in the offing they'll launch the sales droids with FUD bombs to try and frighten people off, so you must have functional equivalence for backup/restore, etc. But, also, once you've moved one proper application you can also start asking for rebates.
At the end of the day, there should be nothing wrong in keeping some stuff in Oracle if it really is the best tool for the job. The problem is that Oracle needs to learn that that is what they have to provide and not Faustian contracts.
AFAIK GitLab is still being bankrolled by the VCs so having a warm and cuddly policy is just good PR that costs next to nothing. Expect it to have shelf-life of the time it takes to IPO or be bought.
Companies do, in general*, have the right to choose their customers with boycotts on Apartheid South Africa particularly well-documented, including the fact that the boycotts had negligible affect on the politics. But, if you do get into bed with the devil, make sure you don't get caught or fall foul of one of the arbitrary US sanctions: bombs to Saudia Arabia for dropping on Yemen are good; bombs to Iran for dropping on Yemen are, of course, bad. Greasing palms in Africa for mineral rights is always good.
* There are some exceptions when it comes to dealing with the general public.
I think you'll find that in our post-truth world we line up to spill bile on stuff.
TBH Google used the Nexus phones to trail Android's features while manufacturers were churning out crippled customised versions. They haven't needed to do this for a few years now with Samsung particularly, but others as well, producing top of the range devices, so the Pixels are pitched at a different crowd, which the price should make clear, which is, I assume Google Fanbois and Fangrrls. But, basically, it allows Google to test on-device AI.
Yes, upfront payment almost certainly formalises a contractual arrangement between parties. This may or may not be something you want to do, but refunds are certainly a complication and Meetup allows people to cancel whenever they want. We do charge for our local events (rooms and equipment it turn out aren't free) but I'd quite like to keep Meetup and PayPal out of the payment handling.
It is a truly awful platform that unnecessarily removed Markdown support from event descriptions and still provides attendee lists as CSV files disgusised as XLS files, but it does have the reach. You can already opt to charge people for attending but this introduces additional complications. But something is needed because, frankly, the lots of people sign up to events and never turn up. This can lead to additional costs or exclusion of other people who would like to attend.
If they would only bother to maintain and continue to develop the platform, such as adding support for public nicknames but private real names, required for insurance or workplace security purposes in many cases, I wouldn't mind paying the fees as much. It would also be nice to be able to ban "no-show" easier: I currently have to mark them as such and then ban them individually.
:=
)
This means they are a common optimisation method, and anything which makes them even faster is generally a good thing.
Premature optimisation is the root of all evil and a great way to waste time. I always advise people to write clear code and profile if they need more speed. Comprehensions don't necessarily loop much faster, not that Python's loops are that slow, but they do allocate less memory. That said, running nested loops through a JIT like PyPY or Cython will generally lead to C-style speeds.
I'm not sure that the syntax change will stop people writing code to avoid nested if statements.
DSLs are probably the way to go for the small group of people (relatively) who need this.
Python has a history of introducing syntactic changes only to reverse them later (backticks, print >>, map, reduce, etc.). Some stuff stays because it is genuinely useful but this can take years of refinement before it becomes standard. My initial reaction to this change is that it is, unfortunately, yet more special use syntax forced on the rest of us. It certainly won't aid readability and the improvements are marginal. But let's see what the take up is in the next few years.
In previous discussions that were similar Guido tended to follow the argument that to do nothing would cause the least problem. This is exactly the kind of the syntax that, because it's occasionally useful, it gets used all over the place where not only it isn't useful but in fact downright confusing, aka dangerous.
Seeing as it's backwards incompatible
Less to maintain is the main reason. Means fixing fewer bugs twice. But also, with the move to turn the Mac into a glorified I-Pad, easier for cross-compiling.
Apple knows it can force people to migrate over time, even if they hold out a for year or two. Last year SWMBO's MacBook Pro went out of mainstream support.
Seems like par for the course to me. Apple has routinely only offered fixes in new versions, though this has usually been bug fixes and not security exploits. Catalina breaks a lot (if you're not 64-bit you're not coming in) and doesn't offer much for anyone who doesn't use IOS. I normally wait until January but am planning to skip this one entirely.
Searching and sorting are not the same so I'm not quite sure what the issue. In general, for columns you should normalise your data for whatever processing you're going to do. Case-sensitive sorts are almost always going to be fastest and you can define pro-table or pro-column collations and you can even do this pro query. See the docs for more information. The docs aren't perfect but that was the first page that popped up when I searched for collation, though as long as you don't use the LOCALE_C you shouldn't have too much trouble.
For searching, there are worlds of difference between using the builtin LIKE or REGEX and the text search extensions. If you do have data that is case sensitive but want to do case insensitive search (you have "JOBS", "jobs" and "job" in your data), then you really should normalise your data and/or query, or at least use a relevant index.
For full-text searches case sensitivity is the least of your worries!
If I see something like query speed compared then I generally think, someone hasn't thought enough about their indices, or tried to use the analyzer. Now, while there is definitely scope for the Postgres analyzer tools to improve so that you know what to look for, that doesn't mean the database is slow. If something is "mission critical", be prepared to spend some money on external support to get the design right.
What do machine learning and "AI" have to do with a relational database? Please take your buzzwords with you and close the door on the way out.
I'm struggling to think of features that SQL Server has that Postgres doesn't come with or have extensions for. Maybe management functions? In other areas Postgres seems to be leading the pack.
For really high-end deployments, EnterpriseDB's version comes with support for Oracle's backend which makes migration of some large installs feasible.
The whole Debian on the Gemini was a huge distraction for Planet for a few vocal users. Because the device uses a Mediatek SoC getting proper support on Linux was always going to be difficult. For those who want to go that route, Sailfish is the better option. But personally, I think Android with a good SSH client is sufficient for most sys admin work that I'm likely to want to do on the move.
Unfortunately, the software is the Achilles heel of Planet's offerings. The hardware Gremlins have been resolved but firmware updates are few and far between and long-standing bugs just haven't been fixed. The device itself is immensely practical. You really need a table to use it properly but it does then let you do a lot of stuff: I was able to demonstrate PyDroid on it recently, albeit without using the native HDMI out, which for some reason refused to work. USB-C on one side is charging only, on the other output only. Sound is tinny but bearable. I only got a replacement Germay keyboard after Andrew Orlowski intervened. This is all okay for a device that you can always takes with you and works.
But the software side is poor:
Basically, looks like Gemini users have been frozen out so that the Cosmo can happen. I knew what I was getting into when I signed up, but still disappointing to see the after-sales service fail so badly.
There won't be a confidence vote until Corbyn agrees that someone else should be caretaker Prime Minister. His minders won't let him pull another December stunt.
Trying to repeal the Benn Act on any grounds would almost certainly be quashed by an injunction. The Supreme Court has asserted the primacy of Parliament and it's difficult to see even the High Court trying go against that judgement.
Bojo could resign but the fixed term act doesn't mean this will lead to a new election. Cf. resignations of Cameron and May. May eroded an already slim majority through bloody-minded attrition but Bojo just threw it away in a fit of entitled pique.
PMQs this afternoon with a decidedly C-team: Raab and Abbot
Who hasn't done what?
I think the government might be planning to break and subsequently challenge the law in which case, given the Supreme Court's recent judgement, it's likely to find itself subject to an injunction and likely to be held in contempt. And then there are still plans to prorogue for a Queen's speech…
The B-team has already demonstrated it doesn't have the best legal team and Bojo's declaiming that he won't comply won't help him in court either.
Pity the fools who have put their trust in these idiots.
The ruling is quite clear: consent must be explicitly given so such messages are invalid. This shouldn't really surprise any once as it has consistently been the position of the courts.
However, GDPR is currently being revised and it wouldn't surprise me if some cookies become legitimate without consent as long as there is a list of them with their purpose and lifespan. It's not as if this hard to do, though I've seen multiple, ahem, "developers" fail to provide a correct list and explantion of a website's cookies. Note, that failure to do this correctly could go beyond GDPR and enter the realms of fraud.