739 posts • joined 4 Sep 2014
There is no change to the actual license. If the original license was GPLv2, then that remains the license. What happens is they add another file to the project which says that in the event of a license violation, the "cure" procedure for copyright violation will be as specified in the new file. Since GPLv2 doesn't address what happens at that point there is no conflict with that license. Since Red Hat stated what they they would do in that instance, so far as a court is concerned they are as effectively bound by it as they would be if it was a clause on the license itself.
GPLv3 addressed a lot of issues in GPLv2 like this, and is in my opinion overall a better license and what I use in my own open source projects unless I need to conform to the license of an existing project. The GPLv3 drafting process also took in a lot of input from lawyers around the world to correct issues relating to legal systems which are different from that in the US, as well as many other matters.
The main objection that people had against GPLv3 was the provision that manufacturers of locked-down platforms had to provide unlock keys. The main objector back then appear\ed to be Tivo and other makers of things like home TV video recorders. These days it is cell phone and tablet makers who object to it.
I won't be surprised if eventually they end up with what is effectively a "GPLv2.9" - or a GPLv3 without the anti-lock-down provisions.
Re: puts a dampener on rival GitLab’s claim?
This is what I'm doing. Within the next couple of weeks I will be setting up an account at GitLab, but will still keep the GitHub repo. The project will simply be hosted in two places. If that works out well, then I may look for a third location as well. I will want to automate this with proper scripting first however so I don't have to do it manually.
My plan isn't to simply switch hosting providers. I did that once before when I moved from SourceForge to GitHub. What I intend to do is to have multiple mirrors where the project is hosted so that the loss of any one of them is not a major setback. There is no point in trying to do that after you have been presented with the choice of either accepting new terms of service or being locked out of your one and only account.
So I will be moving to GitLab, but I will still be at GitHub for now as well. This is what I would expect other people who are concerned about this to do as well.
The only question really is which one becomes the primary repo and which one becomes the secondary mirror. A lot of GitHub's value is in the "community" aspect of having the largest number of developers already active there. If the community becomes more dispersed then a lot of that value will fade away.
@J. R. Hartley said: "Wonder which new and exciting way they're gonna fuck it up."
They'll integrate it with Linkdin, Skype, Azure and MS developer tools.
Their press release said: "... bring Microsoft’s developer tools and services to new audiences."
Expect to see Github features appearing which hook your code repos directly into MS Azure for deployment. Your Github rankings will be reflected in your Linkdin profile. If you don't have a Linkdin profile, one will be automatically created for you based on your Github data. Skype will be integrated into team meetings for projects. MS Visual Studio will have deep integration into GitHub beyond just being a Git client.
So, you'll still be able to use Github via the web interface and via the command line Git client, but every possible Microsoft service that can be integrated into Github will be to the degree that a software developer could work through the life of an entire project without ever leaving the Microsoft walled garden.
Microsoft just paid a staggering amount for Github (three times as much as press analysts were speculating) and they will be looking for ways to make that back. Introducing a variety of forms of vendor lock-in in order to sell other goods and services is the obvious choice here.
I'm looking into what is involved in setting up a Gitlab account. I won't pull my open source projects from Github, but I won't use it as the sole public repo any more. Just like a lot of Youtubers have come to the realisation that they need to diversify their options rather than being at the mercy of Youtube's latest policies, I'm going to make sure that I can pull the plug on Github at any time if necessary just like I did with Sourceforge.
P.S. Don't be surprised if Amazon come out with some sort of response to this.
@AC said: "As a random example, lets say you're a manufactuer that has a line of custom Linux laptops. Want really good support added to them for nearly no cost? Well then, send in ten or twenty thousand entries for your stuff, randomising things to look legit and using fake source IP info."
Or just send an email to Canonical telling them that you are are a manufacturer who is planning on coming out with a line of custom Linux laptops and that you would like them to work with Ubuntu out of the box on launch. Then ask them if their developers would like some free laptops. They're happy to work with anyone who wants to support Linux.
However, just have a look at the type of information being collected. According to the story it just amounts to the following:
- Ubuntu Version.
- BIOS version.
- Amount of RAM.
- Partitions (I assume that is number and size of disk partitions).
- Screen resolution and frequency, and number of screens.
- Whether you auto log in.
- Whether you use live kernel patching.
- Type of desktop (e.g. Gnome, Mate, etc.).
- Whether you use X11 or Weyland.
- Type of install media.
- Whether you automatically downloaded updates after installation.
- Whether you used the minimal install.
- Whether you used any proprietary add-ons.
There is basically two types of information there. One is some basic parameters such as RAM, CPU, GPU, hard drive size, etc. That tells you what you should be targeting in terms of hardware resources, and so whether your desktop (e.g. Gnome) is getting too fat for the average user (as opposed to the average complainer, at which point you are far too late to be addressing the issue).
The other is what install options people changed compared to the default install. If most people don't pick live kernel patching, then you know not to make that option the default. If a lot of people are selecting Urdu as the language, then you might want to make sure that language has better default support. Etc.
Ubuntu will publish this information publicly. Personally I am looking forward to the RAM and CPU type data, as that will give me information on what CPU features to target in certain software I have been working on. I have been relying on Steam data, but that may not be very representative of the science and engineering field which my software relates to.
@doublelayer - They'll use the data to decide what ought to be the defaults for the next release. They will be making decisions based on actual data rather than someone's wild guesses. A major problem has been that developers often assume that the sort of hardware they have on their desks is typical of what everyone else has.
In the past they've had to make decisions on things such as "should the default install disk be CD sized so that it will work with PCs which have CD drives but not DVD drives, or should it be DVD sized so that the user is less dependent on having network access at the time of installation to install stuff that wouldn't fit on the CD?".
They've also had to worry about things like graphics support, what CPU optimisations to compile in as default (some packages have optional libraries for older CPUs), etc.
Apple know exactly what hardware they ship. Microsoft can simply assume that the non-Apple PC market is the same as the Windows market. Linux distros can't make these assumptions so they either just pull numbers out of the air, use opt-in surveys which are usually wildly unrepresentative of the user base, or do something like this.
Before this they had a detailed opt-in hardware data survey which so few people bothered with that it was pretty much useless. The new one collects far less information, but does so from a sample which will likely be representative of the overall user base.
The article seems to be mainly buzzword bingo.
* unpatched Apache Struts.
* IOT securtiy
None of these have anything to do with license terms. They can be related to keeping your systems patched and up to date.
However, the real issue in that case is whether you are talking about vendor support of software you have bought, or whether you are talking about supporting software you have developed in-house (or via a contractor).
In the case of vendor support, the license is irrelevant to this issue. The real issue would be the quality of service provided by that vendor. Whether that vendor is Red Hat or Microsoft, the issue is the same.
In the case of self-support of something you developed yourself (or paid a contractor to develop for you), then you need to handle this aspect yourself.
In the general case of security patches for open source libraries and components though, if all of that came from the standard repos of a Linux distro then the distro manages all of this for you. They have security teams and their distro comes with an updating system that manages security patches. They can't make you apply those patches though, that is up to you being willing to do so and having the procedures in place which prevent the issues from being ignored.
This though is really just another variation on the vendor support question, with the license being irrelevant except that you now have a variety of competing vendors all supporting very similar systems to choose from.
Check the List
The authors have a list of email clients they tested where they state which ones had a problem, and which ones didn't.
My email client of choice - Claws Mail - was listed as not vulnerable to either attack.
Claws looks very old style, but it is fast, reliable, and has all the features I want. I have used Claws for years and highly recommend it.
Re: Got to give this punk some credit.
AZump said: "never saw a Linux distribution swap before Ubuntu, let alone suck up memory like Windows"
I'm more than a little sceptical about that claim. I'm writing this on an Ubuntu 16.04 desktop that has been been doing software development and web browsing all day long. Amount of swap being used - zero. That is typical for a system that I use on a daily basis, and I see the amount of usage on a regular basis as it is displayed incidentally to certain tests I run as part of software development.
About the only thing that pushes it into using swap is heavy memory usage from applications that are allocating a large proportion of available memory (e.g. very large arrays in the software I am working on). And that is exactly what happens in every other modern operating system since that is why swap exists in the first place.
If you want to make comments like that, I would suggest doing it on a non-IT related web site where you are less likely to run into people who actually use this stuff on a daily basis.
ZTE is also a major customer of certain US chip manufacturers, particularly in ZTE's networking gear. For example Acacia gets 30 percent of their revenue from ZTE. Acacia's share price went into free-fall when the news came out. The same is true for a bunch of other American suppliers.
ZTE can source many components from other places, but will have difficulties doing so with some.
However, this situation is a long term problem for US companies who supply anybody outside the US. While their name might not be on the box, a lot of the value of what is nominally Chinese kit is actually made in the US, South Korea, and Japan. The Chinese assemble it and put a "made in China" label on it, but the majority is actually made elsewhere.
The Chinese government's current economic plan is to design and build more of this high-tech chippery in China. If the US is seen as too risky of a supplier, that will only accelerate this trend in China and the rest of the world to the detriment of US business and the US economy.
I should note that many European defence firms go to great lengths to avoid American suppliers because of the risk inherent in buying from the US. Look for "ITAR-free" suppliers as an example of this.
It doesn't take much imagination to see how this could go horribly wrong.
Now waiting for a nation state to infiltrate the over the air update system and deliver a patch which bricks every vehicle in the country simultaneously, causing transport, the economy, and society in general to collapse with no practical means of recovery.
Meanwhile the government will defend their plans on the grounds that they just make policy and law, but it's someone else's role to be held responsible for the consequences of it when the government's plans invariably go wrong.
Laser canon and sonic death rays.
As the story notes, loads of these incidents happen in the US all the time. They are caused by scrotes with laser pointers. I don't see why Djibouti would be any different and I suspect that the importation of laser pointers in terms of power and frequency sees a lot less regulation there.
This smacks of the American story about the Cuban sonic death ray supposedly being directed at their embassy personnel in Havana. That would be the Cuban sonic death ray that no other country finds credible. Canada has investigated it and come to the conclusion that the sonic weapons theory isn't plausible. The Americans none the less persist in blaming Cuba.
I have an AMD APU in the PC I am typing this on (CPU and GPU in one chip package). Before that I had always used NVidia graphics cards.
For my next PC I would definitely choose an AMD APU again. I have had zero problems with it in several years of use. It's fast, glitch-free, and reliable, as are the open-source drivers used with it by default (I'm using Ubuntu).
In contrast I always had some problems with NVidia graphics cards used with multiple Linux distros, especially when using the proprietary drivers.
Considering the AMP APU comes with CPU and GPU in the same chip package for considerably less money than I would have paid for a comparable CPU plus separate graphics card, it is pretty difficult to justify anything else for typical desktop applications.
I don't play video games so I can't speak to that field of use. I use mine for software development, full screen video playback, and web browsing. I have no complaints whatsoever about AMD APUs in my applications.
Re: New Linux poweruser here ...
I started trying to write a short summary of all the different issues, but realised there's really no way to cover even a fraction of them.
The short answer is that the basic idea is good, as it is more or less a clone of the init systems used by Solaris and Apple.
However, the implementation was sadly lacking, mainly because of what the Systemd developers were like. The Systemd developers didn't know how much they didn't know about all the obscure edge cases which exist in server applications, and wouldn't listen to the people who did know. When they made mistakes, they blamed other projects for having "bad" software, because well, Systemd is perfect so obviously the problem couldn't be there.
They also insisted that everyone else had to rewrite their own software to work "properly" with Systemd (mainly to do with network socket initiation on start up). The fact that this then made server applications incompatible with BSD and others without a lot of if-defs didn't go over well with the maintainers who were affected or with BSD people (the Systemd developers had no interest in working with the latter on these issues).
Debian had to ditch their project for a BSD based Debian distro version because they didn't have the resources to support two init systems (and all the resulting Debian specific init scripts) and the Systemd developers as mentioned above had no interest in working with the BSD people on this.
And since we are talking about Ubuntu in this story, I should also mention that the Systemd developers screamed much abuse at Ubuntu for not volunteering to be the guinea pig for the first commercial distro release of Systemd (no other commercial distro was using it at the time either). Ubuntu was bad, bad, bad, they insisted. The fact that Red Hat wasn't shipping it either at the time seemed to go right over the heads of the Systemd developers, the leaders of whom just happened to be Red Hat employees.
As for why Systemd got adopted by most distros is simple. It was backed by Red Hat and they have enough muscle in the Linux market to push through things they want. The same is true for Gnome 3 by the way.
If you are using a desktop distro that uses Systemd, or you are using bog standard server applications (e.g. LAMP, mail, SSH, standard database, etc.) then all of this probably doesn't make much difference. Your distro will have figured out the problems and fixed them. The distro that I'm using on my desktop to write this adopted Systemd a few years ago, and I didn't really notice any difference other than boot up taking longer (Systemd has the slowest boot times of any init system that I've measured).
If you are administering a complex server system, especially if you are using proprietary software that isn't packaged properly, then you have to deal with all the Systemd issues yourself instead of just hacking on an init script or installing a third party logging system. A lot of the complaints about it come from people who have to deal with this aspect of it.
Re: Upgrade, but not right now?
@Notas Badoff: This is not a new policy, they did this with the 14.04 to 16.04 upgrade as well. Existing LTS users don't get upgraded until the first point release comes out (18.04.1). The point releases bundle up accumulated security and bug fixes so that new installations don't have to download them all again.
Normally by that time bug and security fixes related to a new release seeing first widespread use should be down to a trickle. This in turn means that LTS users will see fewer update notifications. If you are an LTS user, you probably care more about not having as many updates than you do about having to wait a few months before getting the next LTS. Non-LTS users on the other hand probably do want the latest stuff ASAP.
When the release does go out to existing LTS users, it won't go out to all of them at once, Instead it will be trickled out to smaller numbers of users at a time over the course of a week or so. Thus even after the LTS upgrade cycle begins, some of those users will be waiting for a while.
If you are an LTS user but really can't wait, then you can force an upgrade now if you know what you are doing (there is a package you need to install which automates the Debian command line process to make it easier).
Re: Ooops they violated GDPR
Canonical are a UK company. I suspect they have heard of GDPR and know what data is personally identifiable and what isn't amongst the data they actually intend to store.
Re: On the face of it
@I ain't Spartacus - "It's funny as a non-Penguiny person. I've not read as much about Linux of late, so was amused to see a review talking about people being sad to see the back of Unity."
The sort of person who is motivated enough to write a comment on an IT oriented web forum is generally not the typical user. There are loads of Unity users out there who are just using their PCs to get work done. Fans of the less commonly used desktops or distros seem to feel they need to slag off the major ones rather than promote what is actually good about their own. KDE versus Gnome flame wars for example go back to near the beginning of modern Linux desktop distros.
I ain't Spartacus said: "So when do I expect the article mourning the loss of systemd?"
Based on how these things tend to go, I expect we'll see that in about 10 years.
Re: On the face of it
@K - Even the version numbers on your middle two examples are indistinguishable.
The reason that Ubuntu bailed out on Gnome 3 in the early days is that it had a very unstable UI that was not ready for prime time and the Gnome developers were no longer supporting Gnome 2. Quite a few people in those days thought that the Gnome project had committed collective suicide and would soon be an ex-parrot.
From that came Unity. It addressed the major usability problems with Gnome 2 (dock moved to the left and reduced use of vertical window space to work with modern display proportions, bigger dock icons, integrate the dock with workspaces, etc.) while keeping the keyboard short cuts and underlying assumptions as similar to Gnome 2 as possible.
After that the user facing stuff remained more or less the same with changes mostly just polishing what they had. The latter though did include a good deal of major work on the underlying bits and pieces to account for major changes in common PC hardware and driver support. The biggest example of the latter is the work they did for compositing desktops when the third parties Ubuntu had been depending on dropped work on their own support for older hardware.
And all that suited most Ubuntu users quite nicely. The Unity desktop worked and was based on sound ideas so why change it? Ubuntu started out as just a much more polished and more up to date version of Debian Gnome 2 and was very popular as that.
Several other currently popular desktops got their start in a similar way. Now however that the Gnome 3 developers have cut back on the crack smoking and have stopped changing how their desktop works every other release and have quite frankly copied some of the better parts of Unity, the reasons for continuing with Unity have to a large extent gone away and Ubuntu can go back to its roots of being a better (and with commercial support available) version of Debian.
Some of the major criticisms that I have of Gnome 3 at this time are the support for keyboard short cuts are not as good as with Unity (this is the biggest complaint I have), the dock is not as well integrated with workspaces or application indicators, and the non-traditional workspace concepts (such as variable number of workspaces and only linear navigation between them). I made very little use of Unity's HUD, so it's loss doesn't bother me much.
Most of the complaints about "Ubuntu" on forums such as this one seem to come from people who are using third party derivatives with non-Unity desktops (I'll avoid mentioning any in particular to avoid flame wars). These non-Unity desktops are put out by community members rather than Canonical, and simply don't have the resources to put the same degree of polish into them that full time distro maintainers do. I've tried some of them and salute the volunteers who work on them for their effort, but I'm more interested in using my PC than in experimenting with desktops. As a result I will be using Gnome 3 after the upgrade notification comes in.
Existing users of Ubuntu will get the upgrade notification in July when Ubuntu 18.04.1 comes out rather than on release day. This is the same policy as was used with 16.04.
They had one non-LTS version, 17.10, which used Wayland. Other than that, every official mainstream version of Ubuntu right from the beginning used X.
Russians poised to fire intercontinental ballistic missile... into space with Sentinel-3 sat on board
What goes up, must come down (in pieces).
And meanwhile in Canada today's news headline is that the %@!#$%# Europeans are dropping another one of their left over missiles on us again, left over toxic fuel and all. The Nunavut Territory minister of the environment said: "It is a concern for us," said Savikataaq. "No country wants to be a dumping ground for another country's spent rockets."
Re: Garbage in, garbage out
The main value of Turkey to NATO these days is its position in the Middle East. American bases in Turkey are ideally situated to strike east into Iran or south into Iraq or Syria and Lebanon, and generally complement the US bases in Bahrain and Qatar.
The US bases in Turkey saw extensive use in the first and second Iraq wars, and in the war against ISIS in Iraq. Their key role in providing bases for aerial refuelling means that even aircraft based elsewhere depend upon them.
So long as the Middle East has oil, Turkey will be important to NATO.
Re: "nd what's the problem with an ally (*) buying a potential adversary's kit?"
The S-400 system is not a specific missile and radar combination. It is an air defence system with a family of missiles and radars. What the Russians export is not necessarily the most advanced versions of what they used themselves.
As for why the Turks are buying them, they put out an RFP for an air defence system. Part of the requirement for any major Turkish defence contract these days is a degree of technology transfer to Turkish defence firms. The Turks are trying to build up their own defence industry. This by the way is why they are making parts of the F-35 as well as doing the engine overhauls. Turkey makes a major section of the fuselage, landing gear components, parts of the engine, electronics, sensors, and a whole range of other items. They are sole source suppliers of a number of pieces, so every F-35 built today is partly Turkish.
As for missiles, the Americans submitted a bid for the Patriot missile system, while the Russians submitted a bid for the S-400. However, the American bid did not include technology transfer, while the Russian bid did. Hence, the Russians won the contract. Toys were quickly ejected from the Americans pram - they wanted the contract, but not on terms the Turks were willing to grant it on. The only thing that will satisfy the Americans on this one is for the Turks to buy Patriot missiles on terms the Americans dictate.
As for stealth fighters in general, the Turks are designing their own, with British and other foreign help. The UK has its own sovereign stealth aircraft technology which is as good as anything the US has, which is why the UK was invited to be the only Tier One foreign supplier for the F-35 (which caused the UK's own stealth fighter project to be cancelled). BAE is supply extensive unspecified technology, and Rolls-Royce are supplying the engine technology licenses. The UK involvement has support from the highest political levels in the UK government. The Turkish fighter is scheduled to replace their F-16s and will supposedly first fly in 2023. The Turkish fighter will do the air-to-air fighting while their F-35 fleet will act as bombers/air support.
On it's own it might be remotely plausible as a "security" action. In the wider context though, it fits in as American trade protectionism. Canadian steel and aluminum companies have also been labelled "national security risks" by the Americans. Bombardier is "bad" until they promise to assemble planes in the US, and then the trade complaint gets magically thrown out at the next hurdle.
I think that the head of Huawei said something along the lines of that getting blocked from the US market feels much more relaxing now that they know that they don't have to worry about keeping the Americans happy any more.
@Charles 9 said: "Because Google's strongest platform, Android, runs on ARM, as does Apple's iOS the #2 mobile platform."
Google's response to that was PNaCl, which was supposed to be a portable form of NaCl based on LLVM intermediate code. That wasn't any more successful because LLVM intermediate code isn't really suited to that.
By that point everyone had decided that ASM.js was a much better solution from a technical and practical perspective so Google threw in their (P)NaCl cards.
Sending native x86 binaries over the web to execute in a sandbox on the other hand is what Google Chrome did with NaCl. That went over with developers like a lead balloon, and Google pulled the life support on it last year in favour of joining Mozilla in using WebAssembly.
El Reg said: "The technology is a W3C standard, emerged from Apple and promises a secure sandbox running inside a browser."
That will come as a surprise to the people who actually developed WebAssembly. Here's one of the original announcements: https://brendaneich.com/2015/06/from-asm-js-to-webassembly/
Who: A W3C Community Group, the WebAssembly CG, open to all. As you can see from the github logs, WebAssembly has so far been a joint effort among Google, Microsoft, Mozilla, and a few other folks. I’m sorry the work was done via a private github account at first, but that was a temporary measure to help the several big companies reach consensus and buy into the long-term cooperative game that must be played to pull this off.
So far as I know, WebAssembly actually came out of primarily Mozilla's success with ASM.js, plus some of Google's work with the less successful PNaCl.
Re: Of course it's not an important security issue
@Anonymous Coward said: "It's not on Windows."
Oh look, someone trolling anonymously. What a surprise.
Well guess what, it's not normally installed on Linux either, as you would know if you had actually read the story. It's a third party program that an administrator can install if he or she wants to, but very, very , few actually do.
Almost nobody even has beep installed.
According to Debian, only 1.88% of users have beep installed. Only 0.31% use it regularly. Apparently "beep" doesn't even work on most hardware. I suspect that the few people who do have it installed used it in a bash script somewhere years ago and forgot it. I checked my PC (Ubuntu), and It is not installed.
The best solution is probably to check if you have it installed, and if you are one of the few people to who, to simply un-install it. If you are worried about some obscure script failing because it got an error when it tried to call beep, then perhaps symlink some fairly innocuous "do nothing" command, or possibly even a script which will write to a log somewhere to tell you when it was called.
If I need to have my speakers on my desktop make any noise I use "espeak", which is a text to speech utility. There are other noise making utilities as well which unlike beep actually work on modern hardware.
Here's the list of Chinese kit facing extra US import tariffs: Hard disk drives, optic fiber, PCB making equipment, etc
The tariffs will apply to goods originating in China, otherwise there would be plenty of places in the world where they could be transshipped through and relabelled.
Plenty of US imports already arrive through Canadian ports, as many US ports are generally more expensive. Of course American port operators are crying this is unfair and want tariffs applied on port services.
Trump has opened a new eastern front in his trade wars before finishing the one he started with Canada and Mexico. Washington is now desperately trying to make trade peace with those two now that it turns out that China isn't going to surrender. Boeing's sales in China may turn out to be the Stalingrad in all this.
Even American military arms suppliers can't compete in the US market
Here's my favourite items from the list. Apparently, Chinese torpedo makers are selling their wares in the US market at unfairly low prices. American howitzer makers and makers of aircraft carrier catapults and arrestor gear are facing similar problems. If only the Pentagon didn't insist on buying the lowest price armaments sold at Walmart instead of buying from American suppliers.
- Artillery weapons (for example, guns, howitzers, and mortars)
- Rocket launchers; flame-throwers; grenade launchers; torpedo tubes and similar projectors
- Rifles, military
- Shotguns, military
- Military weapons, nesoi
- Bombs, grenades, torpedoes, mines, missiles and similar munitions of war and pts thereof; other ammunition projectiles & pts. thereof
- Aircraft launching gear and parts thereof; deck-arrestors or similar gear and parts thereof
- Air combat ground flying simulators and parts thereof
More seriously, I have scanned over the list and a lot of items look like they are there to pad out the length of the list. I suspect that a great many of the more mundane items are not made in the US at all and there is no US industry to protect.
Where the US may run into problems is with a lot of obscure components that go into products that are made in the US, and which will raise the cost of producing those items enough that the company simply closest up shop in the US and moves the production to Mexico to get around the tariffs.
Let's all "hack" each others elections.
So everybody is "hacking" everybody else's election. A British company "hacked" the US election. A Canadian company "hacked" the UK referendum. Not in the story was news reported a couple of years ago that a US organisation "hacked" the Canadian election before all this. It sounds like the Russians are a bit late to the game.
Re: Is it just me
@regbadgerer - I haven't tried it yet, but from what I understand it's all about making the billing more granular for highly variable loads. Instead of getting billed for provisioning a micro-service which may sit about not getting used much, you get billed on actual usage. You do have to structure your application to work effectively with that however.
Whether that makes financial sense is going to depend a lot on your individual use case. It isn't for everyone, but it may reduce costs for some people who have highly variable loads. If you run high loads on a fairly consistent basis, then it's probably not a good fit for you.
The main problem with it has I think been the tendency to create new terminology in an attempt to differentiate it from micro-services. The basic concept though is to make your application better fit something that can be billed on a more granular basis. It probably has a place, but as another option rather than the one and only way of doing things.
@Dan 55 said: "So what's stopping them moving to Vimeo or Dailymotion?"
Several of the channels that I follow were in the process of moving copies of their content to Vidme, and then Vidme shut down. That was a short time ago, so I'm not sure if they are gearing up to look for another destination. Most of the people running the channels that I watch seem to know one another, and the move to Vidme was started by one individual with the others following.
I don't follow many shooting channels, as my own interests are more along the lines of history. People doing channels related to history and (non-pop) culture have been hit just as hard or harder by recent Youtube policies as people running shooting channels.
Some of the problem I suspect is that Youtube was trying to decide which videos to ban or demonetise by using AI. I've heard things from various podcasts interviewing people who have worked on AI systems for that purpose for Youtube. However, the AI doesn't seem to work in any sort of reasonable, logical, or consistent manner and its randomness is driving the content creators mad.
Karl Kasarda I think is some sort of computer security consultant in his day job and very big on the "digital rights" movement in general. Of the two he's the one who is always looking for alternative video distribution networks as he doesn't like the idea of Youtube being able to shut down anyone they take a dislike to. He is also a bit more social media savvy, and the Pornhub thing sounds like more of a publicity stunt that he dreamed up rather than a serious effort at diversification. Since it got his name into the news, it sounds like it has been a pretty successful publicity stunt which also happens to line up with his views on censorship in general.
McCollum is a lot more laid back and lets Kasarda take the lead on things like this. On the other hand, he has his own Forgotten Weapons web site (rare, historic, and antique firearms) which is his main effort. His site has been around for a long time, as its own forums, and he sources his own ads for that site, so he isn't completely dependent upon Youtube to sustain his "brand". He does use Youtube as a video host and they do bring a lot of new viewers to him, but he wouldn't have to start from scratch if Youtube kicked him off their platform.
The big point which has so many Youtube content creators of all types up in arms is Youtube's opaqueness and seemingly random application of their "rules". Creators who want to invest time and money into a high quality production will find themselves "demonetised" for no apparent reason. They will complain to Youtube, who will then reverse the demonetisation, but by the time it goes through Youtube's bureaucracy most of the potential views will have gone by, turning the video into a loss maker for the producer. Nobody at Youtube can give them a reason why they were demonetised or point to a policy which they may have "violated", or even seem to care about any of it. And I'm talking about content creators who have hundreds of thousands of subscribers and received awards from Youtube, not some guy with a few dozen views.
The end result is that content creators have become risk averse in terms of how much they are willing to invest in production costs, and the market is tilted in favour of creators who put little effort into quality. People who simply babble into a microphone about video games have much less at risk than people who have to purchase material or pay for travel to do a historical documentary.
Almost all of the Youtube channels that I follow now depend upon Patreon to make ends meet, as Youtube ad revenues are simply too high risk. None of them make a living from Youtube, but all have to try to at least cover their expenses somehow as they aren't wealthy enough to fund their productions out of their own pockets.
With respect to the latest changes, the firearms related channels can't get any sort of answer out of what "manufacture" of ammunition means and whether there are any clear guidelines for reviewers. Does this cover normal reloading using commercial components, or are they talking about improvised ammunition?
Anyone doing serious target shooting will hand load their own ammunition, as the commercial grade stuff simply isn't good enough for competition use. Anyone firing antique or otherwise old or rare firearms will also usually have to reload their own ammunition, as obsolete calibres are simply not available or the stuff that is available may be unsafe to use in older firearms. So is what they are doing "manufacture" of ammunition according to Youtube? Nobody knows, and there is apparently no way to get any sort of answer out of Youtube.
The majority of the content creators on the channels that I watch regularly have all said that they are actively looking for alternatives to Youtube and only stay there because that is where they can get new viewers they can attract to things such as their Patreon channel. Content creators are looking to decamp en masse from Youtube as soon as a viable alternative arises. The market is ripe for a competitor; the main barrier to entry being the ability to line up advertisers.
"the majority appear to be straight white men"
That sounds to be pretty representative of software developers. Normally, a poll which is intended to discover what products software developers are using ought to be polling a representative sample of software developers, not a representative sample of for example trendy social media advertising consultants or PR flacks.
If there ought to be any concern about how representative the sample is, the concern ought to be with respect to how well the sort of person who answers Stackoverflow surveys is representative of the sort of experienced and knowledgeable software developers whose opinion on matters of what software is good or bad is worth listening to.
Woe Canada: Rather than rise from the ashes, IBM-built C$1bn Phoenix payroll system is going down in flames
The History Goes Back Further Than That
El Reg said: "Launched in 2016, Phoenix was an IBM implementation of the Oracle PeopleSoft platform"
It was actually started well before then, in 2009, under the previous government. The contract was awarded to IBM in 2011. It was part of an overhaul of IT systems which were consolidated as a "cost saving" measure. It went live just as the present government came to power after the election.
None of the other projects which were initiated as part of this cost saving project were successful either. It became obvious soon after it went live that the project was in trouble. The opposition are of course blaming the government for not having pulled the plug on it immediately after being elected.
The Auditor General's investigation didn't really address whether IBM did a good job or not. Rather it focused on whether the government's response to things going wrong was adequate.
From what I can see, the major problem was that the project was rammed through, ready or not, prior to the election in order to claim the cost savings for electoral campaign purposes, and there was no "plan B" if "plan A" didn't come off perfectly.
It turned out though that payroll for such a broad range of employees was much more complicated than had been envisioned back when the project was started nearly a decade ago.
The main failing of the present government has been in persisting in trying to salvage something from the mess they inherited instead of pulling the plug on it earlier. Their response to that however is that there was no fall back position available. The Auditor General however noted that the government of Queensland had pulled the plug much sooner when they faced a similar problem.
The project which was supposed to save $70 million per year has turned into a persistent financial black hole which will continue to cost money for years to come.
Re: Microsoft's idea of system administration...
Anonymous Coward said: "Do sane people run ubuntu as a server though?"
Funny how there's all these anonymous posts on this thread making various claims about other company's products.
Ubuntu is used very extensively in cloud applications. Microsoft isn't putting so much emphasis on supporting Ubuntu because they've run out of other things to do.
Canonical have always placed a lot of emphasis on server and cloud applications. That is why they have a number of deployment and management tools focused on that area. The desktop version of Ubuntu is just a loss-leader intended to get developers using Ubuntu, with the intention that those developers will also use Ubuntu as their choice of server. People who have connections inside Ubuntu have said a number of times that Canonical's server business has been profitable for some time now. They've recently dropped the phone OS project, and Unity has been scaled back as they currently focus on profitability, supposedly to clean up the balance sheet in order to go for a stock market listing.
Ubuntu is based on Debian, but has the advantage of offering commercial support contracts for those who want them. Debian themselves of course do not, and finding commercial support for it is not as straightforward.
Ubuntu Mate for Matebook X
El Reg said: "Huawei guns for Apple with Mac-alike Matebook X" as well as: "The rest of the cruft is Microsoft's: the usual garbage of discarded kids' toys emptied over your desk."
Obviously the proper OS for the Matebook X is Ubuntu Mate.
Re: *A* Brit Expert
The whole premise of the theory is bonkers. A machine is not going to be held "liable" for anything. The police are not going to arrest your car and put it in jail.
The people who are held accountable for how the software performs will be determined the same way that the people who are held accountable for how the hardware performs. There are loads of safety critical software systems in operation today, and there have been for decades. There is plenty of established legal precedent for deciding liability. Putting the letters "AI" into the description isn't going to change that.
The company who designed and built the system and sold it to the public are 100% responsible for whatever is in their self driving car (or whatever). They may in turn sue their suppliers to try to recover some of that money, but that's their problem. Individual employees may be held criminally liable, but only if they acted in an obviously negligent manner or tried to cover up problems. The VW diesel scandal is a good analogy in this case, even if it wasn't a safety issue.
There are genuine legal problems to be solved with respect to self driving cars, but these revolve more around defining globally accepted safety standards as well as consumer protection (e.g. who pays for software updates past the warranty period).
The people who have an interest in pushing off liability from themselves are dodgy California start-ups who push out crap that only half-works and are here today and gone tomorrow and don't have the capital or cash flow to back up what they sell. They might try to buy insurance coverage, but the insurers may get a serious case of cold feet when they see their development practices. Uber's in house designed self driving ambitions are going to run into a serious road block from this perspective.
Liferea seems to be pretty much the standard RSS feed reader on Linux. It's fast, configurable, and easy to use. It can also run external scripts to massage malformed feeds or even scrape web pages.
I'm not going to criticise Vulture-feeds - hat's off to you for actually building something that suites your needs rather than just regurgitating press releases. It certainly gives you insight into Electron that you wouldn't get any other way.
However, this bit really stuck out: "vulture-feeds weighed in at 368.9 MB". Liferea, which does far more, is "594.9 kB on disk" according to Ubuntu Software Centre. That's right, less than 600 kB. Electron is mind bogglingly huge.
Nearly all that sites that I read regularly I monitor via RSS. If a web site doesn't offer an RSS feed, then it may as well not exist so far as I am concerned. I read the articles in the web browser, but RSS is where I find out that the article exists.
I think that much of the source of the problem with "fake news" is that too many people seem to get their news spoon fed to them from Facebook or Twitter instead of getting it directly from reputable news sources followed via RSS. RSS is decentralised, which also keeps any one company from getting a choke hold on the supply of information. That of course is why the companies who do want a stranglehold on the web don't like it.
decrypt it or we'll legislate
If the companies are going to have to do it one way or another, why not demand the government produce clear legislation with a detailed description of the means they propose, and then publicly poke holes in the logic of the legislation?
The "we want to force you to do it voluntarily" argument only exists because the people pushing the agenda want to have their cake and eat it too. They knowingly want weak security, but they want someone else to act as a whipping boy when ordinary people suffer as a result of it.
Next up - the government will legislate that all automobiles must be powered by perpetual motion engines, with heavy fines on any auto company who fails to produce one by next year. Well, why not?
@wolfetone: "So why are the UK buying F-35's then?"
The UK currently flies two main fighter types - the Typhoon and Tornado. The Typhoon is optimised for air defence, but also does bombing. The Tornado is optimised for dropping bombs but also does air defence. Generally air defence planes can do ground attack very well in this era of guided weapons and smart bombs, provided they are equipped with the appropriate sensors and electronics. The earliest versions of the Typhoon left out the ground attack kit as an economy measure (since the buyers already had Tornadoes who could do that job anyway), hence the common myth that the Tornado couldn't drop bombs. Later versions included the kit for both roles by default, as do current production. However, the Tornado is still often use for those jobs because, well, they've got them so they may as well used them and get their wear out of them.
The Typhoon is still a relatively new plane and will form the backbone of the UK's air force for many years to come, but the Tornado is at least a generation older and is getting long in the tooth and has to be replaced due to increasing age and obsolescence.
The UK is buying the F-35 as a replacement for the Tornado. The 'B' version is being bought to also operate from carriers so it can be dual purpose. So far the UK has committed to buying 48 in total. That order could conceivably be extended to 138, but that decision awaits future approval by parliament. It is possible that the larger order may include buying some cheaper 'A' versions instead as strictly land based Tornado replacements, but that is up to the government of the day to decide in future. The F-35 is built by the equivalent of a consortium, and the UK has the second largest share in it (I think around 15%). The UK has various parts which go into it, Italy and Turkey will have final assembly lines, etc. The US of course as the biggest customer has the lion's share of the workshare.
The Typhoon/Eurofighter is also built by a consortium. The parts which go into the plane are built in various countries, but there are four final assembly lines, one each in the UK, Germany, Italy, and Spain. Each of the four takes turns in leading the sales effort to countries outside of the consortium and getting the largest share of the resulting benefits from it. It appears likely that the UK is the lead country for sales to Belgium. Going by attendance at recent meetings, early indications are that the UK is also the lead country for sales to Canada (who are also shopping for new planes).
Re: Network connectivity or not
I think it's about whether there was network connectivity at install time or whether the network connection came later. At the moment when you do an install they ask you if you want to download updates during the install or do it later.
I suspect they want to simplify the install procedure still further (it's already the simplest to install of any modern OS of any that I've tried) and are looking for what defaults to set and what to push off into an optional "advanced" menu.
At present they have an optional hardware configuration collection program which you can go through after installation to send information about your PC to them. I've used it a number of times, but I think it has too many questions and it inherently biases their data towards the sort of user who cares about what is in their PC. I think getting less and more basic information but getting it from a wider selection of users will give better results.
They've said they will publish this information in aggregated form on their web site. I'm in particular looking forward to seeing what proportion of people are using what sorts of CPUs and GPUs. I've been writing software recently which uses SIMD instructions, but it's very difficult to get a good idea of what SIMD level to target since the publicly available data sets are for games users, who are atypical so far as my software is concerned.
Would that be the same Mint that pings Ubuntus servers with your current IP address daily to ask for security updates?
Re: Location, Location, Location.
The location asked at install time is just country and time zone. It uses that to display the local time and to know which local time zone rules to apply, plus the installer uses it to guess what to suggest for language, keyboard, currency, etc. you probably want.
Fedora first started collecting this type of information at least 10 years ago, and RHEL, CentOS, OpenSuse, and Gentoo copied it from them. I think it got retired a a few years ago though because of lack of maintainers for the code and the server.
Debian created a system which tracked which packages you installed and which ones you used how often and a lot of Debian derivatives use it also.
Any non-server distro that wants to know your current IP address already has it anyway, since your PC constantly pings their server for security updates. That is true these days for any PC operating system.
Apple customers obviously need to buy themselves some lace doilies to set their collection of Apple things on. I'm sure their grandmothers could give them some other helpful decorating tips to complement the rest of their post-modern furnishings as well.
Joke icon required, because Apple customers are not exactly noted for having a sense of humour when their latest eye-wateringly expensive fashion purchase goes wrong.
Re: Bad bargaining
I would rather see Creative Commons (and other similar licenses) photos at the top of results, with anything else located further down in a separate section. A lot of what amounts to spam from these companies appears in search results when you are looking for a clear photo (e.g. no watermark plastered across it) of something for non-commercial purposes (just to look at, for example). I want to see stock photos in my image search results about as much as I want to see "shopping comparison sites" in my text search results (i.e., not at all).
Oh, and as a note to journalists and blog writers, stop putting pointless stock photos at the top of your stories. It's a waste of bandwidth and it's a waste of my time and effort as it means the first thing I have to do is scroll down past an utterly pointless and irrelevant stock photo before I can start reading. If the photo is directly relevant to the story, by all means include it. A pointless picture of a model holding something irrelevant though provides no value to the reader.
If you want to really see the height of hypocrisy though, just have a look at almost all of the news stories condemning crypto currency miners for their alleged vast energy consumption. Almost all of those very same new stories will include very large format pointless stock photos which have no direct relevance to the story, but which consume vast amounts of energy in sending, transmitting, receiving, and displaying them. Pot meet kettle.
Does it even work on Linux?
El Reg said: "The Photoshop maker said that – so far – only Windows machines have been attacked, although Windows, Macintosh, Linux, and Chrome OS systems are potentially vulnerable."
I'm using Ubuntu 16.04. I just had a look in the user reviews in the Ubuntu Software Centre (software installation manager) and most of them are saying it doesn't work. I looked at quite a few reviews, but found only two who said it worked (the most recent from a year and a half ago), but they didn't have anything positive to say about it. I think the ones who did have it were using Ubuntu 14.04, so I have serious doubts that many Linux users these days have Flash installed.
I haven't had Flash installed in many years, and it is very rare that I see any web sites that make any use of it at all. For some years now the main laggards still using it tended to be ads, and quite frankly I didn't miss them at all.
If you've got it installed, you can almost certainly just delete it (if you can) without missing anything of value. For the very, very, few people who have a legitimate application for it, you're going to have to find another solution before too long anyway when Adobe finally pulls the plug on it and all the browser vendors blacklist it from being installed at all.
I use the open source driver in Ubuntu with my AMD APU with Radeon graphics, and it is much preferable to the proprietary drivers in terms of stability. I've never had much luck with the proprietary drivers from either NVidia or AMD, as I have a very low tolerance for crashes.
On the other hand, I don't play games, so I can't say much about performance. The desktop is fast with no lags or visible graphics defects, and that is good enough for me. I would rather have the greater reliability of the open source drivers over the theoretical speed improvements of the proprietary ones. AMD's current proprietary drivers are based on the open source driver anyway, with just some features added.
As for Wayland, they're like nuclear fusion. They've been a year away from being "ready" for a great many years now. I'll believe it when I see it.
El Reg said: "As was the case throughout negotiations of the first deal, there's no text for the proposed treaty. Just what Australia, Brunei Darussalam, Canada, Chile, Japan, Malaysia, Mexico, New Zealand, Peru, Singapore and Vietnam signed up for therefore remains obscure."
And not all that was agreed to is in the treaty text. Some of it is in additional agreements which sit outside the treaty but which override it. This is how the final issues were addressed recently.
The hold up in the treaty has been Canada's insistence on having some of the worst bits of it watered down or excluded. With the US gone, that left Canada as the second largest economy in the treaty, and so with additional negotiating power to get those changes made.
The US and certain other parties originally tried to keep Canada out of the negotiations, planning on presenting them with a fait accompli later and telling them to sign it (TPP was intended to replace NAFTA). Canada elbowed their way into the negotiations mainly to try to undermine them from the inside.
Australia seemed to be the main proponent for signing the treaty as is. Much hate was directed from their government towards Canada over delays caused by Canada insisting on changes.
There's a good chance that the worst aspects of the TPP have been de-fanged in the past year and it's now been watered down to a normal trade treaty.