* Posts by thames

1125 publicly visible posts • joined 4 Sep 2014

Gartner confirms what we all know: AWS and Microsoft are the cloud leaders, by a fair way

thames

Re: Confused

As the story notes, this study is on "Infrastructure as a Service" (IaaS), not "Platform as a Service". Google placed their bets on PaaS very early, and got into IaaS much later. The absence of other major cloud vendors from the study is due to them not being in the IaaS side of the market at all.

Microsoft's cloud is more focused on IaaS because they have a lot of customers who want to move legacy applications to "the cloud" so they can shut down data centres and bin staff as a cost saving (theoretically, anyway). Microsoft's efforts at PaaS on the other hand have supposedly not gone as well.

Google doesn't have that legacy system base, as they started off as a cloudy type company right from the beginning. Thus, they're focused on applications that were designed from the ground up for "the cloud", and their PaaS is oriented around this.

So it goes like this:

Amazon - By far the biggest, cheap, lots of third parties based on it.

Microsoft - Move your legacy applications to "the cloud".

Google - Mainly PaaS.

Salesforce, etc. - Entirely PaaS.

OpenStack vendors - Like Amazon, but with less vendor lock-in.

Theoretically, PaaS applications should take less effort to create, be more scalable, and require fewer resources to run and so should be cheaper to operate in the long run. On the other hand, if the PaaS is proprietary, then there's more vendor lock-in. Decisions, decisions.

Debian devs dedicate new version 9 to the late Ian Murdock

thames
Linux

Iceweasel

The Firefox/Iceweasel issue came about due to Mozilla's trademark policy conflicting with Debian's stable release policy. Mozilla would not allow Debian to create their own stable version of Firefox by backporting patches while still calling it Firefox.Mozilla has this degree of control due to trademark law. The code is free and open source, but the trademark belongs to Mozilla.

Debian's policy is no version updates to stable releases, just security and bug fixes. Mozilla's policy was that distributors were not to create their own LTS version of the browser while still using the Firefox trademark.

Mozilla now has their own official LTS version, which is what Debian packages in their stable distro. Since the impasse has been resolved, Firefox is now Firefox again in Debian.

Software dev bombshell: Programmers who use spaces earn MORE than those who use tabs

thames

@Paul Crawford - "However python demand spaces."

As pointed out by Vic, Python will work just fine with either spaces or tabs, whichever you prefer. You just can't mix them together in the same syntactic block. The Python compiler will give you a syntax error if you mix spaces and tabs in an ambiguous manner.

As an example, supposed I have a small program where I have some print statements inside an if statement. If I indent the first print statement using tabs, but indent the next line an equal amount using spaces, I get a syntax error at compile time. Even though visually it looks the same, the compiler detects the mix and gives you a specific error message pointing to a specific line. This happens at compile time, not during execution. Here's the error message.

File "./test.py", line 5

print('Maybe')

^

IndentationError: unindent does not match any outer indentation level

Python enforces consistent indentation as part of the language syntax. This is a design feature, as it eliminates entire classes of bugs which would otherwise result from you looking at code structure based on indentation and the compiler ignoring it.

thames

Re: correlation is not causation.

Yes, correlation is not causation. If you analyse a set of data enough ways, you stand a strong chance of finding various meaningless correlations purely by chance. They could just as easily have found, by chance, a correlation between pay levels and the colour of your underwear.

This is a common mistake amongst people who don't actually understand statistics well enough.

Move over, Stuxnet: Industroyer malware linked to Kiev blackouts

thames

Re: Puzzling

Siemens PLCs so ubiquitous for the same reason that Microsoft products are so ubiquitous. They have the distribution networks, training and skills certification partnerships, third party consultant and value added product partnerships, an entrenched install base and vendor lock-in with existing customers, etc., etc.

Once you've got one Siemens product installed, you have to keep buying more of their stuff because their stuff only works with either their other stuff, or with someone they have a partnership with. Most of the other big vendors work the same way. They've got the "vendor lock-in" dials turned up to 11.

thames

Where the real problem lies

The original article (the link is in the Reg story) waffles about a bit, as the reality is no where near as dramatic as the hype.

First, "Industroyer" appears to be just another Windows virus, but with a payload that is intended to sending commands to devices on an attached private network rather than sending spam or doing Bitcoin mining. In other words, it's a bog standard IT security problem rather than something super special to industrial control systems. The solution is to apply whatever security measures are considered adequate for whatever revenue critical functions your business may also have running on MS Windows.

CVE-2015-5374 is completely unrelated. This affects a Siemens Ethernet to industrial network protocol converter. This is a small box that you plug an Ethernet cable (or optic fibre cable) into on one side, and one of the special industrial media/protocol combinations such as Profibus into the other side. It then will accept commands from your WIndows PC via Ethernet, translate them into industrial protocol commands and does the electrical (or optical) conversion to pass them on to the industrial devices. It also does the same in reverse.

The listed CVE apparently refers to the converter having a software bug which will crash the firmware if it receives a malformed packet. This doesn't surprise me, as most devices in this class are absolutely craptastic regardless of who sells them. A lot seem to be private labelled versions of products from one or two small companies. The only relationship that "Industroyer" has to this CVE is that Industroyer could theoretically send it the sort of malformed data packets that could crash it.

As noted by another poster, there is by the way zero evidence presented that Industroyer had anything to do with the Kiev blackouts.

Whether we are talking about Stuxnet, Industroyer, or whatever scary story of the week is being promoted, they all have one thing in common. They are basically just MS Windows viruses with specialised payloads. For whatever reason, good or bad, the big industrial equipment vendors have decided to host their configuration and monitoring software on MS Windows PCs. This is the software used to monitor what is going on in the equipment, tell the equipment what product to make or how much electricity to produce, log data to databases for analysis, and program the equipment via proprietary IDEs.

That software has to be able to communicate with the equipment in order to do its job, and visa versa. If a virus payload manages to take over one of those PCs, it can them masquerade as a legitimate user, and send out commands using that user's authorisation to do whatever those industrial devices are normally intended to do. That can include shutting down an electrical power plant, which is something that happens routinely for legitimate reasons anyway. Altering the industrial protocols or devices will do nothing to mitigate that.

Where the main, real vulnerability lies at this point in time is in the multitude of Windows PCs which are an inseparable part of many modern industrial systems at this time. Security efforts need to be focused there. The reason why these types of articles waffle around that point is that admitting this isn't going to generate higher consultancy fees for security consultants who have added the word "industrial" to their business cards.

Intel to Qualcomm and Microsoft: Nice x86 emulation you've got there, shame if it got sued into oblivion

thames

Re: 40 year old tech....

@Jack of Shadows - "unfortunately, most/much of that portfolio is around SIMD and right there is the basis of their threat."

This is software emulation, not a new chip. Actually it's probably not even real emulation, but rather binary translation. That is, it would be cross-compiling x86 binary instructions to corresponding ARM binary instructions.

ARM has its own SIMD. If the emulator can translate x86 SIMD binary instructions directly to corresponding ARM SIMD binary instructions, then there's no problem as the ARM chip is implementing it directly already.

The only way Intel's patents can mean anything is if their x86 chip is doing something related to SIMD that doesn't exist in ARM. And it if is doing that, then the ARM chip has to do it using normal non-SIMD instructions anyway. It's pretty easy to imagine the binary translator seeing an x86 SIMD instruction that doesn't exist in ARM, and just calling a library function that accomplishes the same thing using conventional instructions. I can't see Intel's patents coming into play in that case.

I've been doing a bunch of work using SIMD recently, and what I can say about Intel's SIMD instruction set is that there may be a lot of instructions but that's mainly because there are multiple different overlapping sets of instructions that do very similar things. They just kept adding new instructions that were variations on the old ones while also retaining the older ones, resulting in a huge, tangled, horrifying mess of legacy instructions which they can't get rid of because some legacy software out there might use it.

Off the top of my head, the only SIMD feature that I have run across so far that Intel has a unique patent on is a load instruction which has the ability to automatically SIMD align arrays which were not aligned in RAM. It sounds great, but it's not really as big a deal as you might think, since good practice would have you simply align the arrays to begin with when you declare them. It's mostly of use to library writers who want to be able to handle non-aligned as well as aligned arrays for some reason. You take a performance hit for that flexibility however.

I suspect that software publishers, including Microsoft, will offer native ARM ports for the most popular applications rendering this moot so far as they're concerned.

UK surveillance law raises concerns security researchers could be 'deputised' by the state

thames

Re: Education is key...

"Anon for good reasons" - You only think you're anonymous, citizen.

thames

It sounds more like it's about telling you to keep your mouth shut about vulnerabilities than asking you to find some for them.

For example, suppose GCHQ are exploiting a vulnerability in the telecoms gear in Berlin to monitor cabinet conversations in the German government. GCHQ know from reading the literature that you've published previous papers on related security research. They then serve a warrant on you telling you to "assist" them by running anything by them first before publishing it. If you start getting warm with regards to a vulnerability that they're using, they'll tell you to "assist" them further by stopping work in that direction and not to publish anything about it. The warrant will also forbid you to say that there's even a warrant. That protects their ongoing use of that vulnerability.

The sort of equipment we're talking about is expensive and not in the hands of the general public, so the number of independent security researchers working on that problem domain will be very limited. Keeping them muzzled would not be difficult, given the tools described. The fact that nobody seems to know what the law actually means in practice is unlikely to be an accident, as they have an ingrained reflex against revealing that they even want to do stuff like this.

They could be doing this right now, and there's no way for you to tell.

New 'Beaver' web server has exactly ONE user outside China

thames

So are you saying that IIS is fine for hobbyists, but when real work needs to be done, real professionals use Apache or Nginx? That's what your argument amounts to.

Many, many, registered domain names don't exist as real web sites. Instead, domain name speculators ("domain squatters") buy them and hold onto them, hoping to sell a "valuable" name to a real business at a huge mark-up. These names don't need a real associated web site since they won't receive any traffic unless someone types a domain name into their browser incorrectly. They can just exist as a database entry in an old server that serves up a default template with some ads and a "this domain name for sale" page.

There were reports a number of years ago that Microsoft was occasionally paying domain name registrars to put these unused sites on IIS as a marketing wheeze. There wasn't a lot of money involved, but it's not an expensive service to run. Hence, this is why "all sites" numbers go up and down like a roller coaster, while "active sites" and "top million sites" numbers changed relatively slowly and in line with well known industry trends. The "active sites" has just one "bump", which I believe is from when Netcraft had to start accounting for this phenomenon by testing if a site is a "real" site.

This is why the "all sites" numbers are meaningless. They don't have any correlation to how many web servers or real web sites actually exist. Compare the "active sites" numbers to the "top million sites" numbers though, and the relative rankings are the same, and the market share numbers are roughly in line (although "top million" trends tend to lag "active" obviously, as the biggest sites have generally been around for a while).

If you're a server administrator or web developer and want to know what new skills you ought to be working on, then the "active sites" numbers will tell you what you should be paying attention to if you want to address the widest real market.

thames

So What Is It?

I did a quick search for information about the "Beaver" web server, and can't find anything other than the Register article. The Netcraft article isn't much more informative. So, what is it?

Given the information in the article, I would speculate that it's a drop-in replacement for IIS for running some types of existing applications in a manner which is advantageous to a hosting provider. It's hard to think of any other reason for a hosting provider to have this.

thames

Re: Usage Stats

I think the Dot Net or IIS sites that are on the Internet tend to be mainly older systems. It's pretty rare to hear of a new large scale application using either. Yes there are a few existing successful ones, but they tend to be ones that were built back when IIS and Dot Net were considered to be serious major competitors in the Internet market (I think remembering that makes me sound old).

The mainstay for Dot Net and IIS these days tends to be legacy business applications that run on internal company networks out of public view. Dot Net and IIS are the new COBOL and mainframe.

Your roadmap to the Google vs Oracle Java wars

thames

Re: Google had copied

@Richard Plinston - "In any case I am not sure why Google is not using as a defence:

Sun released the complete source code of the Java Class Library under the GPL on May 8, 2007, ..."

The problem is that Google didn't use the GPL version of the class libraries. They re-implemented their own because they wanted Apache or BSD licensed software. A license for software written by Sun doesn't apply to software written by Google. Google's long standing aversion to the GPL may have just bitten them on the arse.

thames

Re: OTGH

The problem here isn't the OS itself, it's the user applications that are written in the Java language that are the problem. The apps that are written in other languages are not at issue. If you port the Java language apps to a new OS, you haven the identical problem. If you replace the Java language apps on Android, you've gotten rid of the Oracle problem.

In other words, the problem isn't the Android OS, it's the apps written in the Java language (even though it's technically not "Java").

Replacing the Android OS itself would be silly, as their are loads of simple apps written in Javascript (which has no relation to Java) and loads of complex games written in C++ which are unaffected by this.

As for what Google's "plan B" might be in all this, the most straightforward one would be to replace the recommended programming language with one that doesn't duplicate any of the Java APIs. Since it's the Java library APIs and not the language itself that is in question, any language that runs on the JVM and calls the Java libraries could be equally problematic. That means that Kotlin, recently given official support from Google, could be in trouble until and unless the lawyers can give the standard library a good going over and assure everyone that there's nothing resembling Java APIs in there. Google Dart might be a safer alternative.

Once Google have picked a new recommended language, they need to figure out what to do about the existing apps written in the Java language. Perhaps they could release a program that mechanically translates Java to another language. For those developers who don't want to do that, they could let those developers negotiate their own license directly with Oracle. I suspect that given that alternative, most developers would either choose to port to the new language (using the automated tools provided by Google) or withdraw the app from the app store (for apps that were not selling well anyway).

For software developers in general though the message seems to be to avoid Java like the plague unless the customer insists on it, and in that case make sure the customer is the one bearing any and all risk (get that in writing).

The above applies to mobile phones. If Oracle are successful however, I doubt they would stop there. Traditional server systems would be at risk when Oracle casts their eyes upon them looking for excuses to squeeze licensing fees out of everyone using the JVM in some way. Alternative languages which run on the JVM and use the standard Java libraries are an obvious target if they implement the library interfaces in a similar fashion.

American software copyright law has something called "non-literal copying", whereby the judge decides that something looks sort of like something else, even if they're not actually identical. That means the fact that the function and parameter names and the syntax decoration aren't the same doesn't mean that it isn't a copy. That's why Google can't simply make the whole problem go away with a bunch of "sed" scripts.

Third party trolls will have a field day as these newly created IP rights become the functional equivalent of new software patents. The fact that you wrote the software entirely yourself won't save you from copyright infringement claims, including claims of non-literal copying.

In other words, if you're in the software business, be prepared to be royally screwed by Oracle if the case goes their way.

Bankrupt school ITT pleads 'don't let Microsoft wipe our cloud data!'

thames

Don't forget the fees.

The bankruptcy lawyers will probably bill the estate more in fees on this motion than the cost of the outstanding bill.

They've had since September to worry about this. According to the document, they didn't even look into preserving the data until last month, which is long after the expiration countdown had begun.

The concern of the lawyers by the way isn't for the students. They'll get screwed regardless. It's that there may be data somewhere in there which the lawyers can trawl through which will help them in pursuing the bankruptcy proceedings.

The situation is however instructive from the point of view of companies offering cloud services. They should all have some expeditious means by which they can dump out some disks or tapes with the client's data and hand them over to the trustee as quickly as possible so they can wash their hands of the whole affair instead of getting dragged repeatedly into very expensive legal proceedings.

It's something to think about.

Samsung was just Tizen – homegrown Linux again pitched at n00bs

thames
WTF?

Eh?

El Reg said: "most of the other participants in the LiMo Foundation more or less lost interest in the idea of Linux-for-mobile-devices."

Er, Android is Linux-for-mobile-devices and I don't think people have lost interest in that. All that has happened is that people have lost interest in Samsung's version of Linux as opposed to Google's version of Linux.

Samsung's idea for Tizen was to fill the perceived gap in the market between Android smart phones and feature phones with something which took fewer resources. The problem is that Android smart phones have gotten so cheap that the gap has pretty much disappeared, leaving Tizen as a solution looking for a problem.

Just so we're all clear on this: Russia hacked the French elections, US Republicans and Dems

thames
Coat

Clinton's embarrassing leaks are on Wikileaks. Trump's are on his Twitter account.

Clinton's embarrassing leaks are on Wikileaks. Trump's are on his Twitter account.

More seriously, we can speculate that the Russians may have compromising information about Trump. Since the man knows no shame though, I'm not sure how much good it would do them.

However, we know the Russians had compromising information about Clinton. I mean, everyone keeps tell us they hacked her servers to get it, right?

Might I make a modest suggestion? How about not advancing dodgy politicians to positions of power to begin with?

Mine's the one with the spy rifling through the pockets.

You only need 60 bytes to hose Linux's rpcbind

thames

Before panicking, check to see that it's even installed. I just checked my PC (Ubuntu 16.04 desktop), and it's not installed by default.

I've not had anything to do with "rpcbind" myself, but from what little I've seen about it, it appears to be mainly used with NFS servers.

According to the on-line man pages, it seems to originally be a BSD program which was ported to Linux in the very early days. Given that, it's quite possible that this may affect any version of BSD or Apple Mac OSX as well.

Post Unity 8 Ubuntu shock? Relax, Linux has been here before

thames

Ubuntu started out as just Debian with:

a) more polish,

b) a regular and predictable upgrade schedule,

c) more up to date packages and features without running a bleeding edge version,

d) the option of commercial support by the same organisation that released it.

It looks like they're going back to their roots. The nice thing about using a Debian based distro is that you can jump ship to another Debian based distro, or even directly to Debian itself, without changing the underpinnings.

There is only one major Linux distro that isn't joined at the hip to a commercial interest, and that's Debian. The majority of Linux distros are Debian derivatives (including Ubuntu). Stick to Debian or a Debian derivative for your desktop and you will always have fairly painless options to switch to if you decide you don't like the direction your current distro is headed in.

thames

@Robert Carnegie - "It seems unlikely to me that Ubuntu can persist without providing desktop applications - they may not be distinctively different from other Linux flavours, but isn't the desktop presence what Ubuntu is for?"

Ubuntu got its space in the Linux server market by offering a good desktop version. Originally, it was just a Debian base with stock Gnome 2 with some polish and a Ubuntu theme. I expect they will go back to that model, but with Gnome 3 (the Debian base hasn't changed). They will continue the desktop because getting developers to use their desktop is a good introduction to getting them to use their server products.

Red Hat made a massive mistake when they stopped offering free access to their desktop and pointed people at Fedora (free of charge but, not the same as RHEL and with a very short life cycle). That was the opening in the market that Ubuntu jumped into and helped them get their associated server distro established.

The desktop version of the distro is just the server version with a GUI added and some different default packages. It won't take a lot of resources to take stock Gnome 3 (which Shuttleworth said they will use) and add that to the base Ubuntu distro. The community versions using the alternative desktops already do that (including an already existing Gnome 3 version). Much of the work flow is automated through Launchpad.

A number of existing and former Canonical employees have said that if you took out the mobile oriented projects which Unity is intended to support, then Canonical is profitable. The mobile efforts took a disproportionate amount of effort, and with no obvious path to success past the dominance of Android.

I expect Ubuntu to focus on servers, cloud, and IoT (server and embedded), with a well polished desktop being offered as a way of attracting and keeping a community of developers who want to use a very polished and well supported Ubuntu desktop to develop their software. If people develop their software on Ubuntu, then the newest and most up to date versions will run on Ubuntu with little effort on the part of the distro maintainers. It's more effective than an advertising campaign, and much cheaper.

thames

Re: I wanted one...

Android has dominated the mobile phone market so thoroughly that there was probably very little chance of a third party finding a market niche. Blackberry got squeezed out, Microsoft failed after pouring money into it. Other third party vendors have made no headway.

I think that Shuttleworth faced the facts and pulled the plug when it became obvious that he wasn't going to succeed where Blackberry and Microsoft failed.

thames

Re: Unity usage?

@Jonathan 27 - "Unity never felt quite finished, the configuration options were never there, for one. You couldn't make the launcher autohide"

The Unity launcher was auto-hide right from the beginning. They then added an option to make it not auto-hide, because so many people didn't like auto-hide. They then made not auto-hide the default but left the option for it in place for those who still wanted it.

If you want auto-hide, then in 16.04 (it's probably the same in previous versions), click on "System Settings", then "Appearance", then "Behaviour". The first option is "Auto-hide the launcher". Set this to "on". When you've set it to "on", then the controls which adjust the auto-hide sensitivity are enabled and you can adjust several different factors relating to how and when the launcher appears.

I don't like auto-hide and don't use it, but despite that even I knew where that setting was.

The auto-hide option by the way is right above the options which control where the menus are displayed, so you would have had to go right past it when you changed the menu location options (which you said you did).

This criticism was typical of 99% of complaints that I've seen with respect to Unity. They've been written by people who don't use it and know little about it.

Net neutrality blowback: Cities say no. Court says whoa. Trumpster blames Canada for not going slow

thames

The best description that I've seen of this is that the ISPs want to turn the Internet into cable TV. That is, you pay 'x' for the "basic" package with access to a few websites, 'x + y' for the "silver" package with access to more, 'x + z' for the "platinum" package with access to the rest of the Internet (e.g. The Register), etc.

They wouldn't do anything so obvious as charging you for access to specific web sites. Rather, they would simply raise the cost of basic bandwidth to ridiculous levels and then "zero rate" sites for which they've negotiated a cut of the revenue.

I can remember when Compuserve and other proprietary communications services were still around. You got a basic walled garden, and access to anything outside (including basic e-mail to anywhere outside of their network) cost eye-watering extra. They charged third party companies money to host forums inside their system in order to access their users. I can recall when Siemens used to host their user support forum services for their industrial automation business inside Compuserve and people in the automation business paid for Compuserve accounts just to get access to the Siemens content.

Once the actual real Internet became more widely available, people dropped Compuserve, MSN (the original incarnation of it), and the others like a rotten turd. You could get direct access to anyone anywhere, without having to worry about which network they were on. You also didn't get charged stupid premiums for going outside the network.

The biggest ISPs are looking to return to those days, where monopoly control of the user base is the basis for demanding a cut of the revenue associated with every byte that goes through their wires. No thanks, I think the CRTC made the right call on this one and the lobbyists can go stuff themselves.

Stanford Uni's intro to CompSci course adopts JavaScript, bins Java

thames

Re: Just teach them Python

I have to agree with the recommendation for Python when teaching the basics. Anyone planning a course has to deal with the reality that there are a very finite number of hours in which to teach, and the toughest decisions are which of the "essentials" to leave out of the course rather than what to include.

The design of Python allows the basics to be covered without requiring knowing the advanced features. Knowledge of syntax and features can be added to the course incrementally, instead of requiring the student to learn a large amount of what they would see as "mumbo-jumbo" up front.

The standard library is very comprehensive, which allows topics such as simple databases, regular expressions, string formatting, file and directory handling, socket I/O, servers and clients, graphics, simple GUIs, etc. without using third party libraries. The syntax of the language itself supports dictionaries, lists, sets, etc. Teaching all of these would be impossible in the time available, but it gives the instructor a good selection of topics to choose from without a lot of teaching overhead.

The on-line documentation for Python is superb, being some of the best that I've seen for any language.

Another big advantage of Python is that it is more or less self contained. It comes, as the saying goes, "batteries included". Installation is simple (if it is not already pre-installed). Again, this is very important from a teaching perspective, as there needs to be minimal overhead involved in the students installing it on their own.

No IDE is required, and none should be used in the course. There are many simple free text editors available which will do syntax highlighting and auto-indenting and require minimal time to learn. I have seen how introductory courses go where some students are still struggling to get their complex IDEs working three classes into the course and as a result haven't complete the first assignment yet. Class assignments are short and simple, so you aren't dealing with searching across massive code bases. If you "need" an IDE in order to use a language effectively in a classroom, then it's the wrong language for this application.

The Python compiler rigidly enforces a set indenting style, which again is a big advantage in a classroom environment. This means the style of code as taught by the instructor will be close to that written by the students, so the instructor doesn't have to wade through a mess of unindented code ("I was going to clean it up after I got it working" is the typical excuse) when students come to him with problems.

Python is also very likely to be more useful to engineering and science students than Javascript is. One of the major areas of application for Python, and one of the reasons the language is so widely used in real life, is that there are large, high quality numerical analysis libraries which are intended for use with it. Many people learn Python specifically so they can use these libraries.

Overall, the reasons for using Python as a teaching language are quite compelling. For someone who is going on to take a degree in computer science, Python, or indeed any other single language, is not enough. I would expect them to learn a number of different languages, giving exposure to different concepts. 'C' is very important for a "serious" programmer to learn, but I don't think it's appropriate these days for an introductory course.

So far as this particular instance at this university goes however, the CS 106J course appears to have some very limited objectives. There are a couple of other courses (e.g. data visualisation) which require knowledge of Javascript as a prerequisite, and this course appears designed to fulfil that. Courses oriented to engineers and scientists require knowledge of Python.

Microsoft promises twice-yearly Windows 10, O365 updates – with just 18 months' support

thames

Timed Releases?

Twice yearly updates and 18 months between major "releases"? It sounds like Ubuntu, except Ubuntu also has 5 years support on the LTS releases.

Major software projects that have gone to a successful timed release system find that there is much less chance of pushing out dodgy code than there was with a feature based release system. With a feature based release, the code tends to go out whether it's ready or not because the next opportunity may not be for another 5 years. With a timed release system, if a feature is not ready it gets held back until the next release window, which is often no more than 6 months away.

If Microsoft are really going to a timed release system, they will need to overhaul their software development, management, and marketing processes in a very major way. This will not be an easy task for a bureaucracy as large and entrenched as Microsoft's. I won't be surprised to see them fail at it.

Microsoft raises pistol, pulls the trigger on Windows 7, 8 updates for new Intel, AMD chips

thames
Linux

Re: This should help Apple and Linux

@ITS Retired - "How can Microsoft do this legally?"

For large customers, this will be a contract issue. I suspect that Microsoft's license contract has this area well covered. For consumers, new hardware will come with Windows 10 pre-installed anyway.

@ITS Retired - "Or does anyone care anymore."

The Apple Mac users are whining about how Apple doesn't care about them any more and how Apple hardware is so far behind the curve.

I guess that leaves - Linux. If you care, switch. If you don't care, then get used to Windows 10. Those are all the choices their customers have available.

As for what I've already done personally - see the icon.

Ubuntu UNITY is GNOME-MORE: 'One Linux' dream of phone, slab, desktop UI axed

thames

Re: Qt?

KDE is based on Qt. Qt just gives you the tool kit to build a UI. It isn't a UI in itself.

KDE by the way was around before either Gnome or Unity. Gnome in fact was started by people who didn't like the idea that it was based on Qt and so started their own project based on the GUI widgets from the Gimp image editor (hence, GTK - Gimp Tool Kit).

thames

Re: History and all

@keithpeter - "I think that I remember a spat about notification libraries as being one of the reasons for the decision to build Unity as an alternative shell as well as concerns around basic usability."

The notifications library issue came about later, as part of Freedesktop.org discussions (an organisation which was dedicated to promoting standards and interoperability between Linux desktops). KDE and Unity (Ubuntu) worked on standards for notifications and associated widgets, and Ubuntu produced a library which followed that standard. Gnome meanwhile refused to take part and then said they wanted nothing to do with it because "they hadn't been consulted" (because they had refused to discuss it - some serious circular reasoning there).

The head of KDE was quite pissed off with attitude displayed by Gnome, and gave his summary of the situation here: http://aseigo.blogspot.ca/2011/03/collaborations-demise.html Long story short - KDE felt that Gnome had a major problem with NIH syndrome and didn't play well with others. I'll speculate that some of that attitude may have been sparked by jealousy over KDE seeming to be the ones who came up with most of the good ideas. The KDE and Unity teams on the other hand focused on technical issues, avoided politics, and seemed to get along well enough.

@keithpeter - "It is worth mentioning that Canonical Design actually carried out and published the results of usability studies, albeit with basic tasks and subjects new to Ubuntu."

Yes, although I can't recall the name of the project. They went out and hired some professional UI usability consultants who produced a report which was then published and available to be used by anyone. Nobody else in the Linux world was spending that sort of money on usability by the average sort of person (as opposed to the sort who reads El Reg).

The project was called Ayatana. There are various wikis and Launchpad projects associated with it, but I don't know where the actual study is.

@keithpeter - "There appears to be very little usability research published for gnome shell that I can find;"

The Gnome developers wanted nothing to do with airy-fairy designers and people like that. Their design process consisted of someone writing some C code and asking one of their friends if they liked it. Conflicts were resolved based on who was friends with who at Red Hat.

The end result shows it. Start up Unity and there are familiar looking icons right there for you to click on so you can get started. Start up Gnome 3 and you have an almost completely blank screen and no clue on what to do next. It takes two or three times as many mouse clicks to do basic tasks in Gnome 3 as it does in Unity.

The most common tasks in Unity have keyboard short cuts (about three dozen of them) and they have a nice cheat sheet to explain them that pops up if you hold down the flag key. Gnome 3 is just starting to get around to implementing more than a handful, and if you want to know what they are you'll need some good Google skills to find the Gnome wiki.

Hover the mouse over something on Unity, and a tool tip will pop up telling you what it is. With Gnome 3 - well let's just hope you have a good memory for obscure icons.

I could go on like this for quite a while. Unity works on the principle of giving you obvious things to do up front, and making the important stuff large enough for you to see what is going one, and having explanations right there. With Gnome 3 - let's hope you like microscopic grey on black icons because you're going to have to memorise a lot of them.

thames

Re: Makes sense

@Long John Brass - "Yes; But not quite what I was getting at, The integration I was talking about was/is things like docks, info, message notifiers and inter app messaging, gnome used to use corba and kde dBus."

What you're looking for is Freedesktop.org. It was supposed to make the various desktops more inter-operable. It was started by KDE and for a while Gnome was working with them on it. However, later on Gnome lost interest in collaborating with other desktops (apparently seeing them as competitors).

Here's what happened from KDE's perspective. http://aseigo.blogspot.ca/2011/03/collaborations-demise.html

thames

Re: Makes sense

@Long John Brass - "The app doesn't need to know jack shit about it's desktop environment."

Yes it does. Trying to shoehorn a complex app written for a large desktop monitor onto a small touch screen is never going to work. Either everything will be too small to see or else windows will get cut off and controls won't be reachable. Look at the disasters on the early netbooks with applications that weren't written for small screens.

@Long John Brass - "I believe that there is now a way for gnome and KDE apps for example to co-exist on one desktop"

You always could. You just had to have the correct libraries installed, which any decent Linux package manager handled completely automatically when you installed the app. Usually the only way that you knew you were using an app from "the other" desktop was by the fact that the theme didn't match perfectly and that the names of the KDE ones usually started with a 'K'. You could have both Gnome and KDE desktops installed, and switch from one to the other and still have all the same apps automatically show up in both sets of menus and run just fine. Some distros (e.g. Mandrake) used to install both Gnome and KDE by default, and have even more as an option.

That's got nothing to do with desktop versus mobile though. The latter relates to a different physical environment and different available modes of interaction.

thames

Re: Good

@Mage - Unity wasn't a "single" GUI. There were two different GUIs - one for desktop and one for mobile. They had common visual elements and certain shared features, but they were two GUIs. The system had both and was intended to bring up the appropriate GUI depending upon the mode it was being used in. This was not part of the current shipping desktop Ubuntu, which still uses Unity 7 and X.

Application developers who wanted the same app to work in both environments were expected to write two GUIs, one for desktop and one for mobile, and the system would show the appropriate one, again depending upon current mode. The GUI tool kit was designed to support this dual mode.

In other words, it was the complete opposite to the approach that Microsoft took with Windows 8.

I suspect that Apple will eventually do with the iPhone and Mac what Ubuntu attempted to do with Unity, and their fans will ooh! and aah! at the latest Apple "innovation".

thames

The original version of Unity was for netbooks. However, the current version is definitely not, even though they re-used the name. They just took the name and the look of the launcher and came up with a new interfaced designed for desktops. The main ideas that survives from the older Unity is having the launcher/dock on the left, as this fits modern form factor monitors better, and also using this to make the icons bigger and do double duty as dock and application switcher.

The current incarnation of Unity came about in reaction to the complete usability disaster that the early versions of Gnome 3 were. Ubuntu didn't want to continue to ship old versions of Gnome 2 like many other distros did, KDE was still in the throws of their own long running self-created version transition fiasco, and the other potential desktops had a look and feel from a decade before.

Hence, Ubuntu rummaged through the cupboards and cobbled up a new version of Unity. The early versions were rough, but Gnome 3 at the time was so bad that people widely predicted that Gnome was going to die off from lack of users. Gnome had to kill off support for Gnome 2 to get users to "upgrade" to 3.

Eventually, there was some change at the Gnome Foundation. They abandoned some of their more hallucinogenic inspired UI visions, borrowed ideas heavily from Unity, and came up with something that was at least somewhat usable.

Linux Mint came about as a reaction to Gnome 3, not as a reaction to Unity. The Linux Mint founders also thought the Gnome 3 developers were on drugs, and wanted an interface that looked more like Gnome 2. Originally it was just a set of extensions that re-skinned Gnome 3 radically to give it a different appearance and function (the end result looked nothing like Gnome 3).

However, changes to Gnome 3 kept breaking the Mint extensions, causing them to look elsewhere for a new desktop, hence what they currently have (which were also started in reaction to Gnome 3).

Linux Mint was based on Ubuntu for the same reason that a number of other distros are. Ubuntu offers a good compromise between solid but out of date Debian Stable and bleeding edge Debian Unstable.

The main reason that Unity was never used outside of Ubuntu was that Canonical never put the effort into separating the Ubuntu specific features from the generic features. If you pulled in Unity straight from Ubuntu, you also pulled in a lot of unrelated Ubuntu specific features and services. Nobody else wanted to do the work of maintaining a hard fork if Ubuntu weren't going to make that generic version the one they worked from, and Ubuntu weren't going to slow down and backtrack on their own development to make that possible. Hence, it wasn't practical to put Unity into Debian, and without there wasn't any practical way for the secondary distros based off Debian to offer it.

If Ubuntu had separated Unity better from Ubuntu specific features early on, then it may have been able to expand into the vacuum that the early versions of Gnome 3 left. Instead MATE (available on 2 dozen distros, including Ubuntu) filled that role.

thames

Re: Mir -> Wayland then?

The whole point of Mir was that it was to be more focused on mobile than Wayland was, There's no point to Mir if Ubuntu Phone is being binned. They'll just use Wayland.

I can see the rational for focusing on the server and desktop markets, given how Android utterly dominates the global phone market. Ubuntu Server is doing well, so they will focus efforts on that and on IOT.

However, I will miss the Unity desktop on my PC. It's modern without being as radical as Gnome 3. I hope they will do something to improve the usability of Gnome when they switch to it.

Ubuntu started off as Debian with more polish. It looks like they're going back to their roots.

Schneider Electric still shipping passwords in firmware

thames

Re: Do their buyers care?

No I don't think buyers would care. I've never seen anyone use the password feature on this category of hardware from any supplier.

Robots are killing jobs after all, apparently: One droid equals 5.6 workers

thames

Globalisation

I have some familiarity with industrial automation, especially in the automotive market. Robots have been working in factories in the auto industry for longer than many of the readers of this web site have been alive.

They're nothing special though, they're just another tool in an automation designer's toolbox. Most automation doesn't involve robots. The objective is to automate production - in other words, replace labour. There are many ways of doing this, often a custom built bit of tooling, motors, and cylinders does the job faster, and more cheaply and efficiently than a robot. There's not even global agreement on what constitutes a robot.

Robots and automation in general are not what is depressing wages in western countries. What is doing that is outsourcing to low wage third world countries where cheap labour replaces expensive robots.

I've seen this a number of times. A company will spend a decade automating to reduce costs, achieving an efficient 24/7 operation. They then spend the next decade moving production of new products to China or Mexico where they can reduce capital costs by not having to spend so much on automation. In other words, they do a complete reverse course and go from high tech to low tech on the basis of reducing capital expenditure and replace robots with third world labour. Very often the building and labour is provided by a contractor, so they technically have no head count other than the head office staff (and what's left of their product designers after they've taken the chainsaw to them as well).

The globalisation process has gone so far in many western countries that people who are panicking about "robots" being responsible for "killing jobs" are at least 15 years too late.

If you want to panic about something, panic about the "Ubers" of the world who plan on reducing everyone to a "race to the bottom" temp worker.

Ubuntu 17.04 inches closer to production

thames

Re: Swings and those other things

@Barry Rueger - I started with KDE on Mandrake, then switched to Gnome on Ubuntu when Mandrake/Mandriva started circling the drain, and then the Unity desktop on Ubuntu when they finally dropped Gnome 2.

I used KDE on Mandrake because that was the default desktop. When I had to switch away from Mandrake (important bugs weren't getting fixed) I tried to find another distro who did a decent KDE, but none of them were up to the quality of Mandriva (as Mandrake had become after the merger with Connectiva).

Gnome was the default on Ubuntu however, and I was struck by how very, very, polished the complete system was. Their KDE version (I can't recall if it was called Kubuntu at that time) however was a bit crap compared to Mandriva's. The same was true for Debian's KDE (the menu structure at the time was mind bogglingly large and complex).

When Ubuntu switched to Unity, I hung onto the last Gnome LTS version until they dropped support for it altogether. I thought I was going to hate Unity, but I downloaded it and tried it out so that I could write a blog post on how crap it was.

After working with it for about 20 minutes however, I decided rather liked it. It had fixed a lot of the design problems with Gnome 2 while being less of a radical departure than Gnome 3 was. After a few days I decided there was no way I wanted to go back to Gnome 2.

The lesson that I draw from all this is that how polished a distro is matters a lot more than which actual desktop they use. The other is that size of the user base and the stability of the organisation behind the distro are very important if you just want to be able to use your PC on a daily basis without worrying about the care and feeding of it (as when Mandriva started circling the drain).

There are people who just want their computer to work so they can do stuff with it. The desktop for them is just something to launch and handle their applications. That's the category that I fall into. I stick with the defaults except to change the wallpaper and some of the launcher icons. I do like the Unity keyboard short cuts and probably at least 90% of the application launching I do on a daily basis is done that way.

Other people want to play with their OS, change stuff, try different things, customise it, etc. They're more interested in the computer and OS as an end in itself rather than simply as a means to get other things done. Some of them switch distros and desktops like they were changing their socks.

For people who are just starting out in Linux however, all I can suggest is to start with the mainstream and stick with something that is expected to work as is out of the box without any twiddling. If you want to experiment and try some esoteric things out, do that after you've got a good foundation in the boring and common ones.

thames

Re: Swings and those other things

I've never had problems with CUPS or printer drivers. I've always just plugged the printer in and hit print and it printed. No installing drivers, no selecting drivers, nothing like that. Somehow, it always just knew what the printer was and did everything needed without having to ask me.

With regards to how many people stick with the default Ubuntu desktop, from the data that I've seen, the overwhelming majority of them do. I used to collect user agent data from various public sources to analyse how usage versions of various Linux distros changed over time. There was virtually no correlation between actual usage and how often any distro or flavour was mentioned in various forum posts.

It is very likely that the average Linux user doesn't spend much time commenting on his or her distro in on-line forums. They just want to focus on their real interests, which isn't centred around Linux for its own sake. They would be the sort of person who would stick with the major distros and accept the defaults while they got on with doing what it was they got a computer for in the first place.

On the other hand, the sort of person who is interested in Linux for the sake of itself is also likely the sort who is motivated to try out various more obscure distros or flavours or desktops, spend the time to debug, configure, and learn about them, and then talk about them in on line forums.

I know several Linux users personally, and doubt that most of them are aware that other Linux desktops even exist. Those ones all use Ubuntu (Unity) by the way. They're just average people who got fed up with Windows Vista or Norton Anti-Virus and got a friend to install Ubuntu for them so they can browse the Internet and do their email without aggravation. Of course they're the sort of person whose never even heard of The Register, let alone comment here. I know of at least one person who I'm pretty sure is still convinced that she's got a pirate copy of Ubuntu because she didn't pay for it.

The few people who use Fedora or Suse something like that at home are all in the IT business and use RHEL on servers at work. I've never met anyone who uses Mint, despite reading posts about it all the time.

Inside OpenSSL's battle to change its license: Coders' rights, tech giants, patents and more

thames

@Charlie Clark - "but incompatibility with the GPL is good thing™ in my opinion"

As noted by the AC above, the GPL version 3 is compatible with the Apache license version 2. Compatibility with Apache V2 was a major design goal when GPLv3 was drafted.

@Charlie Clark - "APL allows for embedding in devices without having to argue about shipping source code"

The overwhelming majority of smart phones and a huge host of other embedded devices are based on GPL software, so there don't appear any practical problems with a GPL license in that area.

I suspect that Theo De Raadt's objections to using the Apache License are simply that it isn't a BSD license. He is a relentless promoter of BSD licenses and slags off anything else. He is of course the founder of OpenBSD, whose defining feature is that it uses a BSD license.

The Apache license (version 2) is similar in effect to an MIT license with the addition of a patent clause. The FSF (founded by Richard Stallman) recommend the Apache V2 license for projects which for whatever reason don't use a GPL style ("copyleft") licence, so it's a pretty non-controversial and widely accepted license in the software industry.

There is a general push in the open source / free software side of software development to get rid of custom licenses such as OpenSSL uses and switch to a few widely accepted ones such as Apache, GPL, MPL, etc.

Amazing new WikiLeaks CIA bombshell: Agents can install software on Apple Macs, iPhones right in front of them

thames

Re: Why??

Wikileaks is just a publisher, they're not an international spy organisation. They don't break into offices and steal secrets. If you've got secret North Korean documents, by all means send them to Wikileaks.

Mostly what you're seeing though is that your own press is mainly interested in what is related to your own country. If you are relying upon what your country's popular press is reporting, you are very unlikely to be seeing a representative cross section of what appears in Wikileaks. Instead, you are seeing a very small cherry-picked sample of what your press thinks interests their readership.

thames

Airports

El Reg: "Sonic Screwdriver is cleared aimed at molesting seized machines, or during black bag operations, not at interfering with factory-fresh products in transit."

Do it at airports during a customs inspection. Take the computer out of the owner's sight and install whatever you want.

Or better still, come up with some sort of bogus excuse to force everyone to put their laptops in checked baggage, and then do it in the baggage handling process. That way the subject wouldn't know they had been targeted. Not that anyone would ever dream of doing something as disruptive as arbitrarily forcing people to check their laptops of course ...

Blinking cursor devours CPU cycles in Visual Studio Code editor

thames

Re: but but but

It's slow and not very useful. I've tried it, but I wouldn't consider switching from Geany.

Malware 'disguised as Siemens software drills into 10 industrial plants'

thames

Re: The ineritors of Stuxnet

El Reg appears to have the story backwards. It isn't firmware that is installed on the PLC. It's trojans hidden in Windows programs that are used to load files into various bits of industrial hardware.

In other words, we're talking about bog standard Windows PC trojans that just happens to be riding along inside software that is used by people on their laptops to service industrial control systems. It's no different from trojans hidden inside pirated copies of games or Photoshop. Presumably the perpetrators will make money off this the same way they make money from any of the other trojanned software. These laptops after all will be spending a lot of their time hooked up to the Internet while the user is doing all the routine office work everyone else has to do.

This is nothing new to people who actually work in the industrial field. I was seeing this in cracked copies of Siemens software at least 15 years ago. Everyone in the business back then knew you could get cracked copies of their very, very, expensive development software from servers in eastern Europe and places like that, but that various bits of malware were guaranteed to come along for the ride. Piracy of this sort of software is pretty widespread, so trojanned copies are as well.

What has happened here is that companies selling Windows security software have smelled money in all the concern about cyber warfare, and they are now addressing a market that was too niche for them to care about before. All they need to due is to tune their existing Windows anti-virus software to look for the normal trojans these packages.

Ubuntu splats TITSUP bug spread in update

thames

Re: Chekov here: my nyetwork

Zero problems here. I had an update and the update to the update, without any problems.

Intel swallows Tesla-hating self-driving car biz Mobileye for $15bn

thames

Re: Overpriced

For not much more than 3 times what they paid for Mobileye, they could have bought Ford or BMW lock, stock, and barrel at current market price.

It will be a long, tough road to genuine fully autonomous self driving cars, and it will take deep pockets to continue the finance the R&D. Once it does become both practical and economic however, the auto companies will be looking to drive the price down by using competing suppliers, just like with every other auto part. The auto companies are not going to settle for having a relationship with Intel similar to that which PC vendors have with Microsoft. They're not going to let one company cream off all the profits just for supplying one small part of an overall product.

I can't see Intel shareholders having the patience to keep pouring the R&D money into a project which won't pay off until the long run, and even then with fairly small profit margins by IT market standards. This sounds like yet another dead end venture which Intel has squandered money on to try to make up for having missed the boat on the mobile market.

Favored Swift hits the charts: Now in top 10 programming languages

thames

Re: C still at number 2

@JLV - I have a Python library written in C. Most of the libraries that are used in Python which are not themselves written in Python are written in C. C is very widely used for libraries in many applications because it is a lot easier to interface with than C++, Fortran, or just about anything else.

Interfacing with a Java or C# library (as opposed to the other way around) is nearly impossible once you are outside their self contained ecosystem. The result is that a large proportion of the libraries that are intended to be used by other applications are written in C, at least that's the case for Linux. C is the lingua franca of multi-language programming.

As as aside, some people have mentioned assembly language. If you want to directly use compiler intrinsics, you need to know assembly language even if you are writing in C. The above mentioned C library uses GCC compiler intrinsics to get performance of up to 7 times faster than just using standard C language syntax. That's not the sort of performance advantage that you can ignore.

thames

A bit of perspective.

Swift is still behind Delphi/Object Pascal, which should put some perspective on how widespread it really is. It's only marginally ahead of Perl.

I haven't seen any sign that Swift will see significant use outside of the Apple market. If Google decides to deprecate use of Java on Android (perhaps because Oracle is being such a pain), then they are more likely to go for something they control, such as Dart or a derivative of it.

Family of technician slain by factory robot sues everyone involved

thames

@kwhitefoot - "Surely the same rules apply today?"

Having rules and following rules are not necessarily the same thing.

I won't jump to any conclusions in this particular case, as I don't know what happened. I do know that the chances of getting a complete and balanced description of what happened out of one side's or the other's lawyer is pretty slim.

Instead I'll describe a case that also in an automotive plant in Michigan that happened a number of years ago. A maintenance person was working in a large robot cell. When you are talking about making something the size of a car, you are talking about machines that you can walk around in. When he was finished, he closed the safety gate, re-enabled power, and restarted the production line. However, he later discovered that he had left one of his tools inside. Rather than admitting his mistake, he got a ladder and climbed over the guard and jumped down inside to retrieve the tool. One of the robots then moved in a manner he didn't expect and crushed him against part of the machine.

In the case described in the story, I'll wait for the results of a safety investigation before I jump to any conclusions. Generally though, it is the responsibility of the person working in the machine to follow the appropriate procedures to render the machine safe before entering it. If the machine cannot be rendered safe to enter, it should not be entered. Companies such as Fanuc and Nachi have sold many thousands of robots for decades, this is not science fiction technology. Locking out robot 130 using the lockout on the front of the robot control cabinet would have rendered it safe. The real question is why this safety measure was not applied before the machine was entered.

thames

Re: It's confusing

@frank ly said: "I'm having difficulty undertstanding all that. 'Robot', 'unit', 'section', 'fixture', .... are robot/unit and section/fixture the same thing? "

The "robot" will be the big mechanical arm which moves about. In this case the manufacturer will be either Fanuc or Nachi. Those are two different robot companies, so it's hard to understand why both are involved unless of course both brands of robots were present.

Lincoln Electric makes welding equipment. The robots may have had Lincoln welders on the end of their arms, or they may have been loading parts into equipment that had Lincoln welders.

A "fixture" will be a unit of custom tooling that is machined to hold and clamp the parts being manufactured. The fixtures would be attached to the frame of the work cell, or to the end of a robot arm.

A "unit" or "section" would be subdivision of the larger overall production line. Those are not necessarily the terms used by the manufacturing companies, they may be just names made up by the lawyer. The sections being labelled "130" and "140" doesn't imply that there are 140 or more "sections". The work cells may be divided up into smaller sub-divisions, and there may be an implied decimal such that it was the 13th or 14th "section", or even the 1st section. You would have to see the electrical and mechanical layout to understand how it was labelled.

The overall production line will have been built in sections at the integrator, run off and approved by the customer. Then the sections would have been unbolted and unplugged from one another, put on trucks and shipped to the customer where they would have been bolted and plugged back together, tested, etc.

Flex-N-Gate appears to be the company that owns the factory in question. They're a large automotive parts manufacturer with multiple plants, and this is one of them.

Prodomax is a large Canadian automation manufacturer located north of Torronto. Their role in this will have been to design and build the production line. They would have bought the robots and loads of other components and designed them into the production line, built the framework and tooling, done loads of wiring, written software, etc. Fanuc, Nachi, and Lincoln Electric would likely have had very peripheral involvement in the actual project.

I've done business with Prodomax as a customer (although not recently), and they were a well run very professional organisation. They've won awards for being a well managed company. The chances that they did anything dodgy or cut any corners on safety are pretty slim, although it is of course impossible to discount some sort of human error.

If their customer was in Ontario, the equipment would have had to go through a formal safety review by a certified safety engineer (usually an independent third party consultant). I don't know what Prodomax does, but I do know that other Canadian companies won't sell equipment to US customers without a similar review simply because of legal liability reasons. I know of at least one small automation company who stopped doing business in the US because the cost of liability insurance there was too high to make it worthwhile when they could pursue business opportunities elsewhere.

Normal safety practice in a case where you are entering a work cell such as this to do maintenance is to lock out the equipment power switch with a padlock for which only you have a key. If there are multiple people involved, there are special attachments which allow multiple individual padlocks to be attached. If you go home and forget your padlock, then your supervisor has to talk to you on the phone and confirm your presence there before cutting the lock off with a set of bolt cutters.

In cases where you need the robotic equipment "live" to diagnose a problem, then you would insert a key into the machine somewhere to change the mode, and plug in a "three position switch". This would be a hand-held device, sometimes built into the robot teach pendant, which only allows the robot to move in very slow motion if the switch is in a middle position. If you let go of it or squeeze it too hard, the slow moving robot is disabled. This is a standard feature built into the robot itself, and is very reliable.

The reports say that the person in question worked in the maintenance department. As such, she should have been well trained in safety procedures. The equipment will have come with extensive manuals (not that anyone generally reads such things) which would have covered this. Whether she was authorised to enter the machine at that time is of course one of the questions which will likely be raised in the lawsuit.

The reports say that her duties involved adjusting the robot and tooling. Quite likely, she was the person whose job it was to tweak the robot to adjust for variations in the incoming component dimensions or material to deal with quality issues.

The reports are unclear, but it is quite possible that multiple different groups of people were working on different tasks in different parts of the work cell, and one group triggered an adjacent robot to cycle when she was in the machine working on another. Normally I would have expected her to lock out the adjacent robot before working on her own. Why that didn't take place is a very good question, and one that can probably be only answered by the supervisors working for her employer.

Official: America auto-scanned visitors' social media profiles. Also: It didn't work properly

thames

A few years ago a francophone Canadian sales manager from Montreal was going to a business conference in New York. On his way there he was exhorting his sales team to greater efforts through the usual bombastic proclamations, including a common French marketroid phrase that apparently translates to "we're going to explode the competition!"

The Americans were apparently already monitoring public social media messaging accounts at that time, and upon arrival at the border that popped up. Needless to say, he didn't make it to the business conference in New York or indeed any further past the border, bur rather spent a number of long hours sweating it out in custody wondering whether his next stop was going to be Guantanamo Bay.

I don't have a Facebook or Twitter account, and the way things are going I would be very unlikely to even consider having one now. Cardinal Richelieu supposedly once said "if you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him." Substitute "six social media postings" for "six lines" and you have today's situation in a nutshell.

thames

I'm going to be even more fascinated as to where they're going to get all the people who can read and understand all the various languages and dialects around the world that the multitude of social media accounts will be written in. Somehow I doubt that Google Translate is going to be up to the job.

Oh, I forgot, we're talking about Americans. The fact that most of the world doesn't speak English all the time hasn't occurred to them yet. After all, God wrote the Bible in English, so Adam and Eve were created speaking English 5,000 years ago. Anything written in funny looking squiggles is just terrorist code.

God help us.