in the control center photo
do they really run WinXP or is that just the wallpaper..?
CERN made headlines with the discovery by physicists in 2012 of the Higgs boson, paving the way to a breakthrough in our understanding of how fundamental particles interact. Central to this was the Large Hadron Collider – a 26km ring of interconnected magnets chilled to -271.3C straddling the Franco-Swiss border. The LHC is …
In the LHCb control room we used to have some Win7 machines mixed in with Scientific Linux machines, but now they all run CERN's own flavour of CentOS. At the CERN Control Centre (pictured) apparently the accelerator operator machines all run Linux now (presumably the same CentOS). There might be some Windows machines left for other purposes.
I wonder if they also manage the volunteer-run distributed computing system with OpenStack.
It does use Virtualbox and connects to their Cern VM File System
https://lhcathome.cern.ch/lhcathome/server_status.php
Check it out if you'd like to donate some spare CPU cycles to LHC simulations.
""The way the physicists work is they build different analysis algorithms within a framework but one of the challenges is if the code's been written by a large number of people so that it optimises a large program – that requires breaking that down and finding the core algorithm and optimising them."
You can't just keep throwing more costly CPUs and IOPs at legacy code to make it faster - optimising the code to exploit available compute and IO will have just as big return on investment as buying new hardware and 'containerising' legacy code to keep running in a modern environment.
Coders typically assume that more hardware is the best fix for badly optimised code - it isn't. Being hardware constrained actually forces code to be efficient with the available resource.
> We generally don't, But often – especially given other demands for resources – it is the least cost inefficient.
Indeed, back in the early days of computers, computing power was more expensive than programmer time, so it made sense to get programmers to spend a lot of time to optimise their code to the limit to get the most power out of the machine. Hence you saw amazing stuff done with what we today consider an impossibly small amount of RAM and CPU power.
However now that has been inverted. Computing power is a lot cheaper than programming time, so sometimes "just throw more hardware at the problem" is the right answer. In fact it seems to be the more cost effective choice pretty much everywhere (Except embedded and aerospace industries, and to a lesser extent the HFT Finance area).
Do they hide the roomful of 8086 chips still in their original plastic packages in 20+ year old antistatic bags, purchased from NASA at a very good NOS price?
Even CERN has to use old tech, due to truly mind boggling (lethal dose in seconds) radiation at the four major detectors ie ALICE, LHCb, CMS, etc.
Some of those chips have to be decades old but are still working, they are incidentally also used in physics packages albeit the ceramic packaged variant.