"The Pentagon spokeswoman added that US military doesn't check off-the-shelf code it buys from vendors, trusting the manufacturer to get the security of its systems right"
Ha ha ha ha ha! Ha ha ha ha ha! <cough> Ha ha ha ha ha!
Hewlett Packard Enterprise handed over the source code for its ArcSight security platform to Russian investigators in exchange for being allowed to sell kit in the former Soviet Union. That's kinda awkward because the Pentagon is one of ArcSight's most high-profile customers. The US military uses the software, which is …
It's a fun day all around. "The code never left the building" is the 21st century's "The cheque is in the mail". Didn't we have a similar story just a few months ago? If anybody thinks that Russia doesn't have a person, nay several people, capable of memorizing a few tens of thousands of lines of code (or "just the important bits"), that person really should have his or her kindergarten diploma revoked.
Vacated.
" If anybody thinks that Russia doesn't have a person, nay several people, capable of memorizing a few tens of thousands of lines of code "
Even third world countries can buy hidden video recording equipment on eBay. I doubt they would rely on memory...
I don't think that quote really means what's implied by it. We know the the US govt *does* require seeing code, what's not clear is who checks it. It'll be NIST/NSA wombo-combo as a general rule, but they probably don't check certain specific product written under secrecy. Of course arcsight is generic off the shelf technology not written under secret contract - if nobody is checking that specifically they're rather naive about potential for foreign security services infiltration at HPE I'd think.
That said it wouldn't be super difficult, given what it does, to isolate arcsight to the point even if it was back doored it wouldn't matter.
Aww, cut them some slack, willya? With all the budget cuts in recent years the Pentagon has barely the money to fund outstandingly successful projects like the F35. And I hear all that computery stuff is about virtual zeroes ons ones anyway, so what's the worst that could happen?
exactly...viz, http://www.visualcapitalist.com/putting-americas-defense-spending-into-perspective/
Putting America’s Defense Spending into Perspective
JEFF DESJARDINS on February 8, 2016
Wouldn’t it be a strange world to live in if
50% of military spending was paid for by just 5% of the population?...
et
http://www.military.com/daily-news/us-military-budget
US MILITARY BUDGET FOR FISCAL YEAR 2018
The White House
in May requested a $4 trillion federal budget for fiscal 2018,
which begins Oct. 1.
The spending plan is notable
because it marks President Donald Trump's first budget request
and calls
for boosting defense spending by cutting domestic programs...
The US military does not check source code on purpose, what the fuckity fuck fuck, the mind boggles at the stupidity. Not only should the source code be provided, the compiler source code should be provided and the checked source code compiled and the program used upon that basis.
I can't believe they would be that stupid to use unchecked code in critical location but then again NASA did manage to crash spaceships by mixing and matching feet and metres. Only arrogant Americans so readily take pride in their ignorance.
The software used to detect tampering has been sold to a country that may well be doing the tampering. This software is off-the-shelf stuff that a private company can sell to anyone it wants to and the US govt. has no control over it. Nor do they vet this software, unlike the Russian spies who now have access to it.
But then, IBM happily sold technology to the Nazis, and Ford and Standard Oil sold their products, too. Plus ca change.
> But then, when it comes to security, open source isn't much better.
At least open source software does not have the problem that revealing the source code can cause security compromises. Because it is revealed all the time!
This in my opinion makes it intrinsically a more secure option.
That presumes that qualified people are actually looking at the source code for other than availability reasons. Last century Borland released a very popular database software to the public. About six months later someone actually looked at the code and discovered hard-coded backdoor credentials. Stuff never changes.
I don't see the point of the examination at all. It could have been done by a competent reviewer, but there's nothing to say that this code is what makes the final executable. If the code never left the building then they can't prove back-doors haven't been added after the review...
It's not hard to build a system that can give you sane assurances about this all the way down to the hardware level. That's the joy of reproducible cross compiles. You can take two completely independent systems like say for example a KOMDIV-64 built in Russia with MIPS arch and cross compile to x86_64 binaries on linux, take a core i7 and compile and compare the binaries. If they match then it's impossible to assert that any of the architectures themselves are compromised in respect of compiling things. That's beyond the fact it isn't really worth anybody's time. There's an ongoing cross-compile project with Debian that could disprove (or indeed prove) the assertion accidentally that an arch is compromised in this respect.
The real issue (threat) is compromise of the source itself. Even large open source projects aren't super difficult to infiltrate if you're a competent developer; somebody who had done that could potentially drop something in to a tool chain and maybe nobody spots it. That's the real sideways threat that Heartbleed actually did allude to, but yes you can prove binary sanity throughout a system - by using another system that's built completely independently; even better by using one that's built in a country that's deeply paranoid about the one you're testing.
when it comes to security, open source isn't much better
Not entirely sure what you're trying to claim here but you're missing key facts. First heartbleed was Open Source working like it's supposed to. Security researcher discovers flaw by analysing the code. Security researcher notifies the developer, it's patched and fixes are pushed. When that's all sorted a public information campaign takes place to alert sysadmins that there's a critical vulnerability and people shouldn't screw around with it. If people don't patch their systems in that environment that isn't a flaw of the open source model, it's a flaw of sysadminery.
You simply can't do comparative analysis of open source versus closed source, there's no data to know how many new vulnerabilities in closed source systems there would be were a comparable number of researchers were given access to closed source systems code. What we know is people have absolutely no problem finding critical flaws in say the windows code, basically constantly.
These days "closed source" often includes a pile of "open source" libraries anyway, so you get the same vulns as open source, plus an extra delay as the vendor incorporates the updates and redistributes their software. FWIW there is a metric crap-ton of *very out of date* vulnerable Open Source software incorporated into every single vendor product I've looked at as part of my day job so far. It's a royal ballache. :(
Not entirely sure what you're trying to claim here but you're missing key facts. First heartbleed was Open Source working like it's supposed to. Security researcher discovers flaw by analysing the code. Security researcher notifies the developer, it's patched and fixes are pushed.
In each of the stories I linked (Heartbleed was just one of three different stories), the flaw had been deployed in the wild for several years before someone noticed it.
Or at least, before someone officially noticed and publicised it. Who knows how many bad actors had noticed it previously and kept it to themselves?
There's this myth that "many eyes" make OS more secure, and it's just not true - it turns out that about 99.8% of the people who use OS software, never actually look at the source, so it might as well be closed as far as they're concerned. And even if you do review the code yourself, you're very likely to miss the flaws - because they're hard to spot, that's how they got there in the first place.
had been deployed in the wild for several years before someone noticed it.
Of course, but it's the wrong argument. The question is security of what's deployed. Who knows what monsters are in closed source code. The reason that HB lasted so long is that it's a difficult one to spot even when you have the source code in front of you.
You can't pick on Heartbleed or nor can you actually pick on Microsoft and say "these are reference examples useful for statistical analysis". I'm not saying Open Source code is more secure, I'm saying it's impossible to determine which is more secure without doing something like who has more unpatched critical exploits against their systems, who's making it hard for sysadmins to get those patches out to their servers. Neither of those things are actual identifiers of underlying code quality though - the key question is if you have a closed source system and a hostile actor sees the code how deep is the doodoo you're swimming in, is it just above your head or not..
The software used to detect tampering has been sold to a country that may well be doing the tampering. This software is off-the-shelf stuff that a private company can sell to anyone it wants to and the US govt. has no control over it. Nor do they vet this software, unlike the Russian spies who now have access to it.
I just happen to be a dual national of both a close ally of the US and Russia. I also lived about a half of my life at each place - so I believe, perhaps misguidedly, that I can understand and appreciate the thinking going on on both sides.
The really funny bit is that I can swap "the US" and "Russia" labels in the quoted statement above, and it still makes perfect sense if I also swap the side of the fence I am looking from. The Americans have legitimate security concerns, including serious, and justified worries about Russians hacking into critical systems. Guess what? Russians also have legitimate security concerns, including serious, and justified worries about the Americans hacking into critical systems.
Going by the past actions of both Russia and the US, neither side has a moral high ground here either - both countries have chosen to use their capabilities mostly to destabilize and cheat both their competitors and their allies alike. In the long run, both countries would be much better off if they at least try to cooperate instead of incessantly bickering with each other while the rest of the world moves on to solve the real problems facing all of us.
No, you didn't understand this right.
The software used to detect tampering has been sold by a corporation based in a country that may well be doing the tampering.
The Russians quite reasonably insisted on being allowed to inspect it to find any little secret bits that the US government had insisted on putting in. Not that that would help them at all, since they can't be sure that the source code they check is the same as the source code that's used to create the binaries. Or the tools to create them.
The only long-term satisfactory solution will be for Russia, China, Iran, Venezuela, and all other nations that reasonably fear US tampering - oh wait a minute, that's all nations - to stop using US software altogether.
"Not that that would help them at all, since they can't be sure that the source code they check is the same as the source code that's used to create the binaries. Or the tools to create them."
Actually you can in any sane build system - if the binary matches your own build, its the same code UNLESS the compilers or libraries have a hidden trust issue (a la Ken Thompson).
And for the second point you can build open-source compiler tools independently using differing compilation tools, so unless someone managed to infect every available compiler in such a subtle manner, you can verify that side as well.
One advice, you could have tried a search for "IBM and the nazi", Google gives you 422 000 results for that. That does not prove anything but it will provide some proper links, (try "Microsoft and the nazi" and you get 787 000 results).
Lets not mix the past with the present, I, for instance, recently drove a Ford regardless of the fact that Henry Ford sent gifts to Hitler on his birthday, and I never accused the guys at the garage of having Nazi sympathies because of him.
That's when Symantec (Norton) and other AV vendors gave China all of their virus collections in order to gain access to the Chinese markets. I think Symantec turned over some 2,000 viruses.
Ahh, for the good old days when an AV def update definition disk fit on a single floppy disk.
our astronauts back and forth from the ISS. Why? Because we would rather add another 70 billion dollars to an already bloated military budget, than spend what is necessary to do the transport ourselves.
But Russia is otherwise our enemy? Really? Why does our government need enemies to fight, except for profit? Don't get me started on the Middle East.
China makes too much of our stuff, so we don't want to upset them too much.
Good rant.
IRL it's $70 million dollars, and it's per seat.
At least part of that's down to the very cautious nature of the NASA Aerospace Safety Advisory Panel (Or ASAP, who's name is deeply ironic given what their "advice" usually does to schedules) who said "Nix" to powered landing for Dragon 2.
At some point it will become apparent that SX (and even Boeing) will launch humans into LEO with capsules they mostly designed themselves while Orion is still years away from first crewed flight.
Let me get this straight : a private company responsible for National-Security-level software actually let a foreign power (who is actually irrelevant) view the source code for that software and nobody is in jail for treason ?!?
So the rules only apply for lone sysadmins, then ?
Sheesh.
So, here's my concern. Everyone is talking about vulnerabilities in the code, for a log moniting package.
I'd be as concerned about someone spotting a way to evade its monitoring during an attack, and that information falling into the hands of my attacker.
Would that be a vulnerability in HP managementspeak? no, not really. Would the other side even need to cough up they have spotted it if they can implement work arounds locally?
Pretty sure the US gov would be analysing the software, they probably already know where it has flaws and are interestedly looking at who else buys it in ready to add it to the list of toys their cyber ops teams can use against that entity!
And then the microcode of the CPU ....
Unless you have complete control - and a way of verifying that what a design says is what you actually have - then there's a risk that something is compromised all the way down.
There are techniques to deal with them. But they cost money .....
THIS IS wholly in-line with the Orange Dotards policy of detente with Putin.
After all, The Dotard famously gave Israeli spy information to the Russian ambassador in THE OVAL OFFICE. And his erstwhile National Security Director FLYNN was busy visiting Russia as well as making calls.