PooPal.
Delet me
Drupal is playing down estimates that more than 100,000 websites are still vulnerable to months-old critical security flaws in its content management system. The developer said Thursday that reports from earlier this week claiming tens of thousands of sites were not patched with version 7.58, and thus were vulnerable to an …
His testing methodology is truly flawed. By access a static text file he is NOT testing the version of the CMS. He isn't even confirming that the sites even still run Drupal. To continue to stand by those numbers after the flaws in testing have been pointed out he is undermining his own reputation.
We do web hosting for other companies. When I checked our servers to see if we were hosting any vulnerable sites I found a number of copies of the Drupal change log that were on sites that weren't even running Drupal. Some of them were left there because they ran Drupal some time in the past and they didn't do a good job of cleaning up after migration, other sites were static HTML created by the client scraping their Drupal site. All of them would show as a false positive in his testing.
So, are we saying the Drupliputians used the patch to introduce a new bug?
CHANGELOG.txt is the single file in a Drupal installation that accurately reports the version of the software being used. The system itself does not provide another means to determine what version is installed, which is critical information for disaster recovery, upgrades, and collaborative coding.
Considering the distributed development environments employed by most teams, it's easy to see how someone could check out a version of the code from git that does not have the patch applied and accidentally push it to production, with no way to confirm what's in place.
This is yet another really sloppy mistake. The Drupal team should not be raising awareness of their oversight.
There's also version = "X.YY" in modules/system/system.info, but yeah.
Much fun was had here doing basically a find / to identify all the culprits. And yes, approx. 50% weren't even configured in anymore and stuff like that, but frankly, at that point that was the users' problem.
I don't upload the changelog file to my drupal site - the version can be obtained from the code itself, but of course not with a simple unauthenticated request.
Probably the 200K site that couldn't be tested do the same. So the number says that there are probably a large number of unpatched sites, but it's difficult to tell the real percentage.
If the version cannot be determined from the scan performed, it is most likely down to he owners having taken measures to prevent access and reasonably can be used as a proxy for security as a whole. Those showing as current can be assumed to be so. Those that show as otherwise, even if the measurement is indirect, can be assumed to have some flaws based on poor housekeeping if nothing else. Those machines will serve as a good starting point for attacks based on flaws that have been known for months. I would stand by those results, too.
Perhaps a better way for Drupal to protect their reputation is to send messages to their customers letting them know the results of scans of their web sites and otherwise raise awareness of the need to patch rather than trying to deflect blame. Are they doing that? Probably not.
Although to be fair to WordPress, the core WordPress code, as horrible as it is, is pretty secure and updated regularly to ensure this continues. It's the thousands of really poorly implemented, barely supported/supportable plugins that are the most serious security issue with WordPress.
Of course, if updates are not applied...
Those with ops experience help me out. I've always assumed that a cheap way to gain a bit of security was would be to false flag whatever platform is being used to build the website. Using drupal? Make it pretend to be Rails. Using Rails? Make it pretend to be drupal.
Or is that actually not so cheap?