Chrome and uMatrix
Throw in Adblock and Privacy Badger and you're set.
Take out Chrome and use Lynx and you're really set.
Detectify security researcher Linus Särud has reported a weakness in popular Firefox security tool NoScript that allows attackers to have their malware whitelisted. The tool is used by some two million security-and-privacy-conscious folk who want to stop active content like JavaScript and Flash getting a foothold in their …
wget has a huge history of security vulnerabilities. Real programmers browse the web with telnet.
telnet www.cvedetails.com 80
GET /vulnerability-list/vendor_id-72/product_id-332/GNU-Wget.html HTTP/1.0
Host: www.cvedetails.com
<Enter>
For secure sites, instead of telnet use:
openssl s_client -connect <hostname>:443
If I was shipping a new build of an app I wouldn't include some untested third party binaries. You can't guarantee the performance of your code if you include random untested bits.
So why do people link to Googleapis.com to get jquery or whatever instead of taking a known version and hosting it locally?
> Makes the page quicker to load. The Google copy is probably already in the cache from another site.
Technically you may be right but the only reason I ever heard of googleapis in the first place was when it kept appearing in the 'waiting for...' bit of the browser status bar, just like every other site-slowing third party.
If having a script in the cache makes such a difference then the script is too big, especially if we are now assuming everyone has at least some passing semblance of broadband. Well over half a meg of script for e.g. a supposedly 'lean' page tells me the definition of 'lean' is not what it used to be.
Adding in untested updates saves maintenance hassle? I disagree. If you trust Google to never make a mistake then you're naive. But more importantly, if the script actually has problems in areas you use then you'll have worked round them. A fix to the underlying problem could break your work round. If the script has problems in areas you don't use why do you need the updates? You won't use them.
So logically it's just for performance. There are much better ways of improving performance that don't involve trusting a third party with the core functionality of your app.
You almost certainly do - unless you are writing directly to low level APIs you'll be using C runtime, MFC, C#, DirectX, ADO, ... all of which are third party binaries and all of which may be updated by an OS patch. Worse still, even if you do write directly to kernel/GDI APIs the entire OS may be upgraded!
OSX and the Linux (not sure about BSD...) have ways to limit the ways that libraries can change, so this is mostly a problem on Windows, but still...
Not really the same thing though is it. Windows is platform for rich client apps, a browser is the platform for a web app. The interface between me and the platform is strictly defined and controlled and that's how I communicate with it. More to the point (about shipping code) I don't ship Windows with my rich client apps and I don't ship a browser with my web apps.
So what are you saying? Google will keep the scripts updated in line with browser updates that break compatibility with older browser versions? That would actually be useful, though not breaking compatibility would be more useful. I'm still going to have to retest and potentially reship my app. I'd just do that with the latest library scripts.
"you'll be using C runtime, MFC, C#, DirectX, ADO, ... all of which are third party binaries "
Nit-pick. These aren't third-party binaries. Simply by booting up a closed source OS, you've already handed over the keys to the kingdom to your OS vendor and these are from the same source.
The one packaged by debian does not. It is not in the whitelist.
That particular list entry can be a pain in the a*** as that is the download domain of the ajax libraries - most places pull it from source instead of having a local copy. As a result a large percentage of websites breaks pretty badly. As there is no 2nd level whitelisting (allow if pulled by this site), you end up whitelisting (very grudgingly) anyway.
> Same here. When I install it, the first thing I do is clear the whitelist.
Ditto.
In fact, I seem to recall being surprised the first time I saw the whitelist prepopulated.
Which would seem to imply that:
ai) there was once a time when it wasn't.
aii) I am, therefore, an old bastard.
I rechecked my whitelist options and no, nothing Google is in there anywhere.
Whenever I do install NoScript, by default I remove the existing whitelist. There is no such thing as security if you don't know what you're allowing.
NoScript is a tool, not a solution. Use it correctly and you're golden.
Especially if you use RequestPolicy/Continued.
7/10 times, allowing the cdn (and possibly the *static) is enough to see everything I need to - no need to enable /any/ scripts!
Chuck in DecentralEyes as well and you're sorted.
I'm taking some sort of adblocker and a cookie manager (like Cookie Monster and BetterPrivacy) for granted of course.
I guess they're okay, but I just add *.googleapis.com, google.com, etc. to my hosts file. I use other apps besides a browser.. I don't have to worry so much about whitelists. To do this on android though, I think you need to root the system, but I don't do android too much, so I don't know for sure.
Of course I use startpage.com or duckduckgo for my searches, not google.com. I do occassionally (actually more than occasionally) run into sites that won't load - their loss (of revenue), not mine........ I just move onto the next site.
"The researcher probed NoScript after fellow hacker Matthew Bryant (@IAmMandatory) found a host of disused default whitelisted domains and purchased one to successfully launch attacks that bypassed default installations."
The researcher found one domain that was unused due to it being recommended by a user and it being a typo.
http://thehackerblog.com/the-noscript-misnomer-why-should-i-trust-vjs-zendcdn-net/
https://forums.informaction.com/viewtopic.php?f=10&t=17066