re: I think music studios can get royalties from legit. videos on every play
For a given miniscule value of "royalties".....
4138 publicly visible posts • joined 11 May 2007
A cookie belongs to a domain and is always sent when a browser sends a request to that domain.
localStorage belongs to a domain but isn't sent when the browser sends a request. It is available to code served from the domain so gives a site a way of accumulating state without bulking up the http requests.
You're comparing oranges with nothing.
If computing uses 10% of global electricity resources when everyone has their own racks and 6% when we all share cloud resources then cloud is more efficient. Without that comparison your point is pointless.
Not really. It stops them running code in your browser to track you but it doesn't block all http requests so tracking is still a distinct possibility. Browser sniffing and other fingerprinting techniques allow servers to single out user agents without running scripts.
NoScript is awesome but I suggest adding UBlock origin or something similar to stop your browser connecting to those endpoints in the first place.
I can put together a node app using npm and I have to depend on other people's code. Like I do when I work in any language. I have a tool that pulls the dependencies in and reports known vulnerabilites to me. It's my problem, I control the payload that's delivered to the customer and can audit as I see fit.
When I visit theregister.com, for example, it tries to pull in scripts from theregister.com, doublick.net, google-analytics.com and jwplayer.com. (I have a sneaky suspicion that if I allowed those I'd get some more domains listed but the site works fine without them so they don't get run on my machine.)
How does a site developer take responsibility for the scripts delivered by other domains? I can audit the code I pulled from npm but I have no control over what a third party domain serves. How could I?
So the much more fundamental issue with modern JS development isn't that we build using code from lots of people we can't trust, it's that we build services that pull code from domains we don't control. In that situation we can never audit the code and be confident it is secure.
npm automatically checks dependent libraries for known vulnerabilities when it's installing. If it finds vulnerable packages it prompts you to examine them (with "npm audit") and suggests you try fixing them (with "npm audit fix").
The problem isn't that it's difficult to check for vulnerabilities, it's that it's easier not to.
There's two little ridges on the f and j keys, if you put your first finger of your left hand on f and the first finger of your right hand on j then you can reach all the keys without moving your hands. It's called touch typing. It's easy to get a computer to teach you, that's how I learnt (though I learnt on a dedicated word processor with an A4 sized green screen!)
The people responsible for ALL the buffer overflow vulnerabilites in the past decade? Oh, yes please!
None of you fuckers can do a good enough job. At least the Go programmers realise there's a problem that needs addressing.
Now is the time to look around and see who is naive enough to fall for the "Microsoft are still the problem" nonsense again.
And then you know exactly who's opinion to completely disregard in future.
See? There is a good side to all this. It is an entry level "stuck in the past test".
But they haven't, the old hub client is carrying on so it's behaviour doesn't change. The new one is gitHub specific so if you use github features you can interact with them from the command line.
The alternative was to change an existing, generic client into one that specifically worked with github.
But it really doesn't matter what they do, does it? You can't let it go, even while FaceBook, Twitter and Google are destroying functioning democracies around the world you have to keep carping on about MS.