Re: HOWTO: hack their voting machines
It is usual to provide some link to credible evidence when making serious allegations.Not in Trumpistan it isn't.
3756 publicly visible posts • joined 10 Apr 2007
I doubt that detonation would really matter at all that much, far too much trouble compared to the impact and the difficulty. It all comes down to fear and disruption, as in money (to clean up the mess). It would be much more effective and easier to just sail a large transport ship at high speed* deliberately into docks and oter vessels damaging them and leaving a very large wreck (navigational hazard) in the way at the same time.
* relative high speed, but even a few knots for something with the mass of these is a lot of collision energy.
I'm not a battery specialist, but my understanding of lead acid batteries is that compared to the alternatives they are chemically quite stable, relatively safe by way of the chemicals used, and cope well with trickle charging. Another of the major differences between lead acid and other batteries is that they can cope with a very high starting output, a trait that useful for their automotive usage but also to be continually charged and discharged.
The continual charge and discharge factor is important because of the two key types of UPSes: those where the output runs through the batteries all the time and ones where the battery is only switched to on a power event. The latter is cheaper but there is always a momentary drop in power during a switch event and the power passed through is not guaranteed to be clean, particularly during a power event. Running everything through the battery ensures clean power goes through all the time which can be an important factor for sensitive equipment.
It's still possible to do the same when using non-American characters in passwords and various online services, such as Google and Microsoft, where being written by Americans with often little knowledge that the rest of the world exists, even passwords are treated as being in American and therefore fail when they aren't.
For example, use the pound symbol: £ as part of your password and enjoy various random Microsoft and Google services being unable to authenticate.
"Excel formulas are the world's most widely used programming language, yet one of the more basic principles in programming has been missing, and that is the ability to use the formula language to define your own re-usable functions,"Seriously? What would be much better is any form of decent inspection functionality, as in the capacity to inspect and debug the bloody formulas rather than treat them like a black box of doom.
It would also be nice if Microsoft fixed the damn bug that's existed since, I don't know, a couple of decades at least, where Excel randomly decides that a function in a cell is just text rather than a function and it's a dead chicken waving moment to try to convince Excel that the contents of the cell is a function and not text... without wiping out every bit of metadata about the cell.
Yes, it's possible to inflict some disastrous rubbish in VBA... but only if one doesn't want security or maintainability. It would be so much better if most of the junk in VBA was available in vaguely sane built in functions, although that's just a fantasy given the way Excel functionality is implemented and expanded upon.
The best system I saw was one where the first time a system/user tried to access potentially NSFW content was to inform that the user that they are welcome to proceed but that all such accesses are logged and reviewed and inappropriate access will be dealt with.
It was a sensible, pragmatic approach.
I don't remember the dates, but while it's all to common to think of the North Americas being largely dominated by (largely) nomadic tribes, there were also quite a few cities (of the time) of good size and sophistication. The demise of most of these tends to be put down to environmental causes but some just disappeared almost overnight therefore probably violence was involved.
I had a junior programmer whose take on compiler warnings was to disable them. Needless to say the quality and stability of his code was appalling.
It took me a while to have him fix all of his code, listening to all his (stupid) complaints like, for example, whether it matters or not if a variable was initialised before use as long as the code (sort of) works.
Microsoft are the Data Controller because they are deciding what to do with the data, and dictating the data stored.
It's a standard Data Controller vs Data Processor type issue where many people think that the Data Controller must be the originating entity, they don't have to be.
The data controller determines the purposes for which and the means by which personal data is processed.
For example, if your organisation uses an external accounting organisation for payroll then the external accounting organisation is the Data Controller. While this may seem contrary to how many people read the act (although in troth most people plainly haven't read it), this external accounting organisation is dictating the information that is required and how it will be processed. Your organisation cannot pick and choose the information provided. Ownership of the data and responsibility for the accuracy is usually contractually deferred to your organisation and while this is fair enough, it does not change the fact that both parties are legally resposnible for the data. While many try to take the simplistic route that every client of such an accounting organisation is in fact the Data Controller and the accounting organisation is the Data Processor, this would require that every such relationship would require that every client draws up their own Data Controller agreement that their accountants check through in detail and agree to this. Such a relationship where a service provider is the Data Controller and the client is the Data Processor is quite common and while people are hung up on choice, as the client chooses to use a particular accountancy firm, this does not make the client the Data Controller.
In many ways the hard distinction between Data Controller and Data Processor is quite unhelpful as real interactions do not work that way. It would have been preferable to consider the origin of the data more strongly and the passing on of this data rather than going through the semantics wrangling of Data Controller vs Data Processor where if one is not careful almost everyone is a Data Controller. It's also important that for any given dataset, a single organisation may be both the Data Controller and the Data Processor, often multiple times over.
Had a similar kind of issue with staff moving their desks around and the CRT monitor stopping working. Sensibly they'd lifted the monitor holding onto the bottom of it (rather than just the stand or pushing it) and had moved the control wheel for the contrast or brightness all the way down. Followed by a panicked call from the user that they'd broken it...
I suffered with one of those too. New system was purchased for CEO, so slightly upmarket model compared to the usual workstations. Not a problem, nothing too excessive. Until I got the embarassed call that he couldn't work out how to turn the damn thing on... I turned up and sat in front of it and struggled to find the power switch as well. It was black, with no signage at all, on a black case, designed to look a part of the front styling of the case. When we'd both laughed about the stupidity of the design we found a sticker and stuck it on it to make it obvious in the future.
Definitely. It was fairly easy to do for a user who wasn't able to press the mouse button and to not move the mouse at the same time. I think there were settings in win.ini or system.ini (can't remember which, doesn't matter now, could even be control.ini) that allowed one to change the sensitivity to movement of the mouse when clicked. Increase the value in this setting and it made the system much less likely to move icons around.
The more thorough fix was to either write protect the offending file or have a safe copy of this file copied over every time the system (windows) started up.
I'm not sure why it's totally unexpected given that using pressure is one of ways of making diamonds for some time now. The effort to make them has only been matched by the diamond industry (De Beers) in trying to identify manufactured diamonds and to dissuade people from making them in the first place.
Unfortunately this is the response of a lot of developers and is why there are so many appallingly mismatched user interfaces out there. Quite understandable of course.
Microsoft used to have a very good and well thought out and reasoned user interface style guide for windows applications. Unfortunately it got shredded, set on fire and thrown out around the time they vomitted out Windows Vista as this utterly useless GUI was counter to pretty much every good UI design paradigm out there.
Naturally, some parts of Microsoft adhered to the guide more than others. For example, Microsoft AutoRoute never adhered to anything at all and with Microsoft insisting that every new iteration of Microsoft Office used the GUI of the latest operating system rather than the GUI and look/feel of the installed operating system they recreated huge swathes of the OS UI within their applications. Made more ridiculous because windows XP "skins" were an appalling hack overdrawing previously drawn components. Around this time Microsoft developers also saw the "wasted space" of the title bar and decided to start pissing around adding custom GUI components into this, with other applications following suit because they couldn't be seen to be inefficient with space compared to Microsoft Office. Since then, GUI rendering has been a precarious thing as anyone running multiple monitors tends to find with random ghost parts of the top of the screen becoming unusable... not as frequent as it used to be, but still happens regularly.
Because of its asynchronism it forces you to think about your code in a non-linear way, which is how modern computers work these days, as does a system of any complexity.Oh, if only this were true. The number of abject horrors that I've come across where supposedly experienced developers don't seem to demonstrate even the first hint of appreciation of concurrency, let alone reducing code exposure to timing issues. Most so-called web application developers still seem to think that a web page is a modal system application and can be abused as such.
On the other hand, not to bash JavaScript, I've seen the same utter lack of comprehension of concurrent processes in C++ and C# applications.
Similarly here... I started with Commodore Basic on a Commodore 64, but was quicky frustrated not so much by the immediate speed but the lack of features. I tried implementing things in Basic but had to switch to assembly code. By the time I'd finished teaching myself I'd pretty much disassembled the Basic ROM and support library chip to see what was there that was interesting and how it was implemented. The how it was implemented part taught me a lot, even down to the storage of Basic statment tokenisations and jump tables.
At first I didn't even have an assembler. I just output the statements using Basic to the screen. I then read these statements using the utterly indispensable Advanced Programming Guide that Commodore sold as this also included the entire 6510 instruction set including the decimal values for each statement. I learnt to write 6510 assembly in decimal!
Later I progressed to a machine code "monitor" and then, finally, to a proper assembler. The assembler saved a lot of sanity.
At login time, the secure enclave will not perform an authentication operation using the private key unless you can first convince it that you are you - which might be via a PIN, or a fingerprint, or a facial recognition. Those things never leave the device, and are never stored on or verified by the remote service.From the point of view of the remote service, the user is logging in using a private key only. From the point of view of the local user, they are logging in "using a fingerprint, face recognition, or a PIN" as Microsoft says. But in reality, they couldn't login with *just* those things; they also need the device which contains the private key.
You're missing the point. In order to unlock the agreed "secret" for use later, a user is unlocking using a non-secret authentication component. In other words, what was secure is no longer secure.
Consider this:
There is a steel door with a very secure lock securing it (the kind that "the lockpicking lawyer" would take more than a minute or two to open). The key for this very secure door is stored in a safe. The safe has a secure permutation (often mis-named a combination) code securing it. This code cannot be changed. The code is written down on a sheet of paper that's hung on the wall near the safe.
In effect how secure is this steel door? The lock itself is still a solid and very secure lock, ufortunately the key is not therefore the door is not secure.
His answer: Microsoft Authenticator, a mobile app for Android and iOS that allows users to login using a fingerprint, face recognition, or a PIN in lieu of a password and with an OTP for accounts that support that standard.Oh FFS, the idiocy never stops, and this from someone at Microsoft who pretends through his job title to have a clue. Fingerprint and face recognition are NOT replacements for a password. They can replace or enhance an identifier, as in a user name or ID, but never a password. It is not feasible to change your fingerprints or face, or to keep them secret from others therefore they can never be a replacement to a secret component of any form of authentication.
Hybrid cloud with onsite server to fail down to might help those that are in production and don't want to idle the floor if the external network is down though.
This is the aspect that many consultant-lemmings fail to see... use the technology appropriately, not just because it's a current bullshit bingo buzzword top term. By all means move low priority processes and external end point based services online to be hosted on someone else's computers as this can make a lot of sense, although there is usually little in the way of real cost savings. For others, such as business critical databases and/or local processes that need to be available locally it's often nothing but ridiculous to push these online to be operated on someone else's computers and if it weren't for vendor price gouging on non-hosted licenses there would be no question about cost savings through keeping most such systems local.
For the majority of organisations the majority of their processes are fairly standard and can be bundled into a "one size fits all" approach that the purveyors of forced obsolence and subscriptions for this like.
Unfortunately, the majority of organisations have the odd existing, and often critical, business process that is non-standard, or at best are industry specific. These are the sticking points for the blinkered "push it to the cloud and hope it works" approaches. The old 90/10 rule.
and...
6. What devices are going to connect to these amazing, world beating flying overlord base station replacements? Oh phones. Tens of thousands of them. Many from a distance of many kilometres. With their low power antenna not designed to broadcast signals such a distance.
If Gartner are doubtful about this, then it's probably the first thing that Gartner have ever got right (other than fleecing manglement all over the world with paid for "reports")
I went straight from a job to a gig that I'd booked months before. Completely forgetting that my techie tool bag consisted of a range of screwdrivers, the odd hammer, spanner and so on - but somehow no "sharps". While there was no official bag storage facility the entry staff took pity on me and allowed me to stow it with (near) the roadie's kit.