Re: Instilling new timeframes of thought in a world beset by faster/shorter.
If it ticks once a year, then I'm going to assume it's measuring sidereal time, not UTC.
5761 publicly visible posts • joined 20 Jul 2010
From the Wikipedia article on the Richter Scale:
"3.0–3.9 ... III to IV ... Often felt by people, but very rarely causes damage. Shaking of indoor objects can be noticeable. "
I would be absolutely astounded if a clock designed to last 10KA isn't designed to withstand the shaking equivalent to a large truck driving past outside, let alone significantly more. Given the amount of effort put into it, I reckon they might have considered this one...
C# is certainly better than VB.net, but you need the correct .net runtime. A nightmare.
Do you? AFAIK, the current iteration (I think it is 4.8), is backwards compatible with anything compiled against an earlier version. IIRC, the last time there were any significant changes that may cause backward compatibility issues was when it went from 3.5 to 4.0, which was well over a decade ago. And guess what? You can have them all installed side-by-side without any issues anyway.
There seems to be a lot of hate for .NET on here today. I get that MS have been guilty of some pretty sharp practices in the past, but the .NET framework, and its languages do appear to be something they have got right. Complaints of massive exe sizes, poor performance and dependency hell really don't bear up to scrutiny.
.NET executables are not run-time compiled. Both C# and VB.Net compile to the same bytecode. It might be runtime-linked and interpreted, but for most use cases, the performance hit is nowhere near the order of magnitude that people like to make out, and unless your application is CPU-bound, you're unlikely to get any noticeable benefit from compiling down to native code using a fully-compiled language. There are plenty of use cases for that sort of thing (real-time embedded code, for example), but by-and-large, desktop applications are not one of them. If an application is slow, it's most likely due to poor programming, not the choice of language it is written in.
Oh really?
I've just done a release build of a Windows console app, written in C#, which comes in at 347KB, plus about 5MB for the libraries it uses (one of which is Entity Framework, which isn't exactly feature light).
Windows services aren't much more, technically, than a console app, with some different entry points. If I didn't need to do the data access, and logging, that 347KB would be all it needed to be.
Where are you getting that extra 150KB from?
Looking at another release we've got, there's a complete windows service, which is 253KB plus the same 5MB odd for libraries, so a standalone service (that doesn't do data access or logging) written in C# is significantly smaller than 600KB.
Your only valid point remains the requirement for the .NET Framework for the RTE. It's the nature of the beast for bytecode-compiled languages. You'll need to qualify your reasons for calling it a "horrible fucking kludge" - I take it you object to JAVA for the exact same reasons?
To be fair, a docx file isn't executable, it's a zip file full of XML files (you can prove this by renaming it as .zip and opening it).
The fact that some contents may be interpreted as commands and executed by Word is the fault of Word, not of the file format per se, and I believe later versions fo Office products won't run macros by default.
The old "more bacteria than a toilet seat" thing ignores that fact that most toilet seats get a clean at some point, and the idea is that you do your business through them, not on them. Pretty much anything that never gets sterilised (or cleaned with bathroom cleaner) is going to have more bacteria on it.
The key things here are which bacteria in any case, not how many, and if I were to find, for example, more e coli on a phone than on a toilet bowl I would start being concerned.
I know you're joking, but due to expansion, there is only so far we can see across the universe. Due to space expanding uniformly, as you get further away, the relative acceleration away from us increases. There is a limit at which this is the speed of light, and you can see no further. Also, due to red-shifting, the further something is, the longer the wavelength of the light coming from it. Hence the cosmic background radiation (which is the oldest, hence the furthest thing we can see) being in the microwave region of the EM spectrum. If you could see microwaves, the night sky would indeed be pretty bright, from the echoes of the big bang.
I will happily rewrite that working software for you, from scratch.
You provide the requirements, and user stories, I'll put together the spec, and cost it up for you.
Specification time is chargeable in advance.
What do you mean, you want me to do it "for exposure"? What am I? A jobbing musician (who also won't work for free, but is used to this sort of thing)?
No, your vendor or supplier will not get the new sale, another vendor or supplier who sells gear that is patchable and has a reasonable length support lifetime will get the new sale, and you will get a lesson in why the cheapest option is not always the best.
If your hardware uses TLS 1.1 (or worse TLS 1.0, or even SSL3) and they haven't provided a firmware upgrade to fix it (and it is a fix, because those are vulnerabilities) then you should seriously be considering replacing that hardware, especially if it is in an enterprise environment. Attacks can, and do, come from within the corporate environment, and if you're using unsupported gear (such as switches, firewall devices, et al) then you have a vulnerability that should be fixed. End of.
Going from memory, the vulnerability in TLS 1.1 is currently theoretical, but could become a real threat in 5-10 years, the flaw in TLS 1.0 is exploitable by a determined attacker to e.g. hijack SSL sessions, and SSL3 is practically exploitable with a RaspPi.
You don't work for Union Carbide do you?
With girls it's even harder, because her surname could change to almost anything after she marries.
Or, you know, it might not, because these days, when a woman marries, she doesn't become her husband's chattels.
I know of at least one man who took his wife's surname, and when I married, my wife kept her name (which she had previously changed to her mother's maiden name for other reasons).
But hey, let people do whatever they want to do. However, the expectation that a woman should take her husband's name when she marries is a relic of a cultural attitude towards women that doesn't have a place in modern society. Well, shouldn't, anyway.
I got some cheapo "MPOW" ones for $seasonal_holiday, they seem to work reasonably well, once I managed to get them to pair with the PC (it turns out that if you buy a bluetooth dongle and forget you already had one plugged into a USB socket in the back, that you got a few years back, Windows gets somewhat confused).
...and there's nothing to stop you wrapping a C library in a DLL and calling it from C# either. For some problems, that might even be the right solution. i.e. write the non-performance critical parts of your code in a high-level language with all its useful features like type safety, and write the critical parts in C / C++ and call them from the managed code.
If you're using a high-level language that doesn't have the ability to call external functions in low-level languages, well then you're either using a language that isn't feature complete, or deliberately doesn't allow that (for security reasons, probably).
It's also assuming a high refresh rate for that security camera.
I don't know about you, but when I've seen security camera footage, it's often in greyscale, and at about 1 FPS. If you're monitoring a secure area to audit access, I'd say you'd be unlikely to spec anything better than that, simply due to the storage requirements. If your camera is recording at 1 FPS, then this technique has a hard limit of 1 BPS, that's 7.5 bytes a minute, and that's presupposing that this is the only thing causing ambient light levels to change.
I get that this is clever, and this it could conceivably be used to exfiltrate data, albeit at a very slow rate. I still think a more efficient attack vector is to use the same vector you did to get your malware onto the device in the first place (e.g. an infected pen drive) to get the data out again at a later date, once you've collected it.
The point I was alluding to was that if you can compromise the computer to get the malware on, why not use the same mechanism for getting the data off?
For instance, if the machine is air-gapped, you must be getting the malware on via an external storage device. If you can get it in once, undetected, to get the malware on, the odds are you can get it in a second time to get the data off again.
Similarly, if the malware gets on via something like a poisoned link in an email, that presupposes network access, in which case that sounds like a far more likely route out again. Ignoring, of course, the fact that the access to that dodgy link would probably be spotted right away by any half-competent network monitoring, and the machine scrubbed.
I don't want to downplay the cleverness of all the various side-channel data exfiltration techniques but I do question their usefulness and applicability, considering that the target is either going to be in a windowless basement somewhere, with the sort of security that carries rifles, in which case, good luck seeing the screen, or accessing anything that can see it, or the target isn't going to be secure in the first place.
This kind of limits the usefulness to situations where not only do you manage to physically access the air-gapped machine to compromise it, but you also somehow manage to compromise other security aspects around it (camera systems, etc.), in which case, as the number of required exploits rises, so does the possibility of discovery. Good old-fashioned rubber-hose cryptography becomes the easiest route over the James Bond stuff.
Their methods all seem to have one thing in common:
Step 1) Gain access to a protected system and install malware on it
Step 2) Come up with some wacky way to exfiltrate the data from the already compromised computer that is orders of magnitude less difficult than getting the malware onto the computer in the first place...
I would ask the follow-up-question, "why has sudo been changing in the last five years"? Sounds like feeping creaturism to me. Such things are never good for software stability, and something security critical like sudo should do the job it is supposed to do, with the minimum of code and fuss, and stop there.
I consider myself suitable chastened.
Of course, I would expect sudo is available in many if not all types of *nix, not just the penguin-flavoured ones. I'm assuming it's the same in all of these, although I will confess to not knowing much about the deep innards of various OSs - presumably various *nix flavours bear enough similarity for sudo to operate in the same way in each, and for it to be compiled from the same source (potentially with OS dependent libraries)?
It does highlight the chain-of-trust that is assumed with tools such as this, and the potential need for some forensic analysis of flaws in such...
Seems sensible. Presumably both of those distros are including a version of sudo that comes from the same verified source (if not the same version). Wikipedia tells me it is maintained by Todd Miller, but doesn't reveal whether it is a team effort, or entirely the fault work of one man...
I'd argue that if the flaw is in the source (even if not the default configuration) then the flaw is in the compiled code. Whether it's turned on or off is largely moot, the alarming thing is that such a flaw exists in sudo.
If by "configuration", what is being talked about is whether the code in question is compiled in or not (i.e. due to an #IFDEF section), then it's slightly less alarming, although still alarming that such flawed code should make its way into the "official" source.
I am assuming here that such things are tightly controlled, and someone somewhere is responsible for maintaining an "official" source for the sudo command. Presumably, it's low-level enough to be distributed with the kernel (I've never dug into the innards of *nix enough to know, TBH, I don't fancy losing myself in that particular time sink). I would guess that various distros don't go around rolling their own forks of sudo for shits and giggles, because frankly, as a security practice, that would smell very bad.
...a program running as root that receives user commands...
This is also a potential security hole; you'd better make sure your program has no flaws, or buffer overflow exploits, and if it allows a user to run certain other commands with root privileges, then those programs themselves have no such flaws.
By doing this via a common well-recognised route (sudo), you reduce that attack surface to only the vulnerabilities in sudo itself. Admittedly, in cases such as this, if sudo has a buffer overflow flaw, then you've got a problem. However, a known flaw (that gets patched, hopefully very quickly) is better than a hundred unknown flaws that may be getting exploited for years without you knowing anything about them.
Security is hard. Rolling your own is almost never the right solution. I am, frankly, astounded that something as basic and ubiquitous as sudo has such a flaw, not least because I would have expected it not to have changed substantially in many years. Whoever is responsible for doing that particular code-review needs a spanking, as does whoever is responsible for doing a security audit of the build. After all, a root escalation exploit in a tool used for controlled root escalation is an obvious place to look for well known flaws such as buffer overflows.
Well, that sounds dodgy AF. You've probably compromised your phone and your Wi-Fi network. It wouldn't surprise me if this was gear intended for a state-sponsored hack against a specific target and it has somehow made its way into the shops.
Did it require you to side-load the app onto your phone? If so, I'd suggest you assume your phone has been compromised, and do a factory reset. The "requirement" for a Wi-Fi password between 5 and 32 characters implies it's also trying to crack your Wi-Fi password. Change it.
Also worth remembering that in Asimov's books, humans never encounter any intelligent alien species, because the three laws mean that the robot pioneers that went first exterminated everything, in order to "protect" humanity. Maybe there should be a -1th law about not committing genocide on a galactic scale?
The problem with that approach is that your order is filled by a minimum-wage employee at the supermarket, who doesn't exactly have the incentive to pick the freshest produce, or the items with the longest shelf-life, so it's a bit of a lottery as to what you actually get delivered.
It's not quite that simple though, is it. If you've "bought" your new handset on a 24 month contract, you can't just go skipping off after 3 months without paying some pretty hefty termination fees. Most people seem to be on 18 or 24 month contracts these days, so the opportunity to get someone to switch networks happens for probably 1 month every 2 years, if that customer can be bothered. Which most can't.
Well, yes and no. When police are investigating a crime, it's often useful to get information from a network operator about phone locations, using cell tower triangulation. This can be used to prove whether someone's phone was in a general area in a rough time-frame (about 30 mins IIRC depending on how often the phone communicates with the base station). It's not much use if they can't find out the SIM identity of the person they're looking for (be it perpetrator or victim) because they were using a throwaway PAYG SIM.
There are, of course, other issues of privacy around this, which is why the police should have to get a court order to access such information, but just remember - if you're the BOfH and you're going to go into the woods to dump a body, best leave your phone at home...