Re: Just wondering...
But less profitable.
12336 publicly visible posts • joined 21 Dec 2007
The original SPECTRE paper demonstrated in-browser attacks, as did the Zombieload paper.
Multi-tenant is at risk. Privilege boundaries are at risk if you have an RCE in an unprivileged process. Enclaves are at risk (though, seriously, fuck enclaves; do they have a real use case other than DRM and spyware?).
We haven't seen in-the-wild exploitation of these vulnerabilities because:
1. Disclosures have been embargoed until the most prominent targets could be remediated. That's what happened with the browser-based exploits for SPECTREv1.
2. We have no shortage of easier-to-use exploits for untargetted attacks, and frequently for targetted attacks as well.
3. Microarchitecutural exfiltration attacks are hard to detect, so they may well have been used in targetted attacks without anyone being the wiser. Just like we have no idea how many victims there were for Heartbleed.
Various sources claim all Atoms are vulnerable to Zombieload and perhaps to RIDL and other MDS attacks as well. The original Bonnell microarchitecture for Atom has Hyperthreading, which suggests it would be vulnerable to type-1 Zombieload.
I admit that I pay little attention to Intel's twisty maze of CPU families, however, and much of the discussion of the MDS vulnerabilities isn't directly supported by anything I've seen in one of the actual research papers.
Damn. That's from 2002. Google was only 4 years old; Facebook didn't exist yet. Amazon's collaborative-filtering recommendation system had only been patented a year earlier (US6266649). While the general direction may have been clear to those who were paying attention, that's still a marvelously prescient story.
(Pretty funny too. I chuckled audibly at the Burger King line.)
Thanks for the link.
Grab a dataset (or photos) from the web without checking the law, copyright, etc
In this case, IBM only scraped photos which explicitly had a liberal CC license applied.
It seems to me BIPA and CC are in conflict here. I have no idea whether BIPA makes the right it establishes inalienable, or whether it can be waived by a license such as CC. (I haven't looked at the text of BIPA.) And I wouldn't want to even hazard a guess as to how courts would find. But I don't think this is a clear case of IBM deliberately violating the law, since a reasonable interpretation of some CC variants would allow what they did.
Valuable as RFC1984 is as a position statement, I don't think it does explain why "key escrow is fundamentally broken". It expresses a position which is fundamentally opposed to key escrow, but while I agree with that position for most applications,1 I don't see how it constitutes an argument that escrow is "broken". In cryptographic research, "broken" is a term of art that implies a rather stronger test than "no sir, I don't like it".
1There are a number of specialized applications where key escrow is a useful aspect of the protocol, under certain threat models that are reasonable for those applications. Filesystem encryption of organization-owned equipment, where keys are held in escrow by the organization's IT department, is one example. Private communication among private citizens is not one of those applications.
That "weird" page-and-offset memory addressing was because the 8088, like the 8086, had a 20-bit address space but 16-bit registers. It was influenced (according to the iAPX 86 designers and others) by a desire to ease porting from 8-bit CPUs such as the 8008, 8080, and Z80; but in reality 16-bit registers were probably as large as was economically feasible.
I doubt it had anything to do with CP/M, aside from the observation that CP/M and applications for it would be numbered among the software packages Intel would expect people to port. They were primarily thinking at the instruction-set level, though; while they made some concessions for compiler developers, I doubt the designers spent much time considering specific applications.
It's worth noting that the 8086 was developed in part as a stopgap while Intel was working on the iAPX 432, development of which started in 1975. The 432 had a 32-bit address space but still used 16-bit registers because going all 32-bit at the microarchitectural level just wasn't feasible. Considering how ambitious the 432 was in other ways (a capability architecture implementing a stack machine with an extremely CISCy instruction set), that's pretty telling.
outside ARPANET (and possibly early versions of JANET) it wasn't really used
I don't believe that's true. While other protocol families such as DecNET, SNA, and XNS were also still common at the time, even in the first half of the '80s TCP/IP had a significant share of local networks and small-i internets in academia and some businesses. The CMU/IBM Andrew Project was always TCP/IP-based, for example, and it started in 1982. The same was true of MIT/DEC/IBM's Project Athena, which started in 1983. I think BSD 4.1a with TCP/IP came out in '82.
For the big-I Internet, besides ARPANET there was CSNET (routing TCP/IP over X.25, starting in 1981), NSFNET (1986), and others.
Wikipedia says JANET (an X.25 network) didn't start routing IP traffic until the 1990s. BITNET, a prominent US academic network, and IBM's VNET (the largest internet in the world until it was surpassed by the TCP/IP Internet sometime in the '80s) ran on RSCS, a pre-SNA IBM protocol. RSCS was a store-and-forward protocol somewhat like UUCP.
In 1987 IP was 7 years old: IEN123, published December 1979. TCP was 12 years old: RFC675, published December 1974.
As you noted, the TCP/IP Internet was 4 years old, since ARPANET Flag Day was 1 January 1983.
I don't know where "TCP/IP barely three years old" came from, but it's wrong any way you look at it.
The earliest TCP/IP implementations for MS-DOS seem to have been PC-IP (1984, though portions were available as early as 1982) and KA9Q (1985).
What about the interfaces in <unistd.h>?
Best I can tell (from ten minutes of research, and IANAL), that copyright still belongs to Micro Focus, via Attachmate Group, via Novell. And attempting to charge UNIX vendors copyright fees would probably not be in our best interest.
Frankly, I'm not so complacent about the medical industry's co-option of the term "doctor", either. Both etymologically and in other fields it means "scholar", and many medical doctors, while upstanding members of their profession, do no research and don't even have much time to follow current research in clinical practice. (That's why Cochrane metastudies exist: so that a team of experts can review research in an area and digest it down into clinical recommendations.)
And in the US, medical interns - who are not yet licensed medical doctors - are generally told to use the title "doctor" with the patients they see. They are scholars - they're still in school - but the medical profession wants to have it both ways: "doctor" meaning "student" and meaning "professional who has acquired some special credential".
Perhaps everyone read the article and understands the bug was in a third-party app?
I think Apple's security is overrated by many (most?) users, but they're in the clear on this one. Unless you think they should do more extensive vetting of everything in the app store,1 which is a position one could argue, but doesn't seem economically feasible.
1Say, by requiring apps be submitted as source, which Apple would run through static analysis and then build and deploy to the store. That's technically feasible but probably not a viable business model, since it would be resource-intensive for Apple and would meet resistance from app developers.
Yes, there's no need for a conspiracy here. Everyone acting according to their inclinations explains the involvement of MBS and the National Enquirer just fine. I'm sure Trump would have approved, and they may tipped him off that something along these lines was happening, but there was no reason to let him know the details.
I don't even think there was much of a plan here. MBS has a collection of hacking toys from NSO Group and Hacking Team, and decided to play with them by seeing if he could steal info from Bezos. He or a toady skimmed over it, found the embarrassing material, and forwarded it to someone (possibly David Pecker at AMI, possibly Dylan Howard at the Enquirer), who decided to try to pressure Bezos. But it turned out Bezos was running short of fucks to give that day.
It's also a mistake to assume the powerful know or care much about OPSEC. It's pretty common for people in power to trip themselves up by using personal devices. Even when they try to do it properly, they often achieve decent security in one area but screw it up in another (as with El Chapo, for example), or use a mechanism that fails under a different mode of investigation (as with Petraeus).
Was it not declared? The article says Co-Op claimed they were told the IG package was an "out-of-the-box solution". My impression is IBM told them pretty early on, possibly in the proposal, that they'd be using Insurer Suite. Since IBM does not have their own commercial package for this industry vertical, you'd think Co-Op would have wanted to know where the software was coming from at the start.
I'm not saying IBM didn't misrepresent Insurer Suite - they may well have; I don't know anything about the situation beyond what's in the article. But it doesn't sound to me like they represented it as something developed in-house.
They could move elsewhere
Many of them have, or never located in high-rent areas in the first place. Any list of high-tech growth areas for IT in the US will show that there are plenty of other areas with high concentrations of IT and other high-tech firms.
the glamor isn't there
From surveys I've seen, this appears to be largely irrelevant to many firms and most of their employees.
Sure, Silicon Valley still has a cultural presence, and that draws some firms, the ones who hope to find success by basking in the glow of the FANG companies or want to be close to the venture capitalists so they can con them in person. But not everyone is so easily led.
you actually can't hear what people are saying if there's other loud stuff on the soundtrack, because the speakers are so bass and treble heavy and knacker the mid-tones
IME, this is at least as much due to stupid, ubiquitous Dolby 5.1, and sound engineers insisting on mixing the dialog on the center channel - which is largely lost if all you have is a pair of stereo speakers.
We've had to get (cheap) "soundbar" speaker sets for the TVs in our houses (just one each, fortunately) in order to reliably make out dialog without having background music and SFX too loud. My wife is deaf on one side, so audio separation is meaningless to her, and I don't care about it a whit; but we need at least three channels just to hear the programs.
TV sets used to come with an option to force sound to mono, which would be just fine with me, but I haven't seen that feature in years.
A pox on all their houses.
Extracting impacted molars is better done by a proper dental surgeon, not a run-of-the-mill DDS. All credit to my dentist, whom I quite like; but that's a specialty procedure. Which is why she referred me to a dental surgeon to have my impacted supernumaries1 removed.
And a good thing, too; one of them was bent in a full 180-degree arc. Took some work getting it out. The surgeon said it was one for the trophy case, but alas it turned out he was just kidding, and didn't have a trophy case.
1Extra molars behind the "wisdom" (third) set. I still have my third molars, but the fourths had to come out.
I think of "deep fry everything" as a cultural touchstone of Midwest and Plains state fairs, not the Northeast.
In NYC I'd expect a deep-fried Creme Egg to be rejected for being gauche and insufficiently "ethnic". In New England it would likely be considered sinful. In Indianapolis or Lincoln, though, you could probably sell them all day long.
(That said, I've been to the Nebraska State Fair, and deep-fried sugar would be an improvement. The county fair I go to in Michigan is bigger and more entertaining.)
This is not a loss for Free Software fundamentalism, it's a testament to the influence it's had on the world of software, transforming the landscape into a world that's now orders of magnitude more open.
I'm dubious about this narrative. Shipping source was the rule, not the exception, in first couple decades of commercial digital computing. The move to closed-source software was arguably driven by a couple of factors: the rise of commercial pure-software companies, and the IBM consent agreement which led to "unbundling" and the forced conversion of its mainframe software into a profit center.
Even prior to the rise of the free/open software movements, source code was still exchanged widely, at both small scale (the txtfile community, for example) and large (AT&T UNIX). When Stallman founded the FSF, I don't recall it being greeted as a surprising concept; the controversy was around the ideology, not the notion of open source, or even open-source commercial software.
Personally, I suspect we'd have a significant open-source presence even if the FSF and the free-software movement (and its variants) had never happened. Certainly the FSF and GPL had a tremendous effect on the evolution of FOSS and its current state, and almost certainly on the volume of FOSS and the success of FOSS-based commercial firms such as Red Had. But I think it would have been significant even without them.
There are plenty of grammarians and linguists who would consider both "gonna" and "Wasm" contractions. There's no generally-agreed definition of contraction in English, either in general use or as a term of art, which would exclude either of those words.
If you want a more-specific description of what sort of elision "Wasm" represents, it could be considered a portmanteau, since it's an elided noun phrase used to describe a (notionally) new concept.
Indeed. I'm sure mileage varies, but for all my USB-charging devices1 I've always just had a couple of those kits that have a cable and a set of adapters for various USB sizes, and the assortment of USB chargers I've accumulated over the years. Phones, tablet, Kindle, that old MP3 player I sometimes dig out for when I'm working on the house - they all charge just fine with whatever cable and charger I use.
I keep a couple of chargers in my computer bag and have various ones scattered around the houses. (The latter stay plugged in; they dissipate negligible heat, which means they're using negligible power.) I've never had any reason to look to see what they're rated for.
1Refusing to use Apple products has solved the Lightning problem nicely.
I suppose some of his most gullible followers might believe that's a quote manufactured by the deep-state LIEberal elite MSM news, but not enough of them to get him re-elected.
As far as I can tell, for the vast majority of his supporters, it doesn't matter what he may have said. Positions don't matter; policies don't matter; performance doesn't matter.
I've read numerous interviews with Trump supporters over the past few months. These are middle-class and upper-middle-class people, likely voters, likely financial supporters. They're educated, articulate, not perceptibly stupid or ill-informed or insane. And what they say, consistently, is that they'll vote to re-elect the man. Sometimes they'll mention specific achievements (the tax-code changes he signed into law, the stock market, etc); Trump doesn't really deserve much credit for those, but they don't dwell on them anyway. They're excuses. Then they'll use descriptions like "the best president since Reagan" or in some cases "the best president ever".
The real warrant to their arguments, readily apparent, is that they support Trump because he's their team and they want to win the game. US politicians have successfully pushed most of the electorate into treating politics as a sport, and most voters will put up with pretty much anything to support their team. People mocked (and continue to mock) Trump and his supporters since he first entered the 2016 race; and those supporters will be damned before they back down.
And so will the rest of us.
Under a number of models, the European countries and Japan are both part of the continent of Eurasia. Regarding Europe is a continent unto itself has a long history but is by no means uncontroversial. Certainly, those who consider the Americas to be a single continent - a convention widely observed in a number of countries - don't have much justification for calling Europe a separate continent.
Yes, privacy is a major concern, as are attacks on group behavior. An argument can be made that crowd anonymity is a fundamental human right. I'm reminded of Poe's "The Man of the Crowd".1
1I've always wondered if this story is the origin of the term "gumshoe" for a detective. A quick search didn't turn up an earlier use. I'd check the OED but I'm feeling lazy.