Move fast and break things
Volunteers?
1416 publicly visible posts • joined 6 Jul 2017
But, as with so many things, confusing the two is being used to push for changes that don't relate to people's perceived problem. "Don't you hate that websites keep asking you about cookies? Well, we're going to remove that consent red tape from GDPR to make it go away."
(And of course, yes, the correct response to being annoyed about websites asking for consent for tracking cookies all the time is to ask why they need to track you in the first place.)
"For example, here in Blighty, sanitary products are not zero rated for VAT and so are more epxensive than they need to be. Essential products are not supposed to carry VAT."
I fully support things like free sanitary products for those who need them (and to avoid any misunderstanding, "those who need them" means those who judge themselves to need them), but I've always found this line of argument about zero-rating them either misinformed or dishonest. Essential products *do* carry VAT. Toothpaste has VAT, contact lenses and prescription glasses have VAT, toilet roll has VAT, non-prescription drugs like painkillers and antihistamines have VAT. Certain foods are VAT exempt and some are not, in a weird way that sort-of-maps to luxury foods are taxed (but not quite, hence all the arguing over jaffa cakes), and people understand that to mean "VAT is for luxuries", but that's not how it works.
"Nor is this a mere fantasy scenario from sci-fi. Last year, financial challenges forced neural tech specialist Second Sight to abandon its Argus II artificial vision product line – regrettably but inevitably leaving a number of its customers with delicate surgical eye-to-brain implants that can no longer be maintained, repaired or (possibly) even removed. Like all kit, it will eventually stop working but there will be no more fixes, upgrades, spare parts, or even anybody with the proprietary Argus II tech skills to know how to fix them anyway."
There needs to be some kind of escrow that the rights and proprietary information for this kind of thing can be placed in for this eventuality, it's only going to get worse. We inexplicably put up with patenting plants, but the line absolutely has to be drawn here. "We can't replace the battery in the device that is letting you see/keeping you alive because the company folded and now the rights are owned by some asset-striping firm, sorry about that," is full on black-mirror dystoptianism.
Nobody building it, fine, if there's a gap in the market they want to try they're welcome. The ridicule is for the fact that they think this could be patented and that they might even get away with doing so.
Anyway, if you think anything new Apple creates is intended to save you money I've got an IBridge to sell you.
No, patents aren't restrictive in that way, if you can use the device on the one port it doesn't matter that it has others that provide extra or duplicate functionality, this is why you can patent a device that's a refinement of an existing patent. So any device that can run from a USB-C port already fits it, hence the broken laptop example.
I present my colleague's laptop with broken screen as prior art. Folding input device with embedded computer and no display, wouldn't be the first time that's happened either.
In any case, patents are meant to be non-obvious: folding keyboards exist, computers in keyboards exist without displays exist, computers that fold exist, computers with keyboards that fold exist, and as mentioned, my colleagues computer without a display and which folds exists. No Nobel prizes are awaiting the person who removes a display or adds a hinge in exactly the same context that many already have.
Any laptop too.
And their attempt to claim "maybe this has been done before, but we've added something new!" is to say it can be more compact because of increasing miniaturisation is essentially the definition of obvious to someone familiar with the state of the art, or even just the general state of technology now. (And yes, the Spectrum ZX81 was essentially just big enough for its keyboard, Sinclair's talent was stripping things down to the bare essentials.)
I've recently discovered a couple of very cool projects in WebAssembly:
1. Web MiniDisc, talk to a netmd minidisc player from the browser, handles transcoding for file transfers, USB interface. The original software for this stuff is long abandoned and only ever worked in Windows. The tools used in the Web MiniDisc version are from Linux-land, bit of a hurdle for most people.
2. Lichess runs the Stockfish chess engine in the browser. It makes it easier for them to provide a free chess service as server-side computing isn't needed. Sadly hasn't gone the whole hog and you still need to be connected to play against the computer.
In both cases it lets you do things with a wide range of devices so long as they support a modern browser. And while it may not be the most efficient approach, for these applications it's arguably a better experience than running some kind of dedicated container.
Betting the old car remained unsold because he refused to acknowledge he was asking too much.
A former landlord, while selling the flat I was living in, told me the market was "low-balling". No, that's the market, whatever wonderful number the estate agents told you they'd get. It eventually sold a year after I moved out (and he'd had to re-let it) for the valuation rightmove had given all along, less than the estate agents had said, but still about twice what he'd paid for it ten years before.
perhaps research grant applications should include data lifecycle plans and salary for programmer(s) in these days of reproducible research?
This is in a process of change. For a long time grants have provided equipment costs, but not wanted to pay for overheads. But the data you generate during your research has a cost for storage (and possibly for processing) that could be high depending on what you're doing. The result was it was easy to buy a new PC or a hard drive to put your data on, it was not easy to contribute towards a departmental NAS or a cluster. It's starting to become better as funding institutions recognise that these are not just administrative overheads but essential services that support the research, and becoming a bit better at accepting costs towards those things (similarly for statistical support).
Yes, investigators are to a large degree independent and tend to compare getting a TB of backed-up NAS storage to buying a TB external drive. At the other end of the scale, there's a tendency for central infrastructure to want to drag people in with solutions that don't always work for them, which hollows out the pool of expertise for departments to help each other. Services like OneDrive and Google can actually provide more convenient services, but you have no control over any changes they make (our university pays for OneDrive and other MS services, at institute level infrastructure has largely reduced to desktop and network support, though there are still departments that have their own computing facilities, and we have decent storage infrastructure still due to what we do).
A certain British university used to have a multiplicity of weekly email news letters; university level, faculty level, school level. What research is in the news, who has been made professor or awarded a prize, adverts for drop in sessions and such like. One day an important announcement about a university-wide planned email outage snuck out in the middle of one of these. Almost everyone was caught unaware when the email was turned off for a day.
Follow up: cutting number of news letters, promises to make important announcements separate.
Several years on, multiple news letters each week...
"We take a step back to the glory days of Windows 98, Microsoft's follow-up to Windows 95, a penultimate hurrah for the Windows 9x family (unless one counts Windows 98 SE as its own entity)"
I know we all want to forget Windows Me, but not all of us have managed it. I ended up with Me on the first PC I bought myself, it was an interesting experience, if not quite as bad as many remember (certainly better than pre-SE 98, subtly worse than 98 SE). But all the cool kids rocked up to uni sporting Win 2k.
Turn up and demand it back. Had that happen to me last year, although on a shorter time scale. If they've had it for a year then they've no intention of fixing it and you should take it somewhere else. (In my case it turned out they *could* finish it for that afternoon, after a week of "come back tomorrow", not a great job though and have now got the tools to do that particular job myself.)
Had to replace a MediaTek wifi+bluetooth module in a new laptop as the bluetooth LE mouse would get lost and require complete repairing whenever it was idle for a couple of minutes, or the machine went to sleep, or you restarted the computer. Both windows and linux, so likely a firmware or hardware issue that I haven't had with Broadcom or Intel. So maybe they should figure out how to handle devices going into standby on 12 year-old standards before trying to sell us on implementations of draft standards.
Anyway, the idea that 40Gbps will replace the 10GbE and vlans holding our server room together seems a little optimistic, not to mention nightmarish.
Still need to disable it if you want to run Nvidia drivers on Fedora and any other linux taking an akmod approach, as the driver module can't be loaded otherwise. Fun. And the begging that was required to get MS to allow linux to be signed in the first place. Remind me again why MS the arbiter of what can run on PC hardware?
Are there any patent rights? Unless there's a patent submitted prior to the bitcoin white paper, one for bitcoin cannot be granted now. I'm not aware of the details of the case (and the register doesn't really outline them), but looking at the judgement, "The applications are made in the context of a claim for (amongst other things) a declaration that the defendant is not the author of, and is not the owner of the copyright in, a document which has been called the Bitcoin White Paper, published in October 2008 under the name (agreed to be a pseudonym) of Satoshi Nakamoto." So it seems since Wright has claimed to be the author (and would therefore have copyright on the white paper, but that wouldn't give control of bitcoin, only a chance to go after people reproducing the paper and code from it), COPA are seeking a declaration that he isn't. I didn't realise this was a thing you could do, proactively seek to disprove authorship, rather than wait for them to try to enforce copyright against you.
Do wonder what he has to gain from trying to claim bitcoin authorship though. If he was actually the author then he'd be extremely wealthy anyway, but I can't see how this makes him money, even in SCO v IBM you can see there is a prize if they ever did win. Here's a thought, maybe he really is the author, but like many early adopters he's lost the keys and those million bitcoin are now forever beyond his reach.
I think the target is really their own laptops, if it works on anything else that's a bonus. Did briefly look at a System76 a while ago, but ordering from the US was enough to put me off (also a bit more expensive for the specs than alternatives, which I might be happy with for a linux vendor, but didn't want to have to deal with sending for a warranty repair for example).
Ah, I've confused the situation here, I did write that starting suggesting (as a comparison) as an example of another way you could look at the situation (given everyone had gone straight to assuming it was for the stupidest possible application), before reading very far in the paper. Considered taking it out and possibly should have. Though I think the point still stands, even if this was a process log, you could still probably find a use for such a tool.
In terms of the actual application, it is logistics, but they're not using it to try to fix parcel tracking, they're using it for (to my understanding) plug in to models to measure how efficiently things are moving around ports and the like, and said models really don't like if something disappeared, even if it's obvious what happened next.
I'm familiar with the general process, although the description ignores that funders have increasingly specific ideas about what they want.
Interestingly the paper quotes the exact same phrase you do. Error bars and confidence intervals exist with any data, purely from sampling if nothing else and, yes, statisticians will happily talk about imputation with caveats. One pretty basic trick you can do for example, if what you are interested in is a metric calculated from said logs, is take a set and apply random dropouts and see how that affects your metrics on the recovered copy. The authors in this paper do exactly that. Because of course "garbage in garbage out" assumes that you're starting with *garbage* rather than merely a slightly degraded signal. You can still listen to music with a few scratches on the record, yuo cna mkae sesne of thys snetnce, you can extract meaningful data from a flow logs with some missing entries. Particularly if your filling in can access other data sources for the recovery. The only data I can't extract is how that is controversial.
Okay, so computer scientists have an obsession with imputation that borders on the unhealthy, and we all know you can't recreate information that's not present in your starting data.
But what about this? Compare your imputed log with the original. What's missing that's expected? What's different? That's the automated version of the exact ad-hoc process many people describe themselves using. If I ever have to go log-trawling I like to have a copy from a good state to compare to.
It's not actually the authors' aim though, as reading the paper reveals. They basically just want to repair incomplete logs so they can carry out process mining (read, do stats on), not forensics (the word doesn't appear once in the paper). Imputation is a fairly common technique in stats, although you need to conduct sensitivity analysis to check it hasn't altered the results, it's generally used to patch up a method that can't properly handle missing data.
Anyway, I always get suspicious when people start talking about "the real world" as if it's somewhere they inhabit that others don't. It usually just means you think your own experience is universal.
An example: the logs they are talking about are not computer system logs as most of the replies above appear to assume. They are logs for things like container handling processes. Which answers another question, why would anyone spend money on this? "This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2020R1A2C1102294) and research projects of ’Development of IoT Infrastucture Technology for Smart Port’ funded by the Ministry of Oceans and Fisheries, Korea."
Yeah, those ivory tower academics with no idea about the real world.
The Typhoon has a cannon, but hitting a slow moving, small, hovering drone with it would be pretty tricky I imagine, the first number I've found for the stall speed is 110knots and you have to aim with the aircraft.
Anyway, while the was a take-home from Vietnam, air-power has moved on. They still put cannons on aircraft, but since the 2000s the emphasis has shifted back to long range missiles, the technology is much better than it was in the 60s. If also a lot more expensive.
Hey, if anonymous Red Hat supporters want to respond to any potential customers with abuse then good luck with that. It's long been the hallmark of the unbalanced one man software company, I'm sure it'll scale well.
Anyway, RedHat's business model always rested on stuff that was free to them too, that's how the rebuilds existed and continue to exist. They post code, good. Some of the rest of us post code too you know. Filing a bug is often pretty futile unless you are willing to take it upstream yourself and run the tests, and they'll certainly be happy to update the package once you and the actual authors have worked to fix upstream code. So I suppose you're right, they don't care about you abandoning ship, since they'll still be taking advantage of that anyway. At least until they find that the people who are willing to pay for it when needed, because that's the environment and toolset they are used to working in, have all got used to a different environment.
Couldn't possibly be that they think an organisation with experience of education around the world is better placed to support thousands of students living in a foreign country for the first time than a minimum wage call centre worker employed by a company that routinely under-performs almost as its mission statement.
Nah.
Jupyter doesn't even need to be cloud-hosted, you can run and keep your notebooks locally, runs a local server on your desktop that you interface with through your browser. Install on windows using Anaconda Navigator, which will come with most of the standard data handling libraries.
A friend in university teaching is increasingly trying to teach students to use Python for data analysis (specifically Pandas with some matplotlib). Structured data, but separation of data and code, ability to do arbitrary things and combine in other data sources if you need to. Maybe fewer 3D bar charts and such, that's not necessarily a bad thing. Excel is still a useful viewer to load up the tables. Personally I'll use R for preference when it comes to data analysis, but if Python (or Perl, or Awk, or C, or, Lord help us, PHP) is more appropriate then that's what comes out of the box. Science students, but for finance they recruit graduates too right? Surely can't be long before those departments also catch on that they're meant to be teaching quantitative analysis and it's 2021+.
About the only thing Excel still gets used for here is calculating trip expenses with friends, and it's great for that, mainly because it makes the data input easier and who can be bothered setting up an actual database for it?
I agree it'll look better on the spec sheet. I recently watched quite a few laptop reviews and they generally all say "the webcam is only 720p", as if this is in contrast to a whole swathe of 1080 cams. Almost certainly wouldn't cost a lot more (I suspect a bit more, probably a supply difference). But it's not something that there's really competition on and very few people are going to choose one model over another on that when there are other factors in play, the ones who really care will be using separate cameras anyway. I guess that's why manufacturers haven't bothered.
(Actually, as mentioned in the review, there is an option for a 1080 camera on the T14s. Not sure how much extra it is as the custom options available in the UK only offer the FHD camera, not the 720 one. I guess this is up-selling...)
Card readers! You can still find them, XPS15 has it. Lenovo seem to have gone microSD for T14 and T15 (none for T14s), which is less useful as you can fit a microSD into a SD slot, but not vice-versa.
While I see no reason to defend 720 over 1080, I also don't really see the killer requirement. Laptop cameras are only ever so-so: no autofocus, fixed location, microphones mounted on an entry level machine that's running fans full blast to keep Teams going. A webcam on a laptop is really useful, but it's always compromised. You're probably right about there being some chip that's used for all of them, though I guess it's more that there's still some fab churning them out cheap than everyone is holding massive stock.
Did you leave out the joke icon? It charges over USB-C, so this is entirely possible, though I don't think there's any charger that actually does it.
There's an interesting factor too, USB-C PD is currently limited at 100W, and if it's put through a hub (for ethernet) then the hub needs to downrate the output to account for its own power needs and loses. A charger or a dock with non-USB-C power can internally power its own logic and ports and also put out the full 100W to the laptop.
Apparently I'm in the minority, but why? Are you really PXE booting laptops that often? I can see for somebody running updates by hand on lots of machines the speed difference could be a factor, but really are you doing that? If you are couldn't you plug in the dongle or dock for that instance? Even if you not it's not a loss of functionality. It's only a win with a fast mirror anyway.
Don't get me wrong, I like ports, 3.5mm audio and mic jack is much more useful than a dongle. For the minute some form of HDMI or displayport is still really useful until there are more USB-C monitors about (okay, you can get passive cables for this, so less of an issue, but takes up your USB-C port if you're not using a dock/hub). Having 3 or 4 USB-A/C ports is pretty handy since it gives you flexibility about sticking in a thumb drive or something without having to worry about what else you need to unplug. USB-C PD is useful, but again using it for power eats up a port, so either you now need a dock or monitor, or you sacrifice a connectivity port to power. And while you can fix all this with a mini dock, it's more expense and one more thing to carry for the sake of making the laptop a bit smaller, seems like a mixed blessing. Also, in PD pass through that dock will knock up to 15W off the power to the machine, so a 100W PSU will only power your laptop at 85W. And of course an SD card slot is super useful compared to yet another tiny widget to get lost. But ethernet? On balance most laptops don't need to connect to it that frequently. The only use case I can see where it's really an issue and a dongle doesn't help is PXE, and really it's a shame there isn't just an ethernet Alt mode for USB-C that would let a BIOS support this (or even just BIOS support for network dongles, if my BIOS can recognise a USB mouse or or boot from USB mass storage then a standard network adapter should be possible too, after all onboard ethernet also needs drivers).
Anyway, if you disagree then you disagree I suppose. It just seems like the least necessary port on a modern laptop.
This is what I thought, but the current T14 gen 2 (and T15) list soldered RAM for most models https://www.lenovo.com/gb/en/laptops/thinkpad/t-series/T14-G2-Intel/p/22TPT14T4N2 The T15g doesn't specify soldered RAM, so might have both slots available. Found this out when looking at new laptops recently since I quite like Thinkpads. Mostly it seems to be one soldered slot, so you can upgrade the other, but still limits how much you can upgrade to and I'm not sure that works for dual channel memory (didn't used to), which may also mean your onboard graphics no longer work as Iris and falls back to UHD.
Yep, size is the limiting factor. I've got here: 1600x900 14", 2560x1600 16", 1920x1080 23". Which of these can usefully place the most windows on? You guessed it, it's the 23" screen. Yes, the QHD looks nicer, I can see the pixels on the others if I put my face a little closer than normal or really look for them, but the 23" I can sit a comfortable distance from and still have it take up a useful area.
Thanks for the clarification.
One can of course patent an improvement on an existing patent, again I'm not sure whether Siemens could complain about that at all, and even if the filling was confidential the patent if accepted can't be, so they'd find out anyway. If Theranos did patent something based on, for example, reverse engineering that they had a contract saying they must not do, then that'd put them in pretty sticky territory. Could Siemens force them to hand such a patent over? Don't know. Asking investors to agree to non-disclosure might be normal while the patent is still being approved anyway (for example, if planning international patents afterward).
All of which is to say, I find the trade secret line pretty weak. But if they were worried about Siemen's lawyers then filing a patent, whether the filing is confidential or not, would have been pretty stupid as it will be made public on approval, that's the trade-off.
Of course, there remains the possibility no-one involved at that stage actually knew what they were doing. But it seems simpler to ascribe all this to excuses to hide the information from their investors and pretend they had secret technology that actually worked, rather than admit, "We're using standard kit and diluting the samples, yes there's a reason the manufacturers don't suggest that."
No, they didn't patent the mods, this is the difference between a trade secret (where your "only" protection is contractual, but that can still be relatively strong because there is actually some legal protection for them which prevents competitors acquiring them through employees who have broken their non-disclosure for example), and patents, where the details are published and in exchange you get time-limited exclusivity. The downside to a trade secret is you don't have exclusivity if someone else reinvents the same process. From the description Theranos had not patented these modifications, because then trade secret wouldn't apply.
The situation described (modifying the Siemens machines) may not have any come-back for Siemens unless it broke terms in their contract of sale, such as any reverse engineering. Might still have had legal issues about untested medical devices, but I'm not clear on US law in that area, medical devices are generally less regulated than pharmaceuticals. In any case, if prospective investors were required to sign NDAs (which I don't think is uncommon) then they would have still been eligible for trade secret protection (similar situation if you are still applying for a patent, where an NDA means it is not ruled out by being public knowledge).