I do the same with my Revolut card, which has nice levels of security. All main online banking is done at home and secured by HBCI.
Contactless payments are generally for the benefit of the payment providers, not the users.
12110 publicly visible posts • joined 16 Apr 2007
Both of those are just shiny baubles to me with the latter being a battery sucker
Never use NFC myself but it is generally passive, otherwise how would wireless card payments happen? But then that's what I use for paying for stuff? Why should I get my phone out to make a payment, when a far less fragile card will do?
Wireless is charing is very convenient and becoming more available. But, particularly with USB-C, it's not as if plugging in a cable is beyond people.
Failures can be catastrophic and the consequences devastating and some images cannot be unseen, doubly so with the internet, so this is standard practice. You can always release any unseen footage later but more important for investigations will be the telemetry and how forthcoming the company is with the data.
Certainly, using the constellation for broadband provision would make sense (particularly for those UK households unable to get decent speeds). It is, after all, what it was designed to do.
Given how tiny the UK is, you could easily provide the same service with a few of Google's dirigibles. Satellites are designed for covering large areas of the earth with their signals.
Oh, hang on. Is that the 19:21 Gravy Train? I have to catch that one! If you fine it, mine's the one with "A Short History of Barnard Castle" in the pocket but you can keep that.
Well done, you! While it's easy to laugh at e-bikes as somehow a soft option (they are, but so what?), they do get people out in the fresh air and, as you have to pedal, there is always going to be some gentle cardiovascular exercise. And, of course, they make light weight of any shopping.
I remember when I first saw them and realised that it meant the grandparents could enjoy a day out with the grandchildren. What's not to like about that? Also met a 70-year old out in the hillls who was still able to cycle around 100 km with the help.
There are still problems with them largely related to separate motor/gearing systems which mean that many people leave them in the highest gear, with the highest degree of support, which means they don't half accelerate at junctions, which is also where most accidents with them occur. I suspect we'll start seeing combined control systems so that people don't have to think about the gears either.
Another interesting effect they're having by driving up the average price of bikes, is that bikes are gaining status, which means people are more willing to spend money on keeping them maintained: so they go for regular inspections rather than being ridden until they start falling apart. Though I also doubt that you can get a really usable bike for less than £ 100.
Dear Def,
billions of clicks from your less enlightened brethren beg to differ…
Actually, targeting ads generally appeals to the same sort of companies and users of services like Groupon: they have a shitty product that they're desperate to get rid of.
Context is everything so just wait for targeted product placement coming to a streaming service near you with products being placed in real time on breakfast tables, cars, etc. of the shows you love. I wish was making this up but AFAIK it's already been patented.
True. This is one of the reason I didn't go WithThings.
But there are exceptions: my digital scales from Soehnle has an online storage option but the default is on the device with the optiom to sync to my phone. I think the same is true for their other devices but I only really want to know how fat I am…
Why, one might ask, does the same reasoning not apply to the entire JavaScript engine? Should Mozilla just migrate to V8?
Well, regexes really count as DSLs (domain specific languages), which is why the two browsers have been using the same approach for years. The change presumably includes some kind of binding so that the V8 engine can be called directly and doesn't have to be ported.
The same does not necessarily apply to the JS runtime. But I do think we will start to move towards consolidation other areas such as HTML, CSS and JS parsing, areas where Mozilla's use of Rust might have advantages.
I think the main reason that the main reason cyclists have so few serious accidents in the Netherlands is that bikes are always taken into consideration when planning roads, thus minimising competing traffic streams. Add to this the fact that most Dutch people cycle and the liability is nearly always with the most powerful vehicle.
Only according to Nathan Barley.
These toys do not:
Oh, and helmets are pretty fucking useless for all the knee and hand injuries that people suffer with them.
Depends on your definition of "on time", considering it was originally due in April. And it only got approval about its handling of data privacy from the CCC after massive initial criticism. The € 20 m will certainly have helped pay for a few of the strategy boutique meetings.
More importantly, if the Entsendegesetz had been applied to the meat processing industry years ago, thousands of people in meat processing plants wouldn't have been put unnecessarily at risk or the good burghers of Gütersloh forced to stay at home. Still waiting for the technological solution to treating employees like shit that doesn't involve replacing them by robots.
I'm sure Apple knew that benchmarks would be released so I wonder why they decided to bother banning developers from providing them; apart from the usual paranoia that is. Presumably, because it gives them plausible deniability over benchmarks for code that is almost certainly not optimised, while at the same time knowing that the leaked scores will keep people talking about the speed of the new chips.
Will we see improvements over time as the compiler and emulator get better? Will we see a baseline that allows Apple to gather wows™ when new hardware is released?
Interestingly, Geekbench notes that the chip running on the Transition Kits has four cores…
Not really, no point in shipping big.Little silicon for what is a non-mobile developer workstation and Apple will presumably be aiming to ship beefier kit in its own hardware later this year, otherwise it would probably selling it already.
They certainly tend to stick around once there inhouse. Got a similar problem with a client currently migrating from MS Office 2010 to Office 2019 and sticking with 32-bit because of the fecking add-ons, which are probably no longer even maintained.
Good that there are now other options with things like PyXLL.
Not really, a table at best. But at least Excel provides some kind of data typing, which is why it's become so ubiquitous. I've recently seen this confirmed in a project which relies on data passing through several hands before it can finally get into the database. Using Excel as the file format has led to far fewer errors than I would ever have imagined.
which is now in the 2013 version rather than the 2007 version. This may cause problems for Word 2010 users, to whom the advice is to "upgrade to LibreOffice".
Unlikely: the transitional spec was really developed for MS Office 2007 and was incomplete, so that changes were made even after it shipped. Microsoft itself deprecated Office 2007 a while back so it's not uncommon to come across OOXML documents that Office 2007 cannot process completely but I've not seen such problems with Office 2010 which saw far greater adoption, largely on the promise of stable document formats.
Funny how people from Google never comment about how it always seems to be an iPhone the F.B.I. or whoever, seem to have difficulty with decrypting. It never seems to be an Android phone. Funny that.
Nice bit of whataboutery which isn't true. I-Phones are encrypted by default, with Android you normally have to enable it manually. But that doesn't mean it isn't secure and, just like Apple, Google cannot provide the keys to decrypt an encrypted Android device.
Apple's decision is more about scope: how much functionality can a browser provide safely? For several years there have been many people championing the browser as an OS. In order to do so this requires replicating OS services, which works well in some situations: notifications, hardware acceleration and even location services can make a lot of sense on (mobile) devices. But in other situations it essentially means breaking open the browser sandbox.
We got it once a year from my granma in Plymouth, back when it took 8 hours to drive there… A few years ago I found my local American-British store was stocking it here in Germany. And why not? It's damn fine stuff!
Oi! El Reg, where's the scone with jam and cream icon?
It was probably chosen for two reasons: tax and regulation. Wirecard AG didn't fall under the remit of the BaFin (FCA) but under the government of the very parochial local government of Upper Bavaria. Mind you, it does look like BaFin did also fuck up but it has a history of this on big money projects. German savers seem to be magically drawn to "too good to be true schemes" in other countries.
Yes, I've seen that behaviour in some companies as well. It used to be the argument against open source, until it turned out that commercial software was just as shit but you wouldn't have a chance of finding out until it was exploited.
Personally, I don't agree with either the current practice (all problems can be solved by an update) or the disclaimers in many licences. But the point is that I can't think of any court cases. Unlike, say, those that have upheld the GPL. No doubt there will be some case at some point but IIRC in the US there are some wide-ranging excemptions. Otherwise Microsoft would probably have been bankrupted multiple times in class action suits over Active X, which wasn't just an oversight but a design goal waiting to be exploited.
I don't think the law is on your side. Currently, software developers are not subject to strict liability, which is why we live in a world of updates. Then there are the licences: most open source licences explicitly exclude liability and I've not seen the clauses invalidated by any court yet. It's not as if commercial software is immune to such stuff either: both the flaws in own code but also in liberal use of open source libraries.
While it is possible to check for known vulnerabilities in libraries, there's basically no way around extensive pen testing for modern web-based applications. But getting customers to pay for these is another matter. As is paying for updates of the software stack as new vulns get discovered.
For next few years there will be x86 Macs sold, including new Intel Macs that haven't been announced yet.
Apple said they expect the transition to take about 2 years, ie. 2 years from now they will not be selling x86 machines. x86 will continue to be supported by the OS for "years" (my guess would be 5-6 in line with current practice of deprecating hardware chez Apple) but it may soon become "maintenance" mode if they can sell enough of the new ones.
We won't really know until the new devices appear towards the end of the year and we can see whether Apple silicon does have better TDP and memory performance than x86. For example, same battery life / performance as now but in machines < 1 kg. That would be serious bragging rights. But let's see what they come up with.
I've been running Windows in VMs on MacOS for over 10 years and never found it to be slow. As long as the VMs can use the hardware hypervisor there's no reason why it should be.
There will be a hit for Windows 10 x86 on MacOS ARM because of the emulation. Apple is clearly saying to Parallels and VMWare: if you want performance it's up to you to do it but, again, the hypervisor might help here for CPU stuff. Bigger problems will be with anything wanting to use x86 hardware acceleration: MMX, etc. because Apple might not make any optimisations it's done for this available to other software, as is already the case on MacOS for codecs. But it also had a vested interest in Windows on MacOS not being completely unviable. Guess we'll soon see reports from people running QEMU on the new developer boxes.
Germany took a huge bath when it last subsidised chip factories: the factories were closed before the tax credits ran out. Though, the pressure to repeat this when the coalfields in eastern Germany close will probably mount.
Chip manufacturing is very complicated and not all value is created in the fabs, which is one of the reasons why these are moving to lower (though not low) wage economies. There's still plenty of money to be made in chip design and in the machines that actually make the chips. IIRC the world-leader on chip lithography is a Dutch company. Countries are focussing on keeping those companies in European hands, because once that IP walks, we really will struggle.
You can report IP addresses to ISPs but nothing ever happens.
Such ip addresses are regularly blacklisted by other ISPs. It's just not effective because the addresses can be recycled fairly quickly so that you get assigned one after it has become blacklisted.
More important is setting up honeypots long enough for law enforcement bodies to gather data, not only about infected machines, but more importantly about the networks controlling them.
In general I agree but the boot sound on a Mac is useful if you're trying to get into recovery or single user mode: it tells you when you can let go of the keys.
Likewise, some animations can reinforce the UX by extending the metaphor: think of button presses, slides in and out, etc.
But, I've never understood the fashion for transparent elements in the UI which only seems to distract.
I've used Skype for projects where screensharing is important and it's been pretty good on the whole. Occasionally there are problems getting audio to work properly, so a restart of the call is necessary, but at the moment, as a multi-platform desktop conference app, I'd reckon it's the best: the Electron-based stuff just chews CPU cycles.
Probably, for the two who do it. It was fashionable when the Intel Macs came out for people who wanted the status of Apple hardware to run their Windows apps but virtualisation was generally good enough for most things - I was certainly using Windows XP on Parallels to remote control InDesign in 2008 without many problems.
Rosetta is something that the user will not even notice.
Bollocks. The approach will be very similar to that done on Android after every update which tries as much as possible to apps through a JIT to get native code. Intel was able to make use of this for a lot of stuff of Android for Intel. Except, it didn't work for everything and for some stuff it definitely was noticeable.
Most stuff using Apple's APIs should transpile pretty well but there will always be exceptions and anything making heavy use of x86 specific optimisations could be noticeably slower.
There's no reason to suggest that Apple's engineers will make chips that are any more "secure" (for users) than Intel's.
ARM designs are inherently more customisable, which means Apple can put more stuff in silicon that it wants whether it's video codecs, encryption algorithms or machine learning. This, in turn, should lead to less demands on the CPU which should be good for battery life and heat generation. Custom hardware also makes software even more Apple specific, ie. increasing lock-in for users.
That said, I'm looking forward to the first devices to see how they stack up.