445 posts • joined 22 Feb 2018
Ofcom asks networks, ISPs: Hey, wouldn't it be nice if you let customers know the best deal once their contract's up?
Re: What is this out of contract thingie?
I'd like to do that. Unfortunately, I have been and can give you the results. I have no results. I'm not looking at U.K. contracts, but I assume they're much the same.
For each provider I check, they have several different plans, all of which seem very similar. I'm a very simple customer; I already have my hardware and I don't need very much data. Each plan has the following variables:
Data at high speed cap
Price difference for different number of lines
Very few of this matters to me. Ideally, I'd like a plan where I have a specific price for each type a thing and just live with it, although I'm fine with caps. Whenever a plan as simple as this exists, prices are much higher than all the other plans where details are less clear. Or sometimes, they will have more complexity to make up for it, such as some plans where there is also a cost for the sim itself (I don't know why). I have found no resource that allows for easy comparisons, and I have had enough conversations with the people at the mobile providers to know that I dislike having conversations with the people at the mobile providers.
The bandwidth and storage costs are not difficult at all. Audio can be compressed a lot. There are some codecs that are optimized for voice and can retain a great deal of the data needed for voice in tiny files. That's until they get better at speech recognition, into which they have been pouring millions. Then you're storing text. They already are doing something like this. If you use google's captcha and use the audio version, they make you transcribe a few words from some recording. Since you're usually blocked from whatever the thing is, it's probably better not to think about where that recording came from or what they're using your response to do.
October? That's a while
So they became aware of this back in October, and sent notification in December? If they were able to find and fix the bug in ninety minutes, which seems entirely plausible and not at all some random low number that someone made up, surely they could have identified the people whose data was read in two hours and sent them and the relevant oversight offices notification in three. What were they doing in these intervening months? By the way, isn't there some new regulation around that says notification should be sent in 72 hours or less? The protection of data or something like that? No, I must be imagining things.
Re: Here's the thing...
I somewhat agree with your points, but I think broadband should be defined higher than you do, because dialup doesn't work anymore. If we took your connection down to dialup levels, browsing wouldn't work anymore. Some sites still use small files and have done with it, but with the image, script, and other media-heavy sites out there, a dialup connection would take forever to load it. It is not feasible to use those services without the scripts and images, at least most of the time. So I think we should find a reasonable lower bound on the speed needed to do standard browsing, and then define broadband as some level significantly higher than that. Otherwise, I think your ideas are good.
Re: What does "broadband" get you these days?
Broadband gets you a lot of things. It gets you el reg, and sites like it that don't exactly have tiny text-only article files, and most other news sites, which go straight out past medium sized files into the tons-of-images articles that would load in minutes over a non-broadband connection. It gets you the ability to stream video, which may just be entertainment, but may also be something like online educational materials. It gets you the ability to videochat with your friends or coworkers. It gets you the ability to download a linux update in minutes, or a windows update in an hour, rather than hours and a week, respectively. It gets you the ability to check job boards, go online to reserve a flight, or yes spend some time enjoying the content that the internet provides.
We've seen slower everything. None of us need things to go faster, as we'd all survive even if using a computer from 1990 with its modem access. We choose to use faster things, not because they are truly necessary, but because they are more functional and useful. And if we use those things, then the slow internet becomes obsolete and gets dropped. Like it or not, the 1200 baud modem of yore doesn't work. Even if we were able to connect it to the modern internet, you wouldn't be able to use it for anything. Even an SSH session uses more data now. The modem is obsolete. Dial up is obsolete. They should be replaced.
Re: It stinketh
Take a look at any ISP website and the large amount of space it devotes to its gigabit fiber system. That's always fun to read, right before you click the "is this available for me" button and get told no. I begin to think that they just have one fiber line connecting their headquarters to their off-site datacenter but can say that anyway because they technically have the services; just not in your area.
No thanks. The benefit with a unidirectional hyperlink is that I don't have to control or know the person who controls the other end of that link, and I can link to things that are relevant to whatever I'm saying without having to worry about whether what I'm saying is relevant to them. Worse still, I could see this as the cause of an annoying amount of spam by people wanting to link to my page so I'll link back to them, which I'd have to run through because somewhere there would be one with useful content. Essentially, double-ended links would try to turn us all into a search engine, and I'd rather not do that.
I somehow doubt that would work. You would need to be able to fake a signal from a real windows device, which I presume has some encryption on it so it's hard to mess up, and you would also need to know the contents well enough to make misleading contents that don't get automatically rejected. If they found you doing it, they'd just autodelete anything from your network connection and carry on. If you have a way, I'm interested to hear how you'd get around these things.
Re: Java f'in script !
Re: Java f'in script !
Client side scripting has its place, but to use some of your examples, I could do without intelligent forms. I don't need someone to store my address on my computer in a format that only works with their site. I can use one of the many browser add-ons for automatic filling of forms should I get tired of entering my address. I also don't need overactive warnings every time I already know that, that tell me that my phone number is not a valid number because I haven't finished typing it yet or that a person can't live at an address where the country says "please select". This kind of thing could be done with HTML5 form things that allow simple conditions to be sent to the browser, without requiring as much attention to client-side parsers and incidentally annoying me less.
So Dev Ops fixes everything, huh?
This article does a pretty good job of explaining how bugs can be a major problem. And then it comes along with the following line:
"In a carefully architected DevOps process for a web application, [...] the cost of fixing a bug found late may not be too bad."
Let's discuss this. A dev ops process has no good definition. I've read the dev ops articles here. They have essentially put the dev ops™ label on every known good coding, management, or systems concept under the sun. Unit testing? A primary precept of dev ops™. Ensuring security? Meetings with managers where the developers are listened to? Having firm documentation about policies for development and usage of the code? All dev ops™ concepts. This tends to assume a utopian ideal of code development and management style, anyway. The problem with this is that none of these things are actually connected. Articles about good policy simply have dev ops plastered on them. So there is no clear way to identify what exactly about dev ops makes these bugs so easy and painless to solve.
Or is there? Let's fill in that gap in the quote.
"where a code change can be made, tested automatically and deployed into production rapidly"
So that's a no. Dev ops articles frequently mention agile as a development style. That quote above clearly describes a system that works similarly to agile, in the sense that code is supposed to change often and get into production quickly. That does not have any benefits with bugs. Bugs will still happen. If and when agile is done wrong, bugs will happen *more* often, because managers think that agile coders should always be moving on to some new functionality rather than repairing things. Agile does at least mean that bugs should be patched more quickly, which has a bit of logic behind it. However, it does not have any specific way of ensuring the bugs are less dangerous.
Let's talk about the "tested automatically" stuff, too. You can't test everything automatically. Unit tests are great. I expect competent devs to be writing them and to make any changes go through them. But unit tests do not catch every bug. There may be unit tests that nobody thought of, or someone thought of but then nobody wrote. Worse, there may be a bug that you either can't test for or you won't notice until things are put together. Consider that heartbleed bug discussed in the article. It doesn't really have a meaning on its own. Unit tests of invalid data could have caught it, true, but there are a lot of types of invalid data, only one of which triggers this. Only when combined with a thing like a webserver does this bug become so noticeable.
That's not a thing a unit test, written by one person and never looked at again because "the automatic system will handle testing" is going to notice for you. That's a thing where you want devs writing unit tests and manually running a test suite, looking at the output to think "I wonder what would happen in this case, but there is no test for that. Let's see." and people doing larger real-world testing on larger components. An automatic system cannot possibly try all types of standard input to a large program and properly interpret the results, but a QA department can.
By making testing simply a speed bump on the road to production, rather than a required turn, you make it a lot easier for things to get through inadequately tested. Write fast and fix things when you find them won't work for rocket launches either.
Re: I cant believe that...
What would you use? As I said above, you can't register a domain to use for mail until you already have an email to create the account, which means getting one from somewhere else. I send most of my email through my own domain, but I have to have the domain registration account through another one because the last thing I need is for my domain to break, requiring me to log in to an account that uses that domain in order to fix it. I'd rather not use gmail for this, but I don't see a better choice.
So if we accept that a third party email service is required for this, which should we choose? I'm not accepting those companies that you pay for a mail account, because I've had enough of them fail. They change server settings, move your handler to a different one, or disconnect things. Then you end up in their customer support maze. No thanks. I have put some accounts like this on protonmail, but I am a bit concerned about trusting a service with something this important when that service relies on donations and doesn't really have a business behind it. If protonmail didn't get enough donations, I could not keep any of it alive. Say what you like about gmail, but at least I know that, if Google goes down, my domain problems are probably dwarfed by whatever took Google out. Google can't see my mail, because the domain mailserver handles that. They can't get malicious and log in for me because I have 2FA enabled and they don't know the password (and there is little likelihood that they would try anyway). So the major security problems with these mail services do not apply to my situation.
If you have a better option, I'm open to trying it. So far, I have not found one.
Re: Using Yahoo! mail for something important?
That's the difficulty with domains. In order to have email through one, you have to register it. In order to do that, you need an email address. So the only convenient way to do that is to get an address from someone else and use it to get your domain and then your other addresses. To some extent, you could use your new addresses to reserve future domains, but that could still result in an all-eggs-in-one-basket situation if your primary domain breaks.
For example, I use my own domain for most email going to me, and that domain is backed up with a registrar account on a gmail address. I don't like that, and I'd rather have my own mailserver running that, but if something broke in my mailserver, registrar account, or domain, I'd be completely cut off. So gmail it is. If there is a way around this, I'm all ears.
And one requirement to use this is
Have they set up any system that informs people if they have been included in a breach or at least lets them check? I haven't read everything, so I suppose it's possible, but I would figure that if such a thing existed the company would have referred to it in their statement or the article would have mentioned it. If they indeed lack such a feature, is it because they don't know whose data was breached or they don't want to tell people? Of course, this makes it hard for a customer to know whether to do anything and therefore whether to ask the company for damages. So I'm assuming nobody will be informed.
And the next 7nm laptop processor will be designed by In, er, AM, um, Qualcomm: The 64-bit Arm Snapdragon 8CX
Re: Linux workstation?
So use a standard quiet workstation if you want to move everything to the server. Using a new ARM chip that isn't really that inventive won't change much, so the only benefit of it directly is the cellular connection. You can use your phone's hotspot or a USB modem for this, so it's not a major thing. I'll grant that it might be good for battery life, but you said workstation and implied that it's on the desk and connected to the mains, and you can already get pretty good battery life with a minimal Linux on a low-power laptop. Depending on your requirements for local power, you can use the following, in increasing levels of performance and price:
1. Raspberry pi: It will run debian just fine, provides all standard packages, and can be a perfectly good client for a server.
2. Those windows 10 mini-PCs with atom processors. They usually run Linux just fine, and provide a few more features than does the pi. The atom isn't great for local work, though.
3. Those cheap windows laptops, reinstalled and running as a Linux workstation. This gives you portability, a processor that will handle low-end tasks with ease, and the stability of a battery backup.
4. A NUC or other small computer (most manufacturers have one, and there is one from system76 that was designed specifically for Linux use). This can get you up to an I7, which should be totally fine for most local tasks, especially as you offload to a server for anything bigger than that.
This chip doesn't provide any useful features for your use case.
Re: What is "Fast Enough"?
I agree that it would not be good enough for me to buy one, but I'm afraid that it probably will cut it for the users that don't know what it is and the business users that think these features will make the devices they buy ultrasecure and thus better. Hopefully this doesn't start happening with the next generation of desktops and laptops, but it might.
The business may be legitimate, but it doesn't start being a good idea. For hostage situations, these consultants can actually mediate with the abductors. For ransomware, the business is essentially just paying the ransom, with the only additional service provided being converting to bitcoin for the user. That's not worth this or any markup, and it doesn't fix the problem of people paying ransoms when they should not. But I suppose the business could eventually make itself somewhat ethical.
Maybe, although this depends on the nation's laws. But that's all moot because he DOESN'T tell his customers that. He'll be very honest when telling the ransomware people that, but his customers are under the misapprehension that he is cracking keys in a more honest way, which is probably why they pay him.
It is, at least, false advertising. Given that this person does not actually decrypt anything, it is useless against any version of ransomware where the ransom doesn't help (quite a few of them). In addition, it is unethical to find the actual cost and then to multiply that by 10; that's not what an honest or ethical broker would do. While a broker that was up front about being a broker and what the fee would be would be doing so ethically, it's also a pointless thing and a very bad idea.
Re: This has been available for a while
I can only speak for the one WD drive I have, but it has withstood my regimen, which included a full format and then transfer and reading of about 2.5 TB of stuff. I haven't put it through much more, so I cannot speak for its reliability. That said, I find that any time a drive gets a physical workout, it will either die within days or survive for the rest of time.
Asia Pacific, usually limiting Asia to China south and India east, sometimes including Australia and New Zealand, but sometimes stopping at their north shores. Combined with such wonderful region identifiers as EMEA which is Europe, Middle East, and Africa, because those are essentially all the same (Middle East may or may not include Iran) and North America (Mexico optional), they make up the typical three markets mentioned.
Tape vendors feel the cold, clammy hand of AWS on their shoulders. Behind them grins the Glacier Deep Archive
Hard drives have their downsides as backup media, but they are the most available and functional media for personal use. I do not have enough data to justify purchasing a tape drive, because I have not seen one that is sensibly priced for home use. So what other choice is there? I could use optical media, but that degrades as well and even the largest capacity conventionally available, blueray, isn't very big when compared to disk. I could use exclusively SSD storage, but that would be much more expensive, and can also fail. So my backup strategy uses disks. They are independent such that I can sustain failures in some of them without losing anything, and I don't have a better option. If there is something I haven't thought of that would be feasible, I'd like to hear about it. For now, I have a system which is pretty reliable.
That's true, but glacier provides a lot of stuff that those hard drives don't. Among other things, I don't trust a two-bay RAID system. That looks like a really easy way to lose both disks, because one will fail and the other one would probably fail too when you use it to mirror again. It also doesn't protect against something that takes out your NAS, such as flooding, fire, lightning strike on the power or data line, theft, or dropping it. I'm not always advocating for cloud backup here; I tend to put a lot of backup data on hard drives which I store off site, but the two are not comparable.
Re: I would love to see their service engineering docs
Engineer: Well, we capped it at eleven nines before for an event that has a likelihood of occurring once in a hundred billion years, like the sun going red giant on us. That would mean that our geographically redundant system wouldn't help much.
Marketer: What would you need to add some nines to that number?
Engineer: Well, we already added one last time on the assumption that we'd have travel outside the solar system by that point, so a really cataclysmic thing would only happen one in ten times a star destroys all its planets. And because you wouldn't stop badgering us. So we can't really--
Marketer: [interrupting] We need something new to distinguish our new product from our old product.
Engineer: Well, it costs less. How about that?
Marketer: But it's also better, right?
Engineer: We already have tapes with redundant copies stored in libraries on six continents, in a total of eighty nine datacenters, in order to let me sleep soundly with all the nines we have now. So we did open a few more datacenters to store the tapes, but not really enough to make the number any longer. Also adding more nines would be pretty pointless.
Marketer: I give up. We'll think it over and see what can be done.
Accountant: Just slap two more nines on the number; it's meaningless anyway. We haven't listened to the engineers for years. They still think we have eighty something datacenters all over the world.
Marketer: What do we actually have?
Accountant: A warehouse in South Dakota.
Marketer: So our real level of reliability is?
Accountant: It's a pretty good warehouse. Maybe 99.9% or so.
Marketer: What happens when people find out?
Accountant: I'm still paying that engineer, the one who's in charge of and thus responsible for our entire glacier system, aren't I? Don't worry, I've covered every contingency. Our employment and liability is 99.999999999999% secure.
Marketer: I should probably quit before something happens, right?
Re: The question to ask is whether self-driving trucks will ever be a thing.
I disagree. The reason I want one is for efficiency, yes, but my efficiency while riding in it. I can use the time spent in transport working on things, or reading, or something useful. Obviously, that's not safe to do yet, but there are real benefits to the users of the personal cars in addition to companies doing transport. Nobody thinks it is acceptable if a car will just crash into things, which is why I'm not expecting to get one of these for at least a decade, but there are many reasons to want one, even if you personally don't.
I have a suggestion that might help to reduce vulnerability to notPetya and similar malware, and it is to PATCH THE ETERNALBLUE VULNERABILITY ALREADY. The patch involved was released for every windows version in March of 2017, and the first time it became really obvious that that patch was important was in May. It's now been eighteen months. What excuse is there for leaving eternalBlue open for this long? Now every basic malware release uses it, because it's evidently still working. Fix it.
I'm not arguing that it makes business sense to do any of this, nor that I actually expect Google to even consider it, nor that if Google did it that China would improve. However, it would be a way they could comply with Chinese law without abetting the human rights abuse. That was the original question: were we who view Google's Chinese search engine as a problem suggesting Google break Chinese law. I answered that question.
For the record, just because someone else will do a bad thing does not make it reasonable for you to do so. I also don't remember Germany blocking Google because it didn't censor, nor their asking for a special google that censors. China and other dictatorships seem to be the ones asking for that, so let's not make up a theoretical scenario which is most definitely not happening.
No, it's not. It is an argument for google to do the following:
Do not comply with China and make it easier for them to abuse human rights.
They can implement this entirely without breaking Chinese law by doing one or more of the following things:
A) Do not start a search engine in China, because China will use it to abuse human rights,
B) tell China that Google will not start a search engine under those limits, and that Google will not help other Chinese companies to do similar things, to put pressure on China to drop its policies,
C) use Google's power and large megaphone (a note on the google search home page might help) to raise awareness of China's abuse of human rights and suggest that democratic countries put even more pressure on China to make them stop their abuses.
They could do many things without breaking Chinese law, either to actually help the situation or at least to not be complete hypocrites on the issue.
Google's ethics code
Employees must always do their work ethically, without harming the world, Google's customers, and Google, in that order. We aren't the most important thing; human rights, physical safety, and the like all outweigh our corporate interests. Whenever you do something, consider this. Then do whatever we tell you to do because you have no idea exactly what we'll do to you if you don't. Those who are curious are welcome to go to the basement of the administration building and peruse the records located in room 001. That room has a double door system, just because we don't want the papers to get wet or something. Don't worry that they say "soundproof" on them.
Meanwhile, maybe they can make it work
While they're taking a look at this to see whether there are really any security problems that need fixing, perhaps they could make calls over WiFi functional? I frequently visit a building where mobile signal is terrible, and completely nonfunctional without leaning out a window (people literally do this). It does have comprehensive wired and WiFi network connections, though, so I figured I'd just enable WiFi calling and we'd be back to normal. Which we are in terms of quality when a call is established; it's usually very clear. The only tiny problems are that:
1. Calls drop randomly, requiring a reconnect,
2. After receiving and answering a call, I have to spend about five seconds waiting for the connection to happen so the person on the other end can hear me,
3. When sending a call, there's a fifty-fifty chance that it will go through immediately with good clarity of sound or immediately drop making the person think that I've just called them and immediately hung up,
4. When moving, and therefore changing from one AP to another, quality for the other person drops. At least they complain that they cannot hear me anymore, although I can always hear them fine.
No, I don't know why this is, but it really isn't helping. So maybe they could figure out why and fix it. If they find security problems on the way, fix as needed.
Re: The irony....
I believe you may have missed the point. The irony is that data about facebook was recently obtained, but facebook didn't want that data released. So facebook violates their users' privacy but want to keep their own corporate data private.
The rest of your comment is good though.
Re: Ipads @ work
"I do dev work as well and:"
1) "I prefer to have a RAID system to ensure I can keep on working even when the drive fails, because reinstalling would take time, and VM are not the solution to every need"
I do this with full-disk backup images which I can deploy in minutes. The advantages of RAID do not outweigh the extra kilos for me. If the disk fails, I switch it, press restore, wait five minutes, and resume working. Also, I haven't killed my disk to require this.
2) "Good graphics cards are not for gamers only, today."
No, they're not. But I still don't need one. Whenever I need that type of processing, I typically offload that to a dedicated machine with a lot more processing than a desktop would have. I don't have a reason to have a graphics card in my dev machine.
3) "Storage is still needed if you happen to run VMs locally, or other storage intensive stuff."
Granted. I have two disks in mine for a total of 1.5 TB. I don't process a bunch of images or video to need more, at least on my work dev machine.
4) "I don't want to take all my code and documents with me when I go to a demo or whatever - that have to stay safe in the office. Sure I have repositories and backups, but I'm obviously worried about stolen data, not lost one."
Hence my disks being encrypted. And I might demo to my managers or their managers, rather than to clients. Or maybe I'd like to work near someone else on my team, or in another office, or from somewhere else because my company will let me. Then, I might want access to my code.
5) "Desktop have better cooling, and can work at higher speeds for longer, thus usually have more powerful processors."
Yes. All true. But my point is that I need a sufficiently powerful processor. An I7 will be perfectly enough for my needs. I do not need a xeon, and it wouldn't really help. When more processing is required, I offload once again to the massive compute resources available to me.
6) "The docking station takes space on a desk, especially for larger, powerful laptops. I can put my "desktop" under it and free space (should I call it "deskbottom"?)"
I don't use my laptop screen when it's on the desk, so I put my docking station on the bottom. My desk is quite free.
Re: Ipads @ work
I'll have to agree on the laptops over desktops. Not all the time, but at least for me. I do relatively intensive dev work, and I do need a lot of processing to do that. I can get that in a laptop as well as a desktop (I need a good CPU and quite a bit of memory, but I don't need a graphics card, a lot of storage, or tons of ports and drives). The laptop can sit on my disk in a docking station for connecting to any hardware I want at work. It is useful, however, to be able to relocate my working machine. I can take it to demos or to someone else's area without copying stuff, and I can work remotely. My job allows me to work from most places, so I can do work (my expected work in expected hours) from other places if that is more convenient for me. For this reason, I prefer the laptop.
Re: Exam question.
To some extent, that's true. Still, there are situations where you can't get people to tell you why you're doing something. It's entirely logical, if told to sort a box of bolts, to do so. After all, if the person telling you to do that just wanted the biggest one, they could either have taken it out at the beginning or asked you to find one with the required specifications.
Sometimes, you want someone to just do what you ask of them rather than to believe they can do it better if only they know everything there is to know about what you're doing. For example, if I ask someone to find the most cost-efficient machine with a set of specifications, I only want them to look at the options, eliminate those that have worse specifications, and choose the one that has the lowest cost (perhaps taking into account other things that they can definitely ask me to elucidate). I do not need them to question me as to whether I want more power because they found one that's only a little more expensive, nor do I need them to suggest that we'd probably be fine if we bought machines with less memory. I set forth the specifications and gave them a task. If they're going to change the task instead of just doing it, perhaps they should be doing my job.
The same is true of software jobs. I cannot deal with every part of a project team thinking they can and should be designing a better system for every component. Their system for some of it may in fact be better than the one we're using, but if it doesn't integrate and won't without doing a lot of work first, it can still be worse. And if each team or team member comes up with their own version that doesn't interact, we spend forever getting things back together. That's why abstraction is so key; figure out the best way to do your job, not the best way for someone else to do theirs or even for you to do someone else's job. If you have some improvements to suggest, go ahead, but don't neglect what you're supposed to be doing just because you don't like someone else's work.
Re: Apples and oranges comparison
I think we can make a case for machine code as the most popular programming language. It can be written to be fully Turing complete, can store data, can do anything you like to that data, and is used if not written by every single programmer. It is also the most popular programming language used by people who don't write code professionally or at all, and is present on every running computer in the world. Every github project eventually turns into or relies on something else that is in the form of machine code, and all questions on stack overflow can be reduced to asking how to get machine code to produce a specific result. I hope you will agree with my analysis, which I performed after several months of tedious but, I'm sure you'll agree, tremendously vital research. I'm off to reserve a something.io address and send out my press release.
Re: I'll say it again and again
I'll vote no. One unified email system, controlled by not me, where I can't decide how it runs but my government can (they never do anything I dislike)? Such a system designed specifically to not work in the situations where normal email works? Cryptography that provides security against not much unless the government's planning on releasing it? Bad idea all around, methinks.
Re: How Can You Tell Without Opening it?
When the subjects are students, they only have one mail account, that being whatever account their university gave them. When at least one of the messages says it came from the university's IT department, that is a logical address for it to be sent to. Until you open it and inspect the headers and/or the content, you do not have a reason to know it's phishing from the subject line and the text in the sender column.
Re: C'mon .. it's 2018 - where do you find students with "no knowledge of phishing" today?
I wouldn't be at all surprised if most of those who bothered to take the survey were people who knew less about the topic. Among other things, I'd typically be wary of filling out a survey for people that just sent me phishing messages, researchers though they may be.
Re: @Richard 12 - Trust is easily lost and slowly gained.
IBM may have stopped being so successful, but I don't remember anyone protesting against IBM and specifically *not* buying their stuff. They just found a better option, and went with it. Therefore, I doubt that Microsoft is going to see a ton of their corporate customers suddenly jumping ship. A few, yes. Maybe eventually all of them if the products don't work after several years. But I see no evidence of a large group of corporates getting tired of windows and removing it entirely. I see many corporates cheerfully using modern windows systems, and others still using windows 7 but without plans to move off windows when 7 becomes unsupported. I do not expect Microsoft to crash and burn.
I don't know phones anymore
I no longer really know the differences or specs of all the different phone models that come out. Other than that most are far too expensive and include some random skin over android. Does anyone know a phone that meets the following extremely long list of requirements:
1. Costs less than $200 or so.
2. Runs a modern version of stock android or has an unlocked bootloader so I can put one on.
That's all I care about, really. I don't care about the camera (with the frequency I use it, it would be fine if it just didn't have one). I don't need a replaceable battery. A headphone socket or SD card support would be minor pluses, but not needed. I don't need any special extra hardware built in. Just a modern enough phone that does not cost as much as a fully specced-out desktop.
Re: Soooo ....
Of course not. The pound of feathers will form an untidy heap on the scale, and feathers will start to waft away almost immediately, worse if your scale is outdoors, indoors near an air sensor, or indoors near any moving thing including humans. So a pound of lead is heavier than a pound of feathers unless you place an item of sufficient weight (about one pound) to force the feathers to stay on the scale.
So what if we're on a different planet. The second does not need to change lengths. Whatever you change it to, the planets' days will still not line up, because you can't find a useful (or maybe at all) GCF of the rotations of every rock you decide to put something on. So your best option is to keep using the second, minute, and hour, because then at least you can speak of durations in the same way as people on the other planets.
Maybe for convenience, you could define a rotation unit for the planet you're on to speak of time of day when discussing with people on your planet. When dealing with anything not on your planet, you will need a standard calendar where absolute dates and times could be used. I don't see a date like "2345-06-07 08:09:10 Gregorian, local time 12.0000" as in any way problematic. It tells me the absolute date, allowing me to compare in nearly zero time whether this happened before or after some other event. Meanwhile, I know this occurred at midday, assuming they decide to stick with the concept of 24 sections of a day. If they don't want to do that, how about percentages for local time? That way, a planet with a long day will work perfectly well. 0% = midnight, 50% = noon.
And the second is perfectly defined using a seemingly random number of periods, because it is equal to the second we've been working with for a long time. Why redefine the second when almost nobody is actually using cesium to measure it? The people running atomic clocks can divide, while we can continue using all the standard second-based things we've used for a long time. Meanwhile, we've already limited this to running at sea level, so we can't avoid being arbitrary. For now, convenience. For later, simple utility. One arbitrary thing that prevents inefficiency is superior to two arbitrary things that require us to switch them. That's why we should stop changing our clocks twice a year.
Re: Sad case of science ignoring the evidence
They redefined the kilogram because we already know what the kilogram is. We know this with things that can actually be calculated easily in a science lab, like a specific quantity of a substance with a known density, measured under a known pressure. That isn't infinitely measurable, so it can't be used by the SI people, but they didn't need to investigate why the kilogram chunks had different masses because they knew why and they didn't need to find the one true kilogram because they knew what a kilogram is. They just needed a math problem to give the mass of a kilogram so everyone else in labs can keep measuring mass exactly the same way.
Re: I would rather
By put to sleep, what do you want to continue running when it is sleeping? How quickly does it have to resume from sleep, and what triggers it to do that? For most available triggers, WiFi signal for example, enough of the system remains up in order to receive and decode that that there is little benefit to putting it to sleep.
Super Micro chief bean counter: Bloomberg's 'unwarranted hardware hacking article' has slowed our server sales
Re: it all comes down to size
Actually, do discount something when evidence is lacking. Don't discount it all the way down to zero, but "because it is possible" is not enough to believe something. It is possible that I run a group of people who break into peoples' homes, yours earlier today, and insert malware onto any computers found there. We are very good at getting through locks without leaving a trace, and our malware hides in the firmware where you can't easily find it. Prove that didn't happen. That is not a good enough argument. I believe this story to be unlikely. The things stated are feasible, but with the lack of substantiation from external sources or release of evidence, I believe it to be less likely.