Photo
He seems panicky, she seems sort of glad. Which files were those?
Apple's recently revised file system, APFS, may lose data under specific circumstances, a maker of macOS backup software is warning. In a blog post on Thursday, Mike Bombich, creator of Carbon Copy Cloner, says that APFS sparse disk images fail to accurately track available free space, thereby allowing storage operations to …
This post has been deleted by its author
IOS is still an utter joke.
Does it even properly multitask yet? Nope. Does it even allow 3rd party apps to install services yet? Nope. Can I use NFC for anything other than payment? Nope. Does it have a filesystem yet? Nope.
I have a Sony A7 DSLR and its a breeze to use on Android, display the pic on the camera, tap my phone on the NFC and its transferred the high resolution pic to my phone which in turn uploads the raw to my free 1tb FLICKR storage next time I am on WiFi.
On iOS, its an absolute nightmare. Only built in apps can have services, so I need to manually launch the Sony app, I can't use NFC for anything but apple pay, so I manually need to initiate the transfer from both the camera and the phone. iOS doesnt allow me to have watch folders and services for the flickr uploaded either.
This is of course one example, but there are hundreds. I ditched my iphone 8 for a pixel 2 and its so much better in every respect. iOS seems to have been adding all sorts of useless bells and whistles and Jonny ive gimmicks whilst stock Android has been building on a very solid foundation and adding true value features (android 8.1 has come on a very long way from the last time I tried Android v5)
Yes is properly multitasks and has from day 1. From your definition of multitasking possibly not. iOS 11 on iPad is probably more what you mean though.
3rd party services? No, correct.
NFC for anything other than payment? Yes you can as of iOS 11, it is admittedly limited a bit at the moment.
Does it have a filesystem? Errr of course it does. Does it have a filesystem area that all apps can randomly write crap into? No. Each app is sensibly sandboxed, but files can be written / read on device to the Files app area.
> Does it have a filesystem area that all apps can randomly write crap into? No. Each app is sensibly sandboxed,
I've always been in two minds about this.
While I understand the sandboxing from a security perspective, it has always rubbed me the wrong way.
The way I see it as me, a user (as opposed to as an IT person thinking security) is that it is an app claiming a file. That is, it is the apps file, because only that app can open, or even see, that file (unless you go through some transferring stuff - migrating it to a computer, then sucking it back into another apps folder - and I hate that term too, its not a folder, it is a directory!).
But that file, say resume.doc, or photo1.jpg, is my file, not the apps file. Therefore I choose which app, at what time, can access that file, not Apple, not iOS.
The reason I hate the term 'folder' when referring to a directory is that the term completely misconstrues what it is and what is going on. A folder is a container that holds things, like documents in a ring binder or manilla folder. That is not what a directory is, or how it works.
A directory is what it's name implies, a directory of files, that is a list of what files there are and where to go and get them, like a telephone directory (it doesn't contain people, it contains information on how to contact them). When you select/open a file within a directory, the filesystem goes out and gets the file (the data) from random places all over the underlying storage medium and presents it to you as a complete whole. That data is not inside the directory, the directory just tells the filesystem where to go to get it.
Technically true, but just as technically irrelevant to 99% of people using that system to store their files (and simply don't care what the OS chooses to call the "same" things where it keeps it's files).
For them, a directory is a folder in which they put their files. How the content of those files is scattered across the underlying storage is (quite rightly) of no interest to them, and indeed the file system and OS work hard to make it unnecessary for them to be concerned about such things.
A "folder" is nothing but a term for a directory that has a specific user context overlaid upon it - remove the user context and in the system context it's still a directory. A rose by any other name and all that.
Now, if "folder" could mean something else entirely in the context of a file system thus giving rise to confusion and miscommunication then there might be a real cause for complaint. But otherwise it's just jargon fetishism/puritanical technoclature.
I'm all for using accurate language, but technical terminology is designed to aid communication and for utility the context is just as important as strict technical accuracy.
When discussing the file system in the absence of any user context, consistently referring to all such things as directories may make perfect sense. But when the context involves specific user experiences on (e.g.) macOS then it is just as sensible to use terminology appropriate in that context e.g. to differentiate in macOS between a directory of user documents (folder) and a directory of application contents (package), which are both "directories" at the file system level but very, very different in terms of the user experience.
- ICON: I'll get my outerwear
("coat" is a specific user context and I refuse to call it a coat when it might not be a coat but could be a jacket or a smock or an overall or an apron or .. )
Don't get me started with the abject idiocy that is "libraries" in Windows. A collective way to group disk locations and see/read them together, that's fine to a point as long as it's an optional view of the system. However when some idiot at Microsoft decided that it should also be possible to save to a "library" and to hell with there the file is actually put, that's stupid taken far too far.
Luckily a couple of braincells seem to have randomly passed by close enough on this one and libraries are pretty much depacated in Windows 10. Unfortunately the same lackwit who came up with libraries in the first place seems to have had a hand in the navigation tree that's shown when opening File (Windows) Explorer in Windows 10...
@eldakka: the "folder" terminology may be disconnected from the underlying reality, but the layperson analogy is consistent: a folder contains "documents", not files.
I just checked: fired up a Windows VM (something I don't do often - I am of directory/file persuasion myself) and Windows Explorer, went to the "Documents" directory... eh, sorry, folder, right-clicked and chose "New". I get a "Microsoft Word Document", a "journal Document", "Text Document", and all sorts of other stuff (including something called a "Briefcase" - no idea what this metaphor is about) but not a single "File".
A quick glance at Apple Support pages (no Mac here) seems to indicate that they use the term "Document", too.
Yes, I am with you: it's files and directories for me, and a directory (as well as everything else) is a file. ;-)
I think the technical difference is that a "directory" is always a filesystem object. MKDIR, RMDIR, and so on attest to the low lying structures on disc being, correctly, known as directories.
"Folders", however, are a user interface concept. While they generally map on to directories, they can appear in a completely different place - "My Documents" is a high level folder belying the obscure place the stuff is actually kept. However folders don't necessarily need to be filesystem objects at all - "My Computer" and "My Network Places" to give two examples, they're both there in the left hand side of Explorer as if part of some imaginary filesystem, but they aren't real, just an easy metaphor.
Android apps are also properly sandboxes, u less you grant the write external storage permission. However Android also has a browsable filesystem. You can't browse applications sandboxes, but you can open up a file explorer app and browse downloads folder, browse all your music stored on your SD card etc etc..
Its the perfect half way house, where application data is sandboxed and private by default for most types of apps, it with a permission based extension to make things work like a real device for more complex or open storage requirements.
> Its the perfect half way house, where application data is sandboxed and private by default for most types of apps, it with a permission based extension to make things work like a real device for more complex or open storage requirements.
Ahh, but here's the thing. A file such as resume.doc, shopping list.txt, funny_photo.jpg are not application data - as in , belonging to a specific application. Unlike say a data file that is proprietary and can only be read by a specific application.
They are my data. And if they are a standard, non-application-specific type of data - jpeg, mp4, mpeg, h.264, matroska, ogg, png, divx, txt, whatever, then they are 'general' data, not app-data.
If I put a video file onto the device, I should be able to open it with whatever video player I have installed, I might have many, I might have an old reliable one, or have a couple of "hey, that looks like a cool media player app"-type installations, any of which I should be able to use - switch between - at will to play the same media files.
"From your definition of multitasking possibly not."
I'm still on iOS 7 (no access to an Apple shop and ~5GB download with rural broadband is a joke); however could somebody please explain why downloading a file with Dolphin and then using "open in" to push it to PDF reader, VLC, etc takes forever and risks losing the file or pieces of it in a way you can't fix (no filesystem access) if you should dare to switch to another app while the copy is happening?
I know nothing about how iOS works internally, but it's certainly behaving like "read a byte, page in destination task, write the byte, page in source task, read next byte, etc" with a lazy "to hell with it" approach to termination because it's concept of multitasking seems to be little more than "the music player can run in the background and once in a while it'll check for emails". So, yeah, not what I'd call multitasking (hell, the music playback could just hang off a "buffer needs data" event or a regular ticker interrupt)...
NFC is a real pain on iPhone.
We use NFC for configuring Wi-Fi... But we also have a printed sheet for Apple users.
Our MFCs auto configure with smartphones, just hold the phone over the marked corner of the device and it sets up... Unless it is an iPhone.
My Jabra Eclipse headset, just hold the case to the back of the phone and it automatically pairs with it,.. Unless it is an iPhone.
My Sony headphones, just hold it to the back of the phone and it automatically pairs with it... Unless it is an iPhone.
I use a Yubico Neo as second factor authentication for LastPass. Just hold it near the phone and it unlocks the password safe... Unless it is an iPhone, in which case, you can't open LastPass, unless I remove the second factor.
While I can understand NFC is a conveniant way to set up WiFi and MFDs (I assume you mean MFDs), I've never understood why people even bother with it on headsets or headphones. How many times do you have to repair your devices? I use various bluetooth devices (watches, headphones, tiles). I've needed to pair them *once*. I would argue the time saved isn't even worth the cost of installing the circuitry.
@Stuart Castle
I have just changed phones, so had to set up all of the wireless kit again. The 2 headsets and wi-fi were done "automatically", the radios, in-car enternainment system etc. had do be each done by hand - put phone into bluetooth search mode, put device into pairing mode, wait for them to find each other, enter pairing code and confirm.
You are correct, you only need to do this once per device, per phone (car radio is paired with 2 phones, for example), when you have to re-pair a lot of devices, it takes a lot of pain out of the process.
Unless something has changed recently, MacOS doesn't handle media write failures. The kernel logs an error but it doesn't bubble up far enough to properly abort an operation. The first symptom is corrupted files or volumes once the unwritten data flushes from cache. Drobo NAS don't handle media write failures either.
> Maybe this is Apple's Windows XP moment
What's wrong with Windows XP (in the context of when it was a current O/S)?
It took so long to replace it because it hit the spot perfectly, basically win2k with a reskinning. So it was hard for MS to come up with a good enough upgrade over it to make it worth replacing.
It's still in use today because, at least until MS stopped maintaining it, for the time, it was excellent (just like OS/2 is still being used in some niche areas because it was good for what it was used for at the time).
You must be thinking Windows Vista, and 8.0.
Windows XP was one of the worst versions of Windows that Microsoft released, only Windows ME probably had a worse reputation (I won't count Windows 1, because that was never really a mass market product).
XP was so insecure, it was downright dangerous. The programming quality was appauling. I'll leave the Fisher Price look-and-feel aside, you could at least switch it to classic.
Microsoft soon realised that they had a complete dog on their hands and stopped development of Longhorn and concentrated on making XP into a product that they could actually sell with a straight face. They put together a huge QA effort to tighten up the (very) lax security in XP.
It was first with SP2 that XP became a stable and, for its time, secure operating system.
> XP was so insecure, it was downright dangerous.
That is downright incorrect when talking about the consumer space.
Window XP was, literally, Windows NT 5.1 (where Windows 2000 was NT5.0). In the consumer space it replaced the aforementioned Windows 95 and children variants (98, ME, etc.)
It was heads and shoulders above the previous consumer-oriented OSes available at the time.
Sure, there were other OSes that had better security, UNIXes, Linux, Mainframe OSes and so on, but in the consumer space it was a vast, vast improvement on what came before it.
The typical consumer O/S prior to XP didn't have any file security at all. All different accounts on them did was allow different configuration values, backgrounds, menu items, and so on. Login to an account on the earlier consumer windows and you could access any other accounts files.
Did XP have issues? Absolutely. Name any OS that doesn't.
But compared to what came before it in the consumer space, it was a huge improvement. Everything else in Windowsland since XP has been mostly just iterative improvements.
In the professional workstation market was XP an improvement? No, because it was just windows 2k with a few extra visual bells and whistles to make it more consumer friendly. And there were - and are -
more competitors in the workstation market where people are willing to spend money on the OS, That was what Windows XP was, Microsoft merging their consumer and professional base-OS (ditching the Windows 3.x and 95 line) into one OS code-base for them to maintain.
@eldakka
It might have been better than Windows 9x, but that isn't saying much. It had no firewall as standard, neither did previous versions, but this was coming into the time when computers were being attached to the Internet. A Cheverolet Matiz is better than a Trabbant, but it is still no Camaro ZL1.
It still had filesharing turned on by default on Internet connections (C$ available), for example.
It was only with SP2 that all of the most serious (known) security problems had been properly addressed.
Microsoft had misread the market, or missed the Internet boat, to a certain extent and their lax security came back to bite them.
If that was not the case, why did Bill Gates stop all development other Windows projects to concentrate on fixing the security and quality problems of XP?
It was derided and corporations didn't want to upgrade to it, until SP2 came along - which is where the saying came from, that you shouldn't buy a Microsoft product until the second service pack had been released.
Re: It was heads and shoulders above the previous consumer-oriented OSes available at the time.
It was. But that doesn't mean it wasn't dangerous. XP was marketed as being based on the secure NT platform, when, although we didn't know it at the time, the NT security was half hearted at best. I would argue that this made it more dangerous, because people thought it was secure (even Microsoft didn't pretend the consumer editions of Windows were secure), so were less likely to take care when using the machine.
XP was relatively secure at the end, because Microsoft changed their whole approach to security before they release SP2.
> Windows XP was one of the worst versions of Windows that Microsoft released, only Windows ME probably had a worse reputation
What about Windows 95, as I think it was called?
I wouldn't possibly know since by the time I was already on Linux, coming from a Unix background, but judging by the swearing from friends and colleagues it did not seem to be one of Redmond's most solid products. In contrast, their latter stuff seemed a lot more stable.
"What about Windows 95, as I think it was called?"
From personal experience... XP SP3 is pretty good. My PC still runs that, but then I don't run the PC much these days so it's okay. ;) I did briefly try Vista but between complaining about everything and choking on drivers, I eventually gave in, formatted the drive, and put XP back on.
Win98SE wasn't bad for a Win32 era system (practically zero memory protection, process protection, or anything else to stop programs screwing each other up). It was pretty reliable for me, though I know some people had never ending BSODs with it? Maybe I was lucky and it liked my machine?
Win95. What a piece of shit. Okay, we could see where Windows was going and it was better than 3.11, but still, what a piece of shit. My two most memorable moments were it blue screening when trying to read from a CD-ROM that it had already ejected (slow handclap), and the time when I realised that everybody saying the standard way to fix Win95 was to format the harddisc and reinstall everything was completely serious and not taking the piss. During the W95 era I had two PCs. One was shiny and new and had W95, the other was a 486 box running WfWG. Guess which one I used? Hell, I'd even use Windows 3-something on the RiscPC's co-processor before I'd use W95...
"What's wrong with Windows XP (in the context of when it was a current O/S)?"
Ask Microsoft. It was so bad that they canned the Longhorn project and put all hands back on fixing XP. The result was XP Service Pack 2, which in its SP3 variant is probably what you are thinking of when you think of XP as a much-loved OS that everyone refused to give up for ten years.
It is the only time I can recall in Microsoft's history where they went back to an existing release and fixed the bugs in it rather than simply put out a different set of bugs under the banner of a new release.
The article says "a recent Bloomberg report suggests Apple is cutting back on features in its operating system releases this year to focus on stability and quality,"
Erm.
What the Bloomberg report says is that Apple's changing the way it handles new software development and releases. One thing Apple's doing is giving engineers the power to delay the release of new features until they're working properly. That's not cutting back on [new] features as such. What it's doing is putting quality and stability ahead of the rush to fill a feature list and beat rivals to whatever shiny new goal the marketing people think needs to be achieved.
It probably is an implied admission that Apple's been a bit slack lately with some aspects of software quality.
But you know what? I remember the 1990s when restarting a Mac because of a Type 1 or Type 11 error was unremarkable routine - maybe not daily, but generally more than once a week. The only time I restart my current Mac is when it's done an OS update. Software quality is better now; not only in Mac-land, but over the fence in Microsoft territory too.
I did sometimes use Unix machines in the 1990s (unofficial access to a university's gear due to me knowing the sysadmin please don't ask but it was mostly because it was nearer to home than the university I was attending at the time and he could be bribed with coffee and chocolate). They never needed rebooting except for OS updates, so it's not entirely impressive that Apple and MS OSes have caught up with the late 20th century...
MacOs is a certified Unix: Wikipedia. I've been around *NIX stuff since the 70s, and a fair bit of what I have used since then works in their bash CLI: ss64.com link.
Standard way to get the best out of an IT department. Throw a 5kg bar of premium chocolate in through the door, slam it shut fast and barricade it, wait for the howls and thumping sounds to die down. Carefully open the door to make your request, be sure to ignore the general dishevellement of the staff and the broken furniture. Enjoy benefit of first class services for the entire rest of the week.
Personally I think development methodology has less impact than team quality, experience and management attitude - if you have a team where they are all in 20s and management concentrating on target features and release dates, chances are you are going to have lots of issues in code. Get a good team with sufficient experience and management who understand that just because someone is half the hourly wage doesn't make them twice the value and shouting doesn't make problems go away - then you'll generally have less issues
Design, Coding, Peer Review, Requirements Tracability, Testing, QA, and Documentation.
A partial list of the failure points for this to make in into the wild.
Oh, and Management too. Because it's their job to make sure that the others do theirs.
Agreed. I've seen bugs like this in sparse file handling in commercial Unix file systems several times over the last 25 years. You have to be imaginative about test cases to predict use cases although ENOSPC seems kinda obvious to me. Thinking Different shouldn't equal magical thinking.
In the Blog post 'Mike B' says
"This week we reported to Apple a serious flaw in macOS that can lead to data loss when using an APFS-formatted disk image. Until Apple issues a macOS update that resolves this problem, we're dropping support for APFS-formatted disk images."
Then he goes on to say that it only affects 'Sparse' Images.
Then at the bottom he says (who reads down that far these days eh?)
"Until Apple resolves this disk images bug, we strongly recommend that people avoid using APFS-formatted sparse disk images for any purpose with any application."
He's got all the clicks and promotion of his product by the first paragraph.
He could have said "we're dropping support for APFS-formatted Sparse disk images." (my ephasis) but the headline writers may well have passed over it and therefore he'd not get the Interest in (a.k.a. clicks) his product.
I would say that dropping support for APFS-formatted images, in general, is a good idea until the thing can be reviewed and tested properly. Who knows what other cans of worms are lurking under the surface.
It might be an isolated bug, but given this is backup software, I'd play on the safe side as well and not use APFS images until they have been fixed and thoroughly tested. Anything that is mission critical, like backups, should not rely on known buggy software. If it isn't 100% reliable, neither is your backup!
That is why you use tried and truted technologies for backups and you test them, you test them again and you test them regularly, once they have been implemented. You don't want to wait until you have a disaster, only to find that your backups are actually zero bytes long or corrupted!
Decades ago I proposed an inexhaustible file system. It was based on the premise of (write, data) without any pesky (write, data, status) worries. The data always went into its repository with 100% reliability. Looks like the idea might have been stolen by Apple. I am contacting my lawyer.
...is when going from OSX to macOS, High Sierra has had me in shock repeatedly with all the bugs I've had to deal with. I've never been so livid over software issues on anything before.
At this point, I honestly don't think Apple can make a sensible minimally acceptable quality software. Having finally acquired another PC and using Linux Mint, the DIFFERENCE is STUNNING. The Latest Linux feels like OSX used to. This was an act of desperation to get everything I need to do off my Mac. With the introduction of High Sierra, my faith in Apple software has been shattered.
Steven Sinofsky, former president of the Windows group at Microsoft, suggested people are just imagining things.
When you choose to quote someone, you should go for a person with credentials ... Steven's opinion on Windows is probably acceptable, however, what any [former] MS guy thinks of "software stability" is totally irrelevant, in any context I can think of ... I am not saying he is wrong when he compares macOS with Windows (if I understood him correctly), I am saying Steven's opinion here simply does not count.
@Steven, thanks for stating the glaring obvious... you should probably give Linux and/or FreeBSD a try, you'll be surprised, I am sure, to experience ultimate "software stability".
We run a mixed Linux / Mac environment, having (happily at the time) migrated from Windows to MacOS in 2003. For the last 15 months we've been storing all data that matters on ZFS filesystems - not just the Linux servers but also on 2nd internal hard disks fitted to our Mac Book Pros and Mac minis (purchased before Apple removed this facility and replaced screws by glue). APFS is very incomplete in comparison with ZFS, and ZFS is incredibly robust and portable.
We're now looking at migrating back from macOS to Windows 10 for some of our applications, primarily because Apple appears to have no concept of (or is that interest in) backward compatibility at the source-code or binary level, and we're fed up with so many things breaking at each major new version of macOS. I have commercial software for Windows that I purchased 15 years ago, that runs perfectly on Windows 10. On macOS I feel lucky if something works for 12 months, and we now defer upgrading to major versions of macOS for around 10 months after they come out, so as to avoid having to do unpaid alpha-testing for Cupertino. Plus Apple keeps trying to force me to change the way I work, by pulling the plug on features and facilities I've relied on for years, no longer offering a 17" laptop, and removing most of the ports that other devices I have would actually connect to, and deliberately slowing down older hardware running newer software.
I accept that if I use Google or Facebook, I'm the product not the customer. But as I pay Apple considerable sums to purchase its hardware, I'd like to ask Tim Cook focus on making what his customers want, rather than endeavouring to make its customers want whatever it has made.
It appears they have installed another layer in the file system starting around 10.10.
I have a FAT formated USB stick and from the command line this often happens:
$ cd /Volumes/RED8/
$ ls
.
Opps all the files are gone except they aren't. Things like "ls -s | sort" won't work at all sometimes yet ls -s sometimes does. There are plenty of online complaints about this but most seem to think it is related to the shell which it isn't.
I don't know what that extra layer does but I expect it might help protect flash drives from being pulled out without being turned off first or related to their new file system but that is pure speculation. Maybe Apple should have used some of its huge pile of cash to by Larry a new boat and then roll out ZFS properly.