42" EDTV here
Yes, you know, the 480p sort. It was a hand-me-down gift. I probably sit about 3 or 4m away and can see the pixels on any high contrast objects. But I don't really care.
2913 publicly visible posts • joined 18 Jun 2009
Based on Microsoft's denial and the port to ARM, I'm optimistic that they're deprecating some legacy stuff by relegating it to a Windows 7 compatibility mode. So they wouldn't be re-jigging the underlying architecture, just trying to push everyone more forcibly towards the re-jigged stuff.
Would an all .NET Windows with all or most of Win16/32/64 in a sandboxed, legacy support environment really be a bad thing?
Not quite true; if I recall correctly then a PowerPC version of Windows NT 4 shipped for the PReP platform in a few extremely obscure ThinkPads that could run Intel binaries through emulation. Or my memory might be fooling me, and the emulator may have been an add-on, though I'm pretty sure it worked at the system level, to emulate the binaries but forward the relevant system calls directly to the native NT implementations. Or I'm just very confused indeed.
I completely agree; binary compatibility for the Mac goes back a decade at the most and requires the installation of optional components to do so. Obviously you can argue some virtue in that from the perspective of bloat and support, but the benefits of full backward compatibility are so obvious as not to need arguing. Microsoft aren't always 100% on the nail, but it says a lot that I can remember the only two times I've had problems, and the first of those was running a Windows 2 version of PageMaker on 3.1...
Same thing here; I think part of it is that there's no gaps between the pixels, and they've selected fonts that are relatively aggressively fitted to pixel boundaries so there's limited need for anti-aliasing anyway. http://www.bit-101.com/blog/?p=2722 makes the point quite well, especially when you get to the 400x versus 375x zooms.
The bluetooth criticisms are valid, the video criticism isn't. Both the QuickTime container and H.264 video are industry standards, being written into the BluRay spec amongst other things. My Android can play them, most £20 DVD players with USB slots can play them, VLC can play them.
I think tablets may displace laptops for users that buy a laptop to use in their own home, primarily for the web and email, tending to keep it in the lounge or some other socially oriented room. A tablet usually betters a smartphone for the same reason that A4 is what most people put content on rather than on till receipts — reading a web page at approximately the size of a full piece of paper is just easier and more comfortable.
I'm aware they also act as media centres, but I don't see that being a big use. For music you want something that'll fit in your pocket and video content tends to want as big a screen as possible, with the average TV now being probably about 30".
I'll bet that within ten years you get a tablet with your broadband just like you currently get a wireless router. People will plug the router into the wall, put the tablet in the lounge and for 90% that's the home Internet sorted.
Apple shipped a major speed boost to the Javascript engine in Mobile Safari in iOS 4.3, though they got slammed — rightly or wrongly — for not applying it to the entire OS. It's not just "going to get faster on Androids, Blackberries, etc", it is getting faster on Androids, Blackberries, iPhones, etc.
The iPhone supports the latest HTML 5 database stuff, for local persistent store, as does Android. Blackberry doesn't.
Even since the App Store launched and Apple stopped saying "HTML + Javascript is the API", they've failed to become more restrictive. Every added restriction on one side, such as the 70:30 rule for subscription content, is balanced by one eased on the other, such as the removal of the C/C++/Objective-C/Javascript rule for apps which allows people to use Adobes development tools if they wish.
To my knowledge, Jobs has spoken out publicly only against Flash. The arguments he made that it offers a very poor user experience on mobile have been backed up by every objective review I've seen of the Android client. I doubt I'm alone in having decided never to download the thing onto my Android phone. His conclusions — to ban Flash from his platform — betray his control tendencies but the initial observations were valid.
So, in response to your post: (i) Apple have been improving their Javascript implementation; (ii) Apple do support local storage for Javascript applications; (iii) Apple's maniacal level of control over the iPhone platform has remained more or less constant rather than being some sort of ever-growing all encroaching beast; and (iv) Jobs has publicly trashed technologies only when they deserve to be trashed (his actions following the trashing being the objectionable bit, not the trashing itself).
History has also shown Apple to be a beneficial contributor to the web ecosystem, being the driving force behind the vast majority of WebKit development (it was forked from KHTML when that project was four years old, which was nine years ago) and the originator of the canvas element and 2d/3d CSS transforms amongst others. As far as I can think, they've done nothing at any time to hurt the development and propagation web standards.
I'd therefore suggest that Javascript will remain just as viable on iPhones as everywhere else in the world, give or take that Apple's Javascript implementation tends to lag Google's in speed increases.
With Honeycomb being closed source, I guess this'll be a fork in many respects — so there's a sense in which it'll add fragmentation to the Android world, but I'd expect that Amazon won't market these as 'Android' devices and won't officially support anything other than their own store so hopefully the issue will be moot. I expect they won't have done anything dramatic with the underneath APIs either, so falling back into line at some point shouldn't be hard.
Amazon are the only company I can think of that can largely match Apple's content collection (probably the same amount of music and movies, more books and magazines, fewer apps, but the latter feels easier to fix if the devices are successful), while having a good direct-to-the-customer relationship and a fantastic retail infrastructure. If you're looking for someone other than Apple to launch a single prominent consumer device, then Amazon is your best bet. If the news is true, my money's on tablets being a two horse race, not between Apple and the field of Android players but between Apple and Amazon.
... Amazon adverts are served only if you've accepted a subsidised device. And given the work Amazon will have had to do in isolation due to Honeycomb being temporarily closed source, I'd expect them to have cut out the Google ads even if it means sending all the normal Google apps with them.
While I agree with your point — which I take to be that Apple shouldn't be condemned for making a computer as an appliance given that there's a market for computers as appliances and lots of people want them only as appliances — I think possibly the offence here is that Apple are adding and removing features that some people want without any sort of notice and with no regard to that particular audience.
So it's characteristic of their control tendencies and it further evidences which segments they're actually interested in selling to. It's also a sign that they don't mind deviating from industry standards if they think something is to be gained for their target audience. So I think it plays both ways. I can see why it offends a lot of El Reg's readership but I don't agree with a lot of the motives that are assumed to be behind it, or that it imputes much upon Apple's customers.
That's not quite the issue here; Apple have built a temperature sensor and the firmware necessary to report back from the temperature sensor directly onto the drive, have created a proprietary connector to allow drive + sensor to talk back to the rest of the system and have set things up so that any failure by the drive to say that it's safe results in the fans spinning up to the maximum extent possible.
This hinders third party upgrades, so is a negative step in the eyes of a group of their potential customers. Any individual who would have swapped out the base build drive for something larger after purchase and who now instead opts to pay for a build-to-order upgrade will have to pay more than they did previously per the industry-wide rule that build-to-order upgrades cost quite a lot more than buying the better part and performing the same task yourself (often even if, hypothetically, you were then unable to realise any value from whatever bit you remove).
Conversely, it possibly shrinks the total size of the sensor + the drive (or, more probably, the cost of the two together), and iMacs haven't been designed to contain user serviceable hard drives at any point that I remember. Most iMac purchasers already treat it as an appliance and attempts to upgrade are rare — the RAM is user serviceable and very trivial to access, everything else is hidden. Upgrading some non-user-serviceable parts of the iMac is easier in this model than the last (including the CPU, notably), but that doesn't make for an interesting story.
So: Apple have taken a step that upsets some of its customers, but not most. It's news but it's not really the end of the world and it doesn't say much about Apple beyond reinforcing whatever you already thought about them.
... some who use Doctor Who primarily as something to talk about in advance, with theories and leaked plot elements, seem to get many hours more enjoyment out of it than they would merely by watching the television programme. And they're probably still buying the merchandising, while those of us that prefer it just as a television programme still didn't know what was going to happen.
i.e. Motorola and Samsung versus the no-name, very low specification, resistive tablets that are threatening to give Android an unfair bad name. I also think that maybe why in 'Honeycomb' they've picked a codename that sounds good and is being pushed as part of the branding, and seem to be retaining it to the next minor version.
Either that, or it really is just that the code doesn't look very nice. Not everything is a conspiracy and companies do sometimes tell the truth.
I thought it was more that 10" screens are quite close to the size that both the international community and the Americans separately have settled on as being good for a piece of paper, so the thing ends up a natural size for web pages, PDFs, magazines, etc. The 7" screen is conveniently like a paperback, but less suited to the web. And that's before you throw in the media centre component.
I think you're right; with Clang now fully capable of C++ and Objective-C++, they've switched to a Clang/LLVM pair for Xcode 4, to power not just the compilation stage but the static analyser, the as-you-type error highlighting, and a bunch of other workspace things.
At present they're pushing all the way to a native binary, but it feels like it'd be a relatively trivial step from here to build LLVM into the OS and move towards LLVM bytecode generation being the default in the developer tools.
Even if this were an official announcement from Apple, it'd have a little of the 'me too' to it, given that Microsoft has already announced an ARM port of Windows 8. Obviously the difference is that if Apple decide they want ARM then you stop being able to buy an Intel Mac anywhere, but supposing Apple were to switch and to demonstrate gains in doing so then the door will be completely open for companies that ship Windows machines to introduce competing devices into their ranges.
So: Apple's move could start a trend, or at least have more of an impact than just on the tiny OS X audience. Though you'd have to buy into the version of events where Apple are highly influential in everything they do rather than just occasionally influential in some areas; assuming genuine benefits do appear from ARM laptops then I'd expect Windows manufacturers to offer devices anyway, and quite possibly sooner.
No, no good guy, but I'd rate Google as the better guy. They've done quite a lot to lead various markets in great improvements, be it in search, maps, email or whatever. They tend to be a bit lax on copyright in places like YouTube and Google Books but I doubt they'd predicate an entire avenue of their business on copyright theft.
Conversely, Oracle's actions with all of the Sun acquisitions are almost a study in trying to use legalistic means to quickly consume whatever value is left as a precursor to dumping the lot.
While the Mac App Store almost surprises by being quick and snappy (though it shouldn't; this is just comparing it to iTunes), it is as the name suggests Mac only. So the Windows people, for whom iTunes performs about a thousand times worse still, would be left out.
That said, the handset itself has a pretty good client — maybe leave iTunes to back up and possibly to organise, but otherwise keep itself out of the software side of any of Apple's network connected devices? If ever an application could do with having its functionality pruned, it's iTunes.
Your comment is disproved by trivial research. From the Apple-specific websites that have reported the rumour:
9to5mac, from where the story originates: "Apple has long used the proxy of iTunes to push updates to its iOS devices ... Smartphone competitors have long offered a different, more direct method for software updates that happens over-the-air."
Mac Rumors headline: "iOS 5 to Finally Deliver Over-The-Air Updates?"
MacWorld: "Other smartphone operating systems such as Android can be updated over-the-air,"
GigaOm: "Smaller, incremental updates like those served to Android might be the way to go, but that would require a significant change in the way Apple approaches updates "
Barnes & Noble operate in the US only, whereas Amazon operate internationally and (finally) launched the Kindle outside of the US last year. So, while it was an oversight not to mention the Nook, it was probably understandable from an article written in a territory where the thing isn't available.
If the new Kindle screen is literally like the Sony ones then it'll be a step backwards because the Sony screens have a noticeably lower contrast, which people tend to attribute to the way the touch stuff is bonded on top. If it's a tablet-style form factor with no buttons whatsoever then it'll be a step backward in usability since you'll no longer be able to hold it in one hand and turn pages, e.g. while using the other hand to grip some part of the tube/bus/train that you're commuting on but which has no free seats.
On the concept of a separate tablet, I see Amazon not just as the only company with a serious chance to challenge Apple at their own integrated market game, but quite possibly the only company that could displace Apple and become the dominant single player — by simultaneously having a ready-made broad retail audience, a strong content offering and, if Android based, being able to get the bloggers on board from day one.
You're assuming it wasn't a bug, even though it (i) was a 'feature' that explicitly wasn't used — the problematic data wasn't posted to anybody or harvested in any way; and (ii) turned up for the first time in a major OS version revision.
Fine, don't trust Apple, but the evidence that this was deliberate is relatively flimsy and the harm it will have done to others will have involved malicious third parties. I'd therefore rate it alongside any other security bug, such as the dozens that crop up in all of the major operating systems and typically lead to malicious code execution, privilege escalation, etc. So it's something that should have been caught, and it may affect your opinion of competence, but it was quite probably just an oversight.
It's what the companies that have the locations updated to them across the network in real time (I think all three of Apple, Google and Microsoft?) are doing with that data that's more scary.
At least on Windows, it didn't perform very well even back when it just managed music but now that it also does movies, apps, photos, podcasts, etc, etc, etc, it really does feel a lot worse. I'm always optimistic that the problem is just the app's heritage (it was an OS 9 app originally) and that the encouraging signs Apple have made by rewriting QuickTime and fixing the Windows port of Safari (for performance and system integration; forget what it does versus your favourite browser) mean that the iTunes problem could be fixed, possibly even in the not-too-distant future. But there's no reason to believe it'll actually happen.
To quote LBJ:
"You do not wipe away the scars of centuries by saying: Now you are free to go where you want, and do as you desire, and choose the leaders you please. You do not take a person who, for years, has been hobbled by chains and liberate him, bring him up to the starting line of a race and then say, 'you are free to compete with all the others,' and still justly believe that you have been completely fair"
It's a question of measure and degree. The iPhone libraries meet probably 80% of the OpenStep spec, which was a pure API effort and was explicitly meant to be vendor neutral. So it's very closely OpenStep related. And OpenStep being explicitly for multiple-vendor implementation, provenance isn't relevant while you're willing to conflate OpenStep and NextStep. Which also nullifies your FIAT/Ferrari comment. OpenStep is a framework, not a company.
Android phones, like iPhones and others, implement probably 40% of SGI's OpenGL (since ES 1.0 cuts a very large amount of extraneous stuff and 2.0 culls almost the entire fixed pipeline) and almost none of the rest of the old SGI APIs, along with none of the design patterns.
If you look inside OpenStep source, you'll see NSArrays, NSDictionaries, NSNumbers, target/action patterns, delegation, key-value observing, the same protocols (in the NSCoding, NSObject sense, albeit largely informal), notification centres, a run loop, selectors and fully dynamic dispatch. If you look inside iPhone source, you'll see all of those same things. So it's the same fundamental base objects, the same fundamental design patterns and mostly the same higher up objects.
Summary: your "some parts" is a massive understatement; I don't consider it so incorrect to suggest a single lineage as to maintain the article's author was wrong. It's not just that the odd API has survived and it's nothing to do with the legal name of the company involved.
It has an updated version of the language runtime (per the move towards formal protocols and the addition of closures), all the old Foundation classes (with additions), much the same conventions and patterns (target/action, delegation), and the kernel is a much updated version of that which was part of NextStep. Only the user interface library is all new, per the new user interface paradigm — multiple touches and direct manipulation are in; at the C level DisplayPostscript is out due to licensing costs and a PDF-derived alternative (so, same primitives but no language) is in.
I'd say it is quite closely related to NextStep.
Suggest you do your research. Or, possibly, stop using 'shill' as a generic smear when you've nothing intellectual to say.
Me, on "Kindle beats Apple's closed book on choice": "I have a Kindle and an iPad, and tend to buy a few books a month on the Kindle but have never bought any at all on the iPad"
Me, on "The Microsoft mobile reboot needs rebooting": "Google would argue they're only temporarily closed source, but even if the Android source code is never published again, it'd still be the only one of the offerings from Microsoft, Apple and Google to allow anyone to install any application from any source."
Me, on "Android, Steve Jobs, and Apple's '90%' tablet share": "Android dominates the phone market ... by being on a lot of different handsets, relatively cheaply. It ticks all the boxes that a large proportion of the market care about, which is enough."
Me, on "Steve Jobs vindicated: Android is not open": "Whether an OS is open source is a completely unrelated issue to whether it has an open market in applications, Microsoft Windows being the obvious evidence — it has the most diverse market possible and not one jot of it is open."
Me on "Tesco herals 2011 as YEAR OF ANDROID": "I was under the impression that 2010 had been the year of Android and that, across the whole market, Android phones were outselling iPhones already. So it's a bit surprising that Tesco have only just caught up."
Me on "Apple 'greed' tax spreads beyond music, movies, magazines", when in a particularly bad mood: "Apple have now made it impossible for others to defend them convincingly. The new charge on subscription services means that the costs of being in the ecosystem now outweigh the benefits for many major companies that you've heard of."
With both mucksie2676 and the other posters making the point that correlation is not causation and, even if it were, taking the one difference in abstract is massively overreaching.
From a completely subjective point of view, I live in London, in a typical London sized property. When I bought books, I tended to read them once or twice, maybe keep them on a shelf for a while but eventually give them away — either to a friend or via those handy Oxfam collection bins. I don't have the space to keep every book I've ever read and I don't really have the desire to. I have a bunch of reference books that I'll probably keep forever but they're in the minority.
Hence, I really don't care about portability of my e-book purchases. I have a Kindle and an iPad, and tend to buy a few books a month on the Kindle but have never bought any at all on the iPad despite downloading the iBooks app and trying the sample copy of Alice in Wonderland. I have the Kindle app for the iPad too, but doubt I've opened it more than about twice.
It's simply that the Kindle is by far the better reading device. It's a much more convenient size for public transport, you quickly forget that it's a screen you're looking at because the type looks a lot better (ie, you can't see the pixels), and the screen is visible in direct sunlight and doesn't attempt to blind me when I'm reading at night. So, it's more convenient: (i) on public transport; (ii) in the park; and (iii) in bed. Which is a clean sweep of my normal reading environments.
I guess it's nice to know that my purchases will someday be portable should a better device come along, but as I've yet to read any of them more than once I'm not really that bothered. It's actually much more bothersome that I can't pass them on having now finished with them.
The primary thing that dictates the best screen size is the size of an average human hand. The bigger necessarily equals better crowd are primarily those that have decided they want to criticise and have worked backwards to find any distinguishing feature. I'd rate it about equal with "the Android user interface is worse because it doesn't have CoverFlow" on the scale of valid criticisms.
Of course, that doesn't mean Apple won't ship a bigger screen. At this point, distinction from the previous years' model seems to be getting ever more slender across the industry, so giving people anything on which they can try to rationalise an upgrade is advantageous.
1080p has a diagonal of almost 2203 pixels, so on a screen with a 50 inch (or 127cm) diagonal that's 44 PPI (or, the other way around, each pixel is about 0.058cm across).
So if you sit 2m away, each pixel subtends an angle of about 0.02 degrees on the retina, or very very close to 1 arc minute. A human eye in optimal illumination conditions can distinguish two lines if they're 0.6 arc minutes apart, so technically a higher resolution could be beneficial. But I imagine not really in any of the scenes you're likely to see on TV, which tend to be moving scenes featuring large objects rather than perfectly static shots of typography or, ummm, rakes at a distance. And probably not at all if, like most of the people here, you've spent most of your life staring at a screen that's maybe 45cm from your face.
It's probably worth someone else checking my maths before you take this as a definitive answer...
There's some sort of refund mechanism, but I've never been clear exactly how it works. There was a minor uproar amongst developers when it was first announced since Apple were taking 30% of the original sale, but then supplying 100% of the refund from the relevant developer's account. I suspect they may just have decided to make the feature a little obscure rather than deal with the matter properly.
What the iPhone had was a multitouch, direct manipulation API. So, in the browser you didn't use a little joystick to step down the page one notch at a time and you didn't go to a little submenu somewhere to decide whether you wanted 50%, 75% or 100% zoom. And it shipped with industry standard fonts like Times and Helvetica, using Apple's preferred typographically-accurate rendering, on a screen with a then high pixel density. It was also the first phone to ship with and with an OS built around the presumption of a GPU.
When you bought an iPhone you also got an unlimited data plan — at the time almost unheard of — and no carrier interference.
Most competitor's phones then rarely shipped without being crippled by the carrier, required a BSc to operate fully and were proud to include only exactly one font. I was at an a Nokia presentation just shortly before the Microsoft announcement and one of the invited speakers opined that it was a big joke that anyone should care how text on screen looks, as long as it's there.
So, limited innovations if you're going to boil them down to numbers. But if you're the sort of person that genuinely thinks the correct way to compare devices is a simple present-or-absent feature check list and that direct manipulation and physical navigation metaphors count only as "a pretty UI".
The Asus Eee Pad Transformer doesn't have the same specs as the iPad. It has a slower CPU and GPU (see, e.g. http://www.glbenchmark.com/compare.jsp?benchmark=glpro20&showhide=true&certified_only=1&D1=Apple%20iPad%202&D2=Motorola%20Xoom&D3=Apple%20iPad&D4=Samsung%20GT-P1000%20Galaxy%20Tab), being based on an older chip, which it counterbalances with a better camera and slightly larger, slightly higher resolution screen. And it's just £20 cheaper, so all it really evidences is that other manufacturers are able to hit much the same price points for much the same technology.
Android dominates the phone market not by being on any innovative devices — in truth nobody has innovated for years — just by being on a lot of different handsets, relatively cheaply. It ticks all the boxes that a large proportion of the market care about, which is enough.
Google would argue they're only temporarily closed source, but even if the Android source code is never published again, it'd still be the only one of the offerings from Microsoft, Apple and Google to allow anyone to install any application from any source. They also score points against Microsoft for the breadth of the SDK - Java in the VM or C (or anything else GCC can do) directly on the processor. Microsoft are allowing .NET managed code only. Don't expect Angry Birds too quickly and probably don't expect a port of the Unreal Engine at all.
"It just boggles the mind that GPU architecture is still in the sub 1 Ghz, single core days of yore."
I'm not sure where you're getting your information from — to the extent that anybody cares about clock speeds, GPUs exceeded 1 Ghz a long time ago, and they're all in the hundreds of cores nowadays. The top of the range workstation GPUs from NVidia (such as the Tesla C2070) have 448 cores, and those, like those on the consumer cards, are fully programmable in various C-like languages such as CUDA and OpenCL.
So I genuinely think I must have misunderstood your comment.
While much of the onerous stuff resulting from the judgment vanished almost immediately upon the change of US president, I tend to think that making it much harder to deny that Microsoft had damaged the industry set the stage for their decade of failure to succeed in new markets (the Xbox being the only exception). It's not just because of products like the Zune, out of date almost immediately upon release (as a regular iPod clone just before the iPod Touch came out), but also Windows Mobile - which was around a long time before the iPhone-inspired smartphone boom but failed to gain any traction. I think a contributor to that was an industry consciously resisting a manufacturer with an antitrust track record and consumers being similarly wary of a tarnished brand.
The main groups that have continuously shouted about Android being open and that being one of the reasons why it is better are (i) Google; and (ii) a particular, vocal segment of Android users. It's those that support Android that have conflated openness and a bunch of other issues, not those that seek to detract from it.
I agree that the conflation is unhelpful and often misguided. I strongly disagree that it is mainly the product of "those with an axe to grind".
... who is perfectly willing to believe that the presenters of Top Gear may have libelled Tesla (though, equally, they may not have done, I don't know) and willing to admit that he doesn't find the programme entertaining at all? I always find their little conversations to be laboriously over rehearsed. I mean, not on the Master Chef level — they're well above that sort of stuff — but so as to make the 'wit' very hard to enjoy.
Thus concludes my highly irrelevant and completely personal opinion.
The piece clearly isn't an op ed, and gives both sides of the story. All of the reasons you give for it being a spent story are within the article - there's no convenient ignorance of facts and no endorsement of either side of the argument. The reported has even put allegations of bias to one of the main actors, Nimmer, and gone with a headline that suggests the story is no big deal.
It's not bad reporting to say "this person prominently says this, the facts are these" just because the facts are more subtle that the person contends.
@04:45: I'm not sure you've understood my point. Or, more probably, I haven't understood yours. I was trying to make it clear how little most people care whether Android is open source. Whether an OS is open source is a completely unrelated issue to whether it has an open market in applications, Microsoft Windows being the obvious evidence — it has the most diverse market possible and not one jot of it is open.
@13:22: your post has no basis in reality. It's a simple troll. Nobody is bickering about Android 3.0 not yet being open source, it's a simple fact. Quite a lot of people, like me, are pointing out that it doesn't matter in the slightest. You're also wrong to state that Apple's market share is shrinking, as it's still growing, and growing faster than the market as a whole. However, it's growing substantially less quickly than Android definitely did during 2010 and probably still is, and Android shipments were ahead of iOS shipments if you restrict numbers to phones only.
Again, all facts. But this is the Internet, so I'm sure you can find someone who will take your bait.
It's nothing like a poor troll. The point isn't that Android is worse than iOS because it's not open, it's that Google are wrong to say it's open because it's demonstrably not according to their own test. And promising it will be again in the future isn't the same.
To be honest, I think that anybody that relies on "it's open" as the cornerstone of their advocacy for a consumer-facing embedded operating system has already lost the argument. To advocate Android you should focus on the free market in applications, the price and the diversity of devices, none of which this article disputes and none of which are affected by the news it covers.
The story: a sea change in the OS offering from Apple, and all the user-facing changes it brought. Your allegation: it doesn't go into the internals, so obviously it's delusional. Have you ever heard of confirmation bias?
There are a bunch of different bits of underlying technology in OS X. Some of them are Apple originated, some of them aren't. None of them are particularly relevant.
It's just competition in the marketplace, and DLNA isn't actually supported by very many devices. Competition is how we run our capitalist societies. I vote: let Apple license their proprietary solution, let the market decide. If, as you seem to imply, it has already decided then there seems to be little damage that Apple can possibly do.