Retina iMac
Best ever computer. If you don't want one of these right now, you shouldn't be in IT. Or in life.
Confirming what had been widely expected, Apple has unveiled two new iPad models – small and large – and laid out plans for a third. There's also a new shiny 5K iMac. The big news, though, is that the "soft SIM" war his finally begun. At a launch event in Cupertino for its pet hacks, Apple said the new iPad Air 2 will be …
"The models will start at $2499 (£1999), so be prepared to shell out for the shiny thin model."
Yes expensive for a computer in absolute and relative cost. But actually very very cheap relative to latest gen. high resolution displays. For a certain class of user, this actually has clear additional value for once. Actually quite stunning.
Dell monitor also rotates and you can use it portrait, it you wish so. Also, today a professional monitor is more than just the pixel count. What are both monitor color space coverage? Hardware calibration capabilities? Etc. etc.
Anyway, these are just the first wave of 4K/5K monitors, expect more coming. Anyway professional prefer a computer *outside* the monitor, for several reasons (components changing temperature can impact color rendition as well).
I like the 5K stunner because I hope Intel's trickle down theory of display resolutions comes true. (1366x768 plz go)
I have a retina screen iPad 3 and a retina screen iPad mini that I use for work.
Both reside in protective cases so the new thinnest iPads really mean thing to me.
Also, why would I need to upgrade, they both just work.
Nothing 'new ' here that gives me a reason to.
Reminds me of the film The Man In The White Suit.
5120 by 2880 pixels, or 14.7 million pixels
Haven't Apple noticed that all their competitors are locked in a head long rush to reduce the screen resolution of laptops. It's been decided by industry officials that users don't want pixels on their laptops any more. Adding more pixels has been decreed to be the wrong direction. Before about 2003 then the market wanted to increase pixel count and eventually reached 1920x1200, then the marketing zeebs decreed that having too many pixels was a bad plan and put the industry in a reverse. Zeebs across the world are waiting to see which manufacturer will be brave enough to introduce the fully retro VGA 640x480 display. Or whether they'll decide on a wide screen format 640x360 display as their ultimate masturbatory fantasy.
Jeeeeez Apple are being just so 20th century here.
"Zeebs across the world are waiting to see which manufacturer will be brave enough to introduce the fully retro VGA 640x480 display."
Personally, I'm waiting for a computer to have just one LED. The computer should give you a choice of outputting text in either Morse or binary format. Pictures are out the window, but who needs them - three year olds? The rush to additional pixels has been an industry blind alley since the fifties.
>Best ever computer. If you don't want one of these right now, you shouldn't be in IT. Or in life.
Please accept my award for "dumbest ever statement".
As a frequent, but not exclusive, Apple user I find it annoying how much the slightest positive opinions about their products elicits howls of "fanboi, fanboi". After all, I hardly notice much difference between my OSX bash shell and my VM's Ubuntu. Postgres, Python and Django run seamlessly on both. We are all people of the 'nix, why can't we just get along?
Then I see statements like yours to remind me why snobbery & herd-think is associated with Apple use ;-)
The screen is an awesome change, yes, but I mostly hope it will relaunch a pixel arms race in laptops and/or standalone screens. An iMac is entirely the wrong format for me. That screen should still be going strong long after its computer has become a paperweight. On the other hand it is not portable anywhere. Good for many, not for me.
Kudos to Apple for upping their game and I hope everybody copies high res offerings on computers. If anything it might actually bring down prices on >2560 screens.
STILL way way wayyyyyyyyyy ***WAYYYYYYYY*** too expensive flash capacity bumps.
Ridiculous profit margin on the base model I can handle. Disproportionately more insane bonkers profit margin on something that has 112GB extra flash but is otherwise absolutely identical I cannot.
£160 extra for approx £20 of flash (teardown.com's figure not mine). Ridiculous. I'm cashing in my fanboi card.
I'm still unclear on why MS can't start it's Surface Pro's at less than 800 bucks apiece - $399 is clearly the starting point of interest for any tablet-centric devices.
I had a Kindle under warranty with BestBuy that had a dead battery. Took it in last week and they didn't have any new Kindles at the same price to trade out with me, so they gave me a Toshiba Encore Windows tablet that was listed at the same price - $269. Sweet little device, think I'll get a lot of use out of it. I don't really see what it's missing from the basic level Surface Pro, other than a little RAM. But it's still plenty zippy.
Learn something MS. Even Apple, the king of overpricing, knows that you can't start any kind of tablet device at an $800 price point.
Screen size is the same (10 inches), it has a mini-USB port which I got a $5 cable on Amazon to run full USB devices, it has a display port, and I wouldn't use the pen anyway. The i5 processor would be nice, but the Atom in this one is plenty zippy for web browsing and office documents. If the only real difference is the processor and some RAM, then it's hard to see why the Surface Pro is an extra $500 bucks.
This thing is also - full Windows 8.1. It has MS Office full version pre-loaded. I'm running all the normal programs - VLC player, regular Chrome, Adobe Acrobat Pro. No games of course - I'm sure it would run some simple games, but it's more like a netbook in that regard.
With this 10 inch size, you can really see where the Metro apps work - the full screen IE is definitely the best tablet browser. I've heard other people say that, but using it on this thing it's extremely clear to see. Basically they've built it so that all the controls are within easy reach of one thumb or the other - very smart. And all those side swipes and "charms" menus and the split keyboard really make sense when you are using your thumbs for everything.
Enlighten us? There's no such thing as 5K content unless you are downscaling from some uber-expensive 8K recording equipment. There is 4K video. So the 5K will be used for stuff that scales to any resolution, like browser windows or the desktop with icons. Are you saying 3-4 GB of VRAM is needed for 4K video content or... for gaming purposes on the iMac?
It could be 3D rendering...
Anyway a Nikon D810 frame is 7360 x 4912, while an Hasselblad H5D-50c is 8272 x 6200 - monitor of such resolution can be useful for high-megapixel cameras, and the driving video card will be in the upper range even if their power will not be much used for 2D editing (but that memory could be used for GPU computations as well).
The screen is 5120x2880 pixels — coming up to 15 megapixels. At 32bits per pixel, that requires a frame buffer of 56.25 megabytes. A 2gb GPU can therefore store more than 37 copies of the display before running out of space. So there's plentiful room for all of the things a GPU caches.
What do you imagine the issue to be?
The issue I imagine is not about the memory buffer but rather about the choice of GPU. The Radeon 290X runs hot as all fuck and this means your shiny new ultra-slim pretty iMac is either going to double up as a hotplate/fan heater or sound like a 767 taking off over your head.
No thanks.
That would be the issue raised by someone who titled their post "Radeon 290X to drive a 5k display?"
As to heat though, I'll bet Apple has just underclocked it. The advertised fill rate for that GPU is 64 gigapixels per second; enough to fill every pixel on screen more than 4,340 times a second. Let's suppose they underclocked it by 50%, which they won't have, that's naively enough to paint every pixel 36 times at 60 frames per second. Or more than enough to paint every pixel once even when you factor in that advertised fill rates aren't really achievable and blur translucency costs quite a few samples*. And OS X does not repaint every pixel for every frame, it just reacts to the changes (yes, with some redundancy because it's a broad-phase test).
* but not as many as you think; per empirical investigation Apple is using a smart mix of mip map samples very closely to approximate a Gaussian in four or five samples.
Android launches occur all the time and Google's own hardware doesn't dominate the market. It's therefore less newsworthy.
As to your random accusation of bias? Let's check the historical record:
After launch, the newer Nexus 7 sold fewer than a million units a month: http://www.fool.com/investing/general/2013/11/11/why-isnt-googles-new-nexus-7-outselling-its-predec.aspx . The iPad Air sold more than five million units a month after its launch: http://www.wired.com/2014/04/apple-q2-earnings-2/
The BBC put both of the launches of the last two days on its front page (see https://web.archive.org/web/20141015182903/http://www.bbc.co.uk/news/ ).
The difference is that the launch that's likely to interest more than five times as many people got one of the picture boxes rather than just a single line, giving it much greater prominence.
I'd suggest this is because the BBC reports based on likely reader interest. It's pretty much given that the iPads will massively outsell the tiny subset of new Android devices that was announced yesterday.
No, the BBC is (was?) so biased towards Apple that I stopped reading their tech section because everything there that mentioned Apple was as much an advertorial as a Stephen Fry Apple review.
But then the BBC has seriously dropped in my estimation over the last couple of years anyway just for the way they started to dumb everything down to the point where the story was just totally missing and started doing the Fox News trick of making everything a horror story to get the sensationalist value in.
They are supposed to be a serious, unbiased news organisation for adults (unless you are looking at the Newsround / Newsbeat sections) and I just don't see that anymore.
Or more to the point complain and stop reading it, which is in fact what I did (at least ffor the tech section) Of course it isn't going to make a difference as the BBC is staffed by creative types for which nothing but an Apple will do. (Being an Apple fan I can live with, but if you want to write fanboy pieces then write for MacWorld or someting similar, not a serious news outfi that is supposed to be balanced.)
I was simply agreeing with the person who was accusing the BBC of bias towards Apple. Which they have.
As for the teenager comment. I wish! (Actually I don't, they were terrible years! :) )