* Posts by Andrew Garrard

151 publicly visible posts • joined 29 May 2007

Page:

Smarty Ring promises technology at your fingertips

Andrew Garrard

Re: Scott Adams - Visionary

Meh. I was thinking about it longer ago than that. Could do with a gyroscopically stabilized picoprojector, really. And I was kind of hoping for powering it by a similar mechanism to electronic self-winding watches.

I really must get a 3D printer and try to make all the things I "invent"...

Google: YouTube fights off HUGE ASCII PHALLUS MENACE

Andrew Garrard

8//=/\-, ?

Yes, I can see the susceptible hormone-drenched youth of today being corrupted by such images, and nothing to do with what's advertised at them every day.

If only we could return to the romance (or sexism, or rape and pillage, depending on your perspective) of the past.

@>-`-,----

Richard Stallman decides Emacs should go WYSIWYG

Andrew Garrard

XEmacs?

I'm not 100% sure what he's after, but XEmacs had quite a lot more integrated font and image management than Emacs did, and there was a big war about it, in part because of Stallman not wanting to take his hands off the Emacs source code (as I understood). Things come around...

Anyway. I still occasionally use XEmacs, though it hasn't been updated much for a while. Emacs itself seems to be a bit more current, so I'm currently running that. (And yes, there's a good reason I still use these editors, and it's not just meta-x butterfly or meta-x dunnet.)

Smartwatch craze is all just ONE OFF THE WRIST

Andrew Garrard

So long as it's a good watch...

I had, and used, that Casio data bank watch. If the alternative for looking up a phone number was a FiloFax, actually it had merits. Also, I'd been known to use the calculator. I always wanted a Seiko RC watch, but never persuaded my parents to buy me one.

I've since been through a series of Timex DataLink watches, which could have done with a better interface, but remained a perfectly good digital watch. I occasionally programmed them to do something interesting, like tell me where to go next on a pub crawl and record whether I was ahead or behind schedule compared with previous years. I never got around to converting to more programmable devices but worse watches, like the Fossil Palm watch or the Ruputer.

I do have (and am currently charging, for amusement value) an OLED watch that plays "MP4" video. It's awful - you have to press a button to get it to tell the time, it runs down in a day, it's huge, plastic, and fogs up if you get sweaty. It also came with the wrong driver CD (unless there's a Motorola modem in there as well). However, for its intended purpose - winding up a colleague on a circuit board design team who was complaining about fitting things into a phone-sized form factor - it worked perfectly.

For the last year or so I've been wearing a non-programmable watch, costing me about £10 from Argos. It had a horrible user interface - most notably going into "tell me what angle the sun is" mode whenever I bent my left wrist and accidentally pressed a button, which could only be reverted from by pressing a specific one of the other five buttons, so it was rarely actually telling the time - and I'd have fixed that if I'd had a programmable option.

As of a week ago (because of the waiting list), what's on my wrist is a Pebble. It's a vastly better watch than the previous thing, and much easier to program than the Timex (no more 6809 assembler). I'd prefer not to have to charge it every few days, but at least it tells the time faultlessly and doesn't need charging more frequently than I'm asleep. I've no intention of using it for phone notifications, but there are some cases for which having something programmable on the wrist is more useful than fishing in your pocket.

Would a Google Glass (or your choice of similar wearable headset) be even better for always-on access? Possibly, but a watch is a lot less intrusive, and everyone seems too concerned about Glass wearers looking stupid for that to be an option. Supposedly women, especially, are now carting around two phones, one dainty and pocket sized and one with a huge screen for web surfing (men may do this as well, but those with flabby thighs like mine have less problem fitting a large phone in a pocket). So, for now, watches are as good as any option for some scenarios. Esoteric ones, maybe, but don't make life harder for those of us who want to make life easier for ourselves. Otherwise we'll have to resort to Glass, and you will be assimilated.

Panasonic whips out MONSTER fondleslab for serious S&M sessions

Andrew Garrard

Could be worse...

Well, it's cheaper than the T221 was, and it's only taken twelve years (I think) to get a higher resolution display than the one sitting in my study. I second the Cintiq comment - as a graphics tablet, great idea. As a content consumption device with finger input? Not so much.

Give us a break: Next Android version to be called 'KitKat'

Andrew Garrard

Re: All that crap

Quite. I still prefer to avoid Nestle (although I'm not absolutely strict about it). Not something I'd want to associate with if I had "don't be evil" in the company motto, though. I have a vague hope that it's going to be a deliberate name-and-shame policy and they'll change back.

I kind of think they did this just to spite everyone who assumed Key Lime Pie. Which is a pain for people who actually have source code with that moniker in it, of course. The "people don't know what key lime pie tastes like" seems rubbish, at least (lime, duh).

Star Wars revival secret: This isn't the celluloid you're looking for

Andrew Garrard

Not that unusual, is it?

I thought there was a fair proportion of the Hollywood output still being shot on film, that being one reason that some film companies are still going. I'm sure Red et al. are making a big dent in it, but I happened to go and watch The Wolverine recently (for not very justifiable reasons) and, going off the bokeh, it was obviously shot with an anamorphic lens. While it's technically possible to stick one on a digital sensor, by far the most sensible reason would be to shoot film. So it's not dead yet, even if I have a fridge full of the stuff due for development.

Now, I need to get myself a 5x4 before Fuji discontinue any more stocks...

Apple's Siri 'hurls insults' at Google Glass

Andrew Garrard

Nope, not seeing the pettiness.

Having found a way to navigate the links eventually to find out what the article was talking about, I agree with the above users. Apple, like many companies, is perfectly capable of being petty when it wants to be, but I really don't see how any of the responses mentioned are insulting Glass. Even the "half full" one is hardly a snipe, and the rest are just joshing with a user for mixing interfaces. To be honest, given time, someone with both devices will genuinely and accidentally try to communicate with Siri like they do with Glass, just as it's common when underslept/drunk/having an aneurysm to try to open a door with the wrong keys or use the wrong remote for the TV (and I live in fear of being too tired when the laundry bin kept in my upstairs toilet has its lid open). I'd far rather get an attempt at providing an amusing response than gibberish - as Infocom could tell you.

There's plenty of pettiness on the internet. Let's pick on the real stuff rather than on engineers attempting to retain their sanity by a little expression of humour. I'm sure the responses had to be cleared by someonecustomer-facing anyway, and if they couldn't find a way to be offended, I'm sure all that's here is imagination.

Now, there are plenty of more commendable and genuinely snide comments out there. The Amiga's Kickstart 1.2 message about Commodore or the message on the CVAX die, for example...

Linux kernel dumps 386 chip support

Andrew Garrard

How not to build a 32-bit CPU

Ah, the 386. I remember when the first Compaq 386 machines came out (I believe at 20MHz, not 33MHz, though Wikipedia tells me that slower 386s were available), and my fanboyism disliking the fact that it took the performance crown for desktops back from the 8MHz ARM2 Archimedes machines (probably as measured by dhrystone, although BASIC may have been involved). I'm sure my copy of Structured Computer Organization contains some comment about Intel "finally making a decent CPU", but unfortunately I'm a few thousand miles away and can't check - anyone got the red edition?

The 486 was a bit nicer as designs go, excluding the slight problem of getting everyone to optimize code in a way that was pessimal for Pentiums. I still wish IBM hadn't decided to use the chip from their printers and had gone with the 68000 series from the start (and if we were going to end up with thirty years of compatible machines foisted on us by Windows, we'd ended up with anything less crufty than x86), but at least it meant that near and far pointers weren't always obligatory...

Record €1.47 BEELLION EC fine for price-fixing display cartels

Andrew Garrard

I'm confused...

How did we get from Samsung and LG being fined 1.47 billion Euros to them having a total of about 345 million Euros, of which Samsung's was only 151 million (and smaller than Philips - who are also fined more than LG - and Panasonic)? Even collecting the four mentioned companies together, there's half a billion Euros unaccounted for. Or did I hopelessly mis-read it? Not that Samsung are unused to being fined a billion, but it'd be nice to know what's going on. (Disclaimer: I'm employed by Samsung, this may hit my salary...)

Author of '80s classic The Hobbit didn't know game was a hit

Andrew Garrard

Ope do

Ah, that takes me back. Not that I ever completed it as a kid. I really must go back to it. I still tend to think "ope do" when opening doors, which is a bit worrying now I come to think about it.

After this, the Lord of the Rings game was a big disappointment, mostly because it barely worked (maybe I had an iffy tape). Although that's how I got my copy of Fellowship of the Ring, so it turned out all right in the end.

Word wonks insist GIFs are really JIFs

Andrew Garrard

People who don't read up...

It's very publicly been known as having a soft G since its instigation, and the request of the owners. You may have been pronouncing it with a hard G for all that time, and it's more recently become explicitly accepted that this is okay, but historically the hard G was plain wrong, if common. These days I don't bite anyone's head off over it (though I still twitch whenever my colleagues say it with a hard G), but I'm not going to take people claiming that a soft G is wrong.

For the record, Linux historically came with an indication that Linus preferred it either to be pronounced as he did (sort of Leenuss, as I recall), with a secondary preference that one should pronounce it as one would pronounce his name (in my case, natively, Lie-nuss). He didn't like people attempting to pronounce it like his name and getting it wrong, as in "linnux". I believe he has since changed his mind on this, not least because "Linnucks" is so common. I still say "Lie-nucks".

Oh, and Risk-Oh-Ess, for what it's worth.

Apple seeks cooling fan patent for iPhone, iPad

Andrew Garrard

Hawtness

"All this hardware oddness begs one question, of course: why would low-power ARM-based chips such as Apple's A series, fabricated in increasingly smaller and therefore increasingly less power-hungry manufacturing processes, need to resort to such a complex, space-wasting scheme as fan-based cooling?"

Well, my elderly HTC sensation gets nice and toasty even playing Angry Birds. Run something with actual 3D requirements and there's no way to avoid heat being generated. ARMs and embedded graphics cores are very efficient, but we're still talking multiple >1GHz processor cores and a lot of graphics, usually in a fingernail-sized chip, and they're not magic. Unless Apple are getting their performance by fabbing with superconductors and Josephson Junctions, of course.

Microsoft demos real-time English to Chinese translation

Andrew Garrard

Digital babelfish, how I miss you

Some remembering the earlier days of the internet may recall one of the first popular public digital translation services, babelfish.altavista.com, since moved to babelfish.yahoo.com, and now redirecting to Bing translate. Altavista, of course, started out as altavista.digital.com, set up by Digital/DEC, and some of us remember when it was the search engine of choice.

So, no - digital babel fish is not closer. Microsoft have moved Babelfish several step away from Digital.

That said, I'm often astonished that Google Translate manages to produce something that's not a coherent sentence, let alone incorrect. I always assumed that something in the implementation of these things must understand some rules of grammar which ought to make that kind of problem tricky.

Apple screen maker preps 'retina' iPad Mini panel, pants reporter

Andrew Garrard

Unprecedented?

In this sense?

http://www.theverge.com/2012/6/1/3056490/sharp-caac-igzo-498-ppi-display-prototype

Just how good is Nokia's PureView 41Mp camera tech?

Andrew Garrard

No magic

"Instead of capturing the output of each individual pixel separately – as sensors normally do – the trick is to combine the output of groups of individual pixels into a larger pixel."

Which is in fact exactly what happens on most cameras if you save a lower resolution image than the default provided by the sensor. And, indeed, it normally improves the noise handling. I've no idea why everyone is making a big deal about this, other than that Nokia can justifiably claim that the sensor resolution is not detrimental. (Now, arguing about whether Nikon should implement a "small raw" mode on their high-end DSLRs is another matter.)

"This technology allows the 808’s sensor to capture as much light information as much larger pixels and sensors would."

The fact that the 808 has, for a phone (and most compacts), a chuffing enormous sensor and a fast lens means that it can capture a lot of light information. The amount hitting the sensor is the amount hitting the sensor. The amount per pixel is small, but DxO do their noise tests normalized by total image area, and exactly the same concept applies. So "[fewer] larger pixels", yes - "larger sensors", no. So much FUD about downsampling...

Anyway, nice bit of kit. I might pick one up as a compact camera once their price drops to clearance levels. Not so tempting as a phone, though.

Andrew Garrard
Facepalm

Re: Well, once I ascertained that Nokia was true to its word

Oh yes, they're completely lying. It's *only* 38MP. Which isn't interpolated (except in a Bayer sense), that's actual sensor sites.

Let's pick our fights?

Nokia apologizes for faking Lumia 920 ad

Andrew Garrard

Re: And still down-playing themselves?

Fair enough (and my bad). In that case, they're vastly over-blowing its abilities, simply by the laws of physics. hence the "simulated", I expect.

Andrew Garrard

And still down-playing themselves?

So, assuming this is actually done with image processing (and yes, it's normal to capture a larger frame and calculate a shift within it), why have they decided to go with "optical" image stabilization? Optical solutions fix camera shake, but do nothing to handle subject movement, like the bouncing Nordic woman, which is why professional sports photographers still have large aperture lenses. In extreme cases, as here, keeping the subject static in the frame would result in the background bouncing around, due to the change in perspective from the moving camera position (something Canon have tried to fix in a stabilized macro lens, but not for riding a bike).

There *is* a lot of research into stabilizing/removing blur independently from separate bits of the image - some was presented at SIGGRAPH this year, and Adobe explicitly stated that their work on this was the reason that they'd not yet released their camera-shake-removal technology (demoed recently) in Photoshop. But "optical" it's not.

If it's image processed, you may as well own up to it. Of course, if there really a stabilization element in there, I take it all back - but it's quite possibly not the best solution.

These things always look good in demos. I'll reserve judgement until a real world test, although I don't think my DSLR is going anywhere.

Hard-up Kodak selling consumer film biz

Andrew Garrard

Re: So they think that the market

I agree that it seems unlikely, in the time that everyone's moving to Red and other digital imagers and shooting stereo at 60fps. I do think that ditching the consumer film industry is a mistake, in that they have a small but loyal base and, like Ilford, someone continuing to make Kodak film stock will always have a market, especially in the formats not supplanted by digital. But since Kodak have been mis-judging the film market since APS and Disc Film, I'm not sure that they have a concept of sticking to what works (even if, admittedly, the market has shrunk a lot).

In as much as anyone's printing anything at all these days (the vast majority of images stay in digital form), a lot of people are either using commercial print services or local shops - because they're plenty good enough and more convenient. Of those who print a lot at home, as far as I know the big names in photo ink jets are Epson, HP and Canon, with companies like Lexmark and Kyocera sniffing around. I'm vaguely aware that Kodak make printers, but I've never been under the impression that the had a significant market segment. If nobody's buying your printers, you can't make money on the ink; if you make budget printers rather than the market leader, it's more likely that your customers will buy cheap off-brand ink.

Kodak have never been a leader in (consumer) digital imaging - they're just not an electronics company, and they're not going to compete with Canon, Panasonic and Sony. They don't have the optical and ergonomic background of Nikon or Pentax that allowed those companies to get a foot-hold in the digital market - Kodak haven't been a halo brand name for cameras in a very long time.

They're a film company. They've been a film company for a very long time. That the market for film has drastically shrunk is unfortunate for them, but trying to reinvent themselves into other sectors where they've not been successful isn't going to make them great. From a customer's perspective, I'd like them to continue doing the things that only Kodak do (making some proprietary emulsions), cut their costs, and try to find some new area in which to invest. Throwing out their only unique products while attempting to become a profitable box shifter for consumer goods doesn't seem like a viable long-term strategy.

My only hope is that someone like Fujifilm decide to pick up the film plant and keep making the emulsions, but given that they, too, are discontinuing some films, I don't have much hope. Maybe the Impossible Project will pick it up.

Andrew Garrard

Film/film ambiguity

So they're keeping their "commercial film" (is this motion film?) but getting rid of the consumer stuff? (I assume they don't mean commercial vs consumer photographic film.) And they're keeping their ink jets; really?

I hope someone picks up the slack. Otherwise, I'll be off to stock up on Portra - to go with my stocking up on Velvia after Fuji discontinued that. They're not making it easy to buy into a 5x4 camera system...

Nikon D4 DSLR review

Andrew Garrard

And the rest?

Interesting review. To me, a lot of the sample images are focussed in odd places (not on the eyes, at least); I don't know whether that's the AF system or the photographer. I've mostly heard bad things about the video on the D4, apparently because the downsampling algorithm introduces unnecessary softness (the 1:1 crop mode is okay, however, but why use a D4 for that?) The D800, which probably just throws lines away, is apparently much sharper. Interesting that there's no mention of the new AF switch position, which is already annoying me on my D800E. I'm not sure that mapping the AE-lock to pressing the joystick is an improvement, either. I'm sure grid lines were available in the finder on the D3s - I've always used them on my D700 (and they help me keep my horizons straight).

This is obviously a camera with a purpose: it's a work horse for high speed, low-light shooting. Journalists and sports shooters will love it. It's not a consumer camera, or a megapixel monster, so comparing it to a 'blad (or a D800) is pointless, as is talking about the price - the right image can sell enough newspapers to justify it, and the price is still lower than many pro lenses (and almost exactly the same as the 200-400 f/4 shown in the review). Consumers shouldn't feel they're missing out if they can't afford one and have to settle for a D3200 instead; each tool to its own place. So no more "how much?" comments, please.

And film? It does have its place, but its place isn't in low-light, high frame-rate, fast turn-around shooting. I have, and use (mostly for flowers that look better on Velvia than digital), an F5, but I'd be mad to try to take on a D4 (or D3) with it for the tasks it was originally designed for. Pros do still use film - but more often in a cheap camera for portability and travel, in a Leica for subtlety (those who haven't spent D4-money on an M9), in medium format for quality or in large format for the ability to fill a wall with a sharp image. Even my Pentax 645 isn't going to get much use now I have a D800, though a Mamiya 7 or a 5x4 (on my shopping list) would be another matter; for what they're good at, they'll smoke a D4 - but then so will a D800. Photojournalist use of film died with the D1 (and especially the 1D), which is why the F6 is so different from the F5 and aimed at prosumers, and nobody's updated a film camera with a modern autofocus system. Shooting through rose-tinted spectacles results in poor images. Someone was using large format at the Olympics to get some interesting images, but I'd be astonished if any pros were using 135 film, at least in an SLR.

Review: Samsung Chromebox

Andrew Garrard

Re: As expected

Absolutely. I wonder why Microsoft decided to do it.

Andrew Garrard

Dragging to external devices

For what it's worth, I don't really know *what* to expect when it comes to dragging files around. I established in the 90s that Windows had different behaviour depending on the source and destination location, decided that this was one of Microsoft's usability nightmares (I have USB drives that don't behave like removable devices and systems that come up with their drives in a different order; worrying about the default behaviour is the last thing I want), and I've been right-mouse dragging files (and selecting copy or move from the menu) ever since. Oh, and occasionally I use Ctrl-X/Ctrl-C and Ctrl-V. But then I was brought up on Acorn systems, which let you choose whether you wanted a copy or move by which mouse button you were doing it with, so trying to second-guess based on the device type was never a problem.

Not that this makes the Chromebox any more usable from the sounds of it, but "not like Windows" doesn't mean "broken". That statement is really quite often true.

Sony SmartWatch Android remote

Andrew Garrard

Re: No chance. NO. CHANCE.

Ah, I wondered about the "press a button to tell the time" problem (I've owned an OLED watch before). Of course, in theory, with OLED you could make the screen mostly black and it *ought* to use very little power. I still think the Pebble eInk solution looks better, if only they'd get it shippable.

Andrew Garrard

I'm vaguely tempted

My latest version of a DataLink has got to the stage where I'm probably never going to program it again, the strap is mangled and the battery is dying. I was on for a Pebble (eInk appeals, although if the previous product was still made I might have been tempted by that too), but they're not out yet, and I'm struggling to find other smart watches that are actually being made - at least, any smarter than a DataLink. I'm less interested in syncing with the phone than running my own apps.

I'm not quite clear with this whether users can write their own stuff for it (need to do more reading). Knowing the resolution of the screen would be nice, too.

Transit of Venus, live-ish from Australia

Andrew Garrard

Good old British weather

I was up at 4:15 (after watching first contact on a web cam before going to bed - good work Hawaii, not so impressed by the guys from the continental US who were filming each other and not the actual sun), ready to go out and meet the sunrise. But I saw the weather and stayed in bed.

Then I got up around 4:45 just in case, and watched some more live streaming from Oz and from Hawaii. And it stayed cloudy, but with the occasional thinner bit.

About 5:45, after Hawaii had reported third contact and while Norway were showing parallax, I realised the sun might actually hit the house. So I ran upstairs, ignoring all the exciting telescope stuff I had with me, and pointed a (stopped down) lens right at the sun. Lo, the sun had a tiny bite out of it (I was slightly nearer third contact than fourth), and I have photographic evidence (and hopefully not a knackered camera) proving I was looking. It turns out that clouds work as an extremely dangerous alternative to a solar filter. Of course, if it had been sunny, I would have had more time to project an image onto paper.

I've heard the "refractors good. reflectors bad" argument before, but I'm very confused as to why. I'd have thought (enclosed) refractors are more likely to overheat than an open reflector (incidentally, *all* Dobsonians are reflectors, by definition). The only reasoning I can think of is that reflectors tend to be faster (shorter focal length per aperture), but that's not strictly a reflector vs refractor difference. Anyone care to educate me?

Strong ARM: The Acorn Archimedes is 25

Andrew Garrard

Re: Still have a mint-condition A5000 Alpha (33MHz!) in its box.

Yup, studied the textbooks, own "The Design of Everyday Things", did HCI as part of my CompSci course, was a member of SIGHCI for a while. The Mac/Amiga solution is better than trying to aim for the top of a window - as you say, there's a hard stop. However, it doesn't scale well to large or multiple monitors. The Acorn solution of popping up a menu in the same position relative to the mouse meant that muscle memory for menu access worked very well - compared with flinging the mouse at the top of the screen between each interaction, at least; you're incorrect about claiming the need for "greater aim" because the menu was already under the mouse when you start. It's true that context-specific menus (changing the mouse pointer, there's an idea for Microsoft...) needed aim, but no more than pressing a button.

There were plenty of keyboard shortcuts available for common operations on RISC OS, but they were much less necessary than on other systems - claiming power users weren't what the mouse was designed for doesn't mean that making the mouse interface as powerful and usable as possible was a bad thing. Sure, Impression Publisher (which had its own hot keys should you want them) isn't as powerful as InDesign (although it can occasionally give Word a run for its money), but InDesign is unusable if you're using one hand to hold the reference document that's the source of your layout. I'm a little confused as to which keyboard you've been using that has only "Alt" and not "Alt Gr", let alone separate Windows and Menu keys, but - much though I love Emacs, ctrl-alt-meta-cokebottle-x is not a user-friendly short-cut. Acorns had a "copy" key that, who knew, copied things.

I like Macs, but my HCI lecturer was a bit prone to claiming that their interface was perfect - notably "don't make nested menus too deep because you have to click every time to expand them" (not on RISC OS you don't). Acorn's interface guidelines would still do a lot of modern developers good - particularly "never write a large amount of text in a dialogue box and put OK and CANCEL at the bottom - name the buttons for what they do". Acorn never asked you to eject a floppy disk by dragging it to the recycle bin or popped up a "disk not recognized - format? [ok]" box, nor did it expect you to shut down the system by clicking "start" or decide whether it was going to copy or move a file according to where you were putting it. There were some really nice touches - expanding a window off the bottom/right causing the top/left to grow springs to mind.

Not that everything was perfect. It's nice to be able to resize windows from more than one corner (I had a plug-in, although I still think twm had the nicest solution). There was still the odd UI clanger ("Please insert RISC OS 3 ROMs and press any key to continue"), as a co-operative system it could still get locked up by a misbehaving app (although app killers help), it wasn't as dynamic or secure as a modern OS. But some stuff really was done right, and still isn't by almost anything else.

Andrew Garrard

Re: Lander in BASIC?

[And, while I'm eating my words, the reset button on the Archimedes was, of course, on the back of the keyboard, where it was useful if slightly prone to getting poked by the keyboard coily caable - although it meant the keyboard was nonstandard. The RiscPC's "normal" PS2 keyboard meant the reset button was, as I said, at the back of the machine, where you'd hit it plugging in headphones. Clearly I'm going senile.]

Andrew Garrard

Re: Lander in BASIC?

[Okay, I've found an article that claimed that Lander was in BASIC, although I suspect it was only the boot code. Unfortunately, because everyone had a copy, I'm having a little more trouble finding a binary to check. I'm prepared to eat my words, though.]

Andrew Garrard

Re: You flew polygon-built ship over a 3D landscape

Amateur. :-)

Andrew Garrard

Re: Lander in BASIC?

I'll believe you, it's just that this thread is the first I've heard of it. There can't have been much logic on the BASIC side, and even a CALL statement to thunk between the two had quite an overhead, so it just seemed unlikely to me. If there's a reference to this, I'll be interested. (Or I may be able to find my old disks and have a look.)

I'm prepared to believe that it might have used BASIC to set itself up, but that seems less likely than doing any BASIC when the program itself was running. I'll go and google this now, but I would have thought that I would have remembered...

Andrew Garrard

Re: Still have a mint-condition A5000 Alpha (33MHz!) in its box.

I absolutely agree about the mouse. Even after Microsoft eventually worked out that their mice had a second button, mousing on Windows still seems stupid compared with the Acorn approach. (Actually, menus at the edges of windows are the worst of all possible worlds - not near the mouse, not where you can get them quickly. I take the Amiga's scheme - like the Mac but invisible until you hold down a mouse button - as second best.)

Add in MouseAxess (we don't need no stinkin' window borders to move things...) and you got a system which was far more usable without a keyboard than most modern PCs.

Which brings me to the things you can do with the three button mouse. None of this "shift-click" to multiple select, that's what Adjust was for. Drag a window without bringing it to the front? Use Adjust. In the file manager, decide whether you want a new window for the directory you're entering or to re-use the current one? Select or Adjust again. I want to say the same thing about the difference between a copy and a move, but it's been too long for me to remember. And, of course, Acorn had the most sensible file save mechanism I've seen (why on earth does every application in Windows need its own way of viewing the file system?)

Ah, rose-tinted goggles. Shame about the lack of pre-emptive multitasking,..

Andrew Garrard

Re: Arthur

I'm not sure about "problems" as such, although it wasn't until RISC OS that there was a proper multi-tasking interface, the draw module is an epic piece of useful coding, vector font rendering (as of later versions) was way ahead of its time (although so was the bitmapped antialiasing of Arthur), and in the newest versions, the SpriteExtend module (dynamically expanding JPEGs to render them stretched and dynamically mapping the output to the screen palette...) makes me wonder why a lot of modern systems struggle so much.

Hands up if you remember the dark blue/light blue version of Arthur?

Andrew Garrard

Re: Lander in BASIC?

(Since I've been sad enough to look it up, Virus also didn't adaptively shade the spacecraft according to surface angle like Zarch did, possibly more because it didn't run in enough colours to do so rather than anything to do with calculations.)

Andrew Garrard

Database software

PipeDream, perchance? It was a database-cum-spreadsheet-cum-word processor. Later ported to the Z88.

Andrew Garrard

Lander in BASIC?

I'm not sure why people are saying that Lander was written in BASIC. To the best of my knowledge, it was pure ARM code, and I'd be astonished if it went anywhere near Acorn's triangle drawing routines. BBC BASIC on an Archimedes is fast (and the Arthur, although not RISC OS, desktop was written in it), but it's not *that* fast. Minotaur, one of the first commercial games launched for the Archimedes (alongside Zarch), *was* written in BASIC, however.

(Speaking of BASIC, I'm not sure about this "press reset twice and you get the program back" thing. Not in my memory you didn't. I also remember the reset switch being on the back, right next to the headphone socket, where it was easy to reset the device when plugging headphones in. It was still on the back on the RiscPC, even though the power switch was on the front. Never understood that...)

As for comparisons to Virus, I believe Virus had some more enemies - it got harder faster than Zarch - but it was also noticably less pretty; for example, there was no depth cueueing of the background (in Zarch, everything got darker towards the back of the screen). I'd assume that the Amiga version used the blitter, although since there was an ST port I can't guarantee that. I've never played the Spectrum version, but it's high on my "most preposterous port" list. Lander, of course, didn't have all the enemies, let you blow up on the launch pad, and didn't clip the front edge of the landscape properly - but it was very good at training people to play Zarch! (I still maintain that I ought to be able to fly a helicopter, should I ever need to, because of this game.)

Part of the exciting colour scheme of Zarch was that it could use the 256 colour mode, back when the best PCs had original VGA graphics. The 256 colour mode had 16 palette entries (that most people didn't touch) and the rest of the values derived from them; the default mapping was an effective perceptual HSV scheme, accessed from BASIC by 64 base colours (setting the top two bits of each channel) and four levels of "tint" (setting the bottom to bits of all three channels at once), giving you fine grained luma control. It might not have had all the colours of HAM6, but it was pretty effective. Despite a brief foray into VU-3D on the Spectrum, it was probably Render Bender that taught me to think in 3D graphics (and now I work in graphics professionally).

Andrew Garrard

Re: 1 colour?

Actually the high res mode (which needed a special monitor) was 1152x896, using one bit per pixel. Sometimes it's worth using Wikipedia for fact checking. All the Arabella-based systems could do 1, 2, 4 or 8 bits per pixel. There was also a bash at resolution independence - coordinates were downsampled according to the mode, so in 1152x896 one coordinate step mapped to one pixel, but in mode 12 (640x256, 16-bit) pixels were two coordinates apart horizontally and four coordinates apart vertically (the next pixel above and to the right of 0,0 was 2,4). Mode 13 (320x256, 256 colour) was downsampled by 2 in each direction.

You *could* do 640x512 in 256 colours (mode 21) with a MultiSync monitor, but before the ARM3 turned up with a cache, it didn't leave much bandwidth for the CPU to do anything. Later systems like the A5000 with faster RAM added mode 31 (800x600, 256 colours) etc. And the VIDC20 in the RiscPC added 16- and 32-bit modes and a more programmable video clock.

Andrew Garrard

Re: Some factual errors

Jason: Thank you for picking up most of my ranting! Although I'd point out that Virus was not *exactly* the same as Zarch (and not just because it ran slower). I've clocked Zarch (I still have the disk), but I struggled with Virus on an Amiga.

This has reminded me that my wife made me get rid of my Archimedes (A310, upgraded to 4MB) a couple of years ago. I cried, even though I still have my Spectrum and my RiscPC is still in the family. I may get one from eBay and hide it somewhere, although it obviously won't be the same.

Txt-speak is a sign of humanity 4 U

Andrew Garrard

Re: Also other pre-internet sources.

From the Jargon File:

"Hackers, as a rule, love wordplay and are very conscious and inventive in their use of language. These traits seem to be common in young children, but the conformity-enforcing machine we are pleased to call an educational system bludgeons them out of most of us before adolescence."

Hence the profusion of this kind of stuff in early usenet (it's not like "ROTFLMAAOBPO" is quicker than typing "ha ha") and the tendency for each generation, as with sex, to think they invented it.

I've occasionally been known to resort to it in order to make a couple of characters' difference between different numbers of text messages (especially when texting a broad). It might have been an option on the phones I had with qwerty keyboards, but since my first phone had T9 and the most recent ones use Swype, entering anything other than real words is always more of a hindrance than a help.

Panasonic touts monster 8k by 4k 'flickerless' plasma

Andrew Garrard

Re: Makes sense for that resolution

Canon have recently announced some stuff that can shoot at 4K (although not 8K). I don't believe they're shipping, but it's relatively consumer-spec stuff.

Seems a little unwieldy - the last screen of this resolution that I used was 44" diagonal (plus some bezels) - but I'll take the pixels where I can get them.

What does the resolution have to do with plasmas flickering? Pretty much every plasma screen I've ever seen flickers visibly to me (which is why I don't own one). Kudos to Panasonic if they've got the flicker fixed, though.

Basic instinct: how we used to code

Andrew Garrard

Ah, memories

I, too, still have all my INPUT magazines. Their series on 3D graphics was what started me on 3D; my initial graphics programming happened on the Spectrum (also yay to the orange manual, which is where I learnt the "x is a cross, so wise-up" mnemonic). I remember an adventure game in INPUT that used a partial predictive matcher for compression - pretty good for the time.

The higher level constructs in BBC BASIC (especially on later machines) were a pretty good stepping stone to more powerful languages. I still list BASIC on my CV (so I don't have the hacker's test point for denying that I know it). I never really picked up 6502/Z80 assembler (although I could probably work them out in retrospect now), but I learnt ARM assembly using the inline assembler in BBC BASIC on the Archimedes - a bit of a step up from my spectrum.

I still have my Speccy. My wife made me get rid of the Archimedes. I cried.

I don't know that I'd be where I am now if I'd been starting out with a 1990s PC instead of an 1980s micro that let you write simple animated graphics in an afternoon. I remember writing BASIC to draw a car. With racing stripes. And speed "woosh" lines. And the text redraw was probably faster than the virtual remote machine I'm having to use at the moment.

Thanks to all the pioneers. Sent from a Khronos conference in Dublin.

Toshiba to demo vid streaming without any work by the CPU

Andrew Garrard

Re: I think you mean...

I'd assumed it was a typo for 40Mbit/sec video streams, 40Mbit/sec being the upper limit for Blu-Ray content. But in retrospect, at least one other version of this report says "64,000 streams at 40Gbit/sec" meaning *total*. So I buy your version.

Game of Thrones Blu-ray disc set

Andrew Garrard

Oh dear, spoilers

I'm only half way through the second book, but I second the concerns that things seem to be slowing down a bit - sad if it gets worse in later books. One of the things that hit me about the first book was that quite a lot happens in it; I'm less surprised that the TV adaptation worked than that they managed to fit most of it in. On the other hand, at least some of the characters come across without the "comedy dwarf" tweaks that Jackson felt the need to make to LoTR.

The thing that really put me off the first series was that some of the early acting - even by relatively experienced actors - seemed incredibly stilted. Some of it was very good, but Littlefinger's need to orate everything really grated; even Lena Headey (who's been good in plenty of other things) didn't seem very comfortable. It's possible it got better as the series progressed, or maybe I just got used to it.

I'm also not all that impressed with the need to age the entire cast (because it stops some of them acting their age and the shock of what they go through is lost a bit), but I guess they'd have had problems broadcasting it if there was really the requisite amount of child nudity.

Here's hoping that the pay cheque persuades the author to find a way to finish the story.

Samsung Galaxy S WiFi 5.0 Android media player

Andrew Garrard

Optical track pad

"With the same rounded corners, chrome vanity band and optical track pad you would be hard pressed to tell the two apart."

What optical track pad?

Otherwise... meh, SD resolution, not for me - but that doesn't mean it doesn't have a market.

Kodak heading to Chapter 11

Andrew Garrard

Just for clarity...

...and to make sure I'm bidding against more people the next time a DCS-14n appears on ebay:

Kodak deserve a bit of credit for getting a DSLR to 14MP sometime before most of the competition, but there's no doubt that it doesn't handle especially well and it's no low-light camera. My interests in it are only as a back-up to my D700, and since the current back-up is an F5 it's not actually going to be worse. The alternative is, obviously, to go crop sensor (or be able to afford another D700 derivative) but that means carrying more lenses to make sure I cover the field of view range I've decided I want on any given shoot.

Kodak obviously suffered from basing their DSLR strategy on adapted film bodies from (mostly) Nikon - the moment Nikon brought out the D1, limited though it was, there was always going to be a conflict of interest.

Andrew Garrard

Actually, a little harsh

I'm grateful that Kodak kept making film so long; I've still got some in my fridge, although I admit that my film shooting these days is more usually with Fuji or Ilford products; I may regret their troubles more when I eventually get a 5x4.

There's no doubt that the film market shrank radically the moment cameras went digital; despite my first paragraph, I do most of my shooting with a DSLR, and that's not going to change (except, with small probability any time soon, if mirror boxes go the way of the dodo). I actually started with DSLRs, unless a Polaroid camera I had in primary school counts, and added film to my repertoire.

I can't imagine Kodak not seeing it coming, but the question is what they could do about it. According to dpreview, Kodak have made 144 digital camera models, some of which are still current. For a company which had more to do with chemicals than optics, electronics or consumer goods (at least in recent years), that's not a bad effort - but it's not surprising that Canon is the company most visible in the desirable compact and DSLR space and that Sony's electronics combined with Nikon and Pentax's camera design is taking much of the rest of the spotlight.

With the premium products made by big names, Kodak - whose cameras have never exactly been the M3 or F-series of their generation - could only really try to compete at the cheap end, and I suspect they didn't have the manufacturing capacity to create small plastic boxes as cheaply as the bigger companies in China. Even if they did, that market must be feeling the squeeze now everyone owns camera phones. Other than "something else", I don't know what they should have done.

Still, maybe I should get my hands on the DCS-14n that I want as a back-up to my D700 before the collectors start putting the prices through the roof.

Oh, and to add to the history lesson in digital photography, Bryce Bayer (of the pattern) worked at Kodak. I wonder which name will live on longer?

Laptop display pixel counts to quadruple in 2012

Andrew Garrard

Managing, yes. Enjoying, no.

One of the biggest negatives about the first LCD monitors was that their resolution was awful. An LCD running 1280x1024 looks sharper than a CRT running 1280x1024, but my 19" CRT is extremely sharp at 1600x1200 and pretty good at 2048x1536. With CRTs, the user had the opportunity to trade off text size and sharpness. That went with LCDs - and the market position that should have been occupied by successors to the T210 and T221 got filled with LCDs that were the same resolution, but different physical sizes. Given the appearance 17" SXGA LCD, why the "upgrade path" was a 19" SXGA LCD is lost on me (I, like several colleagues, kept our 17" panels and rejected the 19" ones when given a 1920x1080 upgrade to our second monitors). The rant I had when it became clear that 1920x1200 needed at least a 24" screen, and that (prior to Apple) 27-28" screens were still only 1920x1200. Thanks, but I need that space for the other monitor that I have to use because you won't sell me one with more pixels...

If you've spent a life being able to have multiple documents on the screen at once, or see a reasonable overview of a document while still examining detail, any small screen is seriously constrictive. If you've been managing at 1280x1024, maybe you don't know what you're missing (but try running at 1024x768 for a while and maybe you will).

Andrew Garrard

All I can guarantee is that *I* find it useful

To be honest, I don't scale my desktop to 200% when I use my T221 - I prefer the extra real estate, and move closer when I want to - but I was answering a question about the icons being too small.

A lot of software these days will scale. Looking at a PDF (or doing DTP) on a T221 is visibly improved over a "conventional" monitor. Using PhotoShop is much better in showing detail (although since the T221 is ancient, the colours aren't quite up to modern standards). I deliberately got a 960x540 phone in order to make the experience of reading PDFs and web pages without scrolling more comfortable. And, of course, you can fit more code on the screen. I've got from a five monitor set-up in my previous job to a 1920x1080 + 1280x1024 combination with Windows running in a 1440x1050 remote desktop in my new place; it's unbelievable how much more constrained I feel.

Most GPUs these days have resources to spare - except possibly in the mobile space - although I admit that composited desktops get hit hard when the resolution quadruples. I don't claim that everyone should care, and that everyone should buy a monitor with a higher resolution - but it's been a source of frustration for years that very few high resolution panels have been available. Thank goodness for Apple, first with the Cinema Display and then the Retina Display, which dragged other companies along. Given Apple's history of "100ppi is perfect", I wouldn't have expected it of them.

As for portable devices, where you're actually constrained in the physical size of the screen and it's easier to get close than with a desktop, the more pixels the better (although 2880x1800 is a bit of a weird choice compared with, say, 2560x1600 or 3072x1920 - at least it's not 16:9). To me, for a tablet or netbook that's a content viewing rather than creation device and spends a lot of time displaying relatively static information, I'm completely behind the idea of a premium offering giving you more pixels even if it means that games can't run as fast at the full resolution. That doesn't stop the cheap end of the market getting pixels the size of bricks, but it's genuinely a useful differentiator. Although I'd prefer not to have to buy Apple to do it (if a 960x640 Android phone had been available in the UK when the iPhone4 came out, I'd not have suffered another year of Windows Mobile).

Andrew Garrard

How closely can you focus?

I'm curious what monitor has pixels you can't see. If you can't see the pixels with your nose to the screen, I suggest that might be because you're too close to focus. My getting-on-for-forty-year-old eyes are a bit mangled, so admittedly close is better for me than for most, but I can see the pixels on an iPhone4 and the PenTile grid on a Galaxy Nexus, let alone pixels on my T221 (204ppi, 3840x2400), although they're still all clearly better than the norm. The pixel grid on the 1920x1080 24" panel I've got at work is clearly visible even from a normal working distance.

You pay for the image quality in GPU requirements (although, for a plain desktop, this wasn't an issue in 2004 when I got my T221, and probably isn't now except in the mobile space) and of course it makes the panel more expensive, but getting above WUXGA makes a big difference to a lot of workloads. I've regretted that it's been nearly impossible to get a 15" laptop with a decent screen for most of the last decade.

I claimed for a long time that the CEOs of monitor companies have failing eyesight, which is why the panels get bigger without resolution increases. Facetiousness aside, I'll be very happy to see the trend reversed - I'd actively have preferred a 22" 1920x1080 panel to the 24" one I got, and I'd certainly have preferred WUXGA or higher. Here's hoping they finally start selling them and the prices can come down.

Page: