* Posts by Frumious Bandersnatch

2662 publicly visible posts • joined 8 Nov 2007

Ancient revellers came to party... and build Stonehenge - boffins

Frumious Bandersnatch

Re: Bring a megalith party

Fine.. I'll leave menhir then.

MakerBot demos 3D object scanner that fits on your desk

Frumious Bandersnatch

http://insight3d.sourceforge.net/

Has been around for a while and will generate 3d models from regular photographs. Obviously, laser scanning is going to be much better for precision work and cutting down on the amount of post-processing work (less noise and higher resolution), but I doubt that photos can be totally replaced (within reasonable cost limits) when it comes to surface "texture" mapping (by "texture", I mean in the sense of a colour map rather than an actual texture, obviously).

While it's nice to see this new project, I think it's unnecessarily restrictive. Sure, there are plenty of applications where you just want to scan in a 3d object, so having a controlled shot (such as with a fixed camera and turntable, possibly with a set background for calibration) makes sense there. In fact, these kinds of object scanners have been around for many years. But they can't handle lots of real world scanning tasks that would also be nice, eg, scanning room interiors and larger objects that can't physically fit in the control frame like furniture, vehicles, etc. Being able to track location as you enter an object's interior would also be pretty useful (think of the opening tracking shot in, IIRC, Vertigo, for example--the one where the camera tracks through a sign and into a building).

I think that latter kind of scanning (of larger and enclosing objects) is much more interesting from the point of view of developing new virtual reality and augmented reality applications. It's akin to the shift from still photography to films, with the ability to move around in space and time. Think of robots that can locate obstacles (or goal objects) in a 3d space, or terrain/object mapping based on aerial video recordings, inferring an object's motion relative to other scene elements, or even just as a quick and easy way to knock up quick scenes for first-person shooters (eg, Runtfest map for Quake 3) or 3rd-person interactive puzzle games (modern versions of the old Monkey Island style of game). Digitising small objects is all well and good, but it's really more of a time saver than a game changer, IMO.

Twenty classic arcade games

Frumious Bandersnatch

It really depends on age

Loved playing ... Asteroids (and later Thrust on C64), Moon Patrol (2 player version), Double Dragon (backwards elbow strike FTW), Bubble Bobble (relaxing, but great powerups), Ghosts and Goblins (hard!), Rampage (smashing and eating) and Outrun (bike racing games were great too).

In another really fun game that I came across in later years (probably in Thailand, or somewhere in East Asia at any rate) you had to control a flying balloon by cycling and steering with a set of handlebars. No idea what it was called.

Ubuntu 13.04 beta touts search privacy - before it hooks in eBay, IMDb etc

Frumious Bandersnatch

Re: Of the things I'd like to see in the next...

You speak like one previously burned :)

Unfortunately, yes. On multiple occasions, I'm sad to say. You'd have thought I'd learn me lesson after the first time. Alas, no.

Frumious Bandersnatch
Linux

Re: Pardon?

As a result I can't get it into its native mode of 1280 x 1024

Sounds like it's probably a problem with the monitor's EDID information being screwed up or the monitor itself reporting invalid data, although there's always a chance that some xorg update broke something. The latter problem is a lot less common these days, and actually I think that the developers deserve a lot of respect for the work they've put into auto-detection of graphics cards and monitors. These days 99% (or a high percentage, anyway) of users won't need to edit (or even create) the xorg.conf file. Such an advance from the early days when you basically had to manually put in modelines on pretty much any system you were working on, probably followed by using xvidtune to deal with overscan and image centring ...

Anyway, for your problem, you might want to look at the xrandr command to see what X thinks the available modes are and bypass any of the layers of gunk that unity/compiz puts on top of things. It might not be the solution, but you never know... it might help. There should also be a command to dump the monitor's edid information, though I'm not sure if it's available in Ubuntu without compiling it from source yourself.

Frumious Bandersnatch

Re: Of the things I'd like to see in the next...

Or a combination of the ps and kill commands, if you're a command

The pidof command is quite useful too, if you know what you want to kill (or check whether it's running). Internally, it's the same command as killall, which, unlike the Solaris version, gives you command help when run without any arguments instead of killing every single process that it can...

MapR smashes MinuteSort benchmark on Google Compute

Frumious Bandersnatch

Re: some wrong figure there?

Hmm... I don't think that coffee makes servers go any faster. Maybe some Wash and Go? (with the emphasis on the "Go" part")

Frumious Bandersnatch

some wrong figure there?

Microsoft could chew through 1,401GB of data in the [Daytona mode] test in 29 seconds and in Indy mode it could do 1,470GB in 59.4 seconds.

I take it there's a mistake here somewhere if after tuning the performance drops by ~50%?

Storage glitch sends Curiosity into safe mode

Frumious Bandersnatch

Re: Three digital watches...

But do any of them display the right time?

Even a stopped clock tells the right time twice a day...

Frumious Bandersnatch

only 2 computers?

Personally I always wear 3 watches for true fault detection and recovery. And that's just for a walk down to the local chemist. A trip to Mars would be mind-bogglingly big compared to that. I don't think even my 3 digital watches would be a good enough idea for that.

WHY would survey-slingers give YOU a free $1,500 Google Glass?

Frumious Bandersnatch

Re: bah

that £498 phone bill iv just paid was in vein then ??

Only if you injected it. With the "iv" you talked about, no doubt. I've heard of people going to desperate lengths to get high (Zappa smoking a high school diploma being one), but you've officially taken the biscuit.

ARM servers: From li'l Acorns big data center disruptions grow

Frumious Bandersnatch

Re: Missing the point, I think...

10^9 core MPP systems are now viable. At that scale we can drop the Turing Machine model,

Not really. We're still stuck with the Turing model in an abstract sense and the Von Neumann model in more practical terms. We just have to adapt them to be more aware of multi-core and multi-processor systems. And in fact, we pretty much have done so years ago and there hasn't been any great paradigm shift.

and progress to an Object Machine, where every element of data is an active processing element

It sounds like you're talking about agent-based programming. Again, it hasn't caught on, except in writing botnets and perhaps back-ends for massively-multiplayer online games.

generating random sequences of logic and see if they do anything useful

And just how do you decide what's "useful"? Or as Robert Pirsig put it in Zen and The Art of Motorcycle Maintenance, "And what is good, Phaedrus, And what is not good—Need we ask anyone to tell us these things?" You'd probably enjoy reading that since it's really about philosophy, not hard computer science.

because all software will be written by software(*)

Of course. And the Singularity will arrive and bathing in Unicorn Milk will keep us young forever.

do not just binary digital processing but higher-base hardware processing ... feed a base-3 digital processor a compatible pair of base-2 & base-3 instructions

Hmm... Are you really amanfrommars in disguise? If so I claim my £5.

But seriously, do you even know what Turing-complete means? In particular, a Turing machine can be re-expressed in terms of Gödel numbers, which it turn can be mapped onto the set of natural numbers. Crucially, all practical number bases are isomorphic to each other, so binary, ternary or base 10 (or balanced ternary or whatever) all have the same expressive power so there's no theoretical reason to favour one over the other. It only comes down to issues of practicality. For most purposes binary is good enough, and it's only if you want to represent certain numbers with a finite number of digits that you might want to consider other bases (the string to represent 1/10 is infinitely long it binary, for example, while it's just "0.1" in decimal or binary coded decimal, for example). And in case you're wondering, going from the natural numbers to the reals doesn't magically grant your computer new powers either: the naturals are perfectly sufficient for "universal" computation, so, eg, a phinary-based computer can't do anything more than a binary one can, except be a pain to build and program. Another book recommendation for you: you might like Godel, Escher, Bach, and Eternal Golden Braid...

(*) Actually, there is one kind of "program that writes programs" that can benefit from having massive amount of cores to work with, though I mean "program" in the kind of mathematical sense that Turing did, rather than the way you think of it (eg, a word-processing package). I'm thinking of something like Turbo Codes, which are effectively bit-level programs that tell a receiving computer how to reconstruct some embedded data even if some of the bits are dropped or corrupted in transit.

Another, similar type of application is data compression, since you can treat the compressed data as a "program" that tells the decoder how to unpack the message. I think that that's the most interesting possible application in this realm: given enough computing power, we should be able to try out many different ways of compressing some given data and output a compressed string and a decompressor. Obviously, this still isn't going to be able to magically compress incompressible data and it's quite impractical as a replacement for general-purpose compression schemes like gzip, bzip and so on (since there is an infinite--or worse, transfinite--number of "languages" to consider, and the best compression ratio possible is sensitive to the choice of language) but it still could be quite useful for discovering good compression schemes for certain types of data. See Kolmogorov Complexity for background details.

Frumious Bandersnatch

Re: Hmm...

I just don't get where you get the idea that you can't backup ARM servers, or even why it's a stumbling block to deployment. If you want them, there are plenty of backup solutions you can compile from source, or you can use the venerable rsync if you don't have any special requirements like snapshotting a filesystem so that it's in a consistent state during the backup (though I understand that LVM can do this).

The second point is to consider whether you really need backups in the first place. I think you may be misunderstanding the use case of (most?) ARM server deployments. You're probably more used to thinking of having a variety of servers each doing different things, or running a number of VMs, perhaps? I see the use case of ARM servers more in terms of grid or cluster computing. Looked at in that way, there's probably nothing on any of the nodes that you'll actually want to back up explicitly. The system image (or a large chunk of it, anyway) will probably reside on an NFS server and will be shared among several nodes. If you're using them for "OLTP" type applications, then your database is definitely going to be distributed, with replication of data across several nodes. The upshot of both of these points is that if something goes wrong with one of the nodes, it's not important: you just replace it or reimage it. If your database is already distributed and replicated across nodes, it can survive some number of failures like this, so again, there should be no need to backup individual nodes. You will want to make sure that you've got some way of backing up your entire database, but that's a whole different kettle of fish, and nothing to do with what you say is the problem here.

Vint Cerf: 'The internet of things needs to be locked down'

Frumious Bandersnatch

shut off internet access for my toaster?

But then how am I supposed to read my morning emails?

Super-fast super-massive black hole spins at nearly light-speed

Frumious Bandersnatch

Re: That Ergosphere is gonna whip the wheels off Hawking's chair!

what exactly is "rotating at speed X" here?

The accretion disk. The article says that "the outer edges of the NGC 1365 black hole are spinning at 84 percent of light-speed or more."

Infalling matter follows the rules of relativity, so that in a relatively flat spacetime, by definition it can't travel at c or more, while in a degenerately curved spacetime (like falling into a black hole) it's red-shifted to such a degree that it will disappear from our relative view before it even appears to approach or exceed c (even before it hits the event horizon). It's just cosmic censorship in action.

Colombian boffins reconstruct flight path of Russian meteor

Frumious Bandersnatch

Re: Bah!

RE: Somehow I can easily imagine someone believes that, in spite of the fact that he missed and none of them died.

Yes, more of a "Small Meteor Hits Russia: Not Many^H^H Nobody Dead".

Frumious Bandersnatch

Re: A modest proposal...

re: I thought that we had a system for tracking and watching dangerous objects in space. Why didn't that system warn us?

Why not? Probably because their budget isn't big enough. I thought I read that it was $5.4 million, but I'm not 100% sure of that figure. It's certainly only a few million (3--6) spent on the problem. Less than what a typical Hollywood disaster movie costs, anyway.

Hey, software snobs: Hardware love can set your code free

Frumious Bandersnatch

Re: There is a reason for Software Smugness

RE: Remember your assertion: "Thinking about better algorithms is never a bad idea."

While you should probably "never say never", I'm siding more with the original poster. Although there is often a balancing act involved in how much time you can spend on finding a better solution and bearing Knuth's "premature optimisation is the root of all evil" quotation in mind, there's still often a good case to be made for looking for a fairly efficient algorithm right from the start.

I don't think that anyone is saying that we should go to excess in looking for the best solution, but for what we're talking about here (processing big data sets), we should really be aware of how expensive and time consuming each possible solution might be. It's the mindset that's important: do you just write the most basic SQL query, or do you take care to minimise expensive join operations or defer them to be operated over a reduced data set, for example? Also, experienced coders will of course realise that there's no point in blindly trying to optimise every single aspect of the code. They'll use a profiler (or equivalent) to identify where their efforts stand to reap the most benefit.

Speaking of programmer effort, I think that in many cases, it can be false economy to use inefficient algorithms. If your algorithm is bad enough, you can end up spending more time waiting for results when you're coding and testing the thing (on real data, as opposed to just a small amount of test data) than you would if you'd just thought about the problem a bit more from the outset. Granted, you can multitask and do other stuff while you're waiting, but it's not ideal to have too many context switches or your productivity will suffer. Plus, what happens when you finally realise (or have to be told) that the solution isn't good enough? Most often, you have to go back to the drawing board and do what you should have done in the first place and implement a half-way decent algorithm.

New social network is for DEAD PEOPLE

Frumious Bandersnatch

Re: Oh oh!

Nah, more like "Things to do in Denver when you're dead". The film, I mean, not the Warren Zevon song.

Apple patents situational awareness for oblivious fanbois

Frumious Bandersnatch

situational awareness?

Hello? I'M ON THE TRAIN!

Microsoft brings Azure back online

Frumious Bandersnatch

Re: Cron job needed and, er, why didn't they renew the cert for longer?

re: Suppose the real reason is something else but the cert.

I suppose they should probably stop calling them "certs" then, if they aren't.

Bees use 'electrical sixth sense' to nail nectar-stuffed flowers

Frumious Bandersnatch

Re: @Gareth

re: ground...

Be that as it may, it doesn't stop us from building up a fair charge on carpets and the like. I'm guessing that the plant itself is acting as an insulator and it's the bees' rubbing against the pollen/stamen is what sets up the differential. Kind of like a mini Van De Graaf generator.

Frumious Bandersnatch

wireless signals and electrical pollution

Sigh. Static electrical charge is not the same thing as wireless (radio) propagation. The article is about the bees' use of the former.

Happy birthday, LP: Can you believe it's only 65?

Frumious Bandersnatch

Diamond Rio PMP300

First commercially successful flash-based MP3 player. Pre-dated the iPod by about 3 years and was widely regarded as the inspiration for the iPod in the first place. Just saying...

Survey: Bosses are DESPERATE and GAGGING for Linux skills

Frumious Bandersnatch

Re: Also: MS Office For Linux (kernel)

If they don't bring Office to Android, somebody else is going to eat their lunch on the tablets

It still doesn't seem likely. They may have got some of the way there with the cut-down version of Office for Surface RT, but it is exactly that: cut-down. I've read lots of unsubstantiated comments here that the Office code base is such a mess of x86 assembly for things like macro support is unlikely to be ported any time soon. Also, I read here on the Register that there seem to be political problem within Microsoft as to whether they should even develop and release an ARM version (or a Surface RT version, to be precise) of Outlook. If that's to be believed, then there's probably a considerable faction within MS that would never accede to releasing a Linux (or Android) version of any of their desktop tools.It would completely go against the whole philosophy of maintaining customers through locking them into the Windows ecosystem. And even if they do go down that route, it may, as you say, already be too late .

On a slightly unrelated note, I think that there is definitely a niche there for a "good enough" (which incidentally is a phrase you used to hear at MS to describe their development/release philosophy) office suite. I've been thinking for quite a while now that a suite that had the 80% of features that most people actually need and use could easily capture a significant chunk of the market for "office"/productivity software. People are fed up with massive, bloated systems with tons of arcane features that they'll never use. By paring it down and providing good interoperability between components and across platforms, it should be "good enough" to satisfy all but the most hardcore/insane of users. In keeping with the 80% of functionality idea, I'd suggest calling it "Pareto" (if such a thing doesn't already exist). So long as developers were ruthless about not implementing features just for the sake of it, I think it could go a long way.

This is just my opinion, though. Personally, I've not used Word in many years and I have no need for it unless someone demands a document in that format. If I need something professional looking, I'll plump for LaTeX every time (edited in emacs, naturally :), or just use XML and CSS if I want to mess with layouts and fancy stuff. In either case, I prefer to concentrate on the content rather than formatting (which gets done at the end and is abstracted away from the actual content). This seems to be the opposite of the way that most Word users (and developers) work--style over substance, you might say.

Higgs data shows alternate reality will SWALLOW UNIVERSE

Frumious Bandersnatch

Re: The laws of physics will be different in the encroaching bubble.

Don't get me wrong... it was a fine attempt at making a joke, and I'm all for that, but the part of me that holds maths in such amazement(*) just flat out refuses to even consider Pi being some other value, even in an alternative universe. I literally just doesn't compute. A universe where e**(pi*i) isn't -1 is as unimaginable as one in which effects precede their causes or don't have causes at all, or where entropy doesn't grind everything down. Besides, even "non-Euclidean" geometries (eg, geometries without the parallel postulate) don't mean that they don't use and need Pi. If you take a plane journey through three points on the globe, the triangle you trace out has >180 degrees, so it's non-Euclidean, but it's obvious that if you go up a dimension from the 2-D Cartesian representation to the earth as a 3d sphere that everything still works and revolves around Pi ...

(*) http://en.wikipedia.org/wiki/The_Unreasonable_Effectiveness_of_Mathematics_in_the_Natural_Sciences

Frumious Bandersnatch

Re: Oh, well...

re: it could be that this is the adequate explaination for the thing with the moon

Been reading 1Q84 then?

Frumious Bandersnatch
Black Helicopters

Re: The laws of physics will be different in the encroaching bubble.

re: How quaintly Euclidean...

I see what you're trying to do, but any of these "alternative universe" theories are rooted in maths. Even if they exist, there will be no universe where 2+2 = 4.1 or Pi isn't both a constant and an irrational number. The fundamental rules describing the geometry of alternative universes has to be the same as ours according to all the theories. The most likely scenario is that physical constants like ratios of fundamental forces or binding energies needed for chemical bonds or decay rates or the like could be subtly different, though it's vaguely possible (in a mathematical sense) that if a particular string theory happens to describe the Whole Sort of General Mishmash that is the multiverse, and the alternative universe has slightly different parameters, then we might actually be able to see extra dimensions there on a macroscopic scale. That would probably be the weirdest possibility. Even so, the metric spaces of our universe would also apply there, so Euclidean distance would still apply on some scales while a Minkowski space metric (which still requires Pi!) would be more natural in others.

I see that a previous poster got a downvote for suggesting alternative lead-based lifeforms. You'd have to tweak the fundamental physical constants by a massive amount before that would even be a remote possibility. Before you'd even managed to get there, you'd find that the stars had gone out due to not being able to self-sustain their fusion reaction. Then we'd have a lot more to worry about than alien invaders. Something like Ice-9 would be a lot more plausible than Pb-based life.

Review: The ultimate Chromebook challenge

Frumious Bandersnatch
Thumb Up

Re: "but the Acer has Caps Lock"

As I understand it, you can change the keymap behaviour so that it acts like a caps-lock key. But yes, I SEE WHAT YOU DID THERE. Have an upvote.

Firm moves to trademark 'Python' name out from under the language

Frumious Bandersnatch

Kind of reminds me of ...

the previous Python-related copyright snafu involving Twisted and (val)grind. Guido and friends seem to be on similarly strong grounds here.

Reg readers scuffle over the ultimate cuppa

Frumious Bandersnatch
Boffin

re: It's not physics in this case - it's chemistry

I found a link to that effect. Apparently all the national papers ran with a similar story around the same time.

Frumious Bandersnatch

Hmm

Not sure what the outcome of this poll will be. Will it be some sort of frankentea, one bearing the hallmarks of being designed by a committea (sorry) or perhaps it'll just reflect tea à la mode (or is it "à la mean"? I can never remember which is which).

Kiwi cops to buy 6,500 iPhones, 3,900 iPads

Frumious Bandersnatch

based on the example

They're really touting the benefits of a national ID database rather than the conduit. We should totally get ourselves one of those.

/sarcasm

Tesla's Elon Musk v The New York Times, Round 2

Frumious Bandersnatch

salacious?

I think he's over-egging things for sure. I mean there's no mention of shifting [gears], hugging curves, burning rubber, [lug] nuts popping, [cam] shaft action, driving stick, sucking [diesel], the point of no return, or even throbbing pistons.I'm sure that if "salacious" was the goal, he could have done a much better job.

Traceroute reveals Star Wars Episode IV 'crawl' text

Frumious Bandersnatch

Re: Decwars!

Was never much of a fan of [Dec] Wars. [Vax] Trek was more my thing.

Doped nanotubes boost lithium battery power three-fold

Frumious Bandersnatch

Mmmmm.

Donuts Carbon Nanotubes ... Is there anything they can't do?

Inside Microsoft's Surface Pro: A fiendishly difficult journey

Frumious Bandersnatch

Re: Who can fix the surface pro? No one, it SUCKS FROM TOP TO BOTTOM

Eadon has spoken the truth. You may downvote me now.

Anyone who speaks of himself in the third person deserves eternities of karmic hell (or at least lots of downvotes).

(ooh... see how I cleverly avoided that trap myself ^_^)

Micron glues DDR4 RAM to flash, animates the 256GB franken-DIMM

Frumious Bandersnatch

Re: Hmmm.

I assume its fast enough to write data to the non-volatile part before the power dies away completely.

That's not a good assumption. Power failure when writing to SSDs can trash even bits of data that weren't currently being written to thanks to the possibility of wear-levelling algorithms effectively moving random blocks around whenever you make a write. See "write amplification" on wikipedia for a pretty good description.

Frumious Bandersnatch

Re: >Holding big databases in memory

You'd be caching the most-used data,

Alternatively/additionally, you'd probably find it useful to hold indexes in RAM, and implement some sort of ageing/caching algorithm that keeps new and frequently-used data in flash and the rest out on spinning disks. If you use a log-based structure for the flash storage and periodically rewrite out to disk (perhaps redundantly, depending on whether new indexing constraints are required) then you can optimise both reads and writes across all storage layers. Something like SILT or log-structured merge trees, but with spinning disks as the final storage layer, optimised to reduce fragmentation and extra seeks.

Frumious Bandersnatch

Re: Thanks for the memories

New chip design would be needed anyway

I see lots of interesting comments here, your own being particularly interesting. So anyway, this is a response to quite a few of those posts...

I think that if we're going to see more of this sort of thing (storage that blurs the boundaries between RAM, flash and disk storage as well as the ability to completely power off components when not in use) then we're going to need a fundamentally different architecture to take advantage of it. This goes beyond just new chip design (where even today cores can be started up and shut down at a whim) and into having some sort of "power arbitration" bus, with the entire system backed up with a small, finite battery. For the instant-on/instant-off scenarios using flash as hibernate/sleep storage, you need to be able to guarantee that it's going to be able to finish writing the OS state data in case of loss of mains power. For the scenario of being able to, eg, keep power routed to the GPU while it's doing some computation task, but shutting down other non-essential stuff (but probably keeping, say, Ethernet alive to enable a kind of wake-on-lan feature) you probably want to be able to budget how much you can do while on internal battery power and also have the ability to suspend gracefully when you're approaching its limit. Not trivial stuff at all.

Of course, it's very unusual these days for us to have battery power built onto the motherboard (as opposed to being in an external UPS). If these devices/ideas become commonplace, though, we're sure to see many innovations in power management overall. I shudder to think of all the new failure cases when we stick in a new device (be it faulty or malicious) in machines in future, though...

Tesla vs Media again as Model S craps out on journo - on the highway

Frumious Bandersnatch

Re: The difference...

Tesla - you do not win at PR by starting an argument with the media.

Or to paraphrase: "Never argue with someone who buys ink by the barrel”. It's called Greener's Law, apparently, though I'd always thought it was a Mark Twain coinage...

NASA melts mysterious 'metal Martian flower' myth

Frumious Bandersnatch

Re: The chances of anything coming from mars.....

Yeah, but the chance of winning the lottery is significantly worse than 1 million to one (and still they played...), and yet every other week you hear about someone winning it! Time to panic!!!

New cunning linguist computer has got ancient tongues licked

Frumious Bandersnatch
Coffee/keyboard

Re: I just like to offer....

Damn! My pandigestory interlude just evacuated my nose. You owe me a new keyboard, sir!

Frumious Bandersnatch

Re: Cunning Linguist

The old ones are the good ones!

I'm so glad that the article wasn't about a really clever bunch of pygmies. Thank Heaven for small mercies, I say.

Public told to go to hell, name Pluto's two new moons

Frumious Bandersnatch

offenbach

The pair of them. Cos maybe now they'll reconfirm Pluto's off-again, on-again status on the list of planets. Well OK, Orpheus and Hades it is then...

Curiosity raises mighty robotic fist, punches hole in Mars

Frumious Bandersnatch

re: swarms of microbots

Interestingly, I read an article a while back about the US military working on building microbots that could be scattered over a battlefield to be used for gathering images and sussing the lay of the land. The software and radios that they had were capable of self-configuring into an ad-hoc mesh network, so that part of it should be easy to sort out, even if a significant fraction of the machines don't survive the landing or fail in some other way.

As Helena points out, though, these things aren't really of any use as roving devices. There's a limit to how small you can make remotely-controlled bots while still giving them useful locomotion and other more practical sensors and actuated abilities.

Still, I think the microbot idea could still be pretty useful for future missions as a means of getting an initial idea of local terrain and even provide telemetry data for later, more fully-featured rover landings. The thought of sending an Internet to Mars is pretty cool too, especially if it can self-organise and do a kind of terrain "interferometry" (a fancy word for building a map from multiple viewpoints) locally instead of having to pipe everything back to Earth first. Think about it... Martian Internet! What's not to like about that?

Frumious Bandersnatch

"the unpredictable rocks on Mars"

I was a bit confused by this at first until I realised "unpredictable" was used in the sense of "No one could have predicted, in the first years of the twenty-first century, ..." Hooray for word-sense disambiguation!

Dead Steve Jobs 'made Tim Cook sue Samsung' from beyond the grave

Frumious Bandersnatch
Coat

Re: All empires eventuall fail...

Apple are on the down-slope. Samsung are on the up-slope

So you mean that it's plain sailing for Apple and that it's going to be tough going for Samsung? I'd have thought the opposite....