Re: iSuppli estimates
The same is true in any hardware company. The details of parts supplier deals are always deep secrets, because both the competitors and competing suppliers could take advantage of them.
1981 publicly visible posts • joined 18 May 2007
Except the technology to get there exists now, so it isn't going to provide the boost it did in the past because there are no problems that need solving.
Technology for a very expensive moon picnic exists (or used to exist), but not for an extended stay, which would have its own set of unsolved problems: radiation shielding, dealing with moondust, recycling air and water for an extended time, surviving the cold lunar night, etc.etc.
Plenty of challenges remain.
>This bug is the sort of mistake beginners make
And experienced people as well, occasionally! I have seen bugs of similar stupidity level made by long-timers (me included). Sometimes in code that has been in use for years.
There is no room for any holier-than-thou attitudes in programming. Anyone can goof up, therefore processes must be in place to to catch and limit the damage.
What I want to emphasize is this starts at making sane specs that avoid unnecessary complexity (like the redundant length field).
I wonder how often the remote management is on by default in these devices? The ADSL+WLAN router I bough several years ago had it disabled, and after some thinking I left it that way, not seeing any good reasons to enable it, just lots of risks. But I could imagine some manufacturers having a different policy, in which case those devices are probably pwned by now.
>Second, doesn't UDF have a big memory overhead?
I haven't heard of it having any worse memory requirements than other non-ancient file systems. Surely larger than FAT, but that is also true of exFAT (which does not have much to do with the old FAT file system, apart from the name). UDF comes in several variants to handle non-overwritable media, but flash cards and USB drives would only need one, the simplest read/write variant.
>There's nothing stopping anyone doing a decent ext4 file system driver for Windows,
It probably is doable (although not so simple: there is a quite a bit of impedance mismatch between how Linux and Windows handle files, like how protections are handled, and the case sensitivity issue).
>and it could become something that everyone just knows they have to install.
This is where it falls down. Unless you can somehow automate it totally (like getting the driver quickly installed the first time an ext4-formatted SD card is plugged in), most users will not bother with cards requiring such tricks.
A little-known fact is that the UDF file system could in principle be used on other media than DVD:s as a read-write file system, and many operating systems already support it, Windows included. So trying to get flash card vendors to use that would be a better plan.
You neatly described how I also see the Lego evolution. My son has around 1000x times the number of Lego parts I had at his 8 years of age (yeah we Finns have got richer...), but it is harder to use them for own designs, since most come from different disintegrated StarWars, Space Police, Kingdoms etc. etc. sets with too many special types of parts. Even if you find more than 2 of the same shape, they are of different colour... The old limited palette reduced this problem.
I still have my box of various old school Legos, but left intentionally at my parents house. It is delightful to see what happens when the modern kids (my son and his cousins) get their hands on these mostly generic parts that do not carry any message about what you should build from them. The only specialty is the old Lego railway system with the discrete rail and tie parts. But even there the isolated rails can act in other designs as rods etc. It is really sad it has been discontinued.
Unfortunately, given how patent offices work (not just in U.S) the possibilities of this going through are good. For a software guy, if you know how to program X once, then making X in execute in a loop, or making X a parametrised subroutine, or making X work on a different hardware platform are obvious stuff. But patent offices have issued bad software patents for these variations of a known operation.
>Basically, you wanna ban trade secrets.
Not at all. Just keep them on the other side of the programming interface. If the trade secret can be seen from the firmware code, it is no trade secret at all, because those with serious financial interest in getting the secrets can get the binary blob disassembled and analyzed. The only people hampered by the binary blob are those who honestly want to program the device.
>Oh really? And abandon, say, QA and other trivial things?
Of course not. This could work analogously with Red Hat's RHEL, with the companies using the firmware doing QA on their hardware.
Besides the existing state of firmware QA isn't stellar either. Eg., I have a big.brand DVD recorder / DVB-receiver combination that locks up almost always when it sees disk it does not like (wrong format, or too scratchy). The great corporate QA department obviously tested it only with perfect disks.
No, the hardware would just have to be documented, no hidden bits needing non-disclosure agreements to see. Of course getting rid of nonsensical variation would be useful, and might be a side-effect of more openness and the requirement to properly and publicly document things (gratuitous changes become more expensive to do).
>Firmware typically needs to be incredibly responsive and run on very cheap hardware.
That's precisely the code I would like to see written by enthusiasts, not corporate drones. You know the demo scene? One sub-genre is making really limited hardware like C64 do things you would never have thought possible with it...
One of the NT:s hung around the Helsinki Malmi airport one summer around 10 years ago. It arrived from Germany, and the idea was to fly it to Japan, but Russia did not give the required permits. So after a month or two it flew back, and was later ignomiously shipped to its buyer by sea.
I live near the small Malmi airport, and saw it arrive, and then sadly grounded there.Funnhy how after a while this exotic ship started looking like a part of the landscape...
> hell it's not even Huffman! It's plain arithmetic encoding
Actually the JPEG standard allows both Huffman and arithmetic coding, and most JPEGs use Huffman, because of concerns about arithmetic coding patents (another case of software patents hindering progress!), and also because using arithmetic coding does not improve the compression very much, and is slower.
>It's plain arithmetic encoding, and anyone who knows their compression codecs knows plain old' arithmetic encoding SUCKS.
I suspect you misremembered the relative quality of Huffman and arithmetic codings. Arithmetic coding is considered superior to Huffman coding. Please see the Wikipedia article on JPEG, which claims files that use arithmetic coding are about 5–7% smaller.
Software patents are probably the biggest problem nevertheless. JPEG2000 (first introduced in 2000) might perhaps come into widespread use in 10 years or so, when it becomes obviously unencumbered, even by submarine patents, and in all jurisdictions. And that is only the basic barebones version of the system, later enhanced versions of the standard are encumbered by later patents.
See how succesfully patents promote the progress of the useful arts!
> So in other words, a complete and utter non event :)
This pretty common attitude pisses me off.
Y2K was a non-event precisely because serious fixing was done. Sure there was a lot of scare-mongering, but it helped in getting resources to make it a non-event. No, civilization as we know would not have ended without Y2K fixing, but there would been a lot more inconvenience and confusion, probably also loss of life.
Very frustrating for the programmers and managers involved: Had there been problems, they would have been blamed, and now that there were no problems, they are ridiculed.
Carbon intensive? Are you serious? Hydrogen is cheap as water compared to helium, which really is a scarce resource and can only be obtained by cryogenically distilling large amounts of natural gas from some wells, which is at least as carbon-intensive.
Hydrogen is widely used in industry, so ways to store and manage it are well known.
The Germans ran completely hydrogen-filled airships for decades before serious problems, using materials and technology way inferior to what are available now.
Wouldn't say creaking... The feature set is limited but to my utter astonishment, the WP 7.8 on my Lumia 710 has been astonishingly stable. After the last major update had settled down, it basically works reliably indefinitely without needing a reboot, unlike the Symbian I previously used, or every other Windows version I have seen. I guess it has reached the plateau of productivity. ("If it works, it is obsolete", like one character in "2010" grumbled about progress in computer technology).
Special effects, he promised, “will be done on PC's, not super-computers”
Not a new concept.. The sci-fi spoof "Star Wrek: in the Pirkinning" was rendered this way already in 2005. Looked pretty good despite the shoestring budget. See http://www.imdb.com/title/tt0472566/ (where the "trivia" section tells the rendering took 5 years overall).
I wasn't cocky, just stating a fact about this particular IT segment. In some others (like desktops and laptops) Linux is almost nonexistent. Sure, Linux, Windows, Android, iOS etc. will get replaced eventually. So it goes. Personally I always try to write my code to be portable, and it has already paid off during my career.
An alternate way of saying that is that Microsoft put it there just to have ANY CHANGE AT ALL in the hypervisor market! All competitors supported Linux, and Linux is more or less what most cloud services run on, so only the 100% Windows data centres would have been interested in a Hyper-V that did not work well with Linux.
>I remember my auntie had a Nokia TV.
I still use one. Bought in 1994, works fine. Of course it now has to get its signal from a digi-TV box or DVD, since Finland no longer has any analogue PAL transmissions. Now and then I look at flat-screens in shops, but conclude that standard-definition TV on them looks worse than on the Nokia tube (the deinterlacing and other digital processing in flat-screen TV:s makes everything look cartoonish), and there is not yet enough on-air HDTV material here.
Ah, Google apparently wants to be Windows Phone compatible. The mobile IE browser has always behaved like that, and yes, it sucks. Sadly, Opera is not available on WP (before getting a Lumia,I always used Opera Mini on Symbian, which performed text reflow cleverly).
My father got a TI-59 from work around 1978 or 1979, and it was the first device on which I tried any kind of programming. Felt like using technology from the future! Unfortunately magnetic card reader was not too reliable, and eventually stopped working. There were also swappable ROM modules of programs. The calculator came with one, but others were supposedly available for special tasks. There was also a cheaper version TI-58 that lacked the magnetic card feature, but was otherwise similar.
"Intel only left the cord out so it could reduce the size of the packaging - it can’t have saved it much money."
Could another reason be that the power sockets used around the world vary? Perhaps they thought this was a clever way to side-step the issue, but obviously annoying to buyers, so they reverted it.
As I remember it, in the 1970's LCD:s were still pitifully illegible. You had to look at them from just the correct angle to see anything, and even then it was murky. So putting LED:s or plasma displays(*) was quite reasonable at that time. (*)= not sure if this is the correct term for the kind of flat glass vacuum-tube-like element with the glowing number segments in it. Around 1980 I used to have a Casio scientific calculator with this kind of display.
So, as usual, if there is programmable logic involved, all bets are off. This is something my Asus 1225B mini laptop gets right: Next to the built-in camera there is a all-mechanical slider, which puts an opaque shutter in front of the lens. The user-facing side of the shutter is light-colored so I can immediately see its state. Hack that!
Apple owners can emulate this advanced security feature with a piece of duct tape.
"Middle Earth has always struck me as just a set of scenes spatchcocked together as background for the plot [...]"
I wouldn't be that harsh. It reflects a Medieval European world-view, where to the West you have an apparently endless ocean, and to the east endless steppes from which invaders occasionally emerge. North is a cold wasteland, and South is hot, with Oliphants and other exotics. This is just how a Medieval knight would have perceived the universe, and neither he nor Tolkien had climate modelling in mind...
I don't see how the Earthsea geography would be better defined. It seems to say nothing what lies outside the archipelago, except I think one book hints you hit the land of the dead if you go too far in one direction. How would you model the climate of that? But I agree the working of Earthsea magic is far better defined than in most fantasy books, it is almost science.
Having co-operating computers (where "co-operating" can mean infected with compatible malware) communicate acoustically is not surprising at all. After all, old acoustic coupling modems used to do this.
But I just do not believe malware could infect a computer that way.
Of course in theory the sound device driver is so buggy it overwrites buffers while receiving data from the D/A converter connected to the microphone, but such a driver would quickly crash the computer even without hearing any malignant sounds! It would also be impossible for the malignant sound to be controlled so precisely that the resulting digitized data would form a working program.