Re: Who cares?
Maybe, and I'm only judging that from the hardware reviews, The Register is just not good with computers and stuff.
4850 publicly visible posts • joined 9 Mar 2007
She wants to build toys not tools.
I'm sorry, but HP used to be a respectable company. They made the first scientific calculator watch, not because there was a market for it, but simply because it was a cool idea.
Now it seems they want to stake their reputation on worthless consumer products, cheaply produced in China, and thrown away before the warranty runs out.
Not really, the security model was already good with Windows NT, Vista just added some weird stuff to it... which everybody disabled to get their legacy software to run.
The operating system Microsoft did try to get security in was Windows NT which was great hadn't it been for the Win9x/DOS compatibility that had to be shoved in. Maybe Microsoft should have sold Windows NT for the same price as Win9x. Then developers would have developed for the subset of both systems instead simply for Windows 9x.
What Windows is now is just a pale image of the vision behind it way back when people still believed in Windows.
You need to essentially develop your own linux distribution for such nodes, as there is no standardized way to do hardware enumeration on ARM. So you need to maintain your own kernel for each one of those boards. That will be a nightmare for people used to rolling out updates with apt-get.
Plus an STM32F103 as a management controller might seem sensible at first, they are cheap and powerful, with the largest having half a meg of flash and 64k of RAM. Far more than what is typically needed in such an application. However the STM32 controllers have one big problem, unnecessary complexity. You'll spend much of your time fiddling about with clocks and setting your port pins to alternate modes. And no, despite of them being ARM you cannot run Linux on them.
Actually that's Agilent now... and some of that business have been out sourced to yet another company Symetricon.
That's a classical business strategy, Siemens also did it. For example they outsourced their integrated circuit business to a company called "Infineon" because it is cyclic. Then Infineon outsourced their memory business, which is even more cyclic, to another company "Quimonda"... which then went bust.
That's a classical piece of textbook stupidity. Millions of business students have been indoctrinated that companies should focus on their core business and get rid of unprofitable side businesses. The consequence of that ideology is of course, that companies will not have the endurance to keep going through a recession in one area.
I mean they could for example bring out something in a small clamshell device, perhaps like the old Nokia Communicator? Perhaps have a device with a pen? Try to do what the others don't do.
Here's my take on the current situation. User interfaces are getting increasingly vague. While in the past, you did text entry on the keyboard, and graphical entry with a digitizer, essentially a large "drawing board" with a cross-hatch-like mouse allowing you to enter precise coordinates, you now have touch screens so imprecise it's hard to hit something smaller than 10% of the width of the screen. Even worse are gesture control systems.
Now there is a market niche for mobile devices allowing you to enter precise data on the go. The Galaxy Note has shown that. Expand on that market, perhaps build a computer which can work with sketches. Maybe something like Sketchpad. Now since you already have the software, extend it to larger devices. How about a "Watercooler screen" where you can interact with some design with some peers? The same technology is needed there, just with larger screens.
You are all assuming that Intel is talking about work devices, but they are talking about phones and slates, perhaps even ultrabooks. The first 2 will probably just run Windows Phone 8 or an ARM-Emulator to run Android, the later will be so expensive and fragile nobody even dares doing work with them. The market Intel is aiming at is the "expensive toy" market.
If they wanted to target the professional user, they'd have ECC memory controllers which can either be used to keep your data more secure or to lower the power consumption in suspend to RAM. (ECC=>more acceptable bit errors in RAM=> less RAM refreshes needed)
The App-Bubble probably will burst in a year or so anyhow. LTE is around the corner and already being widely deployed. You can have decent browsers like Firefox on mobile phones now. There's less and less space for the classical "App". Eventually it will be absorbed by the web.
Particularly for CAD work. Currently you can barely have more than an A4 page of content on a screen.
With such a screen you could fit a whole A3 page onto your screen and even see every little detail just like on paper. Gone would be the times when you drew something through a peephole and then printed it out to look at it. You could finally actually actually work with the design on the screen.
I mean if there was a decent stable hardware platform for mobile phones, you could simply take a boot medium of your favourite operating system, in the version _you_ want, and install it. Alternatively the phone could boot from SD-card.
It's just not feasible having to port your operating system to every hardware platform, and outsourcing that work to the hardware manufacturers has been proven to be a bad idea. They have no interest in maintaining support for their older devices.
So at least do it like CP/M did it, have a common "BIOS", a layer of software, in ROM, which handles input and output for basic features like setting the correct mode on the LCD or accessing flash and SD or the GSM subsystem or USB. Then have something to enumerate the rest of the hardware.
Well the main problem why you cannot use that for mobile telephony is because they use different frequency ranges. That's already quite hard to do. In order to get an energy efficient transmitter you slightly overdrive your power amplifier and then filter out the harmonics with passive filters. So the analogue frontend still needs to be there and tuned to your particular frequency bands.
Particulary since there always have been computer kits for consumers. Just think of the Altair or the Apple. Of course you cannot build your own integrated circuits, but soldering chips into a board is doable.
http://blog.modernmechanix.com/housewife-builds-fm-stereo/
FM stereo receivers did have a lot more parts inside of them than your average computer, yet that untrained housewife was able to build one.
I mean I'm not claiming that I have built my own TV-set just because I have connected a satellite receiver to a monitor, yet that's exactly what people "building" computers do.
...this is all meaningless. I mean there are new kinds of technology being developed, but most of the "Social Web" "Web2.0" stuff will probably be worthless in a few years.
Look around you, companies like AOL have (kinda) failed. Nobody would have thought that would be possible back in the late 1990s. Nobody knows if Google will still exist in 20 years. For all we know by 2016 Facebook could be sold off for a ridiculously low price.
Actually there's a type of digital filter developed a long time ago (it was the basis for Musicam and PAL+, I can't remimber the name) which allows you to split a signal into 2 parts with a matched pair of low- and high-pass filters. You can subsample both, and transmit them independently. So you could set up one low resolution stream as well as a high resolution one.
You could in fact start with an UHDTV image, and scale it down several times until you end up with 256x144 or something. That would be your basic stream, the more additional streams you take into account the more resolution you will get. The great thing about is that you will also be able to scale the processor power needed. That's particularly useful for mobile devices.
Seriously, haven't they learned from the past? As soon as you remove DRM your sales will sky rocket. Just look at iTunes where the first DRM-free song quickly overtook a big chunk of their DRM titles. Anybody still designing DRM into their systems shows he has no idea about reality. As Wau Holland, founder of the Chaos Computer Club once put it, "Es gibt keinen Kopierschutz, nur einen Kapierschutz". (There is no copy protection, only protection from understanding)
Other than that, there always has been something called "scalable codecs" which tried to do the same as MPEG-DASH does. Essentially you have multiple streams, one basic streams, and others allowing you to have higher resolutions or more details. It's technology from the 1990s.
1. Fingerprint scanners work by transmitting the secret (your fingerprint) from your finger to the computer. So from that perspective it's as secure as a password.
2. It's a lot harder to keep your fingerprints secret than keeping your password. If you are not careful you will in fact ending up with your fingerprints all over your computer.
3. Even if UPEK would have wanted to encrypt the password securely, they couldn't have done it. They need to be able to decrypt it, and for that they have the key somewhere. Now with a password, you can simply derive the key from the password. With fingerprints you have fairly fuzzy data. It's probably not possible to derive keys from that.
In a year or so, we will have decent browsers on mobile devices. Mobile Internet will also work decently.
Just like with the "Multimedia CD" era of the 1990s, people will wonder why they should buy an app (CD-Rom) if the same functionality is also available as a website. Microsoft simply is to late in the game, they have missed it.
Besides even for the classical app market there is no place for Microsoft. It's already taken over by Apple and Google. Microsoft has no killer feature, nothing to improve on those.
50 years ago, people believed we'd now have moon colonies, perhaps even mars colonies.
50 years ago, if you wanted to make an overseas telephone call you had to actually make an appointment with the phone company for one of the times they had a line. Even then the quality was even worth than CDMA or GSM-half rate cellphones today.
50 years ago, you were lucky to have access to a telex, so you could telex messages over to another country with virtually no delay.
Back then it was actually important to be able to go from place to place quickly.
Today we have ultra cheap communications. I don't need to think when telephoning to the US. It just doesn't cost anything. We can now easily e-mail to anywhere in the world. If we want we are free to videophone. Larger companies have sophisticated tele-presence rooms for conducting international talks.
There is less and less need for fast travel. And since energy costs rise it is going to become more and more expensive. It may very well be that in 2050 we consider air travel as just some toy for a few rich people who can afford it. We probably won't need it anymore just like we don't need the Concorde anymore.
There may even be a point from when the world starts to localize again. What's the point of manufacturing a computer in China if the transport is either more expensive than the difference in production costs, or the low-cost (=low energy=slow) transport just takes so long it is outdated when it arrives here?
I can understand that they try to push through such "cabinets" since they solve a serious problem threatening the country. A bit of inconvenience can be expected.
But why don't they go a step further and nationalize the telecoms? I mean currently you have ISPs which don't seem to want to upgrade their networks. You have ISPs with network congestion, not only at the peak hour, but also during the day. Yet Internet is not really cheaper than at places where networks are properly designed.
So cut all the bloat and bureaucracy of a commercial company and nationalize it. Just declare fibre to every home as a national goal. Come on, you have been able to bail out banks. That was a massive undertaking. Re-wiring your country is cheap compared to that.
Well PAL+ was popular in Germany, at least with public broadcasters. They only switched it off this year.
I remember going to a local "trade show" initiated by a local electronics store to watch one of the early regular PAL+ programmes.... they simply didn't have a set to decode it. A few years later I was actually able to see a PAL+ Laserdisk from Sony.
Modern sets don't decode PAL+, but they can decode the WSS (wide screen signaling) signal.
PAL+ kinda was a stop-gap solution to still be able to do enhanced resolution (i.e 16:9 with full line count, and improved colour decoding) on terrestrial and cable channels. Back then the long term idea was to move to the "MAC" set of standards, semi-analogue standards broadcast over satellite. The idea was to start with D2-MAC and then move to HD-MAC in the second half of the 1990s. What they didn't anticipate was the "MPEG revolution". Suddenly, thanks to MPEG1 and MPEG2 it was practicable to squeeze a TV channel into a few megabits. The DVB set of standards was created and HDTV was postponed by about a decade.
So what's left of PAL+ is wide screen signaling as well as lots of 16:9 material from the 1990s.
I have to say this. The Raspberry Pi seems to be fairly simple. The only packages the company I work at is not familiar with are the BGA parts. Virtually any SMT mounter can mount BGA parts, but few small companies work with it, as they are harder to hand solder in case they need to be replaced.
Other than that, since there are parts on the lower side, you cannot wave solder the through hole parts. However I, as an end customer would prefer to mount those parts myself.
I have my VDR set up for some search words. When a show comes containing such words, it'll record it. Simple as that.
However in the UK you still have decent live TV. You can turn on your TV set and have multiple decent TV shows right after each other. TV in the UK is not just hurling crap at the viewer but providing a "habitad" for the programme.
At least in Germany you can get sub 60 Euro DVB-S2 receivers which will happily record BBC One HD for you in perfect quality on any USB mass storage device you can think of. Sure such devices must be available in the UK. If not you can always mailorder from other countries. (Keep away from HD+ or CI+, those are the few which actually encrypt their content on USB disks)
So the average not to smart user could simply record it onto a micro-SD card, pop that into their mobile device and off they go.
No matter how they call their DRM, no matter how nice they appear to be. As long as there's DRM, they will be in charge. And even if they don't want to be evil, a court order could just order them to make the files unplayable for you.
So please Music and Movie industry, get rid of DRM. DRM is, for most of your potential customers, the main reason why they pirate.
By now humanity has passed its mobility peak. Nobody cares how quickly you can cross the Atlantic since the security theatre will take more time than the flight anyhow. If people cared, the Concorde would have been a success.
If you want to talk to someone on the other side of the globe, use a telephone or rent a video conference booth.
Let's look a bit forward. Imagine Windows 8 and Windows 8 RT will be pushed onto the market. People will find that Windows 8 RT will not run conventional software without severe modification. In most cases it will require a complete rewrite.
Now add in the post-pc hype and you'll get a future where people fear they won't be getting devices with x86 CPUs any more.
Now add both of them, and you will have people fearing for Win32, thinking it won't last much longer. Those people will seriously consider getting a rewrite of the software they are currently running.
Of course some of those companies are going to switch to .net, Silverlight, Metro or whatever kind of technology is currently in fashion in the Microsoft universe. However many will simply switch to web-based UIs. Those people won't need to have Windows clients or servers anymore.
Microsoft risks to alienate large portions of their customers.
It's not a bright idea to stop a strategy people had relied on since the 1970s
https://www.youtube.com/watch?v=RPwDqFf4NJI
(Video of the Windows announcing their strategy of "starting anew" while keeping everything (important) the same. A strategy which Microsoft followed till Vista.)
You can also go a step further and get an IBM X40 or something which is a 12 in notebook having a 1024x768 resolution. That is much more backpack friendly, however you will most likely need a new battery.
For 400 quid, you can actually even get a T61p with 1920x1200 resolution.
So buying new consumer notebooks is currently a bad idea, particularly since you always risk getting one with a glare display.
I have actually tried tipping them on their sides. It works, although the border in the middle is more distracting. However for unreleated reasons I had to switch graphics cards, and I don't know if that one supports tipping over the images.
One problem I've noticed that at least my screens have just a small vertical viewing angle. Usually that's not a problem, but tipping them over gives you a small horizontal viewing angle.
Maybe one day I can afford a huge room with a back projection screen driven by a large bank of high resolution projectors each one perfectly aligned so they form one giant and high resolution picture.
Seriously, if I want a _wider_ screen, I'll simply put 2 or 3 screens next to each other. It's trivial to do this, and since the only serious use for it is having multiple windows next to each other, the gap won't matter.
What we need are higher screens. Stop making mailslots. Mailslots might be good for portable devices (see Nokia Communicator), but on the desktop you have enough horizontal space.
And unfortunately I'm sure that show will come to Germany, where it'll be produced by utter idiots. Contestants will be type C and D celebrities and it'll be terrible. But since commercial stations have virtually no competition (public TV is mostly censored on cable, for example) it'll be "successful".
Anyhow, wasn't one of the ideas behind RFID to track every-a*se's movements?
Pardon me, how else do you interpret "Java is not the new Cobol"?
http://www.theregister.co.uk/2012/03/07/oracle_java_9_10_roadmap/
Nobody writes new Java applets for websites as Java-Script can do the same now without the obvious disadvantages. And that "Java is not the new Cobol" statement probably held off quite a few new deployments in the server/backend area where Java was considered as a replacement for aging Cobol code.