Re: Poor research
Furthermore I don't know much about R so I'm currently fairly indifferent to it. I suspect that there are even more people like me.
4850 publicly visible posts • joined 9 Mar 2007
For example it's impossible to receive proper television down there as all the relevant satellites are below the horizon.
BTW, the US is also searching for highly skilled workers, as this actual add posted to the "career" wall at a technical university in Germany shows:
http://casandro.dyndns.org/tim/bad_job.png
I mean I recently got an X40, that's a nice compromise between size and usefulness. With Linux that device is nearly a full blown workstation. Getting more storage or a bigger, higher resolution screen unfortunately clashes with the idea of a portable computer.
It wouldn't be of more use if it was thinner, the thickness never has been a problem to me. If it was, perhaps the form factor of an old Nokia Communicator, it would have a bit more use because then I could simply carry it around in a pocket. However just making a laptop flat won't make it more useful.
CPU power is rarely a problem these days, nor is disk access time.
The web always has been designed to scale from tiny resolutions to large ones. That's the whole point about HTML, otherwise you could have gone for a simpler standard.
So webdesigners will have to learn how to not be idiots and specify sizes in something other than pixels, and images will simply be scaled. Once your output medium has a good resolution you won't care. (Just look at printers, you hardly ever print a picture at the printers native resolution)
The end result will be a more usable web, one that can be used on both your mobile phone and your retina display wall-sized desktop :)
Hmm, I doubt "people who just want a computer to be simple and easy to use, without spending their whole life learning how to use it." would still use Windows, those people have moved to Linux years ago since its considerably simpler. There's no DLL-hell since libraries are installed by an easy to use package manager. There's no 2-3 decades of legacy ideas to learn before you can understand how a certain feature is supposed to work. There's even a help system which works.
Just because you and perhaps 99% of the users want multitasking that's no reason for Microsoft to implement it. People will buy the Windows 8 no matter if they like it or not, just like they did with Vista and Windows 7. So why should Microsoft go the extra mile to please them?
If you would actually be serious about your operating system, you'd be working on trying out your legacy software on Wine or Reactos instead of messing around with Metro.
I mean seriously, imagine spending money on some software, for example 100k for some specialized piece of program. Back then you had a PDP-11 running RSX. Now slowly your hardware is falling apart. How can you switch to something easier to maintain without having the source code?
The purpose of Flame is not to spy on users or infect many systems, but to give meaning to the ITU. The ITU fears becoming useless in a world dominated by lightweight patent-free Internet standards which can be implemented within a day.
This is why the ITU wants to re-brand itself as "cyber security experts". I wouldn't be surprised if the ITU sponsored the development of Flame.
Wau Holland, one of the founding fathers of the CCC in Germany once said that the advantage of censorship in schools is that it will teach children how to deal with it.
Of course he said it in German which makes it a pun.
To deal with: "(damit) um gehen" (lit. to handle it)
To circumvent: "umgehen" (lit. to go around)
why not more companies are integrating UPSes into the power supply. It's far easier to integrate it into that part.
Possible designs would be having a special 14 volt output of the PSU to charge the battery and then a second primary winding and switcher to run off that battery.
Another design would be a dual stage power supply. You'd first have a simple coarsely regulated power supply giving you about 14 volts (or a multiple of that) for charging. The batteries are then connected in parallel to the first stage. Then from there on you'd have a second stage working on input voltages between 12 and 14 volts (or multiples of that) and giving you the voltages you actually want.
The first design would probably be more efficient, the later would be simpler and wouldn't have any "switch" between both modes. Once the power goes out, the voltage of the battery will drop from about 14 volts (required for charging) to about 12-13 volts.
Sidenote: Obviously you'd use lead acid batteries which can be charged with a constant voltage.
Typically it's quite bad to be competent in a typical commercial company. The main problem is that many companies suffer from something I'd call the collective Kruger-Dunning Effect. Essentially you have people being incompetent enough to not understand they are incompetent. If in such a situation you become a little bit more competent, you will not only recognize that you yourself are incompetent, but also how incompetent your peers are.
They have feature phones executing J2ME software. That is, compared to probably even Android and IOS, a fairly future proof platform as there is a lot of software around and you need nothing specialized for the phone. To bad this isn't to heavily publicized.
You do realize that not only US NTSC still has that weird 60000/1001 fps framerate, but broadcast standards are actually diverging in the digital age. For example DVB features a powerful EPG in its standard which is used by German broadcasters to provide a 4 week EPG. In the UK this EPG is just used for "now and next" while the actual EPG is done in a proprietary and undocumented format.
Because you need to be able to convert to and from those old standards you need to have whole ratios of framerate interpolation. Now if the US had 60 Hz, that would be simple. You could simply have 300 fps, which is 6*50 or 5*60, simples!?
No, not quite, the US _had_ 60Hz back when it had monochrome television. When they switched t o colour they switched to 60/10001 Hz field rate. That weird rate is still used for US HDTV.
Because when you go to a manager and tell him that the moon has solid gold and platinum under the surface and have ample proof like an actual core sample, they will simply look at the last issues of economic magazines, look at how much money has been earned by space exploration in the past and decide that, since nobody has ever sold any actual moon-gold, it must be utterly worthless. People don't want moon-gold, otherwise they would have bought it...
If you want to excite the military-industrial complex, you need to excite the military part of it. Those are like a bunch of kids you can literally sell non working toys to. And since they negotiate contracts up front and have guaranteed payments, the industry will listen to them.
Well I know the article is very fuzzy on this, but apparently those boxes also have 6 tuners for DOCSIS which needs to fit into the fairly narrow US channels. So you need quite a bit of those channels and therefore several tuners. So you might have 4 tuners for Internet, and then 2 for TV which is reasonable.
However in Europe there is a point in having that many tuners and that's for automatic Youtube gateways which record everything and upload it to Youtube automatically.
While us normal people just get a VDR pop in as many tuners as we want and simply stream via the streamdev-plugin which can stream into just about any format to just about any device.
Mine had 6 tuners until recently when I removed 2 because I needed them for other purposes.
I mean come on, you cannot even properly do e-mail on those devices, let alone any real work.
I mean on a real work day I might layout part of a PCB or capture a schematic. This might work quite well on a pen-based device, but there is no software supporting it. And our software from 1998 will never support it.
Then I might do data analysis. While there a portable device might really be useful, I need gnuplot and awk and the like to do any actual work. At best the device could be a terminal.
And even if that would work, an overnight batch-job would probably drain the battery into emptiness.
So at best you get the use of a dumb terminal out of those in real-life situations. Of course there are people whose job it is to pose with the newest device.
Microsoft is actually taking a gamble. If Windows 8 won't succeed they will be in trouble. However they are doing some things which are partly right. They will sooner or later have to get rid of their old complex win32 and win16 APIs. And that's what they are trying to do now. Had they acted earlier to provide a more abstract generic API, that transition would have gone a lot smoother.
My guess is that Windows 8 will fail on mobile ARM devices, and Micorosoft will buy a start-up making x86 emulators for ARMs including an automatic GUI refit so they are more usable for mobile devices, perhaps by rendering the original GUI into a hidden framebuffer, then either parsing the graphics or using API functions to determine the position of GUI elements. This will then be arranged into a new touch compatible GUI. Alternatively the position of the elements and/or the new GUI could be manually determined and stored.
First of all, don't make the requirements to strong. If you want your people to have at least 3 digits and 3 letters in their passwords, they need to change every month, you'll end up with passwords like June2012.
In some cases it may be wise to write down passwords onto a sheet of paper you carry around with you, or even a plain passwords text file. However those instances are rare and need to be well thought out. Don't put such a file onto a computer you neither can secure nor own. (e.g. an iPhone)
However the main point is to use public key authentication whenever possible.
Microsoft's influence on the world is not about the money. It's about making the world a worse place. People start to become contempt with absolutely horrible software which doesn't even have usable file formats. They work around this by using internal scripting languages.
Just don't treat your customers like idiots. Give them a device which caters to their needs and integrates well with what they already have.
A few examples of what can be done better:
1 Button per source on your remote, no having to go through long menus to switch your source.
Usable network interfaces, make your set play anything it can do reasonably well, like MPEG4 AVI-files from an NFS share, have a simple web interface for control and automation. Implement a VNC client, etc...
Make any "image improvement" technology optional and explain in the manual what exactly it does.
We have enough "devices for idiots", it's time to make something proper. And no, that doesn't mean it will be unusual to the general public. You can make systems which are both user friendly and powerful.
Wasn't that the guy who wanted to have an "Internet Passport" so we'll all be identified on the Internet... which would in his opinion somehow make the Internet safer?
I'm sorry, but by wanting to put his software onto IOS he contradicts himself, proving that he doesn't believe in code signing being a security feature.
Go to the server versions. From a desktop point of view you have everything you need, without all the crappy stuff in the "consumer" versions, like simplified access right dialogues which allow you to lock yourself out of your data.
Just face it, Windows is legacy software, Microsoft won't make something affordable to replace Windows 2000 or XP. What they want to do now is consumer stuff.
It doesn't matter who re-sells the computer, but if they are fit for professional use. Apple's high-end Macs are certainly able to compete with good Linux workstations. It may be a niece market, but it's an existing one. Just don't expect an iBook to be of much use in a business, or plan for them being a good fraction of their time in factory service.
So in a nutshell, carry business-ready computers, and leave the consumer stuff at home.
The main public channels mostly got rid of their children's programming years ago. It's now, in much worse form, on the combined channel "KIKA".
Actually the main "analogue" channels are barely watchable. They spend great deals of money on banale shows. The "digital" channels however actually seem to try to make good television. They are quite experimental with some horribly bad stuff, and a bit of true gems.
Have you ever seen this solar power plant near Vienna?
http://upload.wikimedia.org/wikipedia/commons/3/37/Wien_erdkabelstation_400kV_Nordeinspeisung.jpg
Just a tiny solar panel between the 2nd and 3rd while concrete column (from the left) giving enough energy to call for a 400kV line going to it.
They were one of the companies who let their business products "bleed into" their consumer lines. I think they even had things like a TV-set which was just a low-end professional monitor with an added tuner.
Betamax, Sony's most popular consumer format (which doesn't say a lot) was just a scaled down U-Matic. U-Matic was aimed as a format for the budget conscious user who wanted to carry around his VCR, in case you don't know.
You always got the idea that a Sony product was just a bit better than something cheap. This may have been true in the analogue age, however when they switched to digital, they failed to grasp what this would mean. A Sony DVD or Blu-Ray Player cannot have a better picture or sound than its cheap Chinese version. If Sony would have been smart, they would have targeted the technical user and added a networked media player which can access NFS and SMB shares and auto discover them, playing as many formats as humanly possible. They would have made the NetMD Players to support as many formats as possible given the constraints and being as open as possible.
Each one of those boxes needs a custom prepared Version of Linux. You can't just install Debian or Gentoo on any of those unless the distributions are modified for that particular box.
What the PC has done right in that respect is to be operating system neutral. Your BIOS includes a tiny piece of code loading the bootloader from disk, as well as routines to access and enumerate what hardware you have. This way you can have any operating system you want on your PC, and it'll simply work. If it doesn't have special drivers for your hardware, it can simply use the routines provided in the BIOS. If it doesn't know what hardware you have, it can simply look it up.
This is what's missing in the ARM world. A standard "firmware", perhaps based on OpenFirmware or something. Or perhaps something rather more minimalistic. A table of the hardware in ROM, and some primitive routines to access the most important hardware (network, display, input) in its most primitive way as well as a routine to load the bootloader from the mass storage device.
Well there's profit _now_. However if you look at long term successful companies, you will find that they always invested into the _next_ technologies. Even bean-counters acknowledge this.
While IBM was having great success selling typewriters and tabulators, they invested in general purpose computing, even though that business probably wasn't very profitable till the 1970s. If they wouldn't have, there would be no IBM today.
Capitalism has very little to do how companies are run, at least it hadn't. Now capitalism is seem as the way modern MBAs run companies, where someone could seamlessly migrate from a company making bottled sugar water to a computer company.
Well but that's short sighted. Cars stop being the must have status symbol they used to be. The youth is already trying to avoid having to have a car. Meanwhile China and India are producing cars of increasing quality at hard to beat prices.
So you are faced with a situation where people who used to buy your cars now either don't buy a car at all, or if they must, by an Indian one.
The operating system came with all the GUI stuff coded in. Which made GUIs more or less consistent, particularly when it came to things like colour.
Windows is hardware independent, you can never be sure if the hardware the user has can even display "red" or "green". If it's CGA it most likely cannot. So people just made buttons and other controls, leaving the OS to deal with colours.