Oh b*gger - there is another debugger to slow down potential of GPU?
It Just Keeps...
I may have to port all the important stuff to my Commodore 64. But not my Amega, since the "copper" chip was one of the earliest GPS's
Intel should jump on this
Since their GPUs are far to weak to run Malware, let alone anything graphically intensive.
Are the people who give these criminals ideas in the first place!
No, they aren't.
Since people have been exploiting security vulnerabilities since long before the term "security researcher" entered common usage, I'd say you've got it wrong. The crooks get these ideas just fine without the help of white hats. Anybody with the time and the motivation to find some way to break into something will--especially if there's money in it.
Which would you rather have:
The bad guys figuring this out first and telling nobody but each other, as they've long done, or
some security researchers figuring this out first and publishing it so that it gets fixed soon?
Security through obscurity DOESN'T WORK. It just ensures that, when a crook discovers a good way to break into something, the only people who know how to fix the problem are the people who are exploiting it merrily. You might argue that the researchers should not publish this information openly, but often that's the only way to get this stuff fixed. If the people who distribute the afflicted software know, on the other hand, that this information is public knowledge, they have to either fix it or face the consequences--whereas, informed that only the "good guys" knew, they'd just as likely sit on their asses for three years before doing anything about it.
Why have secrets?
... is it not a good idea when the blinds become see-through?
But it could be useful to have transparent logging, for auditing of course.
I for one welcome our GPU weilding overlords, as the current crop are shite.
Some problems are best solved by working from the middle, outwards.
I'm sure you are right,
however seeing as the number of packers is limited, the problem is less than you make out, and seeing as we're only talking research here and not real threat perhaps a look here
might relieve the scare tactic of your story, and yes it's RE, but real life and real time, not something that may or may not occur at some i\un specified time in the future.
And I'm sure you think I'm wrong, but from recently unpublished comment, I'm sure you know my feelings about Americans, and your opinion in general.
"The CPU then executes it."
On what braindamaged OS/architecture does the CPU just execute stuff you write to memory? Oh, wait... Never mind.
Any von Neumann CPU, actually.
Any CPU of the von Neumann school (which is not just Intel CPUs but many others) is by definition capable of turning data into code and vice versa. Indeed, Just-In-Time Compilation relies on this principle. That's why JIT compilers fall flat in Harvard architectures.
Am I safe.....?
...with my little Acer Aspire One netbook and the weird OS it uses? Surely there is little/no GPU to speak of, and perhaps an OS that won't allow this malware to run?
If so then that will be my web device and my Dual-Core-Windows-based-bigger-laptop with an ATI GPU becomes a non-web-enabled games machine!
It will make laugh if malware is the first mainstream programs to make use of GPU processing. It means malware makers will be the greatest innovators for the PC. But I would not worry if malware authors take up of GPU processing is the same as the rest of the industry we won't see any malware on GPU's for a while
I was really excited by the idea of GPU processing since it was first announced. But it has been years and nothing much has happened, except the use of it for a few specialist applications. I understand that not much was going to happen while Nvidia's CUDA and ATI's stream were competing. But now directcompute and opencl, which work with all DX10 GPU's, have been available for a while and still not much is happening.
Also, I would have thought that antivirus scans would have been a great use for GPU processing, as scan of lots of files at the same time would be very parallel. Just what GPU's like.
Far too late.
Game makers have already latched onto CUDA for physics processing, and there's always Folding@home, one of the first mainstream efforts to prove that you CAN do something non-graphically productive with a GPU.
What a bunch of nonsense
This is nothing different than when polymorphic viruses started using the instruction of the numeric processor (the FPU) or when they started using Windows API calls in their decryptor. It just means that the emulator of the scanners has to be improved to handle the new set of instructions, too.
oh and btw Dan
if you really require to get onto drug infested manipulative thoughts, and not by email you're blocked,try some morphine
and even though that article is out of date, the method is still applied.
GPU processing changes nothing, not even the goalposts, the boundaries are still the same, the aims are still the same, your understanding likewise has not improved, and nor will it, until you actually do understand anything.
Oh no, now McCrappee will make your graphics card run like a dog too
We should have seen it coming, McCrappee has made everything else on your PC run like a dying fly in treacle, now along comes an excuse for their horrible infection, erm anti virus scanner, to intercept and slow down everything your graphics card does too.
Whoohoo, now your Nvidia 9900000ZZZXXX penis pump overclocker special can run at the same speed as a Trident TVGA 8900.
Some days I wonder why anyone bothers...