* Posts by Thorfkin

70 publicly visible posts • joined 23 Aug 2007

Page:

Fragged, fragged and thrice fragged! 20 years of id Software’s Doom

Thorfkin

I definitely enjoyed both Doom and Doom 2. But, like others mentioned, I didn't really find the game scary. I definitely enjoyed shooting anything that moved but it wasn't until Doom 3 that I really experienced fear from a Doom game. Setting aside the endless supply of hidden wall panels with demons inside, no other game got me so worked up that I stayed twitchy for hours after. Despite its issues, Doom 3 was the real masterpiece of the collection in my opinion.

Dell goes private: Stock ceases trading, now Big Mike's baby once again

Thorfkin

I agree this is an excellent move on Dell's part. How could Mike D possibly make the right long-term decisions for his company and customers while constantly facing down shareholder demands to meet quarterly profit goals. Going private will make the company better.

Universal's High Fidelity Pure Audio trickles onto Blighty’s Blu-Ray hi-fis

Thorfkin

The music industry is correct in that there is a limited market for high definition audio. Last year I bought a reasonably decent home theater setup with Polk Audio speakers. Thereafter I noticed just how bad my MP3s sounded when compared to the high-def audio in many blu-ray movies. So I re-ripped my entire collection to lossless Flac. That helped a lot. Now my music collection sounds as good as it can but I really would like to have some of my albums in higher quality than what my CD source can provide. However I also think the author is correct in that these probably won't sell very well.

Comparatively few people go through the process of ripping their movies and storing them all on a shared NAS device. However that's not the case with music. Music customers expect, and demand, the ability to format shift their collection. I doubt that's going to be possible with these high definition blu-ray audio discs. The copy prevention system in the blu-ray spec is a lot more robust than what DVDs used and infinitely more effective than what CDs had. What are the chances that I'll be able to easily rip these high def music discs to lossless flac? I'm not going to buy them if I have to put the disc in every time I want to listen. If I can't rip them to the lossless file format of my choice then it's just not worth the added cost to me over what I pay for a standard CD.

Sony Xperia Z Ultra: The quad-core 2.2GHz MEGA SCREEN PHONDLESLAB

Thorfkin

Thank you for this review. I've been trying to decide between the Galaxy Note 3 and the Xperia Z Ultra. Your review has helped :)

Congrats on MP3ing your music... but WHY bother? Time for my ripping yarn

Thorfkin

I recently purchased a 7.1 surround home theater setup with Polk Audio speakers. Afterwards I started to notice the difference in quality between the 256kbps .mp3 files I had originally created from my CD collection and the original lossless CDs. It was a subtle difference in the very high and very low frequency ranges that I could never hear on the cheap setup I was using before. So I re-ripped my entire collection to FLAC using EAC. It took me a few weeks but it was worth it :) I also love that I can convert the FLACs to other formats (for devices that don't support FLAC) without having to get the original CDs out again. Now all my CDs reside in storage =D

WHY do phone cams turn me into a clumsy twat with dexterity of an elephant?

Thorfkin

Phone Camera

I have the same problem. I can take great shots with my Cybershot F828 but saddle me with a phone camera and I'm useless. My hands shake too much.

Headmaster calls cops, tries to dash pupil's uni dreams - over a BLOG

Thorfkin

Based on the information presented in this article, I would believe that the school's headmaster should be severely reprimanded at the very least. Possibly fired. Not because of his actions in relation to the school itself but specifically because he called the police. I realize that I have not read any of the source material and so my opinion is weighted heavily by my limited exposure to the facts of the matter. However based on what I read in this article I didn't get the impression Kinnan had posted any police actionable material. He may have been extremely rude and may indeed have anarchist and individualist views but none of those things are illegal in any way. Unless Kinnan threatened some form of violence, Jacques had no business involving the authorities and he should be made to publicly apologize for doing so inappropriately. The school has every right to suspend or expel students for being a disruptive influence. However they do not have the right to involve the authorities because their headmaster disagrees with one of his student's views.

Myst: 20 years of point-and-click adventuring

Thorfkin

Excellent game and book series!

Myst was one of my favorite games as well. I think it stands out for me as such a novel experience because it was one of the first games I purchased when i got my first CD-Rom drive. I played it on my 486-DX33 (33mhz, no that's not a joke =). It had a whopping 8mb of ram =D These days I doubt I would have the patience to work my way through some of those nastier puzzles but I'm glad that I did when I was younger. There certainly was something indescribably appealing about the world presented by the Myst game and later the book series. I enjoyed both immensely.

Women in IT: ‘If you want to be taken seriously, dress like a man’

Thorfkin

I disagree with your assertion that IT is a "men's club". Whether your male or female, other people's perception of your ability as a professional is designated entirely by your presentation of self.

I agree that you won't be taken seriously wearing a dress any more than a guy would be taken seriously wearing shorts and sandals. If you want to be taken seriously, get yourself a decent business suit (male or female). Believe it or not, men run into this same problem. How you present yourself matters.

You're right about the attitude though. From an employer's perspective, given an abundance of similarly qualified applicants, they will Always hire the one who stands out the most with an empowered attitude. That same attitude continues to affect your "brand" even after your hired. In this business you HAVE to stand up for yourself. You MUST come across like you personally own the world if you want to get noticed. I personally think that's why most women find it difficult to break into an IT career. Statistically men tend to be more aggressive than women and so they tend to get noticed more. I don't know if this is a result of something social, genetic, or upbringing, but I do know that presentation is everything in any self-driven career.

AMD fools Wall Street, posts smaller loss than expected

Thorfkin

Re: AMD is doing better

I personally think It's likely to be quite a lot more than 5M APUs. The Xbox 360 and PS3 both sold around 75 million consoles over the 7 (or so) years since their release. Over the next 7 years I expect AMD will sell more than 150M custom APUs between the XBOne and PS4. That doesn't take into account the APU in the Wii-U, or the standard PC APU's they're making that are likely to find their way into notebooks and tablets. I even have one of their APU's in one of my desktop machines.

Acer Iconia W3: The first 8-inch Windows 8 Pro tablet

Thorfkin

I can't speak for other people but I know what's stopping me from buying a Windows tablet. The graphics chipset. I refuse to spend money on a tablet powered by an Intel graphics chipset. I want something with a little longevity and to me that means either a discreet GPU or an AMD APU. So far none of the tablets I've seen come with either. I would like a Windows 8 tablet between 12 and 15 inches running on a decent AMD APU so I can game on it as well as surf the internet. Its really surprised me how difficult its been to find such a device.

IT design: You're not data, you're a human being

Thorfkin

When I think, I visualize everything in the form of abstract diagrams and imagery. It has always been a challenge for me to express linguistically what seems so clear in my mind's eye. When it comes to using computers, my experience has been similar to yours. I remember everything by association. When I want to run an app, I visualize in my mind the app itself, followed by the hierarchy of start menu folders that lead to running said app, like a map. I rarely visualize the name of the app. I remember an apps hierarchical position, shape, and color more easily than the name of the app itself. Because of the way my mind works, I NEVER use search as a means of navigation on my local machine. I find the whole search experience to be exasperatingly imprecise. I.E. If I have to reach for my keyboard to run an app then the interface design is piss-poor as far as I'm concerned.

I think these things might be part of why I don't like Windows 8's new start screen much. Sure it looks cool but they've damaged an operating mechanic that has worked so well for me in the past. They've made searching an integral part of the start screen's usage scenario and I detest having to reach for my keyboard when all I want to do is run Calc or Defrag.

Windows 8.1: So it's, er, half-speed ahead for Microsoft's Plan A

Thorfkin

Search as a primary means of navigation

This concept irritates me as well. I use my QuickLaunch bar as my primary means of navigation. Even in Windows 7 I manually create a QuickLaunch folder, and link it to my taskbar as the top row. Almost every program I use has an icon there so that almost all of my applications are literally one click away. This concept of searching for programs I want to run is a huge step back in terms of navigation efficiency. Microsoft has a lot of work still to do to make Windows 8 anywhere near as navigation efficient as Windows 7.

I also firmly feel that when i am in desktop mode, every app I run, even if it's a TIFKAM app, should run in desktop window mode. Not in full screen TIFKAM mode. Windows 8.1 helps but it's not where it need to be yet to get my money.

Sony, Microsoft and Nintendo: The big three slug it out at E3

Thorfkin

What The New Generation Brings To The Table

You mentioned that you weren't sure what the new generation of consoles have to offer over the current gen systems. I can answer some of that. I should start by pointing out the limitations of the Xbox 360 and the PS3.

Even though both of those consoles are able to display a native 1920x1080 image, the GPU (Graphics Processing Unit) in both have some severe (now) limits on the polygon count, texture fill rate, and texture memory. As a result, most game studios have to turn down the render resolution in order to maintain a playable framerate. If you look at the back of the game case for most games, it will tell you the actual render resolution, which in many cases, especially for the larger AAA titles, was limited to around 1280x720 and then scaled up to 1920x1080. This process results in "Blurry" images. The GPU in the new generation consoles is significantly more powerful. Enough so that it can render at native 1920x1080 and still have enough processing power to provide respectable environment detail at a playable framerate. I suspect that most, or even all of the games for the new console will have a native 1080p render resolution which will produce much sharper results in game than the older consoles can.

The older consoles also had some rather severe memory limitations. If you've ever played any mod-able PC games that have fan-made replacement texture packs available (E.G. oblivion, fallout 3, etc. .), you'll know that what a difference truly well made set of surface textures can make to the game. The original consoles just didn't have enough memory to use really high quality textures on most surfaces. Some of the better texture packs can use 4gb of memory just by themselves. The PS3 and the Xbox 360 each had less than 1gb of memory. Both of the new systems have 8gb of memory so large detail textures with good bump maps should be possible on the new consoles.

Between the native 1920x1080 render resolution and the ability to use detailed surfaces, the overall visual fidelity on the new consoles should be significantly better than on the previous generation. Naturally it's up to the game studio to take advantage of these additional resources and some probably won't. But some studios will and those games will be the ones that really showcase what the new hardware can provide.

PC makers REALLY need Windows 8.1 to walk on water - but guess what?

Thorfkin

Sales Downturn

I don't believe the economy is fully responsible for the sales downturn. It definitely played a small part but overall people have always been willing to pay for a product that provides good value and that's where I think the real problem resides. Windows 8's interface is crap for traditional desktop users. The fault lies with Microsoft. It's a shame they're trying so hard not to acknowledge that fact. Instead they should just do what they did when Vista wouldn't sell. Redesign the interface to give customers what they want and sell it as a new version. I'm not convinced the tweaks they're planning with Windows 8.1 will be enough to spark interest.

Reports: New Xbox could DOOM second-hand games market

Thorfkin

I think it more likely that the Xbox One will simply fail to catch on as well as its predecessors. Or at least I hope. If you want to make Microsoft understand how unacceptable you find their mistreatment of the used-game market, don't buy their console. Words mean nothing to a large company like that. If you want them to listen, you have to vote with your wallet.

Microsoft honcho pleads with media: 'Stop picking on us!'

Thorfkin

They got the same picking on with Vista. But with Vista it really wasn't Microsoft's fault so much as Intel and their crappy drivers and hardware. But with Windows 8 it's every bit Microsoft's fault and they deserve the criticism. Their new interface made the OS easy to use on tablets but considerably harder to use on traditional desktop machines. The need to do better.

Master Beats: Why doesn't audio quality matter these days?

Thorfkin

PC Audio

The same thing has been happening in the PC market as well. You used to be able to expect that a high end PC would come equipped with a creative labs, turtle beach, or other comparatively high qualify audio card but these days you're hard pressed to even find them separately let alone buy a machine with a decent audio card pre-installed. I remember I used to be able to find motherboards with creative labs audio chipsets onboard. Now all I see are realtek chipsets. My home theater PC is poorer for it.

But I also agree that the source audio quality is a big problem these days. I have an old album called the Sierra Soundtrack Collection recorded my Mark Siebert that I bought back in the early 90s. The audio fidelity is absolutely amazing. It easily destroys newer stuff like my Disturbed albums in quality. Don't get me wrong, I love Disturbed's stuff but I do really miss the 90s focus on audio fidelity.

Microsoft Xbox gaffe reveals cloudy arrogance

Thorfkin

Cold Hard Reality

It's about time someone finally wrote in an article what I've been thinking for the last few years. I live 25 miles out of town in the middle of a desert. If my internet connection drops Only 5 times a day I consider it a very good day. I'm in exactly the situation you describe and there's not a damned thing I can do about it. I have exactly one internet provider here and I pay over $100 a month for a shitty 1.5mbps connection. This always on bullshit is simply and purely unacceptable. If either next gen console comes with an always connected requirement, I simply will not buy that console. I consider this matter non-negotiable. If they want my money, i require that they drop their always on connection requirement. I WILL NOT and CANNOT budge on this position.

I am NOT a PC repair man. I will NOT get your iPad working

Thorfkin

I experienced the same crap for the longest time until I started making my friends and family buy their computers from an official vender like Dell or HP. When something goes wrong I tell them to call tech support. I'm much happier for it.

Tablets aren't killing ereaders, it's clog-popping wrinklies - analyst

Thorfkin

E-Ink Screen

It's my opinion that the average buyer doesn't understand the difference between an E-ink screen and a tablet screen. When i walk into a tech store like Best Buy I see the e-book readers and tablets displayed next to each other in the same place in the store but I never see any displays or placards that explain the benefits of the E-ink screen. So naturally potential buyers see the tablets with their beautiful color screen and wonder why anyone would buy an e-book reader. I think retailers need to do a better job of making sure customers understand the benefit of only having to charge their e-reader up every few months no matter how much they use it.

Have Brits fallen for Netflix, or do they still LoveFilm?

Thorfkin

Top Gear

I just wish they would update Top Gear on Netflix with the newer series. They stopped at series 17 and have not posted anything past that.

'Not even Santa could save Microsoft's Windows 8'

Thorfkin

Windows Dominance

@Matt

I think your assertion that Microsoft needs to "Think differently" could be expanded on. The problem, as I see it, is that Microsoft tried to create an interface that works for both desktop and tablet computers rather than creating two separate interfaces optimized for each task. They should have provided a way to active switch between "Desktop Mode" and "Tablet Mode" with a clear desktop oriented interface based on what has worked historically for desktop mode and the new Notro interface for tablet mode. This would have made Windows 8 into an OS that could easily slot into a wider set of usage scenarios. Instead by trying to create this hybrid interface they've done the opposite, narrowing the effective usage scenarios.

Desktop users want a Desktop OS not a tablet OS and this new interface suits tablets to the exclusion of effective desktop use.

Report: US telcos cashing in on data caps and poor competition

Thorfkin
Flame

Uncompetitive Pricing

Agreed. I still pay around $90 a month for 1.5mbps connectivity because they're the only high speed low-latency ISP that services my area and they're fully aware of that fact. I'm sick of being gouged by these greedy bastards. I really wish the FCC would do something about it.

It's official: Mac users are morally superior to Windows users

Thorfkin

Logic Fail

Your methodology is flawed. The study in question shows that Mac users donate, on average, more than Windows users do. It's your assumption that Mac users are more generous than Windows users that's flawed. It's the causal relationship that matters here. It's no surprise to anyone that Apple computers generally cost more than comparable products running Windows. There are notable exceptions but as a general rule this is true. Given that fact, people on a limited budged are significantly more likely to purchase PCs running Windows than they are a Mac. The natural result of this inference is that percentage-wise, people who use Apple computers tend to have more disposable income to work with than those who use PCs running Windows. That's not an indication that said people are more generous, only that they have more money they don't need.

I find myself reminded of the South Park episode about the clouds of Smug surrounding eco-friendly car owners =D

2012: an epoch-defining year for home entertainment

Thorfkin

"Smart" Televisions

I must admit, I simply do not understand how "Smart TV" could possibly be "The next big thing". It just seems to be to be an utterly pointless offering. All TVs sold today have HDMI input and it's literally second nature to just hook up a basic Home Theater PC that can easily deliver everything a "Smart TV" can deliver and a million things it can never hope to deliver. Why would anyone think that Smart TV was anything other than garbage with a pretty package wrapped around it? At least the 3D revolution had something tangible to offer. If you want a decent entertainment experience, get a decent 3D TV and hook up a decent gaming ready HTPC. Don't waste any effort, or especially money, on Smart TVs with features that even the cheapest HTPC can easily outclass.

Windows 8 security is like a swiss cheese flak jacket - sez AV firm

Thorfkin

Headline Grabbing

One thing I couldn't help but notice is that you failed to mention how Windows 8's security compares to Windows 7 or Windows Vista. You say Windows 8 can be infected by 16% of the most popular malware when the OS' only protection is Windows Defender. I suspect that that if you ran those same tests against Windows Vista or 7 that you'd find similar results. My instinct is telling me that Windows 8 is likely as secure as it's predecessors and that you're focusing on Windows 8 in an attempt to grab headlines.

When you look at this from the Glass-Half-Full perspective, that means a clean install of Windows 8 is resistant to 84% of malware designed specifically to infect Windows machines. That's pretty good as far as I'm concerned. As the saying goes "You're trying to make a mountain out of a mole hill".

Judge drops TV ad-block block: So how will anyone pay for TV now?

Thorfkin

Vote with your Wallet

I applaud the ruling. I'm of the opinion that the current state of television programming is vile and untenable. The way I see it, if I pay for television programming, then that programming better be 100% ad-free. I refuse to pay for programming that wastes as much as half of my time with useless repetitive twaddle. Since cable operators feel the need to charge for their service yet still shove endless amounts of advertizing in my face, as far as I'm concerned, their service is utter shite and not worth even $1. I vote with my wallet on this and so should you. If they insist on flooding your peaceful house with disruptive crap then the service needs to be completely free.

I do understand the cable operator's perspective and business model. They charge for the service of bringing the available cable channels to your home but they don't actually do any of the programming itself. But that doesn't change the fact that cable is seriously expensive these days despite the fact that the programming is something like 99% ad supported. I buy every one of the shows I watch on blu-ray for less than half the cost of paying for cable for 6 months worth of time. Your average season of any particular show will run about $50 and that's for excellent quality, no commercials, and I can watch it on My schedule rather than trying to time shift it.

I just don't see why cable companies are still in business. It doesn't make sense.

Windows 8: Never mind Office, it's for GAMING

Thorfkin

@Matt Asay

Wow, this article is littered with bad information.

>> Yes, that same Microsoft that blistered the existing gaming competition with the XBox,

What're you smoking?! Microsoft and Sony tied for second place this generation and Microsoft barely qualified as a competitor in the previous one. Nintendo wiped the floor with both of them this generation for both market penetration and profitability.

>> and subsequently set the standard for interactive gaming with Kinect.

All gaming is interactive! That's the whole point!! Microsoft did NOT set the standard or even come close to it. It was only a few days ago that I read an review of Kinect and how no serious gamer would even consider waving their arms around to try and play an PRG or ANY game that requires quick reflexes. Kinect was knee-jerk reaction to Nintendo's success with the Wii just as Sony's Move is. No serious gamer uses it.

>> Valve CEO Gabe Newell called Windows 8 a "catastrophe" for gaming, citing the Metro interface and Microsoft's closed app store.

I wouldn't call Microsoft's app store a catastrophe because it's closed. I would call it bad for end users. I understand what Microsoft is trying to do but from an end user perspective they don't seem to realize that what they're doing is fragmenting the DRM landscape. Having a central gateway in the form of Steam made it obnoxious but grudgingly acceptable to work with DRM laden software. But every additional gateway that gets added to the landscape makes DRM in general more obnoxious and less acceptable. Microsoft isn't helping here.

>> The former concern is overblown - you don't have to use the Metro interface if you don't want to

It's not overblown! Metro is shit for anyone running a desktop computer and most hardcore PC gamers do so on a custom built desktop machine. To my knowledge you don't have the option to turn Metro off without using third party software to accomplish it. I could cope with the new start screen but every application's insistence on running in full screen metro mode is very much a catastrophe.

I like how fast and well optimized Windows 8 is but Microsoft really screwed up with the Metro interface. They made the OS completely unusable on most net-books and even on any machine running at 720p resolution (My HTPC) by artificially setting an unacceptable minimum resolution.

As usual you appear to have absolutely no idea what you're talking about Matt.

Memo to openSUSE 12.2: More polish, less angst

Thorfkin

Open Suse

My experience with OpenSuse in general was only what I would call "Okay". I liked the look of the interface but I just kept running into too many strange bugs for my taste. With 11.4 Only around half the desktop effects work correctly on my Intel chipset powered laptop. If I turned on the wrong desktop effect the whole interface would grind to a halt and crash. There were also a lot of widgets that would just inexplicably crash on me. With 12.1 I found that the interface was just slow as tar running down a wall. Even when I turned most of the desktop effects off it just felt like I was constantly having to wait for the OS to catch up. Not at all a good experience for me. But it was the strange glitches in the "start menu" that really pushed me away from OpenSuse. There seemed to be some weird memory leak that caused items to start vanishing from the menu just leaving blank spaces in it's place until you mouse over those entries. I hate to say it but even Windows 98 was more stable than that. I haven't seen such bad behavior from an OS interface since Windows ME.

I also gave Ubuntu a try. I liked how fast Ubuntu performed but I didn't like the Unity interface and I found the bugs in the Gnome to be a nightmare. Any time I opened more than 7 or 8 browser windows the whole interface would flip out for no apparent reason.

Now I'm using the KDE release of LinuxMint and so far it's the first Linux distro that's behaved for me on my antiquated laptop.

Anyway, has anyone else run into similar problems with OpenSuse? If so, have they fixed those problems with 12.2?

Windows 7 passes XP, Mac OS X passes Vista

Thorfkin

Vista

I've always wondered what it is that turns previously intelligent individuals into raving copy-bots once they're employed in the field of IT journalism. If only I had a dollar for every incidence of some oblivious journalist claiming Windows Vista is a disaster I could buy Kim .com's house. At this point, Windows Vista is actually better than Windows 7. On a clean install, Vista with service pack 2 is faster and takes a lot less interface customization to make it work well than Windows 7 does. I've got both Windows 7 Ultimate and Windows Vista Ultimate here in my house and I keep Vista on my personal machine because I actually like it. If you genuinely do prefer Windows 7's interface then power to you. I personally hate it. Please stop repeating that baseless claim that 7 is better than Vista. They're virtually identical under the hood with 7 having slightly poorer Superfetch optimization. Microsoft tried to shorten the boot up time with Windows 7, by reducing the effectiveness of the Superfetch service, in response to user complaints over Vista's perceived boot time.

Ten movies inspired by video games

Thorfkin
Flame

@Caleb

I think you're letting your opinion / perception of what many of these movies should have been cloud your judgement concerning the movie's quality. Some of these movies are indeed turds but some of them are actually decent movies in and of themselves if you just put aside any preconceptions based on the game of the same name. Doom, Resident Evil, Silent Hill, and Tomb Raider for example were all decent flics that I continue to enjoy to this day. If you put aside any opinions based on the game those movies aren't actually bad. most of them do tend to have at least one mediocre actor in it granted but what movie doesn't? Even Mario Brothers, despite having a script targeted at children, was an enjoyable movie. Street Fighter and Double Dragon were pretty much just stinkers though. I can barely get through either of them without falling asleep. For some perspective I recommend you sit down and watch Battlefield Earth or perhaps Manos: The Hands of Fate (Without the awesome riff-trax). Then you'll have an idea what a bad movie really looks like.

Ubisoft: 'Vast majority of PC gamers are PIRATES'

Thorfkin
Flame

Ubisoft appears to be run by a fair lot of incompetents. Their Always Online fiasco annoyed me so much I stopped buying their games. I won't spend money on another Ubisoft published product until they publicly acknowledge that they screwed up and take steps to resolve my concerns. It's a shame really because I genuinely wanted to play through the Assassins Creed games that followed the first one but I refuse to give that shit company my money and I don't pirate my games. I haven't pirated a game since i was an idiot kid living in my parent's house and I have no intention of doing so ever again. I take offense at being treated like a software pirate after paying good money for a game I want to play.

Apple: Samsung was in 'crisis' over our iPhone awesomeness

Thorfkin
Mushroom

I think there are larger issues at stake here than just whether or not Samsung copied Apple's product design. I think the real issue that this case will influence is whether or not Apple should have exclusivity over the tablet market. There concept of tablet computing in general is not an Apple invention. There have been numerous prior art examples. The data pads everyone carried around in Star Trek The Next Generation is a prominent example. The concept of portable tablet based computing has been around for a long time. Many manufacturers have tried to produce devices that fit this segment but I grudgingly admit that Apple was the first one to market with a device that truly broke the geek-only barrier. Bringing existing touch screen tech and online app distribution together in a user friendly way was the key in my opinion. But despite this I don't believe Apple should have exclusive rights to tablet computing in general. That's what this trial is really about. Apple is on a quest to eliminate Android so that they can exclusively control the tablet and smart phone markets. Samsung was just the most vulnerable enemy and represents an early step in that long term goal. I don't believe Apple has genuinely earned the right to own exclusivity over tablet and phone computing. And I don't believe it's in the consumer's best interest for Apple to win this trial.

Woz: Cloud computing trend is 'horrendous'

Thorfkin
Thumb Up

Woz has the right idea

The buz on the internet and in most office meeting rooms is "Cloud". It seems to have caught on as "The next big thing". But I agree it's nothing more than the "leet speak" of the business world. Reality though is the black to the cloud's white. The simple undeniable fact is that if you move your critical data into the "cloud' all you're doing is adding one more fail point. Now not only do you have to worry about the functional reliability of your service provider's storage hardware but now you've just added your, and same said service provider's Internet connection to the list of things that can go wrong. Moving your data to the cloud Increases your down time rather than increasing it.

The cloud is fine for non-essential things like personal e-mail or for social sharing but for anything more important than that it's just snake oil.

Blizzard faces court battle for 'misleading Diablo III fans'

Thorfkin
Thumb Up

Wow. Someone forgot to take his happy-pills this morning. The game isn't that bad. Not what I would call a Great like Diablo 2 was but it's still fun.

I do agree that Blizzard screwed up on an epic scale in regards to pretty much everything concerning the game's launch and management but the game itself is decent enough. The fact that a fan died while playing is testament.

Darth Vader is a pansy

Thorfkin
Trollface

Technology Matte Black

You know I was just thinking the exact opposite. I actually LIKE the matte black finish and I feel all tech should be available in basic black. I was looking through the phones the other day at best buy and I couldn't help but notice how many of them failed so spectacularly at visual aesthetics. It's like cell phone manufacturers have completely lost any sense of style or taste. Not one of them was completely matte black. Either they have a back cover that's a hideously different color than the rest of they have this god-awful silver band around the outer edge (apple style). I really can't stress enough just what an eye-sore most apple products are and it fizzles my brain to see so many phone manufacturers willfully copying Apple's lack of style.

I still think it should be a crime to use the color silver without an art degree or something XD

LCD to have killed all* other TV technologies by 2016

Thorfkin
Thumb Up

Another Plasma Fanboy

I'll start by saying I own a Panasonic 58 inch plasma screen so i am a bit biased. I did a ton of research before I purchased so knew what I was getting into.

I think the predicted death of plasma is premature. LCD is indeed cheaper but black levels are still a problem with LCD screens. Some of the more expensive LCD screens come comparably close to plasma in black level detail but by the time you get there you're actually paying more compared to plasma. OLED and Sony's Crystal-LED both produce excellent black levels that meet or exceed what plasma can do but both are 10 times as expensive so again, plasma is still the winner for people who want excellent quality for a reasonable price. It was only a few months ago I recall seeing Panasonic announce they had produced a flicker free plasma display at 150+ inches and as I am a fan of plasma I am definitely interested in seeing larger models come to market.

I love my current Panasonic screen. I use it for my home theater PC. The burn-in that people so often mention when discussing plasma has primarily been corrected with the more current models. My set is 2 years old now and there's no visible burn in despite the fact that I spend hours a day playing games with stationary on screen interface elements. Just remember to run the anti-image retention utility that every current plasma set comes with, regularly.

AMD and Intel mainstream desktop CPUs

Thorfkin
Thumb Up

I noticed that a lot of you don't seem to see the point of this article. I agree that the article itself didn't touch very well on this. What the writer was trying to compare were processor's that have integrated graphics and therefor do not require a discreet graphics card or motherboard mounted graphics chip. Both Intel and AMD make what AMD calls an APU. The benefit of an APU is that the CPU and graphics chips don't have to communicate with each other over the PCIE bus and can thus exchange information faster using a lot less power. This is perfect for very small form factor machines like MiniITX and for laptops. Machines like this are ideal for home theater PC setups or for just general office work where heat output and power usage are critical.

I think I saw several people claim that businesses have no interest in graphics. That may technically be true but businesses DO have a very keen interest in power usage. When you have an office building with 1000 employees all using computers I can guarantee every watt matters. And that's where APUs really shine. They can deliver the performance of a mid-range discreet GPU at literally half the power usage of a comparable CPU/GPU combination.

I confess that I myself built my current office machine around AMD's A8-3870. I used an Asus MiniITX motherboard with on-board wireless lan and bluetooth, and an Antec ISK-300 MiniITX case. What I specifically love about this machine is that it costs less to run than a 100 watt light bulb and when I feel like playing a game I can even run things as current as Skyrim. Not at it's maximum detail naturally, but anything older than 2 years will run just fine maxed out.

So here's my opinion on the whole Intel / AMD comparison. If you're going to build a low power usage machine like this that you want the ability to run basic games on there's one thing to keep in mind. Intel, in it's entire long history, has never managed to produce a graphics driver that didn't suck. Intel does make faster CPUs but I wouldn't bet a dollar on their ability to make a GPU that will work correctly on every game you throw at it due to their shoddy driver work.

Then there's longevity to consider. AMD uses a unified driver architecture so that only a certain minimum interface between the core driver and the hardware has to be created for each new graphics chip. Their core driver works on every OS from Windows XP to Windows 8 and will likely to continue to do so for a long while still. In addition to that, their driver package supports all their hardware going back at least 5 years. This means you will not only be able to fully utilize your hardware with the upcoming Windows 8 but it's also likely you'll be able to do so in 3 or 4 years when Windows 9 comes out. You won't get that from Intel unless you have a discreet GPU in there as well and then your power usage and heat output is right back up there.

If you want a dedicated gaming machine go with an Intel CPU with a decent AMD or nVidia graphics card. But if you want a decent office machine that runs cool an quiet while idle but can still deliver some gaming oomph when it's needed, AMD's APU is the clear winner.

Breaking: Megaupload seizures illegal says NZ High Court

Thorfkin
Flame

British Arrogance

I couldn't help but notice how many commenters here seem to think that American authorities treated this case with a “Shoot first and ask questions later” approach. To any American, myself included, this illustrates just how little you know about how things work over here. That gunge-ho attitude you seem to attribute to American behavior is a myth. I agree that the seizures were completely illegal but if you think American authorities didn't fully know this before any action was taken then you've got your head buried up your arse. Money changed hands here. Plain and simple. And judging by the fact that they were willing to risk an international incident, I would say it was a LOT of money. Corruption in politics is something of which we have plenty.

No one watches TV, Nielsen, and you know it

Thorfkin
Flame

Value For Money

I'm of the opinion that Television programming providers here in the US are actually on a downward slide that will, at some point in the future, take a sharp dive. Most of my family's older members spend a lot of time watching television but it seems the younger views, prefer to watch shows online. I'm rather in the middle on this trend.

I think television providers like Dish Network, Direct TV, Comcast, etc. have forgotten that viewers pay for Value not for programming. As I've gotten older I've formed the unwavering opinion that the rampant commercialization of today's television content has complete drained the value from that content. So much so that I refuse to pay for cable television. There are channels available that are commercial free, HBO / Showtime / Cinemax, but none of the content providers will sell you access to just a paid channel by itself without some for of basic or extended package also present on the account. This destroys the value completely. And so I refuse to give them any of my money, opting instead to wait until the show hits blu-ray (if I really like it) or netflix which is commercial free. I'd love it if say Dish Network would sell me access to just one premium channel by itself, HBO for example. That would provide genuine value for my money.

I've found it to be an insurmountable challenge to make television providers understand that I am an All or Nothing type of viewer. Either I am paying for the content and it had better be 100% commercial free, or it cost me nothing at all to view the content and I would expect it to be ad supported. Under no circumstances will I watch a show that I have to pay for AND has commercials in it. I've voted with my wallet by not purchasing cable television but my small contribution to this cause is like a fly trying to stop a freight train. I wish more people would stand up and make this opinion known to them.

Anyway. I think that older people are just so used to paying for commercial-infected content that they continue to pay blindly for material that should be totally free to them. I also think that as the older generations pass on and the more tech savvy generations are left that TV providers may suddenly find themselves without any customers.

openSUSE 12.2 release delayed, team calls for a rethink

Thorfkin
Thumb Up

It's good they are re-evaluating their structure. About 4 months ago I finally gave up on OpenSuse and moved to LinuxMint. I changed because of the excessive number of strange bugs i kept running into. Each successive release would fix some bugs but always had even more new ones so the problem just seemed to keep getting worse. The 12.1 release was so bad it was almost unusable on my netbook. LinuxMint isn't perfect but there are a lot fewer obvious defects than I saw in OpenSuse.

Windows 8 Release Preview open for download

Thorfkin
FAIL

Metro Limitations

Can anyone confirm for me either that Metro apps run from the start screen are able to run in classic desktop mode by default or that Microsoft has enable the ability to run them at resolutions below 1024x768? I want to use this on my netbook but arbitrary limitations placed on the Metro interface made it impossible to use Windows 8 on my 1024x600 pixel screen.

Ultrabooks: objects of desire but just too darn expensive

Thorfkin
FAIL

Most people aren't going to want to spend an extra $500 for a machine that can do the same thing as a cheap netbook but is a little thinner. The CPU in the ultrabook is faster than a netbook CPU and Intel claims the GPU is Direct X 11 capable but Intel has been historically unable to produce drivers that can run even basic old games let alone anything current. So the end result is still a machine that does exactly the same as a cheap $400 laptop or even cheaper netbook but just weighs less. Worth an extra $500? Not likely!

Sony PlayStation 4 will not be download only

Thorfkin
Thumb Up

I'm glad Sony made this decision. I prefer physical media because where I live the fastest internet connection available is 1.5mbps. It takes days to download a 10gb file and PS4 games are likely to be at least that big and probably bigger.

As far as backwards compatibility goes, I would like the PS4 to be backwards compatible but the rumors suggest this is unlikely. The rumors are saying that Sony is switching to an architecture closer to x86 to make coding for the console easier. If this is true it's unlikely the new machine will be able to emulate the cell processor fast enough to make PS3 games work. It could probably emulate PS1 and PS2 games well enough though if they put enough effort into the emulation engine.

It was only the first two or three models of the PS3 that backwards compatibility with PS2 games. The original 60gb model had hardware support for it and the subsequent 80gb model had software emulation.

Basic instinct: how we used to code

Thorfkin

Old School

I kind of feel the same way. I first learned to program on my C64 starting with Basic (Wrote a light cycles clone), then moved to inline assembly, then completely assembly, then machine code. Then later when I got my first PC I picked up C++ (I miss Borland's turbo C++ compiler), Cobol, and Pascal. Not because I was any sort of wiz at computer languages so much as they were all close enough that once you learned one, the rest were all just a matter of syntax. I got into it because I wanted to learn to write games like the ones I spent countless hours playing on my C64. Sometimes I feel sad that today's youth won't have that same opportunity to get into computers when it was still a hobby small enough to be enjoyable. These days everything is so abstracted that just getting into programming seems like a daunting task.

Ten... eight-bit classic games

Thorfkin

Re: 8 Bit?

Okay I figured that had to be the case. None of those games displayed 8 bit graphics so I figured it must have been in reference to the CPU instead of the graphics.

Thorfkin

Re: 8 Bit?

Resistance is futile =D Please feel free to correct me if I missed something obvious and stupid =D

Thorfkin

8 Bit?

When you say 8 bit are you referring to the graphics or the CPU? Looking at the list of games, none of them appear to actually have 8-bit graphics so it must be a CPU reference.

Ant Attack: 5 colors counting the black background

Chuckie Egg: 4 colors

Elite: 6 colors

Hitchhiker's Guide: Seriously?

Hungry Horace: 6 colors

Jet Pac: 6 colors

Knights Lore: Cool looking but still only 5 colors

Manic Miner: 6 colors

Phantom Slayer: Definitely 4 color CGA - 2 bit graphics.

Tranz AM: 7 colors

Actual 8-bit graphics had the ability to display up to 256 colors. Meaning each pixel was represented by 8 consecutive bits able to store a number from 0 to 255. For this you had to use the 300x240 VGA resolution "mode 13" if I recall correctly. Though it's been forever since I did any computer programming so I can't remember for sure =D. 4 color graphics were CGA and were essentially 2 bit. 16 color graphics were EGA and were 4 bits per pixel. There were also some VGA modes that could only do 16 color graphics but that was more to a memory limitation and the VESA spec than the actual bits involved in the graphics. With 16 bits per pixel you had a palette of up to 65535 colors otherwise known as High Color. This was most commonly found in your Super VGA graphics cards.

It just struck me as odd that none of the games you showed here are actually using 8 bit graphics as the title seems to suggest.

People refer to the NES and the Sega Master System as 8-bit but that's a reference to the CPU I suppose because the NES was never capable of more than 4-bit graphics output.

Page: