Tag: nVidia

Where Will the PC Go? -- Part 4: SaaS

So, per the last couple of posts, I find it entirely possible that, as vendors develop tablets that double as PC's, they may replace traditional desktop and laptop computers. For the common end user who just needs a web browser and (maybe) an office suite, I don't think that's going to be a tough sell.

But there are markets that rely heavily on more powerful computing hardware.

One is PC gamers. Others are the various types of media creators: people who create images, music, movies.

I've already mentioned dumb terminals and software as a service (SaaS) as a major current trend, with programs like Google Docs running in a browser and working as an effective substitute for traditional locally-run programs like Microsoft Word.

Of course, a word processor is one thing; an enterprise-quality photo editor is another, and a game requiring split-second timing is something else again.

But developers are working on it.


Last year Adobe released a limited beta of a streaming version of Photoshop for ChromeOS. Photoshop itself doesn't run in the browser; the app is a Remote Desktop shell that interacts with an instance of the Windows version of Photoshop running on a remote server.

So, by definition, this is no replacement for the Windows version of Photoshop -- because it is the Windows version of Photoshop. But it demonstrates a potentially compelling alternative to buying expensive, high-end hardware just to run Photoshop: what if you could buy cheap hardware, and pay a subscription fee to run Photoshop on someone else's expensive hardware?

Reactions to the ChromeOS version of Photoshop seemed generally positive; I would expect it to have some latency issues, but I also bet it runs faster on a remote server than it did on the Core 2 I had to use at GoDaddy. (Hey, when I said the Core 2 Duo was the last chip most users ever needed, I said I wasn't including Photoshop.)

Adobe has already moved Photoshop's licensing to a subscription model instead of a purchase model. (A lot of people are very angry about this, but I haven't heard anything to suggest it's led to a drop in "sales"; that's the thing about monopolies.) It's not hard to envision a transition to a subscription model where you run the program remotely instead of locally. Hell, they could even charge more money to give you access to faster servers.

A/V Club

Other media development suites could, potentially, move to streaming services, but there are caveats. Uploading raw, uncompressed digital audio and video files takes a lot more time than uncompressed images. And what about storing your source files? My grandmother puts together home movies on her iMac, and she's got terabytes of data going back some 15 years. That's the kind of storage requirement an amateur filmmaker can rack up; now think of how much somebody who does it for a living might wind up with. If you're renting storage space on an external server, on a month-to-month basis, that could get pretty costly.

But it's technically feasible, at least, that audio and video editing could be performed on a remote server.

Recording audio is another story. Anything more complex than a simple, single-track voice recording is still going to require specialized mixing hardware. And transferring your recording to a remote server in real-time, without lossy compression? You'd better be sitting on fiber.

So I think we can put "recording studios" -- even the home-office variety, like mine -- into the category of Stuff That's Not Going Anywhere for Awhile.


Moving games to a streaming system is a challenge -- but I'm not sure it's as big a challenge as recording studios. It's more or less the same requirement as Photoshop: take simple inputs from a human interface device, send them to a server, have the server run them and respond accordingly, stream the video output back to the client. The trick is managing to do that in real-time with minimal loss of audio and video quality. That's the challenge -- but engineers are working on it.

The OnLive streaming service was a failure, but Sony bought it out; it sees value there. nVidia's got its own streaming solution too, in GRID. One of these things is not like the other -- Sony sells consoles at a loss and would stand to benefit from selling cheaper hardware, while nVidia makes a ton of money selling expensive graphics cards to enthusiasts and surely doesn't want to cannibalize its own market -- but obviously there's more than one type of gamer, and the people who shell out over $300 for a graphics card are in the minority.

Now, as minorities go, high-end PC gamers are still a pretty sizable minority; it's still a multibillion-dollar industry. But it's a fraction of the console gaming business, and it's expected to be surpassed by mobile gaming by the end of this year. Like the PC industry as a whole, it's still big and it's still growing, but it's growing a lot slower than other sectors and could be facing a long-term threat from new platforms.

Switching to a streaming platform could have a lot of appeal to game publishers; it combines the simplicity of developing for consoles with the superior hardware capabilities of the PC. Think about the possibility of developing for the latest and greatest hardware, but only for a single specific hardware build.

It would also, at long last, produce a form of DRM that could actually work.

While the industry has tried many, many copy protection schemes over the years, all of them are, sooner or later (and usually sooner), crackable. And there's a simple, logical reason for this: no matter what you do to encrypt the data of your program, you have to give the computer the means to decrypt it, or it won't work. No matter where or how you hide the key, if you give it to your users sooner or later they're going to find it.

But that's only true if the software is running on their computer. If the binary data is never copied to their hard drive, never stored in their memory, if the program is actually stored and run on a remote server somewhere and all the client has access to is a program that takes inputs and streams audio and video? Well, then there's no way they can copy the game, unless they actually break into your servers.

(Which, given Sony's history with Internet security, might not actually be so hard.)

I am not saying this is a good thing; in fact, I consider it something of a nightmare scenario.

Consider every problem you've ever had with an online or digitally-distributed game. Now think of what it would look like if every game had those issues.

Not just latency, lag, server outages, and losing your progress every time your Internet connection goes out. Consider that if a game is no longer profitable, they'll pull the plug. If a developer loses a license, the game(s) associated with it will go away. (Was GoldenEye ever released on Virtual Console? I don't think it was.) If a game gets updated and you liked the old version better, too bad. And remember when Nintendo ended its partnership with GameSpy and killed all the online multiplayer features of every Wii and DS game ever made? Imagine an entire generation's worth of games not working at all anymore, online or otherwise. Even though you paid for them.

Now, there's recent evidence that a strategy like this would fail. The Xbox One is still reeling from customer backlash against early plans to restrict used-game sales and require an always-on Internet connection even for single-player games, even though those plans were never even implemented.

On the other hand, there's evidence that even a wildly unpopular strategy could still succeed. Have you ever heard anyone who doesn't work for EA praise the Origin distribution service (or whatever the fuck they're calling it now)? I know I haven't, but people still use it. Because if you want to play Mass Effect 3 or Dragon Age: Inquisition, your only choices are consoles, Origin, and piracy.

And then there are examples that could go either way: Ubisoft continued to use DRM that required an always-on Internet connection for about two years, from 2010 to 2012, before finally giving in to market backlash.

It's hard to say how existing high-end PC gamers would react if the major publishers tried to force a transition toward streaming games -- or whether high-end PC gamers will continue to be a big enough market for the major publishers to care what they think. But for the foreseeable future, I think PC gaming will continue on much the same as it has for the past 15 years. There could be major changes on the horizon, but I sure don't see them happening in the next 10 years.

Then again, five years ago I was saying there was no way that streaming video would outpace Blu-Ray because there was just no way to stream 1080p video over a home Internet connection. So keep that in mind before trusting any predictions I make.

For Future Reference

For the next time I get locked out of X after an nVidia upgrade:

The OpenSUSE package for nVidia drivers for a GTX570 is x11-video-nvidiaG03.

The OpenSUSE package for the nVidia kernel module for a GTX570 is nvidia-gfxg03-kmp-default.

nVidia BSoD Fix?

Well, after a year and a half, I think I've finally got the constant BSoD's I get when playing a game with my nVidia GTX 570 fixed.

First, I bit the bullet and used MSI Afterburner to underclock it to 650 MHz. I may not need to keep it that low, but I still got lockups with 690.

I also added a registry key. Via Mike's Technology and Finance Blog, you can set a key at HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\GraphicsDrivers called TdrLevel.

One of the complaints with Windows (or really any other operating system) is that the screen freezes from time to time. If the screen freezes for more than a few seconds, users are likely to hard reset the machine that they are working on. This seems natural, but in this case the system is still responsive. The graphics processing unit (GPU) is busy processing something (possibly a game, 3D render, or even Windows Aero) and is not actively refreshing the screen.

In Windows Vista SP1 and Windows Server 2008 SP1 Microsoft introduced a feature to help catch and correct this behavior using a feature called "Timeout Detection and Recovery (TDR)." The TDR feature works to identify whether the graphics processor is hung (the default timeout is 2 seconds), and if it is, it prepares to reset the graphics processor and the relevant part of the graphics stack. During this process, it tells the driver not to access the hardware or memory and gives it a short time for currently running threads to leave the driver. If the threads do not leave within the timeout, then the system bug checks with 0x116 VIDEO_TDR_FAILURE. The system can also bug check with VIDEO_TDR_FAILURE if a number of TDR events occur in a short period of time (the default is 5 TDRs in 1 minute). If the TDR is successful, then the user may receive a bubble that says "Display driver stopped responding and has recovered."

TdrLevel should be a REG_DWORD. I set it to 0, to disable checking for TDR entirely.

I'm not sure if that helped or not; I think the underclocking was the more important step (as when I set TdrLevel to 0 but didn't underclock, I still got a lockup). But TDR certainly sounds like something that matches my symptoms, as the lockups usually occur with graphical and audio sputtering -- indeed, sometimes I don't get a blue screen at all, the game just sputters to the point of unusability and the system becomes unresponsive.

At any rate, I'm cautiously optimistic; it looks like I've finally got this thing under control and can actually play games under Windows without constant crashes. I didn't notice any performance hit, either, but then again it's not like I'm trying to run Crysis 2. Walking Dead works fine with its settings maxed out, but you don't need a GTX 570 for that.

Now if I could only get OSX running stably with a 64-bit kernel.

PC Gamer's Dilemma

Well, I finally got me an Xbox 360.

It was free. My fiancée got a new computer with one of those student "comes with a free Xbox" deals.

Here's the thing: I've got a pretty solid gaming rig. And another pretty solid media rig. So I haven't felt much need for Xboxin' up to this point.

The advantages and drawbacks of PC gaming are pretty well-documented. A PC can support crazy high-end hardware, but while the games are cheaper the gear is more expensive and fiddly and there's a whole lot that can go wrong.

Me, I'm something like a niche of a niche of a niche of a niche -- I run Linux on a Mac Pro as my primary OS and keep Windows around for gaming.

This is pretty cool when it works. But here's the thing: even a good Apple makes for a pretty crummy gaming system.

Last year I bought a pretty high-end Nvidia card. ATI has better Mac support, but I've had nothing but headaches trying to get ATI cards working with Linux. Nvidia's always run smoother for me -- galling considering their total lack of cooperation with Linux and the open-source community, but true.

But it's not an officially-supported card. It works under OSX (as of 10.7.3) but it's not entirely reliable under Windows -- when it gets taxed too heavily, I get a bluescreen.

It happened a few times when I played through Witcher 2, but, perversely, it's given me more trouble on Mass Effect 2 -- a game I had no trouble playing through with all the settings maxed out on a lower-end (but officially-Apple-supported) ATI card.

I thought it might be a heating problem but it occurs, consistently, even when I crank up all my system fans with third-party software.

The game worked fine up until Omega, and then started BSoDing randomly. I managed to recruit Garrus in-between crashes, but by the time it came around to Mordin's quest I couldn't get past loading the corridor.

I could just try some other missions, but seriously, you want me to put off getting Mordin? Hell no.

I've found, from searching, that this appears to be a fairly common problem with ME2, even among people not running eccentric hardware configurations such as mine. And I've found a few suggested fixes, but none have worked for me.

I've tried running the game under WINE on both OSX and Ubuntu. Under OSX it plods (I suspect my helper card may be to blame; maybe I'll try disabling it to make sure my higher-end card is the only one the system's putting a load on); under Ubuntu it runs fine up until the menu screen but then doesn't respond to mouse clicks or keystrokes (other than system stuff like Alt-Tab or Alt-F4). I haven't turned up any other reports of this same problem, so I can't find a fix -- maybe one of these days I'll try a full clean install and see if it still does it. Nuke my WINE settings too if I have to. (Or maybe I could set it up on my fiancée's new computer...)

Needless to say, I haven't tried Mass Effect 3 yet.

And that's before we get into all the DRM bullshit plaguing the PC platform.

Never played Batman: Arkham Asylum, largely because of the SecuROM/GFWL/Steamworks Katamari of Sucktitude. Similarly, I gave Dragon Age 2 a miss once I heard reports of people unable to authenticate their legally-purchased games because they'd been banned from BioWare's forums for saying mean things about EA. (Which obviously totally disproves that EA deserves to be called names.)

It's a great damn time to be a PC gamer for a lot of reasons -- a huge indie scene supported by the likes of Steam and the Humble Indie Bundle, with both pushing more gaming on OSX and even Linux -- but it's a lousy time for other reasons.

Anyway. Now I've got an Xbox. All else being equal, I still prefer to play games on the PC, but for cases where the Xbox has less restrictive DRM (like Arkham Asylum) or titles that aren't available on PC (like Red Dead Redemption) or just shit I can get for under five bucks (like a used copy of Gears of War I just picked up), well, it's kinda cool to have one.

Playing: Batman: Arkham Asylum.

Triple-Booting a Mac Pro

Updated 2007-10-14. Scroll down to where it says "Update 2007-10-14". I'd put a link here, but for some reason b2evolution will not let me use the "id" or "name" attributes; expect a presumably silly and useless "rant" on that subject very, very soon. (Update 2008-01-17: Switching to WordPress fixed the problem.)

So I got that Mac Pro I was talking about earlier. No, I still can't afford the thing, so if you notice me living a life of indentured servitude for the rest of my days, well...I'm Irish. We're used to it.

The bastard about being on the bleeding edge is that there aren't a whole lot of guides to walk you through your setup. For example, I found quite a number of guides on how to multiboot a MacBook Pro with 3 OS's on different partitions of the same drive, but approximately bupkis on how to do it on a Mac Pro with each OS on its own drive.

So, in case anyone winds up Googling for the same information I couldn't quite find, here's how I finally did it. Hopefully this'll make it easier for you than it was for me.

Installation and booting

I can't say for certain, but I think order of drives and order of installation are both important.

After some trial and error, I wound up laying my drives out like this:

Drive 1 is Kubuntu.
Drive 2 is OSX.
Drive 3 is Windows XP x64.

Leastways, that's how they're set up in hardware. For reasons I'm not altogether clear on, they show up in software as Kubuntu on sda, Windows on sdb, and OSX on sdc. Still more curiously, both the Kubuntu drive and the OSX drive are assigned SCSI ID 0,0,0. (Could be some holdover from the old master/slave days? Maybe the drives are on different controllers? Something to do with MBR vs. GPT? Is it because the Kubuntu drive is physically first but the Mac drive boots first? Don't know.)

Order of installation seems to be important too. I say this because my first time through, I installed Kubuntu, it ran fine, and then I installed Windows and Kubuntu wouldn't boot anymore. I'd click on the Linux icon and it would boot the wrong OS. (Actually, it still does; more on that later.) So, as with most things in life, everything was going great until I installed Windows.

But after a day and a half of banging my head against the wall, I finally got all 3 OS's moving by rearranging the drives (see above) and installing Windows first and then Kubuntu. (OSX, of course, was preinstalled.)

Things to keep in mind: since we're talking 64-bit Windows, the Boot Camp program is useless. You can ignore it. It might be useful for resizing your OSX partition since Windows insists, for no reason whatsoever, on writing system files to the first drive. I say "for no reason whatsoever" because you can move those files -- boot.ini , ntdetect.com , and ntldr -- to the drive Windows is installed on and it'll run just fine. There's more info at x(perts)64; that guide is specifically for dual-booting XP and Vista, but I found it useful anyway.

(Also, "the first drive" noted above is actually the second drive in my case, which caused a good deal of confusion; as I mentioned earlier, both the Kubuntu drive and the Mac drive show up as 0,0,0.)

It's also worth noting that the much-ballyhooed rEFIt doesn't work for me; I have to hold down Option at startup to get a working boot menu.

That menu gives me the following:
rEFIt, Windows, Windows, Windows
because EFI very helpfully assumes anything that's not Mac is Windows.

The first "Windows" is actually Kubuntu. The second gives me "Error loading operating system". I assume that the first "Windows" is the MBR of the drive and the second is the first partition, which is flagged bootable but doesn't have Grub on it.

The third "Windows" is actually Windows.

Now, rEFIt looks similar -- it offers "Boot Mac OS X from Mac", then "Boot Linux from HD", "Boot Legacy OS from HD", "Boot Windows from Partition 1", not always in that order -- but the last three all open the same OS, either Linux or Windows depending on which I booted more recently.

So I'm stuck with holding Option at boot and selecting the left Windows or the right Windows, but at least it works. I'm hoping future versions of rEFIt fix this problem.


Here's where you can find the necessary 64-bit drivers for Windows:

(Sources: Triple Boot thread on the Apple forums; Airport Driver thread on driverguide.com forums)


Boot issues aside, this is the single most painless Linux installation I have ever experienced. I know there's no dearth of people singing the praises of Ubuntu and how close it is to being ready for desktop use, but I'm afraid I'm going to have to add my own redundant voice to the chorus. It was almost painless.

I still had to install the nVidia drivers by hand -- either get us some free drivers that work or stop being so damn concerned about ideological purity, guys; I need support for my video card, and this would make life pretty rough for the average user. But by my standards as a Linux vet...I didn't even have to touch xorg.conf. Kubuntu, how I love you.

Setting up wireless was another concern, especially when I read there was no native support for the adapter and I'd have to use ndiswrapper. Let me explain something about ndiswrapper: it was a bastard to install under Gentoo, and is responsible for every single kernel panic I've experienced in the past year and a half.

Under Kubuntu, on the other hand, it was over in minutes. And I don't want to jinx it, but it hasn't panicked my kernel yet.

There's a HowTo at ubuntuforums.org. Steps 1-3 are outdated now; Feisty comes with a current version of ndiswrapper, so you won't need to update it. As for the bcmwl5.inf file, it's the same one in the Dell package I linked above.

To get wireless to work immediately at boot, you'll also need to set your access point up. In Kubuntu, you do go to K → System Settings → Network Settings, click "Administrator Mode", enter your password, click wlan0, then Configure Interface, and enter the ESSID and WEP key. (DHCP and "Activate when computer starts" should already be set.)

I will note that on one of my reboots wireless didn't start up automatically and I had to run iwconfig myself. I think that's most likely due to signal interference in my apartment, but I can't say for sure at this point.

Sound support was the biggest problem I hit. The ALSA driver for Feisty doesn't support the Mac Pro's audio adapter.

After poking around for awhile, I decided that rather than bother with the individual package, I'd just go ahead and upgrade to Gutsy RC. After all, if you've even read this far, I'm guessing you're somebody who's not afraid of the letters "RC"; I'd advise you just to go with Gutsy from the start. (Course, by the time anybody actually reads this guide, I'm betting Gutsy final will be out.)

So far Gutsy's working just fine for me. (Update 2007-10-14: Except that I can't adjust volume from the keyboard. The bar goes between 0 and 11 but doesn't actually make any change in the volume. This appears to be a known bug in Gutsy at the moment.)

I'll edit this post if anything changes or if I find anything else out -- I have a Bluetooth keyboard and Mighty Mouse that I haven't bothered trying to set up in Kubuntu yet; I intended it more for my media center/emulation rig Mac Mini anyway. But if I get that, or anything else set up, I'll make a note of it here.

Hoping this has been a help to somebody. I don't usually do this, but when I find myself running into problems that aren't well-documented, I figure I may as well document them myself in the hopes that I can make life a little easier for the next guy.

Good luck, next guy.

Reading: Cat's Cradle again, the first in my "My favorite recently-deceased science fiction authors" theme. I think A Wrinkle in Time is probably next.

Update 2007-10-14: Accessing the Mac drive from Kubuntu

It's easy enough to mount an HFS+ volume under Linux (FS type is just "hfsplus" in mount or fstab), but accessing your home directory or mounting with write permissions is a little trickier.

To access your home directory on the Mac volume from Kubuntu, your Mac user account and your Linux user account need to have the same UID. There are a number of ways to do this; the easiest involve simply creating a new user, but I changed the UID on my Mac login to 1000 with no real trouble.

Just go into Applications/Utilites and run NetInfo Manager, click Users, then your username, then scroll down to uid and gid and change them both to 1000 (or whatever your UID is under Linux -- 1000 is, of course, the default number for the first user account).

After that, you'll need to log out and back in, pull up a terminal, do a sudo chown -R <username>:<group> /Users/<username>, and then log out and back in again.

My source on all this is the Gentoo wiki (even though I'm using Kubuntu).

That should give you write access to your home directory on the Mac drive from Linux. To get read access, you'll need to disable journaling.

It occurred to me that I'd like to keep journaling enabled in OSX and only disable it when I want to access the data from Kubuntu. I came up with a relatively simple solution: I wrote a script to enable journaling when OSX boots, and added a line to the shutdown script to disable it.

For the startup script, I created a directory called /Library/StartupItems/EnableJournaling containing a filepair called EnableJournaling and StartupParameters.plist, as follows:


Description = "Enable Journaling";
Provides = ("Journaling");
OrderPreference = "Late";



. /etc/rc.common

# Enables journaling on Mac volume

ConsoleMessage "Enabling journaling on /Volumes/Mac"
diskutil enableJournal /Volumes/Mac
exit 0

(Don't forget to make this file executable.)

(Source: Greg Neagle's blog)

And I modified /etc/rc.shutdown to the following:

# Copyright 1997-2004 Apple Computer, Inc.

. /etc/rc.common

if [ -f /etc/rc.shutdown.local ]; then
sh /etc/rc.shutdown.local

SystemStarter stop

diskutil disableJournal /Volumes/Mac

kill -TERM 1

exit 0

Seems to work all right; I get journaling when I'm running OSX, and I get write access when I'm running Kubuntu. (Update 2007-11-05: It appears rc.shutdown is gone in Leopard. I'll update when I learn more.)

The bad news is that it doesn't work both ways. At present I have Kubuntu installed on a ReiserFS volume, which is unsupported by OSX. I could have made it an ext3 FS instead and installed the ext2 driver for OSX, but, well, if I wanted compatibility over performance, I probably wouldn't have gotten a Mac Pro.