In the general sense, the answer would be a resounding 'NO'. A laptop may be a better option for most tasks (due to its portability, lower power consumption and compactness), but for gaming, it is totally inappropriate in a lot of ways.
The first and most important aspect is ergonomics. If you don't already know, 'ergonomics' refers to the art of designing computers to avoid/reduce discomfort or injury. Laptops don't usually have an ergonomic design, because their main focus is to provide portability. Gaming is an activity which requires quick reflexes and lots of key presses/mouse movements, continuously. Also, most games are too cumbersome to be played with the laptop's touchpad (you can attach an external mouse, but it affects portability). In fact, even for non-gaming activities, using the touchpad too much can cause nerve damage in the wrist and fingers. Hence, using a laptop as your primary gaming device could cause extreme discomfort, and even injury in the long run.
The next aspect to consider, would be the laptop's specifications. It is a well known fact that if your computer can play the latest games satisfactorily, it can pretty much run circles around every other application. Most modern games require fairly powerful hardware to run. Of course, you'd need a monster of a system to be actually able to enjoy the latest games in all their glory, with most of the bells and whistles turned on. To optimize power consumption of a laptop, usually less powerful components are used. For example, the performance of an NVidia GeForce GTX 780 (which is a powerful graphics card) on a desktop would be much better than on a laptop (which uses the 'M' variant, designed for lower power consumption). Also due to their compactness, laptops tend to be more expensive than desktops in the same bracket. For example, if you consider a laptop and a desktop having similar specifications, the laptop would cost significantly more than the desktop, even though it is likely to perform slightly worse.
Now comes the part about battery life. Games are extremely resource intensive. So, playing the latest games (on a desktop or laptop) pushes the hardware to its limits. This in turn consumes significantly more power. Hence, playing games on a laptop when it is not plugged into a power source would completely drain the battery very soon. The only viable solutions to this problem are to only play games on the laptop when there is access to a power outlet, or carry additional spare batteries (fully charged). None of these solutions are actually desirable, and in some cases may be downright inconvenient.
Here comes the final and most significant drawback of a laptop - the ability to upgrade. Under normal circumstances, only two components of a laptop are upgradable - the hard disk and RAM. This means that the two most important components which affect gaming performance - the graphics card and processor - cannot be upgraded at all. In some exceptional cases, it may be possible to upgrade the graphics card (but not the processor), but the process would be too tedious and time-consuming. Considering the rate at which hardware is advancing and the fact that game developers tend to push hardware to its limits, this is a really serious limitation of laptops.
Of course, there are some other minor advantages of using desktops instead of laptops for gaming, but there isn't really a need to mention them here because it is possible to obtain those benefits on laptops too (with a bit of work). Anyway, it should be clear by now that laptops can't even approach desktops in ANY aspect - performance, comfort, upgradability and pretty much everything else - when it involves gaming.
This primary focus of this blog is PC games, but there will also be occasional posts about PC hardware, operating systems and game consoles whenever they are related to PC games in some way. It contains my perspectives/experiences as well as general stuff.
Sunday, February 9, 2014
Tuesday, February 4, 2014
PC gaming isn't for everyone - TOTALLY true
Not too long ago, I used to wonder why most people prefer gaming on consoles rather than PCs, although the PC clearly offers several advantages over consoles. For example, PC games usually have much better graphics than their console counterparts, provide support for modding which extend a game's life much beyond that intended by the developer, choice of keyboard-mouse or controller, the freedom of upgrading only the required components etc. Agreed, a gaming PC costs almost twice that of a current-gen console, but it can still perform all the usual functions of a PC apart from gaming (which compensates for the cost to a certain extent). It is quite common to hear PC gamers proclaim themselves as the 'Master Race', and refer to console gamers as 'Console Peasants'. But the actual reason why console gamers don't prefer to game on PCs, is not that they can't afford a gaming PC. Its entirely different. I'll just provide a couple of examples (which are my personal experiences) to illustrate this.
I came across a weird problem when I tried to launch a game called 'Dead Space 3' on Windows 8.1, although it used to run perfectly on Windows 7. It displayed an error which said 'Sorry, this game cannot be run in a virtual machine', although I never even had any virtual machine software installed on my PC. I tried implementing various 'fixes' for this issue provided on the internet, but none of them seemed to work for me (although some of those fixes had worked for a few people). I tried fixing the issue for around 90 minutes, but nothing worked. Finally, I was able to get the game to run, but only after entirely disabling hardware virtualization from my motherboard's BIOS (this solution wasn't mentioned anywhere).
Then again, I was unable to launch a game called 'Clive Barker's Jericho' (It showed an error which said 'Clive Barker's Jericho has stopped working and needs to close' whenever I tried to launch it). After a bit of internet research, I found out that this was happening due to improper 'NVidia PhysX' drivers on my PC (which wasn't really accurate). The strange thing is, although PhysX is a feature supported only on NVidia video cards and I have an AMD card, the game required these drivers to be installed. I removed the present driver, downloaded the latest one and installed it, but the game still refused to run (although now it was a different issue). It launched fine, played the intro videos and then crashed to desktop. Again, the internet was my savior (although it took quite a bit of time to find the solution). Apparently, the game only runs with an older version of NVidia PhysX and doesn't recognize the new version, so I needed to have TWO versions of the NVidia PhysX drivers on my PC (a legacy version for this game, and the latest version for other games).
As you can probably see, solving such issues is almost always a hassle. I was probably able to solve these issues due to three aspects - I hold an engineering degree in Computer Science, I have a passion about computers, and I have exclusively been a PC gamer for more than 15 years. But for a console gamer who only uses the PC for day-to-day tasks (such as internet browsing, word processing etc.), it would be really difficult to troubleshoot these kind of problems. This is compounded by the fact that even a fix which works for 99 people, is not guaranteed to work for the 100th person. Although the examples I mentioned are quite extreme and rare, many games do require some sort of tweaking, to be able to run satisfactorily. When a gamer buys a game, he does so to play and enjoy it immediately, and not spend hours troubleshooting/tweaking it just to make it run satisfactorily (and still not sure of resolving the issue). Even game developers can't totally be blamed for this, because it really is a herculean task to make a game run well on all PCs (due to the PC's heterogeneous nature).
Hence, PC is a suitable gaming platform only for hardcore gamers who want the best, and also possess moderate-to-high troubleshooting skills. So I guess its high time we PC gamers stopped calling ourselves 'the master race', and realize that console gamers are just gamers who want to play games without hassles, even if that means compromising on several PC-specific advantages.
I came across a weird problem when I tried to launch a game called 'Dead Space 3' on Windows 8.1, although it used to run perfectly on Windows 7. It displayed an error which said 'Sorry, this game cannot be run in a virtual machine', although I never even had any virtual machine software installed on my PC. I tried implementing various 'fixes' for this issue provided on the internet, but none of them seemed to work for me (although some of those fixes had worked for a few people). I tried fixing the issue for around 90 minutes, but nothing worked. Finally, I was able to get the game to run, but only after entirely disabling hardware virtualization from my motherboard's BIOS (this solution wasn't mentioned anywhere).
Then again, I was unable to launch a game called 'Clive Barker's Jericho' (It showed an error which said 'Clive Barker's Jericho has stopped working and needs to close' whenever I tried to launch it). After a bit of internet research, I found out that this was happening due to improper 'NVidia PhysX' drivers on my PC (which wasn't really accurate). The strange thing is, although PhysX is a feature supported only on NVidia video cards and I have an AMD card, the game required these drivers to be installed. I removed the present driver, downloaded the latest one and installed it, but the game still refused to run (although now it was a different issue). It launched fine, played the intro videos and then crashed to desktop. Again, the internet was my savior (although it took quite a bit of time to find the solution). Apparently, the game only runs with an older version of NVidia PhysX and doesn't recognize the new version, so I needed to have TWO versions of the NVidia PhysX drivers on my PC (a legacy version for this game, and the latest version for other games).
As you can probably see, solving such issues is almost always a hassle. I was probably able to solve these issues due to three aspects - I hold an engineering degree in Computer Science, I have a passion about computers, and I have exclusively been a PC gamer for more than 15 years. But for a console gamer who only uses the PC for day-to-day tasks (such as internet browsing, word processing etc.), it would be really difficult to troubleshoot these kind of problems. This is compounded by the fact that even a fix which works for 99 people, is not guaranteed to work for the 100th person. Although the examples I mentioned are quite extreme and rare, many games do require some sort of tweaking, to be able to run satisfactorily. When a gamer buys a game, he does so to play and enjoy it immediately, and not spend hours troubleshooting/tweaking it just to make it run satisfactorily (and still not sure of resolving the issue). Even game developers can't totally be blamed for this, because it really is a herculean task to make a game run well on all PCs (due to the PC's heterogeneous nature).
Hence, PC is a suitable gaming platform only for hardcore gamers who want the best, and also possess moderate-to-high troubleshooting skills. So I guess its high time we PC gamers stopped calling ourselves 'the master race', and realize that console gamers are just gamers who want to play games without hassles, even if that means compromising on several PC-specific advantages.
Sunday, January 26, 2014
How Microsoft shot themselves in the foot with Windows 8
This post isn't directly related to PC gaming, but I'd recon it would still be an interesting read. You will find references to gaming, though.
From a purely technical point of view, Windows 8 is just a faster, more efficient and improved version of Windows 7. Microsoft has a history of releasing good and bad Windows Operating Systems alternately (Windows 98/2000 were good but Windows ME was terrible. Windows XP was a resounding success whereas Windows Vista was a broken OS which was probably released while still in Beta Testing phase. Then again, Windows 7 is considered the best version of Windows while Windows 8 was widely panned). But the main difference is, previous Windows OSs which flopped (like Window ME and Windows Vista) had some fundamental technical shortcomings whereas there was technically nothing wrong with Windows 8. So then, what really went wrong? Read on to find out.
Microsoft had really ambitious plans for Windows 8. Their idea was to provide a streamlined experience across all devices using the Windows OS and create an ecosystem which allows complete cloud-based backup and synchronization across the devices (which include PCs, laptops, Windows Phones, Windows tablets and to a certain extent, also the XBox consoles). In fact, Windows 8 has slightly lower system requirements compared to Windows 7, and boots up much faster too.There was actually no way that this ambitious initiative could possibly fail, but Microsoft still managed to botch it up due to poor decision making and a hint of arrogance.
Their biggest blunder was the decision to force the touch-based Metro UI as the default interface down the throats of PC/laptop users without a touchscreen, and the removal of the start menu. This UI was really cumbersome to use with a keyboard and mouse/touchpad. The once-familar desktop was relegated to just a 'legacy app' on Windows 8. Windows users who were accustomed to the 'desktop and start menu' interface for over a decade, realized that they had to struggle to perform tasks which they could easily do earlier. In other words, they actually had to 'learn' to use the Windows OS all over again. Switching between Metro apps and regular desktop apps was a chore. And at times, people were totally confused and lost. To cut a long story short, Windows 8 was a totally unorganized mess with two entirely different interfaces. I'm sure everyone agrees that this is annoying. But when you consider this in terms of office-based usage, it results in a criminal decrease in productivity. In Microsoft's defense, this change was necessary in order to achieve their primary goal. And they did achieve this objective to a certain extent (I owned a PC and laptop with Windows 8 and also a Windows Phone 8 based smartphone, so I could appreciate the actual good stuff). But if they had just provided a simple option for the user to choose the default interface (desktop or Metro) before installation and retained the start menu for the desktop, then Windows 8 would have been a resounding success. Agreed, it is possible to use third party apps like Start8 or Classic Shell to obtain the same interface as the previous versions of Windows with the performance/efficiency of Windows 8. But most people (who are casual users) wouldn't be aware of these options, or may not want to spend an additional $5 after buying a brand new OS at full price.
It was Microsoft's foolishness (and perhaps, arrogance) to assume that users would lap up their offering in spite of their stubbornness not to include the option to choose the default interface. Since this was achievable using third party apps, it meant that Microsoft never actually removed the code for the start menu or making desktop the default UI. They justified their decision in several ways and even posted encouraging sales figures for Windows 8. Still, its an open secret that Windows 8 was a failure which actually made most users develop hatred towards Microsoft and call for an end to Microsoft's monopoly in the desktop/laptop market. Valve's Steam service, which is the major digital gaming service on the PC, never even officially supported Windows 8 (although Steam did run without issues). In fact, the backlash was so huge that it resulted in the immediate resignation of Steven Sinofsky (president of the Windows division) and was one of the major factors for Steve Balmer's decision to retire within a year.
Microsoft still didn't want to give up on Windows 8 and accept defeat, so they began their 'damage control' tactics. In this regard, they announced Windows 8.1 which boasted a host of usability-related improvements over Windows 8 (although they were still adamant about not bringing back the start menu). People who actually bothered to try Windows 8.1 (me included) were convinced that it was a significantly improved experience compared to Windows 8. But these people were the minority because the damage was already done, and most users had decided to avoid anything related to Windows 8 like plague. Of course, Windows 8's failure couldn't have had much impact on Microsoft's long term revenue, but it dealt a massive blow to Microsoft's reputation as a brand.
Microsoft recently revealed their plans to release Windows 9 during 2015, and indirectly hinted that they wouldn't want to have anything to do with Windows 8. If they manage to restore the usability of Windows 7 and also retain the best features of Windows 8/8.1, then Windows 9 could turn out to be a winner. It would also continue Microsoft's tradition of releasing successful operating systems alternately. In any case, Microsoft is an innovative company who do value customer feedback (reversal of their initial XBox One policies is proof of that). So hopefully, the negative feedback regarding Windows 8 was just a wake-up call for them, and they'll be back with a bang with Windows 9. Fingers crossed!
From a purely technical point of view, Windows 8 is just a faster, more efficient and improved version of Windows 7. Microsoft has a history of releasing good and bad Windows Operating Systems alternately (Windows 98/2000 were good but Windows ME was terrible. Windows XP was a resounding success whereas Windows Vista was a broken OS which was probably released while still in Beta Testing phase. Then again, Windows 7 is considered the best version of Windows while Windows 8 was widely panned). But the main difference is, previous Windows OSs which flopped (like Window ME and Windows Vista) had some fundamental technical shortcomings whereas there was technically nothing wrong with Windows 8. So then, what really went wrong? Read on to find out.
Microsoft had really ambitious plans for Windows 8. Their idea was to provide a streamlined experience across all devices using the Windows OS and create an ecosystem which allows complete cloud-based backup and synchronization across the devices (which include PCs, laptops, Windows Phones, Windows tablets and to a certain extent, also the XBox consoles). In fact, Windows 8 has slightly lower system requirements compared to Windows 7, and boots up much faster too.There was actually no way that this ambitious initiative could possibly fail, but Microsoft still managed to botch it up due to poor decision making and a hint of arrogance.
Their biggest blunder was the decision to force the touch-based Metro UI as the default interface down the throats of PC/laptop users without a touchscreen, and the removal of the start menu. This UI was really cumbersome to use with a keyboard and mouse/touchpad. The once-familar desktop was relegated to just a 'legacy app' on Windows 8. Windows users who were accustomed to the 'desktop and start menu' interface for over a decade, realized that they had to struggle to perform tasks which they could easily do earlier. In other words, they actually had to 'learn' to use the Windows OS all over again. Switching between Metro apps and regular desktop apps was a chore. And at times, people were totally confused and lost. To cut a long story short, Windows 8 was a totally unorganized mess with two entirely different interfaces. I'm sure everyone agrees that this is annoying. But when you consider this in terms of office-based usage, it results in a criminal decrease in productivity. In Microsoft's defense, this change was necessary in order to achieve their primary goal. And they did achieve this objective to a certain extent (I owned a PC and laptop with Windows 8 and also a Windows Phone 8 based smartphone, so I could appreciate the actual good stuff). But if they had just provided a simple option for the user to choose the default interface (desktop or Metro) before installation and retained the start menu for the desktop, then Windows 8 would have been a resounding success. Agreed, it is possible to use third party apps like Start8 or Classic Shell to obtain the same interface as the previous versions of Windows with the performance/efficiency of Windows 8. But most people (who are casual users) wouldn't be aware of these options, or may not want to spend an additional $5 after buying a brand new OS at full price.
It was Microsoft's foolishness (and perhaps, arrogance) to assume that users would lap up their offering in spite of their stubbornness not to include the option to choose the default interface. Since this was achievable using third party apps, it meant that Microsoft never actually removed the code for the start menu or making desktop the default UI. They justified their decision in several ways and even posted encouraging sales figures for Windows 8. Still, its an open secret that Windows 8 was a failure which actually made most users develop hatred towards Microsoft and call for an end to Microsoft's monopoly in the desktop/laptop market. Valve's Steam service, which is the major digital gaming service on the PC, never even officially supported Windows 8 (although Steam did run without issues). In fact, the backlash was so huge that it resulted in the immediate resignation of Steven Sinofsky (president of the Windows division) and was one of the major factors for Steve Balmer's decision to retire within a year.
Microsoft still didn't want to give up on Windows 8 and accept defeat, so they began their 'damage control' tactics. In this regard, they announced Windows 8.1 which boasted a host of usability-related improvements over Windows 8 (although they were still adamant about not bringing back the start menu). People who actually bothered to try Windows 8.1 (me included) were convinced that it was a significantly improved experience compared to Windows 8. But these people were the minority because the damage was already done, and most users had decided to avoid anything related to Windows 8 like plague. Of course, Windows 8's failure couldn't have had much impact on Microsoft's long term revenue, but it dealt a massive blow to Microsoft's reputation as a brand.
Microsoft recently revealed their plans to release Windows 9 during 2015, and indirectly hinted that they wouldn't want to have anything to do with Windows 8. If they manage to restore the usability of Windows 7 and also retain the best features of Windows 8/8.1, then Windows 9 could turn out to be a winner. It would also continue Microsoft's tradition of releasing successful operating systems alternately. In any case, Microsoft is an innovative company who do value customer feedback (reversal of their initial XBox One policies is proof of that). So hopefully, the negative feedback regarding Windows 8 was just a wake-up call for them, and they'll be back with a bang with Windows 9. Fingers crossed!
Subscribe to:
Posts (Atom)