Sponsored
Techradar |
- 'The Daily' iPad 'paper delayed until 2011?
- In Depth: Best Linux backup software: 8 tools on test
- Review: Packard Bell Dot S
- Buying Guide: Best value CrossFire and SLI DX11 graphics cards
'The Daily' iPad 'paper delayed until 2011? Posted: 27 Nov 2010 06:23 AM PST The Daily, the planned iPad-only newspaper from Apple and News Corp, may be delayed until the new year. The collaboration between the tech and media giants which will bring iPad owners a digital newspaper updated three times a day and is set to cost just (60 pence) per week. However a delay in the release of Apple iOS 4.3,which would allow for recurring subscriptions in the App Store, could mean that the rumoured December 9th announcement is pushed back. Jobs on board Apple's Steve Jobs has reportedly been very much involved in The Daily project, which will have its own editorial team, rather than aggregating content from current News Corp publications. Rupert Murdoch has apparently spent $30m on the project, has employed a staff of between 100 and 150 people, and will need 800,000 subscribers to be economically viable. The Daily will have competition from Richard Branson's Project magazine for the iPad, a business, culture and lifestyle publication, expected to be announced on November 30th. |
In Depth: Best Linux backup software: 8 tools on test Posted: 27 Nov 2010 04:00 AM PST Have you been burned before and lost important data? Or do you lose sleep because of the fear of one day joining the ranks of those who have? Fear not, worried and jaded souls – there's a range of Linux backup tools that can help. Such tools avoid the pitfalls of common data storage strategies, particularly those that begin and end with burning files on to optical media to free up a single hard disk. Such a method won't protect you from random disk failures or accidents, or back up your important configuration and temporary files. Backup tools, however, enable you to identify important files and directories that are then constantly monitored and regularly backed up. If you back up the same directories regularly, how do you prevent redundancy of data in backup files, though? Well, backup tools can perform incremental backups, which – after making a complete initial imprint of the directory – will then only make copies of new files or those that have changed since the last backup. So, for example, a backup of the Pictures directory on Tuesday will only contain files added or changed since last Saturday's backup. Most backup tools nowadays also offer to compress your data so you can store it more efficiently. Then there are tools that will encrypt your data when making copies. There are also GUI and command line flavours – so use these pages as a guide to pick the best tool for your needs. So what's the best Linux backup tool? Let's find out. Pybackpack A vailable in most software repositories, Pybackpack is designed to be a friendly backup tool, and is notable for being easy to install manually thanks to its bundled Python installer script. Its dependency list includes Python, PyGlade and PyGTK and a few other readily available tools. Once installed, you'll find it listed under System > Administration as 'File Backup Manager'. Using Pybackpack is simple too. The default Home tab enables you to back up your /home directory. Clicking Go will burn its contents to a disc. You'll need the nautilus-cd-burner package before you can do this, though. Most backup tools refer to each backup as a set. You can easily create your own sets via the Backup tab. This features a handy Exclude From Set button too, enabling you to specify particular files to leave out of backups made of a full directory. It's useful for trimming the fat. When the New Backup Set wizard exits, press the Backup button at the bottom-right of the main window. This will create your backup in the specified directory. Make sure to provide a unique destination path for each backup set. The backup creates two directories in the destination path – home and rdiff-backup-data. The former contains a copy of the files, while the latter holds data about incremental backups, error logs and so on. Pybackpack remembers each set you create and the files it contains, backing up only new or modified files thereafter. When restoring a backup, you only need to specify the parent directory that contains the two directories. In case of incremental backups, you get the option to restore backups done on a specific date and time. Verdict Pybackpack A lack of active development and compression options hold this back Rating: 7/10 Fwbackups With Fwbackups, you can either perform on-demand backups or create sets and task Cron with automatically backing up your data. All this is conveniently offered from a slick graphical interface. The buttons at the top of the interface enable you to choose which type of backup to use. Click One-Time Backup to create a backup of your data immediately. Note that Fwbackups will not treat this data as a set, so it can't be backed up incrementally. You can choose an archive type, though – be that a tarball, compressed archive, or just a basic copy of the files. All the options are well explained and you can optionally set a Nice value too. This value denotes the importance of a process, and you can use it to prioritise resource allocation. It's especially useful when you're backing up large volumes of data and using compression. When creating a backup set, in addition to opting for incremental backups, you can also specify Cron settings. In the Configure Set dialog box, click the Times tab to specify backup times. Then Fwbackups will automatically back up any changes made to the files in the specified directory at the times you chose. You can save your backup to a local folder, a USB drive or to a remote server (using SSH). If you so command, Fwbackups will back up all subdirectories and hidden files in the backup path as well. Back in the main interface, click the Backup Set Now button on the left to create the backup. Use the Restore Set button when you're ready to restore your backups. For incremental backups, you can also select the backup version to restore. Verdict Fwbackups It's fast and with great options and documentation. Highly recommended. Rating: 8/10 Déjà Dup Duplicity, the command line gem that offers such features as remote backups and encrypted incremental archives, is just too exhaustive to cover here. Still, we've managed to find the best graphical front-end to Duplicity around: the brilliant Déjà Dup. It isn't the only option, though. In fact, if you really insist on using a terminal, try Duply – an Ncurses-based Duplicity front-end. There aren't many dependencies to worry about here, but you will need NcFTP, which is available in Ubuntu's software repositories and is the default Gnome backup tool of Fedora 13. Like all other tools we've discussed so far, Déjà Dup enables you to store backups on the local filesystem or a remote location using SSH. When you launch Déjà Dup from the Applications > System Tools menu, don't let the simplistic interface throw you off. Use Edit > Preferences to fill in settings such as backup location and the files you'd like to back up. In the Preferences window, check the Automatically Backup On A Regular Schedule box and select Daily, Weekly, Biweekly, or Monthly from the How Often To Backup: drop-down list. It's worth noting that Déjà Dup doesn't give you the option to create backup sets, even though it does support incremental backups. You should also know that, depending on the Backup Location specified under Preferences, Déjà Dup only offers the respective backups to restore. For example, if you back up your pictures in a directory called pics and your videos in vids, when the Preferences dialog is pointed to pics, you'll only restore your pictures backup. That's cool, but where's the documentation explaining this, eh? Verdict Déjà Dup A podium contender that offers encryption and incremental backups Rating: 9/10 Backerupper Although not available in the software repositories of any big-league distributions yet, Backerupper is still popular having received extensive blogosphere coverage. The tarball contains an install.sh script if you wish to install Backerupper to disk, but it works just as well without installation. Simply double-click the backer executable file. The first step when working with Backerupper is to create a profile. To do this, click New. Provide a name and fill in the information required, such as the destination directory. Unfortunately, Backerupper won't back up individual files – it only works with directories. Another shortcoming is that it doesn't perform incremental backups. Still, it does offer to automatically back up a specified directory if you so wish. When creating a profile, carefully choose the Max Backup Copies value. Instead of creating incremental backups, Backerupper creates a compressed tarball of the specified directory each time it creates a backup. So, with the Max Backup Copies option you get to specify how many older versions of the backup to retain. For example, with Max Backup Copies set to one (default), a backup. tar.gz containing the Pictures folder would be replaced with backup-1.tar.gz the next time you back up the Pictures directory. If disk space isn't an issue, you may wish to keep at least two or three older backup copies. With a backup in place, click the Restore tab at the top of the Backerupper window and then select a profile and, if you've set Max Backup Copies to a value of two or more, the associated backup you wish to restore. Verdict Backerupper Not actively developed and pales in comparison to others here Rating: 5/10 Simple Backup Suite The Simple Backup Suite, or Sbackup, is a set of Python scripts that provide two graphical interfaces: simple-backupconfig and simple-restore-gnome. Don't panic if it isn't part of your distro's repository – with its tiny dependency list, it's easy to install, even from source. The simple-backup-config tool is named somewhat inappropriately, since you use it to create backups as well as for configuration. Once installed, launch it via System > Administration or the terminal. By default, Sbackup is configured to back up your /home, /etc and a few other directories. If you'd rather define your own backup settings, click the Use Custom Backup Settings radio in the General tab. The tabs at the top enable you to define the files and directories you wish to include or exclude from the backup. The Exclude tab offers you the option to exclude files based on regex matches, file size and file type. You can choose to save your backups to a remote location (SSH or FTP) or a local directory. If you'd like to automate the backups with Cron, click the Time tab and Sbackup will create incremental backups for you. Unless you wish to create a one-off backup, click Save at the bottom of the window. The settings are saved in the /etc/sbackup.conf file. Please note that Sbackup doesn't create profiles, so /etc/sbackup.conf is overwritten each time you click Save. This means you can't schedule different Cron jobs for a range of backups – your Pictures directory on Tuesday and Videos on Wednesday, for example. The Simple Backup Restore tool under System > Administration identifies different backups by their timestamp. Handily, Sbackup will let you select individual files to restore too. Verdict Simple Backup Suite This isn't designed to be a home solution but it's ideal for system data Rating: 6/10 Back In Time Originally intended as a replacement for scp and the rcp tools, rsync is now often used for performing backups. There are many graphical tools that use it and Back In Time is just one. The project website has extensive installation instructions for Fedora, Ubuntu and Mandriva. Once installed, you can launch Back In Time from the Applications > System Tools menu on a Gnome desktop. Because Back In Time relies on rsync, it can't be used to back up single files, only directories. You can use Exclude to specify files you don't wish to back up, though. Back In Time creates snapshots of the directories you want to back up. This means that it copies the entire directory contents into the backup, but only if the contents have changed. So, if you create a snapshot of a directory now, it won't create another 20 minutes later if you haven't made changes. You can and should add a name to each snapshot to enable easy comparison. The diff tool can then be used to compared the different snapshots. To do this, click the Snapshots button on the far-right of the interface. Now pick two snapshots from the Diff With drop-down list. Clicking the Diff With button now will display a comparison of the files in the snapshots. If you like, you can restore individual files instead of the complete directory. Select a snapshot in the panel on the left, browse to the file you wish to copy in the right-hand panel, and you can either drag and drop or copy the file from here. Alternatively, click the Restore button. This will recreate the directory from the snapshot instantly. Verdict Back In Time Very fast. If you don't care for compression, this is a great tool. Rating: 7/10 LuckyBackup LuckyBackup crams almost all the features of the tools we've covered so far into a single package, while trying to keep its interface clean and simple. Great tooltips and a comprehensive user manual help you to make sense of all that's on offer here. LuckyBackup is probably already available from your distro's software repositories, but the Repositories page on the project's website is the place to go to find out more. Also on offer are 32- and 64-bit packaged binaries for Debian, Ubuntu, Fedora, and others. When you launch LuckyBackup for the first time, create a new profile. You can then store different backup sets within each profile. For every profile, you must create a task. LuckyBackup treats backing up and restoration as separate tasks. When you create a backup task, check the Also Create A Task For Restore Purposes box. This will reverse the source and destination directories for your backup task for use with the restoration task. LuckyBackup won't be able to restore your backups if you don't use this option. One particularly great feature you'll find here is the option to do a dry run, which is handled by using the Simulator checkbox in the main LuckyBackup interface. After a test, you can scan the Information Window (bottom panel) and the command output to review your settings and ensure all the files will go where you want. Unfortunately, we don't have enough room to cover all the other features LuckyBackup puts at your fingertips here. You should, however, note that although the option to schedule a task isn't offered when creating a backup task, you can do this via the Schedule button on the task bar. Verdict LuckyBackup Poor performance and too many pitfalls hold this back from greatness Rating: 7/10 Keep Just like rsync, rdiff-backup is a command line utility to back up a directory to another location, even over a network. It's also similar to rsync in that it has inspired many graphical front-ends, and Keep is our weapon of choice for KDE. What makes rdiff-backup unique and a great backup tool is that, in addition to keeping incremental backups, it also stores the reverse diffs. Suppose you back up a directory that contains 11 files on Thursday. When you back up this directory, the backup will also contain all the files. However, if you delete three files and back up again on the following Thursday, the backup directory will only contain eight files, because the backup reflects the current directory. What if you now wish to recover the three files you deleted? Rdiff-backup stores the changes to a backup, whether incremental or reverse diffs, in the rdiff-backup-data directory, so you can effectively restore the three deleted files even though they aren't in the backup directory. Click the Add Directory To Backup button to begin. If you wish to pick or leave out specific files, you can use the Inclusion/Exclusion list. Keep enables you to define a unique backup plan for individual directories. While it doesn't support profiles, click the Backup Now button in the main interface and it will present a list of all the configured directories, then ask you to select the ones you wish to back up. When adding a backup directory, click the Use Advanced Configuration checkbox and the Configure button if you wish to describe settings such about compression, symbolic links and so on. Restoration is simple too. Verdict Keep Easy and fast. Offers compression and good documentation. Rating: 8/10 The best Linux backup tool: Déjà Dup - 9/10 There's no shortage of backup tools available for Linux, but restricting ourselves to those geared towards home users and not including too many graphical front-ends for the same commands brought us to our shortlist here. The stability of all the tools – even those that are yet to reach the big 1.0 milestone – came as a surprise to us. We think it's another factor that finally puts to rest the argument that Tar archives of directories, compressed with Gzip and transferred to a remote location with SSH or an FTP client is a decent enough backup strategy. While functional, this approach seems archaic when faced with the convenience of a robust program that integrates well with Cron, a compression tool such as Gzip and often supports many different file storage features too. After putting all the tools through our tests, we were half tempted to ignore ease of use as a deciding feature. That's because all the tools here, not just the top three, have very appealing and useful interfaces. Despite the barrage of features and options on offer, the tools present all the information and seek user input in a way that won't overwhelm you, no matter what level of expertise you posses. Triumphant triumvirate Which, however, is the best? Well, that mostly depends on your needs, but we feel that Lucky Backup, Pybackpack and Back In Time constitute the middleweights in this test. Each is just a feature or two away from being a title contender and the aspect of all of them we found most disappointing is that they don't offer compression. Lucky Backup in particular is sitting on a virtual gold mine. With just a little love, it could become the all-time best. Its dry run feature is a great idea and we reckon all tools should offer it. Our winner, though, stands apart from the second and third-placed Fwbackups and Keep, despite sharing quite a few features with both, because it offers encryption. Indeed, Déjà Dup is the only tool for home users that offers to encrypt files. Sure, restoring encrypted backups with Déjà Dup can seem tricky, especially if you've created multiple encrypted backups. Panic not, though – to handle them, you simply need to change the Backup Location in the Preferences dialog to whichever backup you wish to restore. For example, if you create a backup of the Pictures directory first and then backup the Videos directory, you'll need to switch the Backup Location back to whatever you specified for the Pictures directory before you attempt to restore it. Remember this rule and the power of encrypted backups, along with a rich selection of other features, will be at your command. |
Posted: 27 Nov 2010 02:00 AM PST Since being bought by Acer in 2008, Packard Bell has gone from strength to strength, releasing a wide range of high-quality laptops and netbooks. The updated Dot S adds to this legacy, combining great usability and unique features at an equally pleasing price. Acer's involvement can be seen in the design of this device, as it is built around the same chassis as the Aspire One D260. Available in a choice of black or champagne colour schemes, you will either love or hate the patterned finish. Build quality is excellent and the resilient plastics will be more than tough enough to withstand daily family use. The slim chassis can easily be carried in a small bag, with the fantastic 502-minute battery life only beaten here by the Acer Aspire One D260. The spacious keyboard has all the strengths of the Acer. It is a pleasure to use, with a near flawless typing action. The white keys of the champagne model are easily tarnished, however, so you may prefer to opt for the black model. The spacious touchpad is great to use, with its wide design matching the screen of the Dot S perfectly. A slim scrollbar on the right-hand side of the touchpad makes it easy to scroll vertically through documents and web pages. The bright 10.1-inch screen shows photos and videos to great effect. The low resolution is common on netbooks and reduces sharpness somewhat, but not drastically so. The screen can also be folded back almost 180 degrees, making it easy to find a comfortable viewing angle. A unique feature of recent Packard Bell netbooks and laptops is a dedicated social networks key. When pressed, a proprietary software application opens and provides instant access to Facebook, YouTube and Flickr. This is sure to be a hit with younger users. Photo editing Packard Bell also trumps the competition by including a full version of Adobe Photoshop Elements software as standard. This excellent entry-level photo editing package makes it easy for first time buyers to tweak their photos. The only real flaw of the Dot S is that it shares the same limited storage as the Acer. The 160GB hard drive will be capacious enough to store most collections of music, photos and videos, but is vastly bettered by the 250GB Asus Eee PC Seashell 1015PE and Dell Inspiron Mini 1018 drives. Combining a striking consumer design and all-day battery life with great usability and unique features, the Dot S succeeds on almost every level. While its somewhat limited storage may put some buyers off, this is still by far one of the best netbooks you can currently buy. Related Links |
Buying Guide: Best value CrossFire and SLI DX11 graphics cards Posted: 27 Nov 2010 02:00 AM PST CrossFire and SLI have finally come of age. Just a couple of years ago, you were hard-pushed to squeeze an extra 50 per cent more gaming power out of your system by adding an identical card to the mix. And even then, the setup was often impossibly fiddly. It worked perfectly with some games, and just, well, not at all with others. You'd see diminishing returns on your frame rate investment to the point where you may as well plump for a big, fat, flagship card. But ATI and Nvidia have both honed the tech to a point where we're now seeing major benefits to the polygon punt – in some cases, close to twice the power for two cards – and it's been a learning process for both outfits. Architectural innovation and API-leveraging aside, they've had to work their driver releases to really get two cards on-song. And just as crucially, they've had to invest more heavily than ever before in close working relationships with game developers. That sort of commitment doesn't come cheap, especially when you consider the tight milestones that both hardware, driver, and game-development teams have to work to. But both Nvidia and ATI have grasped the nettle and built these relationships, their engineers jetting off to work on-site with game studios to ensure this stuff actually works. The result is a process of fine-tuning that enables dual-card setups, finally, to fly like they should with the games we all play. And yet, each company seeks to differentiate, offering a slightly different proposition. Elegant ATI looks to court the multimedia market with EyeFinity and power-efficiency, while the poke-focussed Nvidia, as ever, seeks to shift more pixels at a greater speed than the competition, and attempts to make 3D vision a household technology. Our remit for the following shootout, as ever, is to ascertain which twin GPU setup offers the greater value proposition. Both companies have their mid-range DX11 cards on the table now, so let the games commence… Life with SLI and CrossFire has never been easier. Provided you have a supporting motherboard and you're not limiting the capabilities of these fine DX11-capable GPUs with something more archaic than a decent Core 2 Duo, you'll see some serious gains for having a dual-card setup. And we really don't mean to sound glib when we say that CPUs prior to Core 2 Duo will hold back your graphics card. Not only will an older CPU limit the speed capabilities of a new graphics card, but there's DX11 to take into account. DX11 doesn't just make standards of tessellation and stream-processor leverage. It aims to make your CPU work harder for the game engine, with multi-threaded rendering. The more cores, and the more threads per core in your CPU, the better gaming performance will be. A fancy Phenom or Core i7 will really take the load off your GPU, and let it do its thing unhindered. But before we go anywhere near the benchmarks of our stable of cards, and what they mean in terms of a buying decision, let's pause to examine your motivations. Why a dual-card setup? In the great flow-chart of whether-or-not-you-want-two-graphics cards, you should first ask yourself why. There are two good reasons to plump for a pair of cards, which depend entirely on your financial situation. The first is the prospect of an inexpensive future upgrade, which requires that you have already bought a motherboard with dual-card capabilities, with future-proofing in mind, you clever thing. If one card is good for the price, why not save up for a couple of months and add another card to your system for a major increase in frame rates? It's a compelling proposition, especially when retailers are constantly and incrementally driving down the prices of cards to keep up with the competition. And now that SLI and CrossFire offer really solid benefits for doubling up, it's become a more valid motivation than ever before. The second reason to invest in two cards comes down to overall value in a single purchase. Can you really get more performance from a twin-card setup than from a more expensive single card? That's largely down to your budget, which we'll come to in a while. The purpose of our tests is to find the current best mid-range dual-card setup, so let's examine what the cards offer. The cards on test We're looking at paired sets of four distinct cards: ATI's HD 5750 and HD 5770, and Nvidia's GTS 450 and GTX 460. All four of these cards differ in price, performance and capabilities. In a sense, there's never been a richer crop of mid-range cards – and a more confusing series of choices to make. And by mid-range, we mean cards that drive mid-range resolutions – between 1,680 x 1,050 and 1,920 x 1,080 – 22-24-inch monitors, basically. Regardless of every other feature a graphics card may possess, we're primarily interested in gaming performance, and when that's your key metric, the native resolution of your monitor is a major deciding factor. Your price/performance sweet spot lies with the cards that run games happily at your native resolution. The cheap seats At the budget end of our scale are the HD 5750 and the GTS 450 – both pared-down versions of the GPUs boasted by their bigger DX11 stablemates. Traditionally, this means fewer cores/stream processors and a lower memory bandwidth than higher-end cards in the series, and this is true of these budget barnstormers as well. They just can't shift as much data as their freer-breathing bigger brothers, which means lower frame rates. The compensating factor, as ever, is price. 1GB examples of both the HD 5750 and the GTS 450 (these are the ones we tested, over their 768MB counterparts) can be found for less than a hundred quid. The middle brothers to the HD 5750 and GTS 450 are the HD 5770 and GTX 460. Specs-wise, they have exactly what you'd expect: a bigger memory interface with considerably more memory bandwidth, and greater power consumption to boot. Here's where the prices really start to differ as well, with the 1GB version of the HD 5770 coming in around £60 cheaper, on average, than the similarly-equipped GTX 460. So while competition is as tight as can be at the budget end, the upper-midrangers start to pull away, and a different story begins to emerge. Just what is it that makes these graphics cards so different? We ought to look at some benchmarks. But first, a word on compatibility. Driver issues The games we used to run our benchmarks were: DiRT 2, Far Cry 2, Just Cause 2 and the synthetic DX11 benchmark, Heaven 2.0 Benchmark. The only game in which we saw some curious results from a dual-card setup was DiRT 2 on CrossFire. The difference between ATI's and Nvidia's dual-card setups is that CrossFire often requires Application Profile packages to be installed alongside the driver (although why the two can't be combined into a single package seems odd). The result, if you haven't downloaded and installed the latest Application Profiles from the AMD website, is a drop in performance in those games that require them. Our benchmarks were performed using the latest Catalyst and Forceware drivers, and in all other tests with all our other games, everything was ticketyboo. Catalyst 10.9, it appears, doesn't yet have an Application Profile for DiRT 2, which meant we actually saw a dip in performance with two cards, as the game struggled to leverage the hardware. We take these results with a pinch of salt, however. The cards are perfectly capable of running the game in CrossFire mode, and the catalyst 9.X packages had their own hotfix. One just hasn't been introduced for 10.9 yet, and when it does, we'll see that rise in performance. It's just a little irksome that we need discrete Application Profiles at all with CrossFire, seeing as SLI just works out of the box with commensurate performance gains over one card, and no requirement to download extra software. So let's start with the humble HD 5750. Alone, the card runs games at a respectable rate at middling resolutions. But at nearly 40fps in Far Cry 2 at 1,920 x 1,080 with high AA and AF, and all the effects turned on? That's great DX10 performance for under a hundred pounds. The card is also whisper-quiet, and while the fan speed scales to cool when the pressure is on, it never really becomes audible. Whack two together in CrossFire, however, and you'll see a 65-90% increase in frame rates across the board, depending on the game, and scores that outperform any single card on test here bar the GTX 460. That high-resolution Far Cry 2 test jumps to 54fps, just shy of the holy-grail that is 60fps. And if you're running at 1,680 x 1,050, the setup nearly cracks the 100fps mark. Impressive indeed. Price-wise, the 5750 sits in direct competition with Nvidia's GTS 450, and here's where the real battle lies. Alone, the 450 offers extremely competent performance at middling resolutions – close on 60fps in Far Cry 2 and 30fps in the considerably more demanding Just Cause 2. As you'd expect for a budget card, it runs out of puff at higher resolutions, but not as quickly as the 5750. Pair it with a twin, however, and the whole game changes. The performance of the 450 in SLI mode at 1,680 x 1,050 is nothing less than stellar: 70fps in DiRT 2, 125fps in Far Cry 2, 57fps in Just Cause 2 and 40fps in the massively demanding Heaven 2.0 Benchmark. It not only pushes the HD 5750 CF setup into the shade, it outperforms a single GTX 460 by a very tangible margin. The setup is even capable of driving healthy performance at 1,920 x 1,200, with 98fps in Far Cry 2 and a still-playable 29fps in Just Cause 2. Like any lower memory-bandwidth budget card, it's always going to run out of juice before its bigger stablemates, especially as more demanding games appear. But in terms of mid-range poke, we're frankly astounded. This means sad things for the HD 5770, which in CrossFire mode, can barely keep up with the GTS 450 pairing at 1,680 x 1,050. The 5770 handles Just Cause 2's engine more adroitly than the Nvidia equivalent, but only just. What's more, a pair of 5770s will set you back around £50-£70 more than a pair of 450s, which clock in at around £198, and outperform a single 460 at 1,680 x 1,050. Do you feel a conclusion coming on? Well, stop right there. There's more to ATI's 5770 – and indeed its whole HD lineup – than just gaming performance. It's incredibly quiet, power-efficient and it allows you to use ATI EyeFinity for multi-screen setups. The only downside to the latter being that as far as gaming goes, you need a much fatter GPU to drive multiple panels at decent frame rates. For gaming on mid-range cards then, EyeFinity is a bit of a pipe-dream. But if you want a quiet, efficient, cool system that's capable of wholly competent frame rates at mid-range resolutions, then an HD 5770 CF setup is still a serious consideration. And what of the GTX 460, the most expensive single card on test? The Zotac models we benchmarked clocked in at just a shade cheaper than a pair of GTS 450s, and while a single 460's frame rates at all resolutions couldn't quite compete, it's pairing them up that makes the difference for high-resolution gaming. At £380, you get a properly high-end setup, with serious high-end results. So high, in fact, that a pair of 460s will comfortably outperform a single GTX 480, which will set you back the same price. The price-performance ratio prize, however, has to go to the pair of GTS 450's in SLI mode. Great mid-range performance for under £200? We'll take two. Benchmarks |
You are subscribed to email updates from techradar To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 20 West Kinzie, Chicago IL USA 60610 |
No comments:
Post a Comment