Friday, July 3, 2009

Nvidia Ion Platform: Atom gets GeForce

Intel's Atom CPU and the subsequent net-product phenomenon in the last year has been the fresh talk of the industry in an otherwise pretty regular world. It's not often we see a whole new product segment created and exploded - we've covered plenty of netbook and mini-ITX products based on this but while they have a fantastic little low power and cost effective CPU, they are ultimately let down by their pairing with an old, hot northbridge and feature minimised southbridge.

Intel's Atom now comes in a dual core variety and even though it lacks the all important out of order execution that is an essential component to any modern CPU, it's very inexpensive as a platform and that has made it extremely popular for basic applications.

The 945GC northbridge and IGP can be described as, at best, "basic", and the southbridge has only two SATA ports, lacks USB options and the whole thing doesn't even get a PCI-Express x16 slot - or even much in the way of the modern PCI-Express altogether to be honest. At $50 for the entire package of three though it has made extremely cheap platforms for all sorts of official products: netbooks, nettops and mini-ITX boards, but also more exotic home builds like smoothwalls, file servers, powerful routers, NAS boxes, games servers - basically anything SFF you can think of.

Nvidia is looking to capitalise on this popularity by offering its own Atom-capable "mGPU" which won't necessarily lower power consumption even though it's a single chipset rather than a pair, but it should offer a whole wealth of extra features and graphics power that'll be like comparing the Star Ship Enterprise to a Canoe.

Nvidia has dubbed this new pairing the "Ion" in an attempt to stick with the chemically inclined "Atom" naming from Intel. We suppose "molecule" doesn't have quite the same ring to it, even though "Ion" does leave us chemists thinking of something static or incomplete.

Nvidia Ion Platform: Atom gets GeForce Nvidia Ion Platform: Atom gets GeForce Graphics

The differences between the chipsets are pretty epic - the Nvidia GeForce 9400 mGPU has plenty of front side bus overhead for overclocking and tons of extra memory bandwidth for the graphics portion that includes more recent DirectX 10 support in addition to four times the shader capacity and several times the clock speed, not to mention the fact that the Nvidia part does hardware vertex shading whereas the Intel GMA 950 does not.

While mobile and SFF parts are limited in what they offer, potentially having six SATA on a mini-ITX and PCI-Express x16 is not unheard of and it opens up a bevy of other possibilities into the net-product range, not to mention enthusiasts who want to fashion cheap and innovative home builds like we mentioned above.

Nvidia has shown that it can squeeze seven USB 2.0, two eSATA, 7.1 channel surround sound audio, Gigabit Ethernet, VGA, dual-link DVI and HDMI (with HDCP) into the tiny pico-ITX form factor (600mL capacity)!

Our concerns lie in the fact that the Atom CPU is probably underpowered for such a solution, and outside of mini-ITX motherboards most manufacturers will probably only use a single 1GB DIMM in single channel, but at least it will be DDR2-800MHz rather than 533MHz.

This is not only to potentially keep its costs in line with the main Atom brand, but also because Microsoft will only license Windows XP to net-products that have a maximum of 1GB of memory included.

Nvidia wasn't able to give us a specific TDP of the GeForce 9400 chipset, however we have read claims of a 12W TDP for the notebook part, although whether this is the same we're as yet unsure. 12W will be ~10W less than what Intel currently offers from the Atom platform as a whole, and should potentially improve battery life in mobile products.

Nvidia Ion Platform: Atom gets GeForce But will we ever see one? Nvidia Ion Platform: Atom gets GeForce But will we ever see one?
Click to enlarge

While we appreciate enthusiasts will jump at the chance to explore greater options on a really affordable platform, and companies in a saturated netbook/nettop market will also be glad of the extra breathing room to explore new products. The PCI-Express x16, extra SATA and generally more of everything will see to that, Intel's Atom is made for very inexpensive and light usage products: email, internet and basic computing. Can the Atom CPU keep up with the demands of a more complex chipset? Or is this Nvidia's closest thing to an x86 mobile product it can get seeing as its ultra mobile Tegra product is ARM based?

Nvidia claims full HD decode and display from >10" displays, but is that even "netbook" anymore? Desktop Atom products, maybe, because they will be paired with bigger monitors but not mobile. Can it really decode a Blu-ray movie without dropping frames? Nvidia also highlights the advantages of CUDA - but again, do we think of net-products and want to do video encoding with it? Unlikely.

Nvidia Ion Platform: Atom gets GeForce But will we ever see one? Nvidia Ion Platform: Atom gets GeForce But will we ever see one?
Click to enlarge

The biggest hurdle Nvidia faces is Intel and the way it controls its products. For starters, at the time of writing and since launch manufacturers cannot buy the Atom CPU on its own - it can only be purchased with the chipset as well. This is because Intel wants to protect its low cost Celeron processors and low end P43 or G31 chipsets that afford a much greater feature set.

Like other companies, Intel also guarantees marketing funds for manufacturers if they make a certain list of products and included on this list and "an Atom product" has been included onto this from its launch, but whether that changes to Atom CPU only or Atom platform remains to be seen and we'll be sure to enquire in due course.

Without being able to purchase the CPU on its own, Nvidia's new MCP is basically a non-starter because it can't make a competitive product in a very price sensitive market. In many ways Intel's current insistence on the way it sells the whole platform could potentially render it in an anti-competitive position, since it could be argued that while Nvidia has a front side bus license, Intel is locking it out of the market.

That could well change though - Intel could see it as a way to sell more Atom CPUs should AMD get in on the ultra portable action (hint: it will) so it's not an unreasonable choice. We have contacted Intel to ask about its 2009 policy towards its Atom package and as soon as we get a reply, we'll let you know.

Final Thoughts

So in conclusion, Nvidia has created some much needed potential for a very restrictive, yet immensely popular platform. The GeForce 9400 MCP is a good part and we can't wait to test the actual product to check how viable the Ion platform is and how well it works with an Atom CPU.

Will Nvidia's Ion be price competitive and will Intel offer its Atom CPUs on their own? These two factors will determine whether an Ion dream comes reality, or if we simply forget about it by breakfast.

Saitek Cyborg Gaming Mouse

Manufacturer: Saitek
UK Price (as reviewed): £31.99 (inc. VAT)
US Price (as reviewed): $59.99 (inc. Delivery)

As far as inputs go, it’s hard to get much more fundamental than the mouse and, as far as gaming devices go, it’s hard to get much more important.

They proclaim the fact on the back of the boxes and in all the hype and press releases and we all roll our eyes and pretend not to be so stupid as to fall for these marketing ploys…but it’s true. A good mouse will last you for years and afford you a Zen like level of comfort that allows you to take your gaming to the next level.

And a bad mouse? Well, using a bad mouse is like playing a Counterstrike clan match on a Commodore 64 with a brick for a mouse and a soggy cake for a keyboard. And no hands.

Just what constitutes a good gaming mouse is something that’s pretty hard to define, and there are loads of companies out there which have tried to perfect the formula and create The Ultimate Peripheral.

Saitek Cyborg Gaming Mouse Saitek Cyborg Gaming Mouse

Some have come close to succeeding. Others have drowned under their own incompetence. Now, as Saitek takes another bash at making what they think the ultimate input should be like, we take a look at the Saitek Cyborg Gaming Mouse to see if it can measure up to the task…

Cyborg

The Saitek Cyborg is, as you can probably tell, a mouse that doesn’t want to mess around. It wants to make a statement. It wants to grab your attention. It wants to grab you by the cojones and slap you in the face with them because it doesn’t care what you think.

Which, to be honest, is just as well because when you cut through all the outer layers of flesh and bore down to the cold robotic truth of it all, the Saitek Cyborg is actually quite ugly. Like, really.

It’s red and it’s black and it’s angular and so painfully pseudo-futuristic that just looking at it makes me think that this is the type of thing we’ll be seeing as a joke in the PC version of Fallout 3.

Saitek Cyborg Gaming Mouse Saitek Cyborg Gaming Mouse

Even the name, Cyborg, is labouring under the impression that gamers love anything even remotely tech-sounding. True, it isn’t as silly sounding as something like the Boomslang or the Deathadder, but at least Razer has an on-going theme and mice that are actually pretty good despite the name. The Saitek Cyborg on the other hand still has to prove that it can handle itself in that regard.

In fact, when you really examine it, the Saitek Cyborg isn’t just ugly looking – it’s uncomfy looking. The thumb especially, with that jutting out overhang and flat under-pad.

Still, putting all that aside for the moment, the Saitek Cyborg does have some things going for it, especially in the feature orientated department. Literally everything on the Cyborg can be controlled, from the resistance of the scroll wheel to the very size of the mouse itself.

So, this is a mouse with more buttons than style, right? A typical example of a feature list dominating the product design, to the detriment of the aesthetic, yeah? Well, maybe, but that it’s necessarily a bad thing and just because the mouse is a little lacking in the looks department doesn’t mean that it’s a bad mouse in the end.

HP Pavilion dv2-1030ea 12in Ultraportable

Manufacturer: HP
UK Price (as reviewed): £599.00 (inc. VAT)
US Price (as reviewed): $699.99 (ex. Tax)

Netbooks have been one of the big crazes (in computing at least) over the past 20 months or so and they were the cause of a massive spike in laptop sales last year. It's amazing to think how far we've come in such a short space of time when you look back at the first generation of these dinky little devices.

Oh yes, we're talking about the iconic Eee PC 701, which made many compromises but it did hit a very low price point and, in the wake of its success, Asus continued to bang the netbook drum while many other manufacturers joined the party. Intel even developed a CPU specifically targeted at netbooks and Mobile Internet Devices, which meant it wasn't long before there were some particularly attractive netbooks on the market.

The Acer Aspire One was brilliant, as was the Samsung NC10, while Eee PCs such as the S101 certainly caught our eye. But they're all suffering from feature creep and with Nvidia's Ion platform just around the corner that feature creep is going to continue.

HP Pavilion dv2-1030ea HP Pavilion dv2-1030ea notebook

What this has proven is that what consumers really wanted was a cheap, portable and capable laptop all along. This is where AMD hopes its Yukon platform fits in.

AMD first talked about its new platform at the Consumer Electronics Show in January and the chip maker made some pretty bold claims at the time. These were naturally treated to some scepticism on our part and we came away feeling a little underwhelmed given the fact that AMD itself had said the next-generation of the platform wouldn't be all that far behind – but it's not here yet.

What we do have here though is HP's Pavilion dv2 ultra portable notebook – it's based on the Yukon platform which, in this instance, sports a single core Athlon Neo MV-40 processor (1.6GHz, 512KB L2 cache) and the AMD 690T/SB600 chipset, which features an ATI Mobility Radeon HD 3410 integrated graphics processor. There are a number of different processor options available, but the Athlon Neo MV-40 is HP's weapon of choice in the dv2.

HP Pavilion dv2-1030ea HP Pavilion dv2-1030ea notebook HP Pavilion dv2-1030ea HP Pavilion dv2-1030ea notebook
Click to enlarge

According to AMD, the target for Yukon are devices like the dv2 which are a cut above current netbooks, and it hopes Yukon will fit into form factors typically associated with high-end ultra portable notebooks like the MacBook Air, ThinkPad X301 and Vaio TZ series. However, unlike these high-end ultra portables, the Yukon-based notebooks promise to shed one of biggest turnoffs associated with ultra-portables – they'll be affordable.

Rather than just calling these lightweight, affordable laptops 'affordable ultra-portables' though, AMD felt the need to try and introduce a new class of device known as the 'ultra-thin notebook.' From what we can see, there aren't any hard fast rules to differentiate this from the typically expensive ultra-portable laptops apart from price, so we really don't understand the need for a new segment in an already oversaturated market.

Specification Summary:

  • AMD Athlon Neo MV-40 processor (1.6GHz, single core, 512KB L2 cache
  • AMD 690T/SB600 chipset with ATI Mobility Radeon HD 3410 graphics
  • 2GB 640MHz DDR2-SDRAM
  • Glossy 12.1-inch LED-backlit display (1,280 x 800 native resolution)
  • 320GB Western Digital Scorpio Blue 5,400rpm hard drive
  • Three USB 2.0 ports, two 3.5mm audio jacks (headphone and microphone), 10/100 Ethernet, HDMI and D-SUB
  • Integrated 802.11b/g wireless and Bluetooth 2.0
  • Integrated five-in-one media card reader (SD/MS/MS Pro/MMC/XD)
  • Stereo speakers, built-in webcam with microphone
  • Removable four-cell 2,900mAh Lithium-Ion battery
  • Windows Vista Home Premium Service Pack 1

Thursday, July 2, 2009

Radeon HD 4890 vs GeForce GTX 275

ATI Radeon HD 4890 1GB

Manufacturer: AMD
UK Price (as reviewed): Typical price £220 (inc. VAT)
US Price (as reviewed): Typical price $249.99 (ex. Tax)

Nvidia GeForce GTX 275 896MB

Manufacturer: Nvidia
UK Price (as reviewed): £220 (inc. VAT)
US Price (as reviewed): MSRP $249 (ex. Tax)

Introducing the new competition from ATI and Nvidia

It’s been a while since we’ve seen ATI and Nvidia scrap it out with two comparably priced cards launching on the same day, but this month they’re right back at it. The Radeon HD 4890 is best seen as an overclocked 1GB Radeon HD 4870 – the architecture is the same and it’s still built using a 55nm process, but ATI has slightly stretched the design to widen the internal copper interconnects and put a crucial few more atoms of silicon between the transistors. The stretched design reduces transistor power leakage and strengthens signal integrity, which in turn allows the chip to run at higher frequencies.

Stock clocked Radeon HD 4890s ship with a GPU clocked at 850MHz, which is 100MHz faster than the Radeon HD 4870's engine clock, and the Qimonda GDDR5 memory is now running at 975MHz (3.9GHz effective) rather than 900MHz (3.6GHz effective). ATI claims that the RV790 GPU can be easily overclocked to 1GHz and beyond (with watercooling), however we're dubious about how much the stock cooler can take.

Radeon HD 4890 vs GeForce GTX 275 ATI Radeon HD 4890 vs. Nvidia GeForce GTX 275

With the new GeForce GTX 275, Nvidia is sticking to its strategy of modifying its 55nm GT200 GPU to suit whichever price point it fancies hitting. The GTX 275 therefore has 240 stream processors and a 448-bit memory interface. This means that the GTX 275 is very similar to the GPUs used in the GeForce GTX 295, differing only in clock speeds.

The twin GPUs in the GTX 295 operate at 576MHz, compared to the GTX 275 which is clocked at 633MHz and likewise the stream processors of the GTX 295 GPUs run at 1,242MHz, rather than 1,404MHz in the GTX 275. Memory is still limited to GDDR3, and the GTX 275 has 896MB of it running at 1,134MHz (2,268MHz effective) rather than the 999MHz (1,998MHz effective) GDDR3 memory of the GTX 295. Why is the GTX 275 GPU faster than those of the GTX 295? Because it has a dedicated dual slot heatsink for just one GPU, it has two 6-pin power connectors all to itself and it has more PCB space for better power hardware underneath as well.

In terms of price, neither of these new cards will break the bank. Both cards cost roughly £220, give or take a few pounds. However, Palit has produced an own-design version of the GTX 275 for £199.99 inc. VAT on launch day. As we were given a reference card to test, all of the following data cannot be applied to this Palit card, and so any price/performance comparisons we make in this article will be made on the prices given for reference design GTX 275 cards.

To see how the two new GPUs stack up in the line-ups of the two graphics companies, we've knocked together the handy specs table above, and we've also got some details for a few partner cards over the next two pages. Unfortunately Nvidia was less "ready" than AMD at this launch, playing role of the reactionary party and only a reference card was available to us in time for the launch. In comparison, no less than five AMD partners got us Radeon HD 4890 cards for launch day, however one had an issue with its artwork (no, not a "wardrobe malfunction").

Nvidia GeForce GTX 275

Core Clock: 633MHz
Shader Clock: 1,404MHz
Memory Clock: 2,268MHz (effective)
Memory: 896MB GDDR3

You'd be easily mistaken in thinking Nvidia's GeForce GTX 275 was a GeForce GTX 285 without knowing the difference or seeing a sticker. Both use two 6-pin power adapters, both are the same length and both are very black. The only difference is what appears to be an optimised (cost reduced) PCB and two less DRAM chips, because of the smaller bus and total footprint.

Radeon HD 4890 vs GeForce GTX 275 ATI Radeon HD 4890 vs. Nvidia GeForce GTX 275 Radeon HD 4890 vs GeForce GTX 275 ATI Radeon HD 4890 vs. Nvidia GeForce GTX 275
Radeon HD 4890 vs GeForce GTX 275 ATI Radeon HD 4890 vs. Nvidia GeForce GTX 275 Radeon HD 4890 vs GeForce GTX 275 ATI Radeon HD 4890 vs. Nvidia GeForce GTX 275
Click to enlarge

The cooler has been changed to closer match the 9800 GTX GeForce GTS 250, as a single moulded piece of plastic, instead of the metal grill-shape on the outside edge. Like all current GT200 derivatives, sound over HDMI still requires an S/PDIF pass through cable near the power cables.

Early Look: Asus Maximus III Formula

Manufacturer: Asus

Asus' latest edition to its Republic of Gamer line of motherboards has popped its head above the clouds, in the form of the third revision of the Maximus. Having first launched on X38 with some dodgy mishmashed blue heatsinks, the revised Maximus II Formula on P45 was very much loved here at bit-tech.

The P55-based Maximus III Formula is the newest model in the Republic of Gamers line and follows on from the Maximus II Formula quite closely in design. As we can see from the pictures, the red and black heatsinks make a welcome return.

The southbridge gets a nice fat heatsink, even though the P55 needs next to no cooling (we've seen it running without any cooling at all), and the MOSFET heatsinks surrounding the CPU socket are particularly low profile, with a fat, flat heatpipe circling two sides of the CPU socket into a larger central heatsink where the northbridge on other boards would traditionally be. In this case though, nothing is cooled underneath it - we expect some backlit bling to illuminate the RoG logo like usual.

The heatpipe doesn't cover the top heatsink, although both heatsinks can be unscrewed and replaced at the discretion of the end user. We'd probably expect to see the whole lot go if people do want to replace it to be honest - the central heatsink will do little but get in the way for more elaborate cooling setups.

Early Look: Maximus III Formula Early Look: Asus Maximus III Formula Early Look: Maximus III Formula Early Look: Asus Maximus III Formula

The four DDR3 DIMMs for dual channel memory get three phases of power regulation, while the CPU has an odd number at 19; although we've yet to confirm either is a "real" phase count. Instead of a single fat Fujitsu capacitor like on a few previous RoG boards (Rampage Extreme for example), Asus has gone for many, many smaller capacitors to smooth out the current flow.

This is especially important considering the tax PCI-Express (including multi-GPU), a memory controller and four CPU cores could potentially have on the area. We can only anticipate what capacitors will be used, but we expect the usual 50k hr Fujitsu solid aluminium capped, like on previous RoG boards.

There's space for the Intel Braidwood NAND Flash socket under the memory slots - we expect the Maximus III Extreme based on P57 to have that, but its usefulness has yet to be determined. Since the two (P55 and P57) are socket compatible with only this feature differentiating them, it makes sense to have just one PCB design for each.

Early Look: Maximus III Formula Early Look: Asus Maximus III Formula Early Look: Maximus III Formula Early Look: Asus Maximus III Formula

Despite three PCI-Express slots, the two red ones are either a single x16 or dual x8 at PCIe 2.0 bandwidth and will be suitable for both SLI and CrossFire multi-GPU setups. With Intel continuing to be stingy on the P55, referred to as "ICH10.5R" by some motherboard manufacturers, with PCI-Express, the bottom slot is only an x4 and we expect will disable the two PCI-Express x1 slots above if used.

Of those PCI-Express x1 slots, the upper most one is both the most useful above the primary graphics slot, and also the most useless as it backs right into the central heatsink. In this respect, we see the central heatsink being removed - sod the heatpipe. Although, with that said, this slot will also be used by Asus' (supplied) add-in audio card, which we've no details on at all yet - we can only guess for similar support to the last: software X-Fi and an ADI chipset perhaps. Finally, two PCI slots still get squeezed between as well, making an impressive total of seven expansion slots.

See the above MemOK button? Well, Asus can't test every perceivable memory module kit out there and sometimes you can try a new kit six months down the line and it just won't boot. The MemOK function is Asus' proprietary way to force the board to try all sorts of different memory configurations to force a successful POST. It'll sit and cycle through for a while as it goes through a set of iterative steps, then finally should kick in with compatible settings - it's a welcome fail safe feature.

DDR3: Kingston and OCZ at 1333MHz

We had a first look at DDR3 performance back in May with an engineering sample of Corsair's DDR3 DHX memory. However, as some of the first DDR3 modules to run off the production, Corsair's modules were rated at a CAS-9 latency, which is pretty high in comparison to the respectively lower latency CAS-7 modules we have on test today.

All right, let's not start whining about "latencies going up" or "how is seven low??" because it doesn't work quite like that. DDR accessed two bits of data per clock cycle, DDR2 accessed four per clock and now DDR3 accesses eight.

In addition to the change of topography making access take longer as the signal goes through every bank, the extra time is only in response to having more data being sent per clock and frequencies being scaled.

The latency criticism is true to a certain extent because 1,333MHz equates to 0.75ns per clock, which makes 7 CAS clocks = 5.25ns, where as 800MHz equates to a longer 1.25ns but just three CAS clocks are 3.75ns. You have to drop the 1,333MHz CAS latencies down to CAS-5 to reach the same 3.75ns latency, but you've also got nearly twice as much available bandwidth at 11,000 versus 6,400MB/sec.

This means in a high throughput, longer latency scenario DDR3 should work better. It's then somewhat ironic that this kind of system was employed with the long pipeline of the old Pentium 4, instead, from experience the Core 2 architecture works better with short, sharp accesses in a low latency memory environment rather than needing a ton of bandwidth. The Core 2 has a ton of cache and far better pre-fetchers than anything on the P4s, which is meant to "hide" memory access latency, however these pre-fetchers aren't perfect.

Today, we've got modules from Kingston and OCZ on the test bench and without further ado, let's have a look at the modules...

Kingston HyperX KHX11000D3LLK2/2G

Manufacturer: Kingston
UK Price (as reviewed): £302.68 (inc. VAT)
US Price (as reviewed): $452 (ex. Tax)

The string of letters and numbers for Kingston's model name translates to a PC11,000 "low" latency kit of 2GB DDR3. It's strange, because even though the industry continues to use the PC-bandwidth numbers to represent module speed, fewer and fewer consumers and businesses use it because we're not bandwidth limited in this day and age. PC3-11,000 translates to a memory clock of 1,375MHz, rather than the official JEDEC speed of 1,333MHz. Kingston either feels that 10,666 doesn't quite have the same ring to it or it wanted to get a slight clock edge over its rivals.

DDR3: Kingston and OCZ at 1333MHz Kingston HyperX KHX11000D3LLK2/2G DDR3: Kingston and OCZ at 1333MHz Kingston HyperX KHX11000D3LLK2/2G

Kingston does have an ultra low (UL) latency pair of 1375MHz modules which are rated at an eye watering 5-7-5-15, instead of the 7-7-7-20 modules we have here, but what stops us just overvolting, overclocking these LL sticks and saving a ton of cash instead? In actual fact, during testing we managed to drop the timings down to 5-6-5-15-1T at 1.85V, fully stable, but it was only at 1,066MHz not 1,333MHz.

Kingston HyperX KHX11000D3LLK2/2G Details:

Kit: 2 x 240-pin DDR3 Double Sided DIMM
Module Size: 2GB Dual Channel Kit (2 x 1GB)
Module Code: Kingston HyperX KHX11000D3LLK2/2G
Rated Speed: 1,375MHz DDR3
Rated Timings: 7-7-7-20 (CAS-tRCD-tRP-tRAS)
Rated Voltage: 1.7V
Memory Chips: Elpida

DDR3: Kingston and OCZ at 1333MHz Kingston HyperX KHX11000D3LLK2/2G DDR3: Kingston and OCZ at 1333MHz Kingston HyperX KHX11000D3LLK2/2G

Kingston HyperXKingston uses the same style of heat-spreaders that it sells with the entire HyperX series, just simply changing the DDR2 to a DDR3. There's a new futuristic look and style of sleek heat-spreader shown on its website, but it's disappointingly not used on these modules.

Elpida DRAM chips are used on these modules, which seems like a common choice given many other companies use either this or Qimonda for 1,066MHz and 1,333MHz modules. The DDR3 Micro D9s seem to be the preferred choice for enthusiast modules like the Corsair Dominator and DHX ranges, so we can only assume that there might not be much headroom in these modules. However, low latency might be a more appropriate option for Core 2 CPUs like it is with DDR2 (since Intel is the only one supporting DDR3 until sometime next year).

Memory: Is more always better?

Have you ever wondered what all of the fuss is about when it comes to memory? Memory manufacturers are going out of their way to sell you what they consider to be the best memory on the market for gamers, while also trying to push you into spending more money on more memory because - according to the memory makers - 2GB of memory is now becoming the industry standard for gaming systems.

The question is, whether doubling your current 1GB configuration to 2GB will be beneficial to your gaming experience or not. There are several options to take, too. Should you keep your current 512MB DIMMs and add another two to give you a total of 2GB, accepting the drawbacks (if there are any) from using the 2T memory timing?

Or, should you attempt to sell your current modules and purchase a pair of 1GB DIMMs?

Alternatively, could you get away with just sticking with your existing setup?

There are so many options on the market today, and we're going to attempt to answer these questions over the next few pages, while deconstructing some of the confusion that surrounds memory timings.

What does all of the jargon mean?

In the current computer market, where comprehending the intricacies of the latest generation of videos cards requires a great deal of stamina, it's easy to be deceived by memory and its many names - it's actually one of the simplest components in your system, and understanding what's best is easy.

Fred's driving along the road in his car, he spots something in the road and has to stop in a hurry, how long does it take for him to stop? There are two important components to this problem. The first is the most obvious (assuming something about his brakes, tyres and road surface) his speed will determine how far he travels when he presses the brake pedal.

Memory: Is more always better? Introduction
The second, more subtle component, is his reaction time. How long it takes him to press that pedal. Actual experienced memory bandwidth is determined by two analogous factors; how quickly data can be transferred from the memory (memory speed), and how long it takes this transfer to start (memory latency).

Here's a quick primer in memory jargon:

Memory Speed:

The link between the CPU and memory is called the memory bus. Often, it runs at the same speed as the Front Side Bus (FSB), which regulates the communication between the CPU and lots of other system components. The newer Intel Pentium Processors seem to run better with the memory using an asynchrous 4:5 memory divider, meaning that the memory is running faster than the front side bus. The bus speeds are measured in MHz, or million clock cycles (CC's) per second.

Modern processors transmit 8 bits of data on every clock cycle, and all Athlon 64 and the older Socket 478 Pentium 4 CPUs run with a 200MHz memory bus. Newer Intel Pentium CPUs that use the LGA775 socket use either a 266MHz or 333MHz memory bus speed. The memory bus speed depends on whether they're an Extreme Edition or not - standard Pentium CPUs use a 266MHz memory bus, while Extreme Editions use a 333MHz bus.

If you multiply 400 (200 times 2 as Double Data Rate (DDR) memory runs at twice the clock speed) by 8, and you get a theoretical maximum figure of 3200Mbits/s transfer - hence the memory rating speed PC3200 found on the label of most new sticks of DDR memory. With the newer Pentium CPUs, you will see modules labelled with PC2-4200 (DDR2-533), PC2-5400 (DDR2-667) and modules up to PC2-8000 (DDR2-1000).

Memory Latency:

Addressing memory is much like reading from a large, multiple page spreadsheet. It doesn't matter how quickly you can read, before you can start you have to find the page the data you want is on (this is known as tRAS), work your way to the row and column the data's stored on (tRCD), when you've found the cell you want it takes some time before you start reading (CAS) and when you get to the end of a row you have to switch to the next, which takes time (tRP).

tRAS is the time required between the bank active command and the precharge command. Or in simpler terms, how long the module must wait before the next memory access can start. It doesn't have a great impact on performance, but it can impact system stability if set incorrectly. The optimal setting ultimately depends on your platform - the best thing to do is to run Memtest86 on your system with variable tRAS settings to find the fastest setting for your system.

The tRCD timing relates to the number of clock cycles taken between the issuing of the active command and the read/write command. In this time, the internal row signal settles enough for the charge sensor to amplify it. The lower this is set, the better - the optimal setting is either 2 or 3, depending on how capable your memory is. As with any other memory timing, setting this too low for your memory can cause in system instabilities.

CAS Latency is the delay, in clock cycles, between sending a READ command and the moment the first piece of data is available on the outputs. Setting CAS to 2.0 seems to be the holy grail with memory manufacturers, but the difference between tight timings and high memory bus speeds is an arguement that we hope to settle over the course of this article.

The tRP timing is the number of clock cycles taken between the issuing of a precharge command and the active command. It could also be described as the delay required between deactivating the current row and selecting the next row. In conjunction with the tRCD timing, which relates to the time taken between the issuing of the active command and the read/write command, the time required to switch banks (or rows) and then select the next cell for reading/writing or refreshing is a combination of the two timings.

The Command Rate timing is another timing that is important to maximum theoretical memory bandwidth. It's the time needed between when a chip is selected and when commands can be issued to the selected chip. Typically, these are either 1 or 2 clocks, depending on a number of factors including the number of memory modules installed, the number of banks and the quality of the modules you've purchased. The majority of memory available today is claimed to run at the faster 1T memory timing.

Memory latencies are normally quoted in the following format, CAS-tRP-rRCD-tRAS Command Rate, an example being 3.0-4-4-8 1T, with the numbers corresponding to the individual latencies quoted in clock cycles. Lower numbers are better, though in theory tRAS should be tRCD added to CAS Latency plus 2.


The differences between the 2x1GB and 4x512MB, and essentially the differences between using a 1T command rate versus the slower 2T timing, are relatively small in a selection of today's most popular games. However, there were instances where we found that we were able to play games with less frame rate lag and hitching as a result of having 2GB of memory using the 1T command rate timing.

However, the bottom line is that you will not see the same 10% performance increase in real games that we saw in SiSoft Sandra.

Memory timings are going to make the same subtle performance differences too, so it's a question of whether you can afford the faster modules that are capable of using tigher timings. The choice will ultimately depend on what memory you are currently using in your system and also how much you're willing to spend.

We'd recommend making the upgrade to 4x512MB over 2x512MB, even with the slight drawbacks we experienced in one of the four games we tested. That's because there are many other uses for your computer (aside from gaming) where you'll see the benefit of 2GB of memory. The general desktop experience is improved by 2GB of RAM, and I'm sure you'll never look back if you make the jump.

If you're willing to take a bit of a gamble (dependant on whether you'll be able to sell your current memory or not), we'd recommend swapping out your current memory for 2GB - it's just a case of whether you choose to buy the cheaper modules with looser timings, or whether you opt for memory capable of reasonably tight timings at DDR400. That'll ulimately come down to whether you're planning to overclock or not.

Sapphire ATI Radeon HD 4850 X2 2GB

Manufacturer: $339.99 (ex. Tax)
UK Price (as reviewed): £259.97 (inc. VAT)
US Price (as reviewed): $339.99 (ex. Tax)

Core Clock: 625MHz
Memory Clock: 1,986MHz
Memory: 2,048MB GDDR3
Warranty: Two years (parts and labour)

We had a brief look at Sapphire's ATI Radeon HD 4850 X2 in November when the card launched, but at the time there was no official driver support from AMD – the company simply provided us with a pre-release hotfix driver that wasn't available for download. This driver had some issues where performance dived through the floor and then we learned that official AMD support wasn't forthcoming until at least Catalyst 9.1.

With Sapphire eager to get ahead of the game though, it has recently released a version of the Catalyst 8.12 WHQL driver with a modified INF file that enables support for the 4850 X2. The driver is WHQL certified and available for download direct from the company's website – this isn't quite official support from AMD, but it's as close as we're going to get for now and it's therefore time to get down and dirty with Sapphire's rather interesting dual-GPU monster. Read on to find out how it gets on...

At the time of our initial preview of the card, a number of ATI board partners had come to us to say that they were planning to release their own Radeon HD 4850 X2s very soon – so far, none of those promises have rung true and Sapphire has maintained its exclusivity in the market. At the same time, the card's price has dropped a bit since November – it's now available for under £260, which is over 15 percent less than the original £300 asking price.

Sapphire ATI Radeon HD 4850 X2 2GB Sapphire ATI Radeon HD 4850 X2 2GB


The card has the same number of stream processors as the Radeon HD 4870 X2, which means there are two 'full' RV770 GPUs under the heatsink each with 160 five-way shader units (making a total of 1,600 stream processors over the two GPUs). The clock speeds are exactly the same as a single Radeon HD 4850 as well – the core is clocked at 625MHz, while the GDDR3 memory runs at 1,986MHz (effective).

Unlike the standard Radeon HD 4850, Sapphire has included 1GB of memory per GPU on this Radeon HD 4850 X2 instead of the 512MB on a standard 4850. There is another, cheaper version of the card with 1GB of GDDR3 (512MB per GPU) – that retails for about £240 to £250 though, so we'd definitely recommend spending that little bit more on the 2GB variant based on the comparisons we've done between single 512MB and 1GB Radeon HD 4850 (and 4870) cards so far.

There are a couple of 80mm fans attached to an aluminium shroud with Sapphire's branding emblazoned onto it. The two RV770 cores are cooled by large aluminium heatsinks with a copper insert above each GPU core, while a third heatsink sits atop the PCI-Express switch positioned between the two GPUs. In addition, there are another couple of heatsinks on the power circuitry (one on the front and one on the back) that makes sure the PWMs are kept cool. Normally, we'd be a little concerned about a relatively tall heatsink placed on the back of the card covering additional power circuitry, but because of the card's length, it shouldn't cause any problems.

Sapphire ATI Radeon HD 4850 X2 2GB Sapphire ATI Radeon HD 4850 X2 2GB

Sadly, the card cannot be classed as quiet – far from it, in fact – and we think that's partly down to the decision to use fairly a basic aluminium/copper hybrid heatsink design on the GPU cores. It would have been more efficient to use a heatpipe design, but sadly there doesn't look to be enough room for that. And there are some components that would get in the way of a single heatsink design covering all three chips.

Sapphire has chosen not to use the Radeon HD 4870 X2's reference design PCB – probably because of the different traces required from the GPU(s) to memory – and has instead opted for its own design. It is longer than the Radeon HD 4870 X2 reference design (283mm compared to 267mm), which means that it will not fit in every chassis out there – be sure to check that you have enough room for the card to fit in your case.

Unlike the 4870 X2, the 4850 X2's memory resides only on the top side of the card – they're cooled with simple L-shaped aluminium cooling plates secured with three push pins. Moreover, the power circuit is also quite different to the Radeon HD 4870 X2's as well – most of the components are located down near to the dual power connectors (one 8-pin and one 6-pin) and is a seven phase circuit covered mostly by the pair of aluminium heatsinks – one on the front and one on the back.

There are a few components nestled in between the GPUs though (which prevent the single heatsink design we just mentioned) and are comprised of a pair of phases with each including a single choke and three MOSFETs. These appear to control power to the PLX PCI-Express switch, but it's not 100 percent clear based on the traces visible on the surface of the PCB

Sapphire ATI Radeon HD 4850 X2 2GB Sapphire ATI Radeon HD 4850 X2 2GB
Click to enlarge

On the PCI bracket, Sapphire has done what Asus did with the Radeon HD 3870 X2 – it has implemented four dual-link DVI connectors that will certainly please multi-monitor lovers because you can not only run CrossFire, but also connect up to four high-resolution digital displays. The bracket is rounded off with an analogue TV-out connector – Sapphire has included composite and component adapters in the bundle.

And while we're talking about the bundle, Sapphire has included a pretty good selection of items in the box. In addition to the aforementioned cables, there is a DVI-to-HDMI dongle, a DVI-to-VGA connector, two supplementary power connectors, a CrossFire connector and a selection of software that includes Cyberlink PowerDVD, DVD Suite and 3DMark Vantage.

Warranty & Support

The Sapphire ATI Radeon HD 4850 X2 comes complete with a two-year warranty that includes cover for parts and labour. During the first year of the product’s life, your point of contact should be the retailer. However, if you’re having problems getting hold of the retailer (or the retailer goes out of business), you should contact Sapphire’s support team directly. During the second year of the warranty period, you should talk directly with Sapphire.


Overclocking

Considering how low the operating temperatures are on Sapphire's ATI Radeon HD 4850 X2, we expected the card to overclock fairly well and we weren't disappointed.

We managed to increase the core speed from its default 625MHz up to 696MHz - an eleven percent improvement - while the memory was quite happy running up at 2,300MHz (effective). This represented a 16 percent increase in memory bandwidth.

What was disappointing was the fact we couldn't use our favourite RivaTuner software - the latest version wouldn't allow us to adjust clock frequencies - but we suspect this will be fixed before long, as it should be just a case of adding the card's deviceID into the application. We instead used the OverDrive utility inside the Catalyst Control Center to achieve this overclock.

In terms of performance, we saw our Far Cry 2 frame rates increase by just under 10 percent at 1,920 x 1,200 4xAA 16xAF. This means that our overclocked Radeon HD 4850 X2 was just 6.6 percent slower than a stock Radeon HD 4870 X2 at these settings - that's pretty impressive with all things considered.

Final Thoughts...

We’re very much in two minds about Sapphire’s ATI Radeon HD 4850 X2 2GB, because although the card is generally a better performer than Nvidia’s recently-released GeForce GTX 285, there are a number of issues that we can’t ignore. The first is the fact that this is a dual-GPU card and AMD’s support for it hasn’t been as forthcoming as we would have hoped.

Even today, going to AMD’s driver page looking for updated drivers for the Radeon HD 4850 X2 is a failure – that’s where we downloaded our drivers from originally and our attempts to install them didn’t yield positive results. Instead, it resulted in frustration – there is still no official support for this card through the usual channel, even though it’s plastered right the way through AMD’s internal marketing slides. And this is almost three months after the card was originally released!

Of course, we have been able to get drivers through Sapphire’s own driver portal – but that’s not the first place anyone is going to look and it also relies on Sapphire keeping its own driver page updated with modified ATI drivers. If we were to poll readers on where they’d expect to get drivers for an ATI Radeon HD 4850 X2 from, the majority would no doubt say ATI or AMD – we don’t understand why, almost three months after the card was released, we still can’t do that.

To me, there’s something clearly wrong with AMD’s monthly driver strategy and this hasn’t been creeping up – it’s something I’ve pointed out for a long time, but usually in relation to multi-GPU profile support in new games and not getting the latest products working. Catalyst 9.1 hasn’t been released yet, but it’s the driver where official 4850 X2 support has been promised and we’re hoping that is still the case. If it’s not, then I am not sure what to think.

The other major disappointment for us with the Sapphire Radeon HD 4850 X2 is the noise it makes – it’s far too loud for a modern graphics card and we have moved away from the whiney fans of the early naughties. Sadly, the two fans on the Sapphire Radeon HD 4850 X2 just make us want to jam a couple of pencils in them to shut them up – they verge on offensively loud if we’re honest and to add insult to injury, there’s no fan speed control available in any of the monitoring software we’ve tried so far.

Despite these flaws, the Sapphire Radeon HD 4850 X2 does deliver a generally better gaming experience than Nvidia’s more expensive GeForce GTX 285. The cheapest we have been able to find the 285 for is around £300, including VAT, which makes it £40 more expensive than the 4850 X2. That makes the decision difficult and while we’d typically recommend a single GPU solution almost every time (especially in light of what we’ve described above), the price difference makes it an awkward choice – the GTX 285 is 15 percent more expensive and it’s slower in four of the seven games we’ve tested here.

Frankly, the Sapphire Radeon HD 4850 X2 is too loud for a modern gaming system, driver support isn’t where we’d like it and the GeForce GTX 285 is too expensive. In light of this, we don’t believe there is a perfect choice in this segment of the market and it opens up a window of opportunity for the manufacturer who wants it the most. Right now, we’d avoid this part of the market altogether and either go higher or lower depending on your screen size and budget elasticity.

AMD Phenom II X2 550 Black Edition CPU

Manufacturer: AMD
UK Price (as reviewed): £71.86 (inc. VAT)
US Price (as reviewed): $102.99 (ex. Tax)


It's probably unnecessary to remind you that AMD has had very few successful "firsts" when it comes to its Phenom line. The latest Phenom II X2 550 Black Edition is, thankfully (or should that be "finally"?), the exception to the rule. It's the first dual-core Phenom II chip - and its actually pretty damn good!

As it's a Phenom II, it's a dual-core of 45nm origins, and is very similar to its bigger triple-core (X3) and quad-core (X4) brothers, only with two cores disabled. This means that with special motherboards, there's the possibility of unlocking the extra cores, although there's no guarantee that they'll work.

The 45nm K10.5 design used by the X2 550 BE features a huge 8MB of common L3 cache - more than enough to act as a snoop filter for just two cores - as well as DDR3 and DDR2 support. Supported memory speeds are faster than the Athlon II: 1,066MHz for DDR2 and 1,333-1,600MHz for DDR3. Another advantage is that this is a Black Edition product - so the multiplier is unlocked for easy overclocking.

It comes clocked at 3.1GHz, and with effectively a DDR3 memory channel per core (in un-ganged mode), there's not only oodles of cache, but also tons of fast memory access too.

  • Clock Frequency: 3.1GHz, 200MHz x 15.5
  • Core Count: Two physical
  • L1 Cache: 64KB data and 64KB instruction per core (256KB in total)
  • L2 Cache: 512KB exclusive data cache per core (1MB in total)
  • L3 Cache: 8MB L3 inclusive cache
  • Fabrication Process: 45nm DSL SOI (silicon-on-insulator) technology
  • Packaging: Socket AM3 (Socket 938)
  • Memory support: DDR2/DDR3
  • Thermal Design Power (TDP): 85W
  • Transistors: ~758 million
  • Die Size: 258mm²
AMD Phenom II X2 550 Black Edition CPU AMD Phenom II X2 550 CPUAMD Phenom II X2 550 Black Edition CPU AMD Phenom II X2 550 CPU


All this will set you back just over £70; in terms of its Intel competition, this situates it between the Core 2 Duo E6300 at £64, and the pricier E7400 at £85. It's not just CPU price that you need to look at when making a buying decision though: the cost of the whole platform (i.e. motherboard, CPU and memory) is crucial. In this respect, the X2 550 BE does rather well. Pair it with an MSI 770-CD45 motherboard and 4GB of low latency DDR3-1333 and it costs just £189. However, for an Intel Core 2 Duo E7400, Gigabyte GA-EP45-UD3R and 4GB of DDR2-1066 (it's £2 more than 800MHz), you'll need to shell out £224 - £35 difference.

Even if you choose the E6300 instead, the total system is still £15 more expensive. It doesn't sound like much, but it's halfway to a good CPU cooler such as the Titan Ferinr. The AMD setup should have more legs too - AM3 is in its infancy, with an upgrade to triple- or quad-core available, whereas LGA775 is at the end of its life. That said, cheap Q6600s could be flooding the market soon as upgrade cycles come around. All this is worth keeping in mind as we take a look at how well the X2 550 BE performs

AMD Phenom II X2 550 Black Edition CPU Overclocking and Test Setup


Power Consumption

For all of the performance tests on the previous few pages, we disable all power saving technology in order to give us a consistent set of results, and also to obtain best-case performance numbers - technologies such as Intel's SpeedStep might only take microseconds to kick in, but that can make all the difference in some tests. However, for the power consumption tests we re-enable everything but Intel's Turbo Boost in order to get a real-world power draw.

Idle Power Consumption

For this test, we leave the PC doing nothing but displaying the Windows Vista desktop (with Aero enabled) for a few minutes and record the wattage drawn from the wall via a power meter.

Final Thoughts:

The Phenom II X2 550 Black Edition is what the II X2 250 should have been. In fact, there's very little reason to buy an old K8, or even K10 cored CPU now the Athlon II X2 250 and Phenom II X2 550 are available (unless, that is, you have a specific need for an ultra low power 45W 5050e).

The Phenom II X2 offers plenty of performance and potential for simple and advanced overclocking for enthusiasts to have fun with. It's a solid chip for gaming and is excellent for multitasking. It suffers in comparison to the Intel Core 2 Duo E7400 when both are overclocked and you're throwing video encoding or image editing tasks their way, but that's not to say the X2 550 BE doesn't give the Intel CPU a good run for its money - if only it could roll over the 4GHz barrier!

If you're not into overclocking though, but fancy a fast machine for the family or friends, with the possibility of future upgrades; the 3.1GHz core clock affords a solid performance and the AM3 socket also has a bit of a future. It may not excel in very heavy workload - that's what quad cores are for - but for a basic all rounder, the Phenom II X2 550 Black Edition is certainly a good buy, and it's helped by the fact it's cheaper than comparable Intel chips, especially when you factor in platform costs.