GeekSpy Forums | The Lobby - Misc Discussion | Colour Me Impressed Not Really...
 * F.A.Q.  * Memberlist  * Search
 * 
Colour Me Impressed
Rating:
Topic by: Cyrris
Posted: Jan 10, 13 - 10:19 PM
Last Reply: Jan 28, 13 - 8:54 PM
Go to : 
Posts: 13
Page: 1
Author Colour Me Impressed
User avatar

He Leg
Posts: 527
With this: http://www.anandtech.com/show/6600/inte ... ce-gt-650m

That's Intel's next generation CPU+GPU (that is, both on the one chip). It's running against a machine with a discrete Nvidia mobile graphics card - and not a bad one at that. My laptop is very new and only has a GeForce 640M, which is slower than the one used in that test (a 650M), and this is a fairly high-end (though not gamer/enthusiast) laptop.

So when this new Intel series is released in the next few months, laptops without discrete graphics cards can be as good for games as mine with discrete graphics, in a laptop that was brand new only 6 months ago. That's crazy.

Intel is making the same sorts of leaps and bounds in its on-chip graphics solutions that Nvidia and ATI were making with their desktop cards, back in the days where each new generation was 100% faster than the last. They slowed down eventually... these days their new generations tend to only be 25-40% faster or so. It will be interesting to see how long Intel can keep it up.

_________________
The Man, The Myth


User avatar

Just Beat Coyote (off)
Posts: 229
In Reply To #1

Given the fact I believe an Aussie research team only recently discovered some new material which is apparently purrrfect(er) for allowing electrons to run wild - they claim that this new material with its molecular structure very similar to graphite, will boosts chip speeds and decrease component sizes.

Too lazy to find a link.

Brilliant!

I think within the next 2-3 years we're going to see another sudden jump in raw powah.

_________________
Will someday beat Strike Suit Zero...


User avatar

He Leg
Posts: 527
It's all about the power, both how much more we're seeing and how little (in terms of Watts) is getting used.

I was looking at Razer's new gaming tablet idea from CES, and while I don't really think it will be a commercial success, the fact that we now have that sort of power on hand-held devices is quite remarkable. They're no longer an order of magnitude behind the desktop PC.

That Razer tablet has a discrete GPU... how long before it just uses an integrated one from Intel, and adds more hours to its battery life? Yeah. Probably not long.

_________________
The Man, The Myth


User avatar

Trainee
Posts: 34
In Reply To #3

Not sure if I agree. There's no reason why Intel should be able to offer more performance/watt (at least not at the same level of quality). They're on the same technology node. Only difference there could be is because of the power it takes to drive a PCIe bus off-chip.

_________________
To the optimist, the glass is half full. To the pessimist, the glass is half empty. To the engineer, the glass is twice as big as it needs to be.


User avatar

He Leg
Posts: 527
In Reply To #4

The main part I've failed to mention is that Nvidia is also making huge strides of its own with Tegra in terms of big performance and small power draw. I just tend to not pay as much attention to it because I'm far more interested in x86 architecture at the moment. I expect eventually that if ARM continues to protect itself against Intel's encroachment on its mobile market share, that the prevalence of universal binaries will mean it won't matter anymore as things continue to converge.

_________________
The Man, The Myth


User avatar

Why Did I Do That?
Posts: 206
Let's just hope Intel actually sticks with it this time in entering the graphics hardware market. They got us all worked up over Larrabee, got pretty far into development and...then said 'fuck it, let's go make some toast.'

_________________
I have a lot of great ideas, trouble is most of them suck. -George Carlin


User avatar

He Leg
Posts: 527
In Reply To #6

Yeah. That bugged me a lot. The current discrete GPU duopoly has me seriously bored. I was bummed when XGI's Volari launch turned out to be a disaster. And looking now... holy crap. That was almost a decade ago.

This time though, being integrated with Intel's next CPU line-up, it's just a logical progression of their current technology. No new paths being set out like they looked to do with Larrabee. Should just be business as usual.

_________________
The Man, The Myth


User avatar

Trainee
Posts: 34
They are not entering the bussiness. They have been there all along, with a larger market share than the other two combined...

_________________
To the optimist, the glass is half full. To the pessimist, the glass is half empty. To the engineer, the glass is twice as big as it needs to be.


User avatar

Why Did I Do That?
Posts: 206
In Reply To #8

'Technically' yes, but in the same way that McDonald's is the largest toy manufacturer/distributor due to all the Happy Meals. Intel's video chip sets never emphasized the kind of hardware architecture for fast rasterizing of vector graphics and customizable processing of vertex, geometric and fragment/pixel data. Their hardware was more suited for working with popular video codecs for video streaming from websites.

Larrabe was their first serious attempt at it and it emphasized raytracing over rasterizing. Raytracing definitely provides high quality images, even at lower resolutions compared to rasterization and once you reach a certain point it actually becomes cheaper to render more and more complex scenes. However the hardware needed to reach that point for real-time rendering costs way too much for an average consumer and even PC enthusiasts at this point. Down the road it may be a viable option, we'll eventually see a mix of both techniques in the near future.

_________________
I have a lot of great ideas, trouble is most of them suck. -George Carlin


User avatar

Trainee
Posts: 42
In Reply To #1

Were you not aware the AMD A series is already doing this?


User avatar

He Leg
Posts: 527
In Reply To #10

Yeah, I was quite impressed when Llano came out. Since then however, their performance improvements have been incremental at best while Intel's solutions look to be making larger strides (of course, they have more ground to make up). Meanwhile Nvidia's mobile discrete options are still keeping them comfortably ahead as well, which is why - I imagine - Razer's gaming tablet is an Intel/Nvidia powered system, and not an AMD integrated one.

I'll admit to some bias here. AMD did well with their APU development at first, but they've got problems when it comes to... lots of things. Their discrete GPUs are not keeping up with Nvidia's. They can get the framerates equalised only at the cost of substantially more heat and noise. Their Enduro graphics-switching solution on notebooks is only just becoming stable now, while Nvidia's Optimus has been solid in that space for a few years. Their recent microstutter problem was yet another example - their engineering teams just don't have it together right now. Add in the performance problems on the CPU side, particularly single-threaded (which is very important in some of the games I play), and it's just not a winning formula.

It makes me a bit sad. 10 years ago I was sitting pretty with an Athlon64 and a Radeon 9600, comfortable in the knowledge that Intel and Nvidia were failing all over the place and I'd spent my money wisely. It's been a rocky decade for them since, and with their current performance improvement goals (15% year on year for CPUs, 25% for GPUs) I don't see much changing.

_________________
The Man, The Myth


User avatar

Trainee
Posts: 42
In Reply To #11

Have you actually used these new A series combos? They will knock your sox off at a price Intel can't touch. Check em out.


User avatar

He Leg
Posts: 527
In Reply To #12

Used? No. My usual reads are things like Anandtech, reviewing Trinity here. Is there something newer already? The gaming benchmarks are alright but my socks are most certainly still on. The discrete Nvidia solutions (like the GT 640M, what I have) pull far away in most games. If Haswell in fact matches the 640M, then that's quite a leapfrog over AMD.

For me price isn't such a big deal. I never buy best-of-breed equipment but I usually go for higher end stuff. The current mid-range GPU I have is a result of me refusing to have an overly heavy notebook, or one that is too embarrassing to have around other people (Alienware, I am looking at you).

In any case, all we saw from Intel's Haswell demo was one game. So if they just chose the best one for their hardware then it might not tell us that much.

_________________
The Man, The Myth


Colour Me Impressed
Rating:
Topic by: Cyrris
Posted: Jan 10, 13 - 10:19 PM
Last Reply: Jan 28, 13 - 8:54 PM
Go to : 
Posts: 13
Page: 1
 Moderators  Permissions  Tools
Moderated by Cyrris You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
· Print view

© 2012-2023 GeekSpy Forums.