Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software

NVidia Announces Mobile GeForce 2 Chip 116

AFCArchvile writes: "NVidia might be giving ATI, the current dominator in the laptop graphics chip sector, a run for its money. This Yahoo article tells about how the release was announced in Vegas, and PlanetHardware has a preview of the chip (a low-power derivative of the GeForce 2 MX), with some technical specs as well. The GeForce2Go, as it has been labeled, performs over half as well as a GeForce 2 GTS (572 Mtexel/s) while consuming much less power (0.8 watts typical, 2.4 watts maximum)."
This discussion has been archived. No new comments can be posted.

NVidia Announces Mobile GeForce 2 Chip

Comments Filter:
  • Half the performance is nothing. The power ain't any better than most of the competitors at this price range. Hardware transform is a white elephant. Half speed and you start to find that your processor's free time is more than adequate to outperform it. Maybe it does give the CPU more time to spend on other tasks, but games are written without any other tasks.
  • Don't forget that little 15-pin connector on the back of your laptop. Many people connect their laptops to external displays for presentations (large-screen monitors & projectors). Those displays can be much higher quality and much faster than the internal display. For an engineer showing the latest 3D animation of her work or a scientist showing molecular structures of a protein, fast graphics hardware does matter.
  • Don't forget that little 15-pin connector on the back of your laptop. Many people connect their laptops to external displays for presentations (large-screen monitors & projectors). Those displays can be much higher quality and much faster than the internal display. For an engineer showing the latest 3D animation of her work or a scientist showing molecular structures of a protein, fast graphics hardware does matter.
  • A partly biological chip, this chip promises brighter colours and finer textures than ever before.
    Those damn biological chips also promise a hellish future where hummans are hunted like rats by their own creations....
  • My recently purchased Dell Inspiron 4000 has an ATI Mobility 3/4 chip (8 megs of VRAM) that flies. I was really surprised by how well it plays Quake and UT. It easily reaches 30-40 fps in reasonable resolutions.

    I bought the Inspiron to be more of a "mobile word processor/internet station", but it's proven to be a solid gaming beast. Nvidia may introduce some decent chips for laptop gaming, but as it stands right now, the ATI chip performs more than admirably.

  • Anyone out there got a Toshiba Tecra 8100 running W2K? I still can't figure out how to get Q3A to work. The laptop advertises a 3D accelerated MobileSavage chip, but frankly I see no acceleration. Some DirectX games also don't get translated well; 4X4 Evolution behaves more like Drivin' Miss Daisy. I heard in win98 for MobileSavage the framerates are at least respectable, but I don't have the luxury of changing OSes (mercy of the IT department).
  • Intel has their Side Step dual-power CPU's, why not a dual power video card too? I for one would think it'd be great to run it at Xvolts when unplugged and XX volts when docked. I know heat may be an issue, but theres got to be some creative heat ducting. I'd pay a few bucks extra for something like that, and be one more step closer to ditching my hulking tower.

    ----------------------------------
  • Are the LCD panels hooked up digitally or throught the DB15? My understanding is that you bypass a lot of analog crud by using the dedicated digital out (if your card has it), like the laptop would.
  • I currently own a Toshiba 4340. For all intent and purposes it is a leading edge Labtop. It has a 650 PIII with 128MB RAM and a 6X DVD Player. Its only problem is that it has a S3 Virge 8MB Video Card (64bit). I looked for a labtop with a more powerful video card but unfortunately there are none to be found. 8MB is the max amount of memory you can get in a mobile video card. Some feature 128bit instead of 64bit, like on my card, but that is basically the most you can get for a mobile system. Depending on what your viewing (AVIs, MPEGs, DVDs), it occasionally freezes for a 1/3-1/2 a second. So far I have not had a problem with games, but I hope that they can put more RAM into a mobile system. (And then find some way to upgrade mine :) )
  • See...there might be a minor issue here.

    The lack of suitable Power Supplies.

    Currently, anything from 200 to 400 W AC-DC power supplies are common. If the power consumption of components drop significantly, the extra power could be dessipated as heat (somebody correct me here if I'm wrong). Thus making the power supply hotter...

    Still, I look forward to the day when we can have ultra-powerful components that consume so little power that we can all go back and use our old 80 W power supplies...
  • I think you mean the size of 3dfx graphics chips is getting bigger and bigger lately (look at the V5).

    My nVidia GF2 MX is smaller than the TNT2 I replaced it with, which is turn was smaller than the original TNT it replaced.

  • If you hadn't noticed, NVIDIA's cards (GeForce2 GTS, GeForce2 MX) are LESS expensive than the comparable ATI models (Radeon DDR 64MB, Radeon SDR 32MB) Not only that, but performance is mucho better.
  • For a while, this is the reason i have claimed I do not have a laptop:

    "If I get onto a plane with my laptop and a hub and a couple friends, I want to be able to play CS/Q3A/TDR2k at a decent resolution and framerate. When this is possible, I will purchase a laptop."

    It looks like I'm going to have to buy a laptop sometime soon.
  • Er, the TNT I replaced WITH IT...stupid brain...
  • TNT2 even... OK this is getting lame, I think I'll go hide my face in a paper bag now. :(
  • Actually, the refresh rate of an Active Matrix screen is around 100hz (I think, at least it should be capable of that.)
  • Good god. The average PC chews up about a high-power lightbulb or two's worth of power (My room as a 200W halogen bulb). If you can afford a PC, you can afford the electricity to run it. (around $20 a year)
  • 3 years ago, the fastest chip on the market was a PII300. At a $1200 pricetag no less.
  • Where are you buying power from so cheap?

    .2 KW * 24 * 365 = 1752 KWh per year

    I'd have to check, but i think i'm paying around 8 cents a KWh. That's 140 bucks a year.

    And that's only 200 watts. My 20" and 17" monitors on my main system, suck around 250 watts just by themselves. Then the main case adds more. And in the summer, i have to pay to run my air conditioner, to get rid of that heat(winter is easier, leave the window open)

    Although, I'm now thinking maybe not every slashdot'er will leave their computer on 24/7 =)
  • So, how did they come up with that name then? Oh, those clever marketroids! First we had the GeForce; then came the GeForce 2. Ooh, now we have a mobile version, let's call it...GeForce 2 Go!

    By hey, smartypants, what's the next version gonna be called? GeForce 3 Go? GeForce 2 Go 2? Doh!

    --
    Barry de la Rosa,
    public[at]bpdlr.org

  • You know, given how well most laptops could handle 3d games, but for the non-availability of gaming chipsets, I sense a market for upgrades.

    I mean, ever laptop made in the last 2-3 years has had CardBus (32bit, 33Mhz PCMCIA) and ZoomedVideo (direct write from PC-Card to vidram) support - what's stopping the production of a PC-Card 3d accelerator for PC laptops? (I understand there was one for Powerbooks a few years back).

    I'd drop $300 on such an accelerator for my laptop.

    Just one data point,
    -Isaac
  • We've got some cheapo ones that have a standard VGA connector, but the better ones use some weird digital plug...
  • I have an ultra fast config for Q3. My PII 266 with a 4mg ATI RAGE LT Pro (laptop) gets a huge 29 fps!! even higher if a server is used.
  • Very, very impressive stats. It's just that the 286 Megapixels sounds a bit familiar somehow. Couldn't they have publicized it as 285 or 287 somehow? This way Joe Schmoe will think he's getting a 10Mhz videocard.

    If I ever meet you, I'll Ctrl-Alt-Delete you.
  • Why in god's named you turn APM off?
  • Here's some more info on NVidia vs ATI from a financial viewpoint. http://www.fool.com/news/breakfast/2000/breakfast0 01110.htm [fool.com]
  • I don't think you want APM turned on, on a server :P
  • My main interaction with LCD screens (apart from my 266MHz laptop) has been on PIII-800 desktops with £2000 LCD panels and TNT2s.

    I wouldn't even bother trying to play anything with more animation than solitaire on it....

    *shrug*

    I'd like to be proven wrong... but until I see a £300 LCD screen running at 1600x1200 and playing a good game of Q3, I won't be impressed ;)
  • I'm a huge gamer, but I never seem to ever do any of it on my laptop. Is it just me or is there a market for this?
  • I never claimed whizbang superiority over anything. I see overkill all the time, but that doesn't make it less so.
  • Battery life.

    How about a video chipset that consumes less than one watt of power? Wouldn't *that* make more sense?

    Out of all the world's laptops, how many are *really* being used primarily for 3D video games?

    Out of all the world's laptops, how many are *really* being used for wordprocessing, spreadsheets, e-mail and other simple data processing?

    Right. So why, oh why, do the dumb knobs keep focusing on stupid things like clockspeed, 3d video, dvd players and shit like that?

    How about a nice 200MHz ultra-low-power CPU with a nice fast-refresh, accelerated 2D video card with rock-solid drivers; a nice, low-power hard drive; a good 128Mb of low-power memory; and a ultra-hi-res screen (one of IBM's 200dpi ones!) with a super-reflective backplane that reduces the need for backlighting?


    --

  • You need to go back and check out recent LCDs because they are much better now.

    Most recent one's I've seen don't "ghost" like you mention. I would still reserve judgement until I see a this chip on a laptop.
  • When I was in Viet Nam, I inserted more than my "video card" into the Vietnamese. I even used SCSI
  • With mobile versions of Geforce and Radeon coming and Transmeta doing similar things for the CPU, I wonder why the mobile market is the one driving this.

    Just because my machine is plugged in doesn't mean I shouldn't care about power consumption. I still pay for the electricity I use. I don't want to burn more coal so my GPU and CPU can chew up energy while my machines are idle. Desktop users won't sacrifice performance over efficiency, but why should any CPU/GPU design use more power than is needed at any time. This shouldn't just be a requirement for a mobile design.
  • Read the article...

    It says that Toshiba will sell A machine with this chip (the Satellite models)

    Available on Q1 2001
  • Do this sometime: go into Q3DM12 (The Dredwerkz) and invite no bots (go alone). Jump onto the rocket platform with the door. Enter the hallway with the two doors, then stop and turn to look at the rocket launcer just as it closes. When you're there, with the door closed, type this into the console:

    cg_drawFPS 1

    Look at the top right corner; there should be a number with the current FPS reading. Take a note of it with the door closed. Now, walk up to the door so that it opens. The FPS rate should drop significantly. On my P3-500 320MB RAM with a GeForce 2, the reading is 90/45. My dual Voodoo2 on a Celery 466 with half as much RAM got around 45/23.

  • Being able to release my anger in a couple of "headshots" right after work when I'm wasting time in the train anyway, yep, it seems there's a market for this...Not to mention being able to take your laptop to a friend, hook it up to a spare monitor and game the night away...

    If I ever meet you, I'll Ctrl-Alt-Delete you.
  • I know that this chip consumes too much power to be considered a candidate for Handheld 3D use...

    That out of the way, I've had a few discussions regarding this and the general consensus is that it will most likely happen...lets take a look at what the requirements would be...

    320x320 max resolution (this is 2x the resolution of the current Palm...this allows for advances in LCD technology)

    Low Framerate requirements (LCD screens have low framerates compared to monitors)

    Integrated Chip...Just like all of the other handheld chips, this chip needs to provide the function of display, sound, and CPU.

    Low Power Consumption...probably the most difficult requirement to meet...you need to keep power consumption low enough to allow for 8 hours continuous on...

    Now, why do I say that there is a need for this in the market???

    3D games - Now, this is not the only reason for 3D on a handheld device, but this is simply an evolution of the technology.
    3D Graphics Artists - This is a market that has not been tapped yet. Imagine being able to do 3D modeling on the Palm...even the ability to create 3D presentations using the palm...which brings us to...
    Business/Science Applications - this is a market that might seem small on the desktop...but I belive that 3D Apps would have a huge market in the handheld industry...Data Analysis (graphs, charts, scientific results), Astronomy, Medical Reference, Mathematical Analysis, Maps

    I'm sure there are many other uses that I've not thought about, does anyone else agree with me or am I just FOS???

    Just my $.02 worth.
  • I can get a Windows-Based laptop, and get similar graphical performance as a PowerBook Pismo (which I have been looking into, for both performance and reliability, and I am a die-hard Windows person).... Mmmm..... About time that ATI had some competition in that department too.
  • It will be nice to have a choice (hopefully) next time I buy my next laptop. After my experiences with three previous ATI cards (Xpert XL, Xpert 98, Rage Pro) in my desktop plus the Rage Mobility chipset in my laptop (Dell Inspiron 3800) I will NOT buy another computer with an ATI product in it ever again. Their drivers are horrible and consistently horrible to boot.

    Bring on the new chips, my credit card is waiting...
  • Tell me about it. Lugged my tower + monitor + keyboard +mouse + powercords +extention cord + surge strip +chair + card table.

    laptop all I would have needed was the laptop and powercable. Couch would have been comfy for playing CS

  • Quake during psych lecture sounds good to me... now where was that crossover cable?
  • Maybe because some of us "dumb knobs" could actually use some of the stuff you knock so casually. I do highly graphic intensive 3D simulations of manufacturing systems for a living. I will max out any graphics card you put in front of me and be begging for more. Same with RAM and same with CPU. I'm an engineer, not some management dweeb with more machine than I can use.

    What's more I have to travel since we have several hundred plants in my company spread across the globe. It is highly inconvenient to tote a full desktop unit on the road and a laptop with good 3D graphics could help immensly. Right now even the fastest laptops out there just can't work fast enough for me to use them for anything but the smallest models. Try modeling a 150,000 square foot plant in full 3D and without heavy LOD you'll choke any PC on the market today. Never mind doing any numerical analysis on it.

    There are people who can use fast laptops. Don't criticize just because you aren't one of them.

  • Why is it that every piece of hardware that comes out seems to come with a million comments saying how it isn't 'this' or 'that' and they need to face up to the truth and do something impossible?

    The fact of the matter is, you have nothing to base your argument on beside how 'typical' tends to mean something in particular. If you take in in the true sense of the word, it means the chip will use less than .8 watts at certain times also. There is no data for either argument though, so the point is moot.

    My guess is that the 2.4 watts comes into play when running 3D apps. That's not bad at all, IMO. Just try to find a 3D card out there that uses less.


    _______________
    you may quote me
  • now I don't have to waste time going to the article. short and sweet
  • Build your own system using PC-104 components and you would easily clear 80W.
  • For those who don't mind wearing a few extra battery packs this could be very interesting. Kind of extravagant, but way swank all the same.
  • I still love my atari...now that is a solid product! hehehe
  • Laptops already get pretty hot, I assume that with the lower power consumption they also solved the heat issues with their chipset, as there's little to no room for a heatsink, much less a fan.
  • Since tom'shardware is slashdotted, you might choose to while away the hours gazing earnestly at Nvidia's press release [nvidia.com]. A good sample quote:

    "GeForce2 Go allows business users, artists, and gaming enthusiasts to create, present and entertain anywhere, anytime."

    Remind you of a certain software giant's claim of "anytime, anywhere, and on any device"?
  • .. I would also guess that this chip may not be suitable for all laptop systems, but more probably the big power hungry versions that people tend to use more as transportable desktop systems rather than portables.

    However this is the type of system I'd be interested in; being able to plug my laptop into someones office network and still kick ass at Quake is a Good Thing (TM)!

    Anyone who knows a laptop with one of these babies should let me know...
  • And now for a mobile broadband connection and a cheaper version of that IBM 22" screen.

    Fragging on the move, the ultimate stress relief for burnt out travellers! ;-)

  • Finally we'll have quality 3D acceleration on a laptop running linux. Combined with the nvidia drivers (ok, so they're not open source, but closed source is better than no source), one can finally have a decently fast setup.

    The ATI cards seem to work fine, but have always been lacking in the performance area.
  • The point is that LCDs are getting better all the time and that that blur is as much the fault of the cheapass vid cards they put in laptops as the fault of a a LCD problem. So this will help and it will be very cool.
  • Don't believe figures. 17.2 million triangles is unshaded untextures triangles, and probably includes half of them being culled. Texel rate is down to memory speed. 128 bit memory is not a lot of use when you get single 32 bit texels at arbitrary locatuions. Reading 4 at a times not a lot of use unless you have flat poly the same size as the texture. Adding a decent Z-Buffer depth hogs yet more memory bandwidth.
  • Does anyone else fail to see the point of this? I mean, on most LCD displays, even slowly dragging a window around the screen results in a hideous blur... What's the difference between a 10fps blur and a 50fps blur?

    I'm not going to even bother going into the power demand and heat problem...
  • I've got a Dell Latititude CPx (PIII 450) which comes with the rage mobility chipset + 8mb. This has been able to handle all I need to do and then some. Work related tools run fine, and I don't really use the laptop for gaming (well, maybe to test video performance). Of the few games I have installed, most everything (Dungeon Keeper II, NBA live, Re-volt)ran well except Q3 and UT, which looked like crap, but at least loaded.

    Although this may be the 'bees knees' for some folks out there, I just can't see the business minded folks really needing geforce2-like performance sitting in their collective laps.

    The only consumer base I can see really drooling over this is maybe students looking to buy a well-rounded (yet small enough to take to class) system. Well, maybe a few folks out there really go for laptops that rival desktop machines, but for me personally this is overkill.

  • Oh man. This made my day. ROFL!
  • Plus or minus a percentage point. Most power consumed is immediately released as heat, since little is radiated as light and even less is converted into another stored chemical/physical form.

    Their 0.8W figure should be sufficient.
  • The GeForce2 GTS is very much worth it, especially for anyone who would like to toy around with some 3D games on Linux. A bit tricky getting the drivers going, but worth the trouble. For the first time, I actually find myself content with overall video performance of X. Video quality has truly come a long way for Linux.
  • The drip tray, recipe guide and endorsement by George Foreman, however, are included.
  • Half the performance of a real G2 still gets you the register combiner extensions (for hardware spotlights etc), on-card geometry (even if at half the T&L speed, they still cut on DMA transfer), accelerated render-to-texture, full 32bpp, 8 bit stencil buffer (shadow/mirror effects), and the best drivers (in terms of raw speed, features and crossplatform support) on the market.

    And I think T&L is more than twice as fast as a common CPU doing the same tasks, especially considering the typical CPU in a laptop, and also considering that T&L covers more with every new release (more lightsources, vertex blending, vertex programming, etc). So you still win.

  • by Anonymous Coward
    NYidia announces Mobile GeFilte 2 Fish!
  • I love playing Unreal Tournament with an ATI Rage ProLT chipset on a laptop. Actually I did try it but it worked much better in software mode. This actually sounds like a good thing

  • like 3DFX does. I love my 3DFX cards, but the new cards just didn't deliver.
  • by AFCArchvile ( 221494 ) on Monday November 13, 2000 @09:23AM (#626831)
    Back in my senior year of High School (okay, back in April 2000), I took my tower to school for an Astronomy demonstration. I wanted to demonstrate the environments of Venus and Mars. So, I fired up Q3 and went to Q3DM14 for Venus and Q3DM10 for Mars. I hooked up my computer to a projector which projected the image onto the 25-foot tall hemispheric planetarium dome. Everyone laughed when I dove into the Fog of Death.

    Later that day, I hooked up the tower to a monitor and proceeded to play Q3 and Unreal Tournament during my study hall and PC Practicum class. My friend even tried out Q3 and a little Q2.

    Now, if I had a laptop with the GF2Go in it, I could've carried 20 pounds less of equipment (I lugged around the tower, keyboard, and mouse; I found a monitor wherever I could). For E3/COMDEX reps, this means the difference between a good impression and a questionable impact.

  • How about a video chipset that consumes less than one watt of power? Wouldn't *that* make more sense?

    0.8watts typical power consumption, 2.4watts maximum power consumption

    Looks like you didn't read the article.

    Out of all the world's laptops, how many are *really* being used primarily for 3D video games?

    How about those coders (like myself) who code in 3D while on the plane, or in bed, or wherever? I hate the fact that I can't see my results immediately while I'm programming. I don't want to transfer it to my desktop.

    And, again, like the other poster said, laptops aren't used for 3D typically because they can't be used for 3D typically.


    _______________
    you may quote me
  • I mean, on most LCD displays, even slowly dragging a window around the screen results in a hideous blur...

    That's exactly the point! Do you have any idea how much calculating power it takes to produce that kind of transitional effects?!?

    If I ever meet you, I'll Ctrl-Alt-Delete you.
  • At laptop resolutions, the MX holds its own against its bigger, badder DDR brothers, so anyone wont to complain that they can't have a mobile GTS or Ultra can just STFU.

    OTOH, Even the biggest, baddest, hottest overclocked laptop on the planet would suck for gaming. Fast moving objects look a lot blurrier than they're supposed to on LCD screens, so a CRT output monitor would be an absolute necessity.
  • Your system was overkill to the point of insanity as little as three years ago. This is just the normal progression of the market.
  • I've always wondered why this isn't done for other products as well. Why don't they sell cheap, but *solid* desktops? With the number of products available for a nice socket 7 solution, you can easily pick the most reliable ones and make a very good machine, for little money. The only cheap machines you can get now are substandard crap. I want a cheap machine that's cheap because it's tech is old, not because it's bad... And yes, dammit. I *love* my P90! And it was cheap! ... I guess. I mean, it's really a Frankenstein('s monster... to be correct ;) of a machine... but I digress. Give me cheap, old, *solid* tech!
  • Many people have laptops as their only machine. They don't have a desktop to put a 3D card in to play games on. These machines also spend 90%+ of their life running plugged in. If it saves having to buy a second machine for doing 3D and all the headaches associated with dealing with 2 machines it's well worth it.

    I want to be able to play UT on my laptop!

    This is a fast complete 3D chip that is plenty of power. There is just such a difference when you have a real 3D chip. My laptop has some crappy NeoMagic garbage in it and it can't do squat for 3D.

  • >>0.8watts typical power consumption, 2.4watts maximum power consumption

    > Looks like you didn't read the article.

    The fact of the matter is - 'typical' power requirements tend to be just above the least, and that means very little in terms of large screen refreshes (i.e. sitting in one app and working on that for a long time, then finishing and moving on) - all the alt-tabbing between applications will cause it to go much closer to that 2.4 - and while any chipset is going to have a spike during that, his point is that they should face the fact and try to get their high-end closer to 1, not their average. I know personally I switch desktops in X a lot on my laptop, a little too much, and that causes fullscreen refreshes up the ass.
  • I assume you capitalized the A implying that only one measly machine will use this product. Toshiba Satellites are pretty popular. When I sold those things several years ago, they were one of our best selling laptop lines.

    -B
  • Uh, I don't mean to flame, but I am uncertain whether you are kidding or not.

    I play quake at above 1024x768 at around 80 fps.

    I have also been doing some early work with .Net as far as evaluating the security of the runtime, so I feel I can speak somewhat on it.

    In a larger sense with graphics and things like .Net, you have two options as far as graphics. One, you can send screenshots too the client or two, you can send code to the client and have them generate their own imagies.

    As far as sending screenshots of 1024x768 at 80fps goes, bandwidth to do that just isn't going to be around for your average joe for a while. I'm pretty sure that would bog a 100mbs ethernet line. If you run the code locally, you still need the same 3d acceleration.

    Also, there are mad latency, etc issues with streaming screenshots.

    This 3d accelerator is a _good_ thing.
  • an ibm T series thinkpad. I will name my children after you.
  • Ingredients:
    1. Laptops with these badboys!
    2. AirPort with HUB
    3. FirstClass Seats (or HUGE batery life:)
    4. Its on......

  • Of course, if you're computer is less than 4 years, old, its sucking up 10w for 2/3s of the day. (Unless you use it more than 8h/day) So immediatly, your figure of 140 drops to ~50. Then you factor in the fact that you probably don't use it every day, and you've got ~40 or so. So, I underballed it, the actual figure is more like 80-something under your usage load. For me however (maybe, 3-4 hours at most) with a lesser set of monitors, the cost should be around $30 or so.
  • Ya know what this means?
    "Desktop Replacement" laptops just became a lot nicer looking. ;-)
  • The Radeon can also do this. The .18 micron process is great (I assume NVIA uses the same.) I just wish 3dfx would upgrade from their existing .25 micron fabs to .18. An accelerated Voodoo (moved to .18 micron) with lower power consumption would probably compete favorably against ATI and NVIA.

    ATI hasn't had any real competition in the mobile market. I don't think NVIA will steal too much from ATI with the mobile Radeon a month or two away. They will however help speed up the previously stagnant mobile accelerator market. Competition is a very good thing!!

    Willy
  • That would be fine if I were talking about laptops, but I was refering to handhelds...and handhelds don't use Active Matrix...Handheld LCDs are pretty much a step up from your watch (same technology)...one of the reasons for the slow refresh rate on handhelds is power consumption...higher refresh rates require more power...
  • Ars Technica did a review on that Apple 22" display. The conclusion was that it was almost perfect. The problems were price (as expected) and the fact there was blur.

    Now that Apple 22" is about as good as it gets for flat panel displays. Much better then most smaller (cheaper) displays as far as blur goes (or so the article states.) However, in the end Ars recommends that gamers stick with a CRT. The flat panels (regardless of brand or style) don't cut it for gaming. Great for office work though!!

    Willy
  • by hattig ( 47930 ) on Monday November 13, 2000 @09:00AM (#626848) Journal

    I don't know about most people here, but that mobile GeForce chip is still more powerful than most of the desktop graphics chips being sold today. Now I for one wouldn't mind having it in my machine, without requiring any fan or heatsink on the graphics card, which has been the problem recently.

    Still, this will make one powerful laptop, although destined more for the power laptop (mostly plugged into an external power supply) rather than the mobile laptop (low power requirements from battery). Don't expect to see one of these in a Transmeta powered laptop soon. Maybe a die reduced ATI mobile chip will make it there, where power consumption is the priority and killer 3D graphics are less important.

    In a year or so these chips will be amazing. 0.13 micron

  • You obviously haven't used a laptop manufactured in the last 4 years. Things have gotten much better. My nice 15 inch TFT display has zero blur.

    And yes, I would love to be able to play some of the latest 3D games on my laptop. As it is right now, Quake (the original) is just playable on my crappy, supposedly 3D excelerated ATI Rage 'mobile' chipset.

    -josh
  • I really dont see the Pro version of the Ultra as a big need right now.

    Why bring something out that really isn't needed if it is basically just a faster version of whats there already. A Quatro is fine for most professional users. Pro users couldn't give a flip if they can now render at 90fps instead of 75 like they could with the Quatro.

    Basically, I see the Ultra as something that they did to satisfy the masses. Before it came, most gamers were saying The GF2 is good, but not much better than a GF1, and the Voodoo 6000 will probably beat it a bit. All Nvidia just did was provide the super highend card for the 15 people who will actually buy one. Just like the voodoo 5 6000
  • The external heatsink can be bought as an accesory later on...

    If I ever meet you, I'll Ctrl-Alt-Delete you.
  • The GeForce 2 MX. It comes in PCI and AGP, with SDR and DDR memory. But seriously, the GF2GTS is worth it.
  • True. Assuming we have the power saving turned on. :P

    First thing i do when i get a computer to play with at work or home, is shut the APM off.

    And yes, for normal users, normal usage, your figures are quite right.
  • by iceT ( 68610 ) on Monday November 13, 2000 @09:01AM (#626854)
    What's the difference between a 10fps blur and a 50fps blur?

    40 fps. Is that a trick question?

  • If LCDs lag graphics chips, then there's room and incentive for LCD manufacturers to raise the bar and produce better LCDs. It's the same vice-versa: if graphics chips lag LCDs, then there's room and incentive for chip manufacturers to raise the bar and produce better chips.

    You've already seen this same interplay on the desktop with cpu speed and software complexity. Welcome to laptop land.
  • Just what I have been looking for...A real videocard in a laptop!
    Anyone interested in a Dell Inspiron 7500??? :-)
  • How about a desktop Transmeta as well!
  • Out of all the world's laptops, how many are *really* being used primarily for 3D video games?

    Out of all the world's laptops, how many are *really* being used for wordprocessing, spreadsheets, e-mail and other simple data processing?

    Right. So why, oh why, do the dumb knobs keep focusing on stupid things like clockspeed, 3d video, dvd players and shit like that?

    Well you obviously have a point that a laptop oriented to the business world will generate more revenue, but that's simply because that business world has more cash to spend on such things. The reason that no laptops are primarily used for 3D is that there are no laptops suitable to gaming. Just ask anyone who ever tried to get a reasonably hightech 3D gaming going on a laptop. Seeing as the games industry just keeps growing (it eclipsed the movie industry quite a while ago already), I can see how a laptop with decent 3D support finds its market both with developers and gamers..

  • "Very, very impressive stats. It's just that the 286 Megapixels sounds a bit familiar somehow"

    Actually, the Voodoo3 2000 had a fillrate of 286 megatexels. That's probably why it rings a bell. I don't mean to steal your thunder, but it's true.

  • NVidia claims to have two lines, a "gamer" line (the GeForce) and a "pro" line (the Quadro), with slightly different features. Actually, they're the same chip, and GeForce boards can be converted to Quadro boards by changing a jumper resistor. [geocities.com]

    With the release of the GeForce 2 Ultra, NVidia's fastest "gamer" board is now faster than their fastest "pro" board. There's no Quadro product corresponding to the GeForce 2 Ultra. It's not clear if the Ultra is crippled, like the older models, to maintain the gamer/pro distinction. Does anybody know for sure? (Asking NVidia and ELSA (the last remaining Quadro board maker, now essentially a unit of NVidia) produced no useful response.)

    In any case, it's very clear that the gamer/pro distinction has very little life left in it. The low-end chips are now better than high end stuff of two years ago. The high end guys don't get enough sales volume to pay for the IC design needed to keep up. So most of the pro-only graphics board companies have dropped out.

  • Low pwoer is great. But ATI didn't win because they had the lowest pwoer consumption of any of the options. They won because those chips they sold to motherboard makers were dirt cheap while (barely) 3D capabale and not a total off-brand.
  • by mbell ( 80219 ) on Monday November 13, 2000 @09:03AM (#626872) Homepage Journal
    This is the first really exciting product announcment of Fall Comdex '00 thus far, though Comdex doesn't officially kick-off till tomorrow morning.

    The actual figures that pertain to the GeForce2 Go chipset are not that impressive off the bat, especially when compared with Nvidia's GeForce2 and GeForce2 Ultra products. However, one must keep in mind that this chipset is aimed at the mobile market, and its performance is truly geared in that direction. For instance, Nvidia has tried to reduce the power consumption of their chip, making for increased battery life of the overall mobile system.

    Heres a feature list:
    ---------------------
    Built on a .18 micron manufacturing process
    Based on the GeForce2 core
    143/166MHz (core/memory) speed
    17.2 million triangles per second
    4 Texels/clock
    286Mpixel/s, 572 Mtexel/s, 2.6GB/s memory bandwidth
    0.8watts typical power consumption, 2.4watts maximum power consumption
    AGP 4x support, with FastWrites
    HD Video Processor/DVD decode
    Nvidia Shading Raserizer (NSR)
    TwinView & Digital Vibrance Control
    32/64/128-bit SDR/DDR configurations
    8 - 32MB of memory
    Integrated dual-channel LVDs

  • I love my 'desktop replacement' laptop to death. Never mind the fact that it scorches my lap on long plane flights and cuts of circulation at my knees - I just can't go without the power of a full desktop. My one niggling complaint has always been 3D graphics performance, and it is good to see that a high end card manufacturer is finally taking this market segment seriously.

    I don't game much, but it is a shame that you can buy the highest end laptop, pack it with a 30 Gig hard drive, 15 inch screen, 512 MB of RAM, and you still can't play Quake II at reasonable frame rates.

    -josh
  • by Bert Peers ( 120166 ) on Monday November 13, 2000 @09:07AM (#626879) Homepage
    No, it's not great because there can now be a laptop with a Linux supported accelerator.
    Yes, it is indeed totally pointless for business applications
    And yes, relatively simple 3D games already run fine on the current chips.

    But, have you ever tried to take your cutting edge 3D game, development tool, or engine to a tradeshow like ECTS or E3, to give a demonstration ? Currently, you can choose between either taking your full tower (right), a laptop with crappy 3D support, or a couple of demo CDs -- hoping for the best concerning publisher's hardware & driver uptodate-ness. A laptop with a cutting edge 3D chip with proper driver support would rock, which is exactly what NVidia has been delivering, save the "laptop" part.
    Granted, it won't generate the revenue of a business model (well, maybe when VRML kicks off or something), but there are many (would-be) game developers waiting for this thing..

    And about the screen part; when giving a demonstration a decent screen or even projector is usually available. It's the hardware+drivers that are the risk.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...