Jump to content

GeForce GTX 680 Giveaway, ENDED! Congrats to Zaz!


Bromm83

Recommended Posts

I have decided to get the new GeForce GTX 970, so I have a spare screen card! Reply here and tell me if it is something you need. 

I will pick one person, probably at random, that will get my old GTX 680 for free! I will select the lucky winner in a couple of days. 

 

Click here to see info about card

 

I will only select from replies here, not from personal messages.

 

I also have a 650W PSU if needed to run the graphics card.

Link to comment

Gee... I've got a 580 right now, and it works okay, so I'd feel kind of an asshole wanting a new card so quickly.

How I got the 580 is kind of a bizarre story... I'll just copy and paste an email I sent to a friend about it:

 

My video card died. Violently. I'm using an old one now. 
It was more like self destruction, though... I figured out what happened to it:
Apparently, although Starcraft 2 does have a framerate cap when in the game, it has no framerate cap when sitting on the menu. I left it sitting for a couple of hours and the video card expired. I honestly don't know how long it lasted before it did expire, since I left the room and didn't come back for about four hours.
It took a while to figure out what the problem was. I finally managed to get hold of a slightly older model video card, slapped it in, and tested out a LOT of the stuff I was doing previously, and only found out about Starcraft 2 yesterday.
With my "new" GTX 580, framerate sitting on the menu: 647 FPS. That's six hundred frames per second. The GPU temperature shot up from about fifty degrees to 84 within a few minutes (the card is rated up to 80 degrees).
Poor old video card. He did his best, he just couldn't handle the load. I'd guess the first thing that went out was the on-card cooling fan, followed very shortly by the rest of the card.
The rest of the components are fine, mostly - the motherboard was untouched, nothing nearby was hit except for a little tarnishing on the heat sink's cooling fan and a black mark and slight melting on the plastic window of the tower case. The adjacent wireless card turned mostly black on one side, but since I keep my computer plugged in and never use the wireless, it's not an issue. Oddly enough, the computer itself didn't even shut down when it happened, just kept running, and there was almost no smell of burnt electronics or plastic.
Let that be a lesson to anyone who plays Starcraft 2 - always, ALWAYS, enable VSYNC. Or don't leave it sitting on the menu screen, EVER.
BTW - Those temperature monitors that take up drive bay slots are crap. As I write this, MSI Afterburner says my GPU temp is 43 degrees, and the display on my case says the GPU temperature is 27 degrees.
The weird thing is that I was certain that the video card was supposed to shut down before it overheated like that. Or at least SOME system should have done. And I thought the 680 I had was rated up to 100 degrees, which even at ~650 FPS it should not have reached, maybe?
That was about a month and a half ago, and at this point I'm not entirely certain it was Starcraft's fault. Might have been some physical problem with the card itself, for all I know.
 
p.s. This is not a sob story. I'm quite happy with my 580. I wish I had my old 680 back, but my current card is working fine. :)
Link to comment

I have decided to get the new GeForce GTX 970, so I have a spare screen card! Reply here and tell me if it is something you need. 

I will pick one person, probably at random, that will get my old GTX 680 for free! I will select the lucky winner in a couple of days. 

 

Click here to see info about card

 

I will only select from replies here, not from personal messages.

I would love this card im running off a sub par internal build in graphics system so have to run my game on low spec i noly tent to raise the settings if im just going to be doing a simple screenshot and with 5 young kids my outlook for upgrading mine is far fetched lol

Link to comment

Lol , if its spare I could use a card :P ,I'm using a 9800gt still since my lappy crashed last November. But still saving up for a new laptop for Christmas hopefully

 

Cheers

 

That being said I wonder if my crappy dual core would support such a card

Link to comment

Pity my spare Geforce 610gt... Using it since my main died, and now the fan has tanked. You should see the dual fan setup (paper towel rolls work wonders in holding up fans against a heatsink) I have going in my case to keep heat off of it. (yes I know it isn't for gaming. it was for work but it's better than the integrated :( )

 

As for why my gaming rig has gotten to this pitiful state? You don't wanna hear that story x3

Link to comment
Guest endgameaddiction

Zaz, I think minimum for a card like that would be 500w, but that's me guesstimating. Dual core? yikes! I hope now, after some BS last month and this month I can start saving for the card I've been wanting to buy.

 

Good luck to who ever wins it and gets to put it to good use (don't include me).

Link to comment

Zaz, I think minimum for a card like that would be 500w, but that's me guesstimating. Dual core? yikes! I hope now, after some BS last month and this month I can start saving for the card I've been wanting to buy.

 

Good luck to who ever wins it and gets to put it to good use (don't include me).

 

550 is the minimum requirement for it, but good guess none the less.

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-680/specifications

 

Link to comment

Gonna throw my hat in on this one, GT610 currently and its having to work real hard.

 

 

Now I shoot myself in the foot, I agree with the Zaz suggestion. Incidently I have been looking at upgrade options recently and took the time to read up on PCIE format compatability ANY PCIE card will go in ANY PCIE slot whether ver 1 2 or 3. The differecne being the max bandwidth the pairing will achieve.

Link to comment

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...