• Announcements

    • Ashal

      SITE MOVED - IN READ ONLY MODE   12/08/2015

      Please use http://www.loverslab.com moving forward. Site has been restored to a previous version, and this one placed into a read-only mode. This is available for a limited time so users may reference/copy content that has been lost in the transition. This will no longer be accessible by December 22nd, 2015.

Archived

This topic is now archived and is closed to further replies.

D_ManXX2

Nvidia does not assign all memory ??

39 posts in this topic

But have you personally? That's the thing. You say this and google search. But have you actually had ANY experience in having an ATI card and actually playing games? That's what I'm calling you out on.


0

Share this post


Link to post

See where I said I have 16 years of experience?  


That's personal experience.  The search was to show you that it's a wide spread problem.


0

Share this post


Link to post

However, if you look clearly, most of the problems stem from the HD 5000 series and below. However, I can accept that not all ATI's cards/drivers are perfect. But then again, you should also accept nVidia doesn't have a stellar record either too.


 


Let's just agree with what we know and leave it.


0

Share this post


Link to post

Why ?? 70 is not that high, these card can sustain over 90 to 120 degrees on constant load.

 

 

 

 

70's insanely high, honestly.  55-60, depending on the card, is the point at where the card pretty much starts to force a crash in order to prevent damage.  If you're higher than that, then don't expect much life out of your card.

0

Share this post


Link to post

yes i always thought that was somewhat high, since my CPU and motherboard never goes above 45-55 degrees. Any idea how to apply thermal paste on videocard ?? I never touched it since i was worried of losing my warranty.


0

Share this post


Link to post

If you're worried about your warranty, I'd take it to a certified repair center, or, as I said in a previous post, look into getting a liquid cooling system.


0

Share this post


Link to post

nVidia says the max temp for the GTX 660 is 97 Celsius
Although keeping it under 80 is recommended (this doesn't mean you should try and freeze the thing)

0

Share this post


Link to post

problem is, OpenGL "IS" third party, who do you think makes it? Nvidia?  NOT Mesa3D and OpenGL is opensource which is why Nvidia uses it, it's free.


and the fact that you've been building PC's for 15 plus years matters zero. and you are trolling every time someone mentions ATI/AMD you go batshit nuts and start trolling with your openGL BS and third party BS. Does it really matter? NO. Most cards default to D3D in the case OpenGL is not present or the bloody game doesn't support it. Which most do not. so the whole OpenGL issue is pretty much moot. it's a toss up over who can pay the most to get which renderer supported in a game, Microsoft or Nvidia, guess we know who wins that battle don't we. Yes OpenGL is a damn good renderer but it sure as hell isn't worth all this. if it gets used well and good, if it doesn't just as well and good. it really doesn't matter.


0

Share this post


Link to post

Who's trolling?  I dropped the topic a few posts ago.  Don't bring it up yourself if you're not willing to do some actual research into what I'm saying, especially after I did all the work for you.  

 

It deleted half my post -.-  

Wth?

 

Anyway, temp tolerance depends on the card.  Higher value cards are going to have higher values for tolerance, tis why the more money you spend, the bigger the fan, they expect overclocking.  Some of the cards even come with something to overclock it with.  However, I've never had any good results from a card that was regularly going 70+, and having 70 as an idle temp is a very bad thing.

 


nVidia says the max temp for the GTX 660 is 97 Celsius
Although keeping it under 80 is recommended (this doesn't mean you should try and freeze the thing)

0

Share this post


Link to post

On idle my card is 35 to 40. heavy load it can reach 70 depends how warm my room is. if it is cold it will stay around 60-65 degrees heavyload especially skyrim. other games it will be even lower.


0

Share this post


Link to post

On idle my card is 35 to 40. heavy load it can reach 70 depends how warm my room is. if it is cold it will stay around 60-65 degrees heavyload especially skyrim. other games it will be even lower.

 

My idle temp varies.  If I have my browser open to some of the browser games I play, it's around 25-30.  Fresh boot, no browser, it's about 19.  The peak this card has seen is *checks logs* 56.  

0

Share this post


Link to post

I will try cleaning the fans of of videocard. since i have not done it in some time.  Maybe that will force the temperture down.


0

Share this post


Link to post

Temps on cards do vary wildly as LJ and others have stated.  You'd need to check the manufacturer to see if you are in tolerances.  I had an Nvidia 8800 that fresh boot idled at 65C.  No shit.  Under heavy load it climbed to around 90C.  After I did a bunch of research, I discovered that the damn thing just ran hot.  It ran this way for the 2 years I used it and it is still running that way now with the buddy I gave it to when I upgraded.


 


The current card I have is a GTX 650Ti and it idles around 28C.  That is less than half of the 8800!  Point being that it varies per card and per generation.


 


It is an excellent idea to clean out your case at the minimum of twice a year.  More if you are in a dusty environment.  It will help keep temps down and prolong the life of your components.


0

Share this post


Link to post

Why ?? 70 is not that high, these card can sustain over 90 to 120 degrees on constant load.

 

 

 

Go to bios and disable your onboard video. I don't see why it would do that unless you don't have enough ram for the video card to use. I use the same card as you (560ti 2gb) and I have all my 7.99/8gb of memory available. You mind posting the exact model you have? Sounds really odd. I'm running xp though, so it could be the os you have withholding resources.

 

My Onboard video card is already off. My OS is windows 7 64 bit.

The model is a GeForce GTX 560 TI 2 Gig Palit 3d Graphics Engine

 

Is this what you mean ??

This is really strange. You should find the firmware version on the sticker of the card and compare it to the ones provided online. I haven't heard of memory sharing since agp cards were out. I know sometimes when there's a mismatch on the nvidia firmwares that some things are unusable. 1024m/2048m yeah?

0

Share this post


Link to post