Pagafyr Posted March 12 Share Posted March 12 Thx Werne, I only delved in the exciting news, and thought about buying one of the two Star Wars Video Cards back then. The Jedi one is what I had on my birthday list that year. Link to comment Share on other sites More sharing options...
Werne Posted March 12 Share Posted March 12 Ah, you mean the Titan Xp Collector's Edition? They looked amazing but were waaay overpriced, one of those cards cost more than my last PC and current one combined. Titan Xp was already too expensive for what it was (essentially a 1080 Ti with a different name) and the Collector's edition Titans had a premium on top of that. Still didn't stop me from wanting the Sith one. By the way, it's kinda sad that 8 years ago $1000+ Titans were seen an overpriced luxury product and now we have high-end gaming cards in the $1200-1500 range. Link to comment Share on other sites More sharing options...
Pagafyr Posted March 13 Share Posted March 13 Only a year ago Nvidia produced a 4090 for gamers. I about went through the ceiling when I saw the price tag. $4000.00. I still have the Jedi Titan XP with the AI enhancement. That was one birthday present I didn't expect I would get. That Nvidia Jedi Titan Xp works for games which are reported as should have at least a Nvidia 3070 to get the best performance with the games. Edit: I'm not absolutely sure. I think the reason it is as good as it is, is because of the way the AI handles it. Link to comment Share on other sites More sharing options...
Werne Posted March 14 Share Posted March 14 I think you might be a tad confused there, Titan Xp is Pascal architecture (it's what the p stands for in the Xp) and Pascal doesn't have tensor cores nor does it support any of the AI stuff, it can't even do hardware ray tracing and does it through software like AMD cards. It was Volta with RTX 20 series and Titan V that introduced tensor cores and started the whole AI thing. In terms of performance your Titan should be similar to my RX 6650 XT so yeah, it does still perform well, high-end cards in general can go for quite long before they become the bare minimum spec, they usually get killed off by lack of driver support. If you play at 1080p it'll be perfectly fine for a few more years, especially with frame generation tech appearing in new games, that stuff is just pure magic. And now I win. 1 Link to comment Share on other sites More sharing options...
Pagafyr Posted March 14 Share Posted March 14 Thx! I keep Advertising. I should probably sell SAMS. I could sell a thousand Electronic Blueprints. Would you like a SAMS for the next generation of Nvidia Video Cards that should be available legally in the 1st Qtr of 2025? Did you study the Nvidia 4090? Would you buy one? I WIN! Link to comment Share on other sites More sharing options...
Werne Posted March 15 Share Posted March 15 Buy a 4090? Never. It's way too expensive with a ridiculous power consumption and heat output, that stupid 12VHPWR connector with the dongles that loves to melt, huge size, sagging issues and some AIB PCB cracking. And then there's other things that put me off like GeForce Experience requiring a login, a proprietary software/hardware ecosystem, my poor personal experience with Nvidia drivers and their crap customer support, Nvidia's shady business practices and, worst of all, overclocking on RTX 4000 series being more frustration than fun. GTX 1060 6GB (RIP, it will be missed) is the last Nvidia card I've owned and it will be the last one until they start making hardware that comes with less cons than pros and start treating their customers and AIBs with some respect. Anyway, Starfield got really really buggy with the new updates. I haven't played it in about a month and while the performance improved since then, the game itself got buggier. Ah well, it is a Bethesda game after all. 1 Link to comment Share on other sites More sharing options...
Pagafyr Posted March 15 Share Posted March 15 (edited) Starfield had a promotion to buy a video card to meet its requirements that was kind of pricey. I turned my NMS Star ship around and headed for the Space Anomaly instead so I was Out of Atlas'es sight through those sentinels eyes while I watched as the others posted their wondrous discoveries and compared everything the Adverts hyped with reality. What do you have to share about the MoBo built in video cards? I just turn the MoBo built in on any new replacement computer Off the minute I am signed in. Seemed some people think they support rather than interfere with RAM and VRAM when swapping? Might be messing the GPUs up too instead of supporting? Without getting Indiana Jones out of retirement to see if he can find my work room filled with all the stuff for electronic examining like a dual Oscilliscope, a monster multi-meter, and such buried under a pyramid of dustmites holding hands protecting their dustmite eggs, maybe some termits too; I'm not going that deep into the pile of dust covering everything. Edited March 15 by Pagafyr Under the spell of Atlas, I am. Link to comment Share on other sites More sharing options...
Werne Posted March 16 Share Posted March 16 For me the game was free thanks to the AMD promotion, got two codes with my hardware. The performance did improve quite a bit in the past month but the bugs are standard Bethesda stuff, physics spazzing out and random crashes when opening/closing a menu, I can live with those. The gameplay is meh but the basic framework to make a decent game is there, it just needs a lot of work, also needs a fix to the inventory system cause holy crap it's bad. Luckily, it is a Bethesda game so mods are available, if the devs won't make it good the modders likely will, provided they stick around in the first place. As far as integrated graphics go I can only think of three reasons to use it - low power video transcoding (slower than a real GPU though), driving large multi-monitor setups and diagnosing a GPU failure, otherwise they just take up RAM for no good reason. Some mobos actually allow you to set the iGPU as disabled unless there's no graphics card present which is kinda the best option in my opinion, it's there when you need it but gone when you don't. The iGPU doesn't support RAM/VRAM or the dedicated card in any way. That said, using an Intel iGPU for transcoding streams was once a viable option when streaming games cause it lightened the load on the GPU/CPU which can kinda be called supporting a GPU I guess. Kinda useless nowadays since pretty much all cards have dedicated hardware encoders. Anyway, I win. 1 Link to comment Share on other sites More sharing options...
Pagafyr Posted March 16 Share Posted March 16 I just finished watching the last TV show linked to the The Sandman collection. After watching the last show about an author who was having trouble writing his next book, I feel exhausted. Thx for the info. Sort of lucked out then by choosing to turn them MoBo built in card Off. Now I need a nap. Be well. I WIN! Link to comment Share on other sites More sharing options...
Werne Posted March 23 Share Posted March 23 Dragon's Dogma 2 appears to be quite a mess, I was thinking of getting it so that kinda sucks. Ah well, I win. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now