this post was submitted on 11 Jun 2023
57 points (100.0% liked)

Games

103 readers
6 users here now

Video game news oriented community.

Content.

  1. News oriented content (general reviews allowed).
  2. Broad discussion posts (not specific to one game).
  3. No humor/memes etc..
  4. Try to post high quality content.
  5. No advertising.

Comments.

  1. Be civil about arguments. No personal attacks.
  2. Obey instance rules.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it. I don't want it to be purely discussion oriented either.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
 

"Intel’s Arc A770 and A750 were decent at launch, but over the past few months, they’ve started to look like some of the best graphics cards you can buy if you’re on a budget. Disappointing generational improvements from AMD and Nvidia, combined with high prices, have made it hard to find a decent GPU around $200 to $300 — and Intel’s GPUs have silently filled that gap.

They don’t deliver flagship performance, and in some cases, they’re just straight-up worse than the competition at the same price. But Intel has clearly been improving the Arc A770 and A750, and although small driver improvements don’t always make a splash, they’re starting to add up."

top 17 comments
sorted by: hot top controversial new old
[–] TendieMaster69@sh.itjust.works 11 points 1 year ago (1 children)

Glad to see more competition in the GPU market, but the Arc cards are still not cheap enough to really take away much market share.

Userbenchmark still has the 3060ti +16% over the a770 and on the market currently the 3060ti is cheaper than the a770.

https://gpu.userbenchmark.com/Compare/Intel-Arc-A770-vs-Nvidia-RTX-3060-Ti/m1850973vs4090

[–] AllHailTheSheep@sh.itjust.works 19 points 1 year ago (1 children)

I don't know much about the performance of the a770 so I don't really know if that's right, but I wouldn't trust userbenchmark at all. they favor Nvidia massively and their ratings are super inaccurate. here's an article on some of it: https://www.gizmosphere.org/stop-using-userbenchmark/

[–] TendieMaster69@sh.itjust.works 6 points 1 year ago* (last edited 1 year ago) (1 children)

I don't trust anyone saying userbenchmark is biased without their own set of information to back up their claim. Using reddit drama as an excuse to not use a tool is weak.

The article you posted claims this:

"However, consider this: UserBenchmark mentions the NVIDIA GeForce GTX 1660 Super in their GPU section as faster than the Radeon RX 5600 XT."

This claim in the article is factually incorrect at the current time.

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1660S-Super-vs-AMD-RX-5600-XT/4056vs4062

[–] mantaba@sh.itjust.works 10 points 1 year ago* (last edited 1 year ago) (1 children)

Sounds unfair to criticize an article from 2021 for not being up to date with the ever changing metrics of Userbenchmark.

The point stands. Userbenchmark has announced and made changes to their own metric calculations because Ryzen Cpus were getting better scores than intel. It has a very clear anti-AMD stance that is clear on the written reviews and linked videos like the one in your screenshot.

Just from this comparison, I have no clue how Userbenchmark achieved the 8% given the values they're posting below. 8% is still also reasonably below what other reviewers posted at the time. https://www.techpowerup.com/review/sapphire-radeon-rx-5600-xt-pulse/27.html https://www.techspot.com/review/1974-amd-radeon-rx-5600-xt/

This just to say that they are pretty clearly biased and do not shy away from altering their metrics to favour one product over the other.

[–] TendieMaster69@sh.itjust.works 2 points 1 year ago (1 children)

Sounds unfair to criticize an article from 2021 for not being up to date

An article from 2021 which hasn't been updated is by definition not up to date. Neither of the articles you posted even have the 3060ti on the list. Nothing you wrote actually debunks my refutation.

[–] mantaba@sh.itjust.works 12 points 1 year ago (1 children)

You attacked the credibility of the article just for having some out of date information. Just because something they reference is out of date, does not mean the article is less relevant. It's still things that happened and they should weight on how you view Userbenchmark as a source of information.

I am replying to your userbenchmark defense. Of course the articles I posted have nothing to do with the 3060ti, they were meant to source my claim that even the 8% on your posted screenshot doesn't seem like an accurate evaluation when comparing these 2 GPUs

Just because something they reference is out of date, does not mean the article is less relevant.

Yes it does.

@nanoUFO Honestly, I am considering picking one up for Starfield, depending on how they perform there. I'm sure their Linux support isn't incredible, but Nvidia also has a lot of issues on Linux and I've been running my 1060 for years now.

[–] Disaster@sh.itjust.works 7 points 1 year ago (1 children)

OneAPI also looks quite juicy... from someone currently suffering under ROCm and in refusal to give the green goblin any business.

[–] nottheengineer@feddit.de 2 points 1 year ago (2 children)

If you can bear the terrible drivers, consider a used nvidia card. They can be decent deals for gaming as well.

[–] xontinuity@sh.itjust.works 2 points 1 year ago

This is so the way. Using a used Tesla P40 in a Linux server for AI stuff. Card goes hard.

[–] planish@sh.itjust.works 2 points 1 year ago (1 children)

What happened to the drivers for the old cards to make them bad?

[–] nottheengineer@feddit.de 4 points 1 year ago

Crashes, broken adaptive sync, general display problems and, most importantly, stutter. I'm running a version from about a year ago on my 1070 Ti because every time I try to update, some game starts to stutter and I get to use DDU and try multiple versions until I find one that doesn't have that problem.

About 2-3 weeks ago, an update also worsened LLM performance by a lot on 30 and 40 series cards. There were a lot of reports on Reddit, not sure if they fixed it yet.

My default advice for any issue on r/techsupport that could be nvidia driver related has been to DDU and install a version from 3-6 months ago and that has worked shockingly well.

That reminds me, have the r/techsupport mods migrated to lemmy yet? Their explanation of the whole reddit issue was great, so I don't think they'll want to stay on there.

Anyways, back to the topic. Since OP also mentioned ROCm, I'm assuming he uses Linux for that. The nvidia drivers on linux are pretty much unusable because of all the glitches and instabilities they cause. Nvidia is a giant meme in the linux community because of this.

[–] MrScottyTay@sh.itjust.works 6 points 1 year ago (1 children)

I really want an energy efficient variant next time around. I currently have a 1050Ti and when i upgrade i sort of want something that's relatively better but with less wattage of possible.

[–] TendieMaster69@sh.itjust.works 4 points 1 year ago (1 children)

This article has a graph of GPU total power (Watts) https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

I would be interesting to see a ratio of Frames/Watts for power efficiency.

[–] MrScottyTay@sh.itjust.works 3 points 1 year ago

Thanks for that, that's good info. It's good to know the arc gpus are trending towards the bottom half of the graph. It would be very cool to see a frames/watts graph like.

[–] BirdLaw@sh.itjust.works 2 points 1 year ago

can you use it for Cuda applications?

load more comments
view more: next ›