This is an archive of the mabination.com forums which were active from 2010 to 2018. You can not register, post or otherwise interact with the site other than browsing the content for historical purposes. The content is provided as-is, from the moment of the last backup taken of the database in 2019. Image and video embeds are disabled on purpose and represented textually since most of those links are dead.
To view other archive projects go to
https://archives.mabination.com
-
Evaris wrote on 2013-08-07 07:31
Quote from RebeccaBlack;1132785:
I'd say that yeah, for that long of a range. 2020 is quite a ways away though ;__;
I think he'll like it.
as stated, 4-6 years with minimal upgrades is what I make recommendations for. So basically now till just before 2020 it needs to be good for, if possible. I try to do my best to recommend such within a stated budget, that is just my personal policy unless the OP were to state a different intended time frame of usage.
-
ikamiru wrote on 2013-08-07 09:41
I still prefer gaming laptops for LAN-Parties. JS.
-
MareneCorp wrote on 2013-08-09 02:56
Quote from RebeccaBlack;1132785:
I'd say that yeah, for that long of a range. 2020 is quite a ways away though ;__;
I think he'll like it.
^.^
Quote from Evaris;1132793:
as stated, 4-6 years with minimal upgrades is what I make recommendations for. So basically now till just before 2020 it needs to be good for, if possible. I try to do my best to recommend such within a stated budget, that is just my personal policy unless the OP were to state a different intended time frame of usage.
I'm hoping for about that time range, yes. Anything above 2/3 years is really good, I also wouldn't risk it to blow up my computer every time I needed to change something out in case of failures.
Quote from ikamiru;1132873:
I still prefer gaming laptops for LAN-Parties. JS.
I don't do LAN parties. I like having power in one place and not something I'd drag around everywhere I go.
-
Yoorah wrote on 2013-08-09 03:56
IMO it is flawed to go the AMD route on the basis of future proofing. Especially if you use consoles as an example. Key point being that consoles are using low-power CPUs meant for tablets. PC games will likely not be optimized for AMD's "octo cores" for a long time to come, and by the time they are, the per-core performance will be inadequate anyway.
There's also future computing trends to keep in mind; the shift to mobile computing. It's more likely that games will get optimized for the more energy efficient CPUs you have in laptops, with no more than 4 cores.
And for the cost conscious it should also be noted that an AMD CPU uses a lot more power than a more powerful Intel CPU, especially at full load. Depending on how much you use your PC every day, over a span of 5 years you could have a significant added cost on the AMD side.
-
Evaris wrote on 2013-08-09 04:39
Quote from Yoorah;1133700:
IMO it is flawed to go the AMD route on the basis of future proofing. Especially if you use consoles as an example. Key point being that consoles are using low-power CPUs meant for tablets. PC games will likely not be optimized for AMD's "octo cores" for a long time to come, and by the time they are, the per-core performance will be inadequate anyway.
There's also future computing trends to keep in mind; the shift to mobile computing. It's more likely that games will get optimized for the more energy efficient CPUs you have in laptops, with no more than 4 cores.
And for the cost conscious it should also be noted that an AMD CPU uses a lot more power than a more powerful Intel CPU, especially at full load. Depending on how much you use your PC every day, over a span of 5 years you could have a significant added cost on the AMD side.
I would argue differently, as there is no reason that they would reduce the thread optimization from the consoles to PCs, that would just be retroactive and make no sense when the instruction set commands are -exactly- the same and require nothing to reprogram for in commands to the CPU. Also that is what we are seeing -now-, there is no reason we might not see more cores in mobile. As far as power consumption goes, by US averages of power consumption at full load stock clocks between the FX -8320 and a Haswell i5, you're looking at the FX chip spending $15 more in electricity over a year, if the PC runs at full power usage for 12 hours out of the day, so... not really a big deal there given an average use will be about half that.
-
RebeccaBlack wrote on 2013-08-09 07:28
Quote from Yoorah;1133700:
IMO it is flawed to go the AMD route on the basis of future proofing. Especially if you use consoles as an example. Key point being that consoles are using low-power CPUs meant for tablets. PC games will likely not be optimized for AMD's "octo cores" for a long time to come, and by the time they are, the per-core performance will be inadequate anyway.
There's also future computing trends to keep in mind; the shift to mobile computing. It's more likely that games will get optimized for the more energy efficient CPUs you have in laptops, with no more than 4 cores.
And for the cost conscious it should also be noted that an AMD CPU uses a lot more power than a more powerful Intel CPU, especially at full load. Depending on how much you use your PC every day, over a span of 5 years you could have a significant added cost on the AMD side.
Part of this was what I was arguing, except I didn't consider mobiles so much. And the extra power draw
really does not matter. It's pennies. At the end of the year you'll have
maybe paid a few dollars more which is like nothing. Power draw is also overrated, IMO, at least in places with reasonable electricity bills? I dunno, it's bound to be different everywhere so it might not be in your location.
Quote from Evaris;1133711:
if the PC runs at full power usage for 12 hours out of the day...
And this is never even close to happening unless someone is rendering something all day long every single day forever. I'd bet people wouldn't even average 30% in a 12 hour day unless they're really pushing it for something in particular.
-
MareneCorp wrote on 2013-08-13 22:10
Quote from RebeccaBlack;1133754:
Part of this was what I was arguing, except I didn't consider mobiles so much. And the extra power draw really does not matter. It's pennies. At the end of the year you'll have maybe paid a few dollars more which is like nothing. Power draw is also overrated, IMO, at least in places with reasonable electricity bills? I dunno, it's bound to be different everywhere so it might not be in your location.
And this is never even close to happening unless someone is rendering something all day long every single day forever. I'd bet people wouldn't even average 30% in a 12 hour day unless they're really pushing it for something in particular.
I doubt I'm going to be pushing anything to the brink of the computer's limit and electric costs aren't a problem, so you would suggest AMD over Intel?
Quote from Compass;1132707:
Do you want this to be you?
Also answer my question ;___;
Wow I'm blind. What question? :P
-
RebeccaBlack wrote on 2013-08-14 18:35
AMD and Intel both make good products.
It's really easy to use a good CPU or GPU. Just get a game and turn up the settings. "Using" a CPU/GPU isn't that hard because most games (such as Vindictus) are so poorly optimized that they'll max out what they can get from the CPU despite not actually not using 100% of it. And better CPUs do make these things perform better, just not as much as they should. Same goes for GPUs. So like, how much you'll actually use can depend a lot on the product as much as the hardware. This is primarily what we're talking about when we're talking about how it's going to be utilized. If it isn't being fully utilized (and hardware almost never is regardless of how good or bad it is), then some of it is wasted. The thing is, we can't really predict how things are going to be in the distant future.
-
Evaris wrote on 2013-08-14 20:16
Quote from RebeccaBlack;1135676:
AMD and Intel both make good products.
It's really easy to use a good CPU or GPU. Just get a game and turn up the settings. "Using" a CPU/GPU isn't that hard because most games (such as Vindictus) are so poorly optimized that they'll max out what they can get from the CPU despite not actually not using 100% of it. And better CPUs do make these things perform better, just not as much as they should. Same goes for GPUs. So like, how much you'll actually use can depend a lot on the product as much as the hardware. This is primarily what we're talking about when we're talking about how it's going to be utilized. If it isn't being fully utilized (and hardware almost never is regardless of how good or bad it is), then some of it is wasted. The thing is, we can't really predict how things are going to be in the distant future.
I would argue the prediction point. As we can look to the past generation of games on PC, which was largely based on console ports, and thereby look to the next generation of consoles as the baseline, as regardless the next generation of games will be based on the same CPU architecture regardless, and with that in mind, we can look directly at core count (eight in both the M$ and sony consoles.) With that in mind, you look that current high end games are being optimized for 6-8 threads (Battlefield 3, Crysis 2, Crysis 3, Mechwarrior: Online as some examples.) and that over the last four years the base end of games with "mainstream" games have increased in thread optimization from 2 to 4 threads. (Skyrim remaining the main exception to this, still only being optimized for two cores.)
So to say that an assumption that mainstream games should reach 6-8 thread optimization as a set average over the average lifespan of a desktop PC (4-6 years) could not be predicted, seems a little odd to me.
-
RebeccaBlack wrote on 2013-08-14 20:35
The thing is, a lot of our games don't even
properly use 4 right now. Hell, some of them don't even properly use one!
I get what you mean, but eh. The thing is, PCs drastically overpower consoles. I'd like to think that no matter what, we're either going to end up with something that is either:
- A poorly optimized console port, regardless of cores, which doesn't need much CPU power anyway or
- A game optimized for PCs, which may eventually use more than 4 cores but likely won't, seeing as how we're doing a mediocre job at using 4 so far
I'm just saying! Usually a game ends up either being either a shit port with no reqs (that still runs terribly) or something that needs all the power in any form it can get. It's definitely going to be more convenient to port over now but it's still going to require no CPU power because consoles require no CPU power.
-
Compass wrote on 2013-08-15 04:23
I can't believe there are people in this thread that think PC games won't require better hardware in a few years.
Welp, I now know whom I shouldn't listen to for tech related things not that it matters since I only listen to Evaris anyways.
-
RebeccaBlack wrote on 2013-08-15 14:29
Do you actually have anything useful to say ever or are you just going to complain about everyone else without adding anything? Seriously, what's with your attitude lately? You went off on Snowie too for absolutely no reason for something he wasn't even around for and he's the nicest fucking person on here other than maybe Phunkie or Iljimae or something.
I'm sorry but I'm just so tired of you being like this. No one cares about your fucking feelings about the matter, say something in return or stop pointing at people to incite responses like this. Eventually, yeah, you'll get them because you keep provoking people without actually making any arguments at all.
I'll gladly take the infraction.
-
Compass wrote on 2013-08-15 15:11
Uhh what?
What do you want me to say Evaris has probably said everything that needed to be said.
Games are taking advantage of more cores, and as Evaris said games like Far Cry 3 for example runs better on AMD CPUs than Intel's. Next-gen consoles will be running AMD hardware with x86 architecture so we should be seeing games that run better on AMD CPUs as opposed to Intel ones with better ports and core optimization.
A poorly optimized console port, regardless of cores, which doesn't need much CPU power anyway or
And why do you think it's poorly optimized? Are current consoles using x86 CPUs? No? Welp.
A game optimized for PCs, which may eventually use more than 4 cores but likely won't, seeing as how we're doing a mediocre job at using 4 so far
What am I hearing?
As quad+ core computers become more common so will the optimization.
By your logic nothing would of advanced because when things start off they're done terribly.
Remember when games first used 3D graphics and looked like crap? Yeah, seems like we should of stuck with 2D games because they're so terrible now, right?
-
RebeccaBlack wrote on 2013-08-15 16:05
I never said quad core optimization wouldn't improve. Of course it will. But in a PC market where largely people are still using quad core and will be in the foreseeable future (AMD isn't exactly dominating in CPU market share among gamers), PC games have no reason to use 8 cores unless they're console game ports. And consoles games, well, they're quite underpowered, really. They tend to get some graphical enhancements when coming to the PC but not enough to really make them substantially more powerful. Consoles are running at such a low frequency that any CPU that's anywhere in the ballpark of decent would have enough power, 8 cores or not. We always have to assume the PC is an afterthought and the consoles are the primary target. Meanwhile, Intel packs a ton of power into their CPUs with hyperthreading even if it does have less cores, putting it into a position where it effectively operates like it has more cores than it does. It's far more likely that intensive (read: non-console port) games will receive hyperthreading support before they'll use 8 cores.
-
Evaris wrote on 2013-08-15 16:18
Quote from RebeccaBlack;1136148:
I never said quad core optimization wouldn't improve. Of course it will. But in a PC market where largely people are still using quad core and will be in the foreseeable future (AMD isn't exactly dominating in CPU market share among gamers), PC games have no reason to use 8 cores unless they're console game ports. And consoles games, well, they're quite underpowered, really. They tend to get some graphical enhancements when coming to the PC but not enough to really make them substantially more powerful. Consoles are running at such a low frequency that any CPU that's anywhere in the ballpark of decent would have enough power, 8 cores or not. We always have to assume the PC is an afterthought and the consoles are the primary target. Meanwhile, Intel packs a ton of power into their CPUs with hyperthreading even if it does have less cores, putting it into a position where it effectively operates like it has more cores than it does. It's far more likely that intensive (read: non-console port) games will receive hyperthreading support before they'll use 8 cores.
What you don't seem to understand, that all hyperthreading does is create "virtual" cores, which the OS and games see as the same thing as a physical core in most respects. Thus, 8 thread/core optimization should be the same regardless in this point.
And again I argue, why would games have thread optimization reduced from consoles to PC, given most "AAA" level titles will be cross-platform.