Jump to content

Recommended Posts

Posted

+1 on contrast ratio.

not too worried about color accuracy though as the baseline quality on current models is already pretty good. viewing angle is also already pretty good across the board.

instead i would look at screen reflectivity, particularly if ambient lighting and sources cannot be controlled in the room. a lot of screens still opt for glossy surface which enhances brightness, but is pretty damn annoying when you can see other room light sources to the side or behind you reflected on the screen, specially when watching dark scenes.

the other thing i check is the evenness of the LED/LCD backlighting. either put on a black test screen or just turn on the TV without any source inputs. the screen should be completely black except maybe for a couple of dialog boxes, and you can then check for light leaks, particularly at the corners and edges, as well as if there are patches that are noticeably brighter (ie. dark grey instead of completely black.

you can also do the same test to check for stuck/bright pixels. usually the delivery guys and their tech support would use very bright scenes to show you that the set is fine. unfortunately, it's only when you watch your first dark scene that you will notice stuck pixels. on mine was a single pixel stuck on red, unfortunately on my biggest tv, but fortunately close to the edge where it's less bothersome. still annoying occasionally because i know it's there, and i would've much preferred a dead pixel rather than a stuck one.

Posted

120hz-240hz 600hz has nothing to do with decreasing gaming input lag. you can have a 60hz tv that won't factor into gaming lag. that's all marketing speak.

in fact, having a 120hz-240hz can increase input lag if the feature 120hz-240hz tv's were made for, is engaged.

what 120hz refresh rate is for:

in terms of sports, the issues the tv marketers claim to have solved is motion dejudder. when cameras pan fast in sports 120-240hz tvs can engage a "smooth motion" gimmick to eliminate judder when the camera pans in sports by doubling the refresh rate. 120hz-240hz allows hdtv to engage a post processing to create the illusion of getting rid of judder in slow planning scenes and motion blur in fast scenes which is marketed for sports watchers. it's really garbage and makes the tv look unnatural. 99% of videophiles do not use it including the editors at cnet. (sports is ok to use but you don't need it)

120hz is useless unless you plan to use the artificial smooth motion gimmick. (soap opera effect which increases lag)

90% people don't use it.

marketers will try to sell you on the refresh rate eliminates motion blur. most modern tv's at this point don't have to worry about ghosting (high ms on lcd's) as that was an issue on lcd hdtv's in their infancy.

the truth is motion judder/blur we all see from slow/fast camera pans in sports is mostly from the original broadcast source material and not the actual limitation of you tv. but marketers lie about this to sell tv's.

the 600hz is a term designated for plasmas not LCD's and it means absolutely nothing. marketers just came with that to combat lcd advertising.

i'll try to bracket it to simplify and cut through the BS tv makers try to sell you:

LCD/LED

60hz is standard refresh rate every tv broadcasts in. this is perfectly fine for 95% of viewing material.

120hz = lcd makers doubled the refresh rate to create the artificial smooth motion gimmick. (it is artificially adding frames to give an illusion of smooth playback on slow camera pans eliminating dejudder which is from the broadcast source)

240hz = lcd needed to produce 3d smooth motion 120hz spit in two (otherwise little or no benefit of getting 240hz)

Plasma

600hz = has nothing to do with anything. term was created to combat the 120hz-240hz craze

why is there judder in films?

movies are filmed in 24fps/

tv broadcast standard is NTSC standard =60fps.

so films have to be converted to fit 24frames into 60 fps by converting frames in a 3:2 ratio causing an uneven skip between frames. so marketers came out with gimmick to eliminate those issues.

Plasma have the smooth motion gimmicks as well.

but their main gimmick is playing 24fps film content by converting 60frames back to 24 fps film content. some LCD's do this as well. (this gimmick the ps3 gives as an option for tv's capable of using it.) but it works for some films while for some older films flicker which bother people.

48hz doubles 24hz (but creates flicker)

96hz (hdtv makers doubled 48hz again to reduce flicker)

gaming lag is a separate issue. dis-engaging all gimmicks reduces lag. (making the point of having a 120hz or 96hz hdtv useless)

the cnet article on dis-engaging all the post processing features (or using game mode) finding a way to by pass it helps gaming lag.

most hadcare 360 fps players use an old 360 w/ vga port to a low ms pc gaming monitor.

real hardcore fps will find an old tube crt for zero input lag.

personally get i would get a hdtv for the best picture quality and worry about gaming secondary as i don't think it's worth the trade off to get a tv for gaming if it sacrifices in the picture quality dept.

imo, i would look for the best contrast ratio (best blk levels) then color accuracy. and gaming lag third in that order.

+1 on contrast ratio.

not too worried about color accuracy though as the baseline quality on current models is already pretty good. viewing angle is also already pretty good across the board.

instead i would look at screen reflectivity, particularly if ambient lighting and sources cannot be controlled in the room. a lot of screens still opt for glossy surface which enhances brightness, but is pretty damn annoying when you can see other room light sources to the side or behind you reflected on the screen, specially when watching dark scenes.

the other thing i check is the evenness of the LED/LCD backlighting. either put on a black test screen or just turn on the TV without any source inputs. the screen should be completely black except maybe for a couple of dialog boxes, and you can then check for light leaks, particularly at the corners and edges, as well as if there are patches that are noticeably brighter (ie. dark grey instead of completely black.

you can also do the same test to check for stuck/bright pixels. usually the delivery guys and their tech support would use very bright scenes to show you that the set is fine. unfortunately, it's only when you watch your first dark scene that you will notice stuck pixels. on mine was a single pixel stuck on red, unfortunately on my biggest tv, but fortunately close to the edge where it's less bothersome. still annoying occasionally because i know it's there, and i would've much preferred a dead pixel rather than a stuck one.

To begin with, I want to thank you guys for setting me straight on what to look for on a t.v. Additionally, I've learned that the single most important performance metric of a newer t.v. is contrast ratio. However, I read this about contrast ratio: "As mentioned, all manufacturers manufacture their numbers with little basis on reality, so spec sheets are out..." In other words, there's no set standard when it comes to broadcasting CR numbers, so I've gleaned that the best thing I can do is go off of in-store demo's (which are set up to fail) reviews and/or recommendations. There's quite a bit to read up on regarding contrast ratios, so I'll see what else I can learn and hopefully come up with a decision sometime this freaking century...

Guest davidwhangchoi
Posted (edited)

In other words, there's no set standard when it comes to broadcasting CR numbers, so I've gleaned that the best thing I can do is go off of in-store demo's (which are set up to fail) reviews and/or recommendations. There's quite a bit to read up on regarding contrast ratios, so I'll see what else I can learn and hopefully come up with a decision sometime this freaking century...

agree, definitely demo them.... but before going to a store definitely research them instead of going in blind. after reading reviews and recommendations of particular models in your price range jot down like three of 4 of the models and

then go in and see if you can get the remote from the clerk to at least change the picture off of "torch mode" (this is the mode manufacture put in to attract customers, it's just puts it the tv on full blasting bright so when people look at it they go wow! but at home it will cause headaches and unrealistic. its meant to attract customers.) but if you take it off of that mode you'll get a more realistic picture of what it will look like.

i'm a bit biased bc towards plasma bc i own a Pioneer Kuro Pro-151fd, which to videophiles, is the "Holy Grail" of tv's, Cnet's review of it:

http://reviews.cnet.com/flat-panel-tvs/pioneer-elite-kuro-pro/4505-6482_7-33002556.html

the only tv that comes close is panasonic plasma VT-60 (which is on sale right now

but even their entry level ST series which is in your budge range is very very capable.

but you have to first decide whether you want an LCD or plasma knowing the trade offs.

Plasma daytime viewing is still ok, but not as good as an lcd. but an LCD in the under 1000 budget range cannot touch the black levels offered by a panny in the same budget, hands down no contest.

there are some good lcd's out there but you have to pay a whole lot more just to get even close to the performance of a panny.

my recommendation would be to go on cnet and look at all the reviews of plasma and lcd

David Katzmaier of cnet is very accurate with his reviews and is also a certified calibrator of hdtv's himself. here's a link to what he thinks are the best tv's

http://reviews.cnet.com/best-tvs/

once you get a specific model or a few in mind,

go over to the AVS (audio visual science) forums and look on the owners thread for the particular models you are interested in.

David Katzmaier is also a member and goes over there for info himself. it's very over whelming but if you read through it you will find every flaw for that particular model. and it's 100% valuable resource. very similar the the MW threads of the macross toys, these guys will find any and every flaw for the tv you're interested in. as well as the best picture settings for it.

they will also reveal the true contrast ratio for the tv in mind.

they also have their own reviews for tvs

here's an example (panny ST owners thread)

http://www.avsforum.com/t/1450484/official-panasonic-tc-pxxst60-series-thread

the last resource i look at is:

every year the top hdtv calibrators go to a show where the best of the best hdtv do a shoot out in every catagory to see which is the best hdtv overall for that year. every top tv manufacture (sony, samsung, panasonic, LG, etc) brings their flagship LCD,LED and Plasma hdtv to this shootout. then the top hdtv guru's test them. (the top guru right now is DNICE from AVS forums) he's the authority on judging hdtv's performance.

here's some link to the results of this year's:

the guy on the right in the yellow shirt is DNICE

http://www.soundandvision.com/content/value-electronics-hdtv-shootout-and-then-there-were-three

they always have the pioneer kuro (KRP-500M is the same as the 1x1fd series without a tv tuner) as a reference each year to see if any current gen tv can finally dethrone it from being king of the hill.

i'm fairly certain you'll get a better idea of what you want after looking at these resources.

once you demo it and know what you want, then looking for a deal is your last step. if you find a particular model you set your sights on, just pm me and i'll help you find the best deal possible for that particular model.

good luck!

Edited by davidwhangchoi
Posted (edited)

In looking at my GPU choices for my build, I've been getting different opinions between nvidia and AMD. I'm a fan of DCS and flight sims in general, and alot of folks from that community have told me to steer towards nvidia over amd while folks playing BF3 and 4 seem to like AMD. Right now, I'm caught looking at the GTX 660 and the R9 270X (which is just a rebranded 7870 from what I've read); both priced around the same. Are there certain types of games nvidia chips perform better with and vice versa? Sorry to ask such a silly question.

Edited by Shadow
Posted

In looking at my GPU choices for my build, I've been getting different opinions between nvidia and AMD. I'm a fan of DCS and flight sims in general, and alot of folks from that community have told me to steer towards nvidia over amd while folks playing BF3 and 4 seem to like AMD. Right now, I'm caught looking at the GTX 660 and the R9 270X (which is just a rebranded 7870 from what I've read); both priced around the same. Are there certain types of games nvidia chips perform better with and vice versa?

DICE optimized BF4 for AMD chips thanks to EA. However, I haven't heard too much from the Nvidia crowd. And benchmarks for AMD vs. Nvidia on BF4 have only shown about a 7% at-the-most improvement on AMD's side but Nvidia's 780ti blows away AMD's cards. I'm playing BF4 on a GTX 770 with mostly high-settings and it's fine for me. My beef is with the awful netcode, chat lag, and crashing.

Posted

In looking at my GPU choices for my build, I've been getting different opinions between nvidia and AMD. I'm a fan of DCS and flight sims in general, and alot of folks from that community have told me to steer towards nvidia over amd while folks playing BF3 and 4 seem to like AMD. Right now, I'm caught looking at the GTX 660 and the R9 270X (which is just a rebranded 7870 from what I've read); both priced around the same. Are there certain types of games nvidia chips perform better with and vice versa? Sorry to ask such a silly question.

What's your budget? I was looking into this for a friend awhile back, and we pretty much concluded that AMD is better under $200, Nvidia is better over $200.

Posted

agree, definitely demo them.... but before going to a store definitely research them instead of going in blind. after reading reviews and recommendations of particular models in your price range jot down like three of 4 of the models and

then go in and see if you can get the remote from the clerk to at least change the picture off of "torch mode" (this is the mode manufacture put in to attract customers, it's just puts it the tv on full blasting bright so when people look at it they go wow! but at home it will cause headaches and unrealistic. its meant to attract customers.) but if you take it off of that mode you'll get a more realistic picture of what it will look like.

i'm a bit biased bc towards plasma bc i own a Pioneer Kuro Pro-151fd, which to videophiles, is the "Holy Grail" of tv's, Cnet's review of it:

http://reviews.cnet.com/flat-panel-tvs/pioneer-elite-kuro-pro/4505-6482_7-33002556.html

the only tv that comes close is panasonic plasma VT-60 (which is on sale right now

but even their entry level ST series which is in your budge range is very very capable.

but you have to first decide whether you want an LCD or plasma knowing the trade offs.

Plasma daytime viewing is still ok, but not as good as an lcd. but an LCD in the under 1000 budget range cannot touch the black levels offered by a panny in the same budget, hands down no contest.

there are some good lcd's out there but you have to pay a whole lot more just to get even close to the performance of a panny.

my recommendation would be to go on cnet and look at all the reviews of plasma and lcd

David Katzmaier of cnet is very accurate with his reviews and is also a certified calibrator of hdtv's himself. here's a link to what he thinks are the best tv's

http://reviews.cnet.com/best-tvs/

once you get a specific model or a few in mind,

go over to the AVS (audio visual science) forums and look on the owners thread for the particular models you are interested in.

David Katzmaier is also a member and goes over there for info himself. it's very over whelming but if you read through it you will find every flaw for that particular model. and it's 100% valuable resource. very similar the the MW threads of the macross toys, these guys will find any and every flaw for the tv you're interested in. as well as the best picture settings for it.

they will also reveal the true contrast ratio for the tv in mind.

they also have their own reviews for tvs

here's an example (panny ST owners thread)

http://www.avforums.com/threads/panasonic-st60-owners-thread.1756176/

the last resource i look at is:

every year the top hdtv calibrators go to a show where the best of the best hdtv do a shoot out in every catagory to see which is the best hdtv overall for that year. every top tv manufacture (sony, samsung, panasonic, LG, etc) brings their flagship LCD,LED and Plasma hdtv to this shootout. then the top hdtv guru's test them. (the top guru right now is DNICE from AVS forums) he's the authority on judging hdtv's performance.

here's some link to the results of this year's:

the guy on the right in the yellow shirt is DNICE

http://www.soundandvision.com/content/value-electronics-hdtv-shootout-and-then-there-were-three

they always have the pioneer kuro (KRP-500M is the same as the 1x1fd series without a tv tuner) as a reference each year to see if any current gen tv can finally dethrone it from being king of the hill.

i'm fairly certain you'll get a better idea of what you want after looking at these resources.

once you demo it and know what you want, then looking for a deal is your last step. if you find a particular model you set your sights on, just pm me and i'll help you find the best deal possible for that particular model.

good luck!

Wow, you really DO have the holy grail of t.v.'s; that thing's been the top of the chain since '08. Anyway thanks again for all of your input and resources; I've learned more from your posts than I have from anywhere else. I'm liking the 'Pan ST-60 even though it's 50% more than I'd like to spend. However, it has very high praise from its owners and CNET, which more than makes it worth the extra expense, IMO.

Honestly, I don't really game on t.v.'s, that's what I have my Toshiba Qosmio X505-898 laptop for, but I just "liked" the idea of having a t.v. capable of twitch gaming. Actually, I'm a fan of 2D fighters so having a t.v. capable of supporting my consoles IS important, but honestly I can't remember the last time I played Capcom vs SNK 2 so it may be a moot point. At this point I may stop looking at input lag and just focus on picture quality, which changes the dynamic of my research completely. Question: if I were to link my laptop (that has no issues with lag of any kind) to any t.v. would the "input lag" issue be null and void? I'm assuming that the t.v. would just become a display and wouldn't have to process the actual game signal?

Just get a Panasonic plasma.

I would prefer a plasma, but I can't really control the lighting levels in my living room which might make for a disappointing viewing experience.

Posted

Hello Archer,

I bought a surface 2 about a month ago and have been enjoying it ever since. I too didn't have a tablet and I already have several laptops for work and personal use. I did however want the ability to perform word processing and have the features of a tablet at the same time. It was a great compromise for me because I travel a lot for work and pulling out a laptop on planes even in first class is a royal pain. I use it mainly as a laptop and constantly have it docked with the detachable keyboard. The app store is limited as your already aware, but I've been able to find comparable apps for most things I'm into. Hope this helps. WIndows 8 wasn't a huge learning curve for me as I already had it on my cell phone.

Posted

Question: if I were to link my laptop (that has no issues with lag of any kind) to any t.v. would the "input lag" issue be null and void?

No. There would still be lag. Every display panel has lag. How much is another question. Just focus on picture quality.

Posted (edited)

i've had no issue with lag connecting my laptop to LCD TV via HDMI (or more accurately, laptop to AV receiver to TV). as azrael said, there will always be lag, but IMHO it's a non-factor these days. i have yet to meet a gamer who can legitimately claim they are able to frag more because their displays were faster. the average human response time is 150-300ms, most displays are under 100ms, good ones are sub 20ms. seriously, this would make for a good episode on Mythbusters.

what you do have to think about is whether your laptop can output 1080p at decent framerates at detail settings you can live with. if your laptop puts out 1080p then the TV just displays it as is, but if you send the signal at other resolutions, say 720p or any of the other reduced resolution game settings, then the TV's upconverter will kick in. i haven't noticed any lag with the upconversion, but i mostly play Starcraft and RTS games rather than shooters. the quality of the upconversion though will depend on the processing engine onboard the TV and varies from model to model, even within same brand. poor upconversion means some of the hi-res details will be mushed/softened, but honestly, once you're engrossed in a game, you would hardly notice.

the Pioneer Kuro is truly legendary. my buddy had it in his mancave and it was really a pleasure to watch. i wouldn't buy one myself though as it is overkill for most applications and its advantages diminish in the typical bright living room setting.

edit:

just for fun http://www.humanbenchmark.com/tests/reactiontime/

i suppose sub-100ms is Jedi territory

Edited by Major Focker
Guest davidwhangchoi
Posted (edited)

the Pioneer Kuro is truly legendary. my buddy had it in his mancave and it was really a pleasure to watch. i wouldn't buy one myself though as it is overkill for most applications and its advantages diminish in the typical bright living room setting.

That is true for most plasma's in general. However, there are two exceptions. the current samsung F8500

http://reviews.cnet.com/flat-panel-tvs/samsung-pn60f8500/4505-6482_7-35566923.html

which can hit 83 footlamberts, (lcd's are normally 40 fl). which is exceedingly bright for daylighted rooms. check out one at a local bestbuy.

and the kuro pro fd151 it is well known that is this is the rare plasma that is bright enough for daytime viewing (can go beyond 40fl, the highest i got was about 50fl without losing blks) and keep it's deepest blk levels without washing out in daylight. (actually it's the only tv that has the best blks and bright enough for daytime viewing while retaining blks, and the best at night. aside from the future, OLED, which is about 3-4 years off, this is the best all around tv money can buy.)

At this point I may stop looking at input lag and just focus on picture quality, which changes the dynamic of my research completely.

I would prefer a plasma, but I can't really control the lighting levels in my living room which might make for a disappointing viewing experience.

if you want to go LCD, try checking out

the Sharp LE650,

if you want LED try looking at:

vizio M551d-A2R

or vizio E550i

word of warning, when shopping for an LED

make sure it is "Local Dimming"

there are side lit LED tv's which is another complete bogus marketing that does nothing. it's pretty much an LCD disguised by using LED's for decoration (side lit led's, marketers do this so they can claim it's an LED tv, it really is not)

the true LED HDTV's are Local Dimming.

btw, i posted the wrong link to the avs forum for the st60 thread, i updated the link in my post. sorry, was posting links really quickly

here's the real thread:

http://www.avsforum.com/t/1450484/official-panasonic-tc-pxxst60-series-thread

if you're set on an LCD/LED i would get an LED that has local dimming technology. but if you're still on the fence whether plasma or lcd/led, i would try to see if you can find a walmart for cheaper tv's(which are usually bright inside) or a bestbuy (for higher end tv's) and A&B all the ones you want to consider.

in terms of deals, the ST60 you can get for under 1k if you wait for it and not in a hurry, otherwise local dimming LED is bright for daytime and has deep enough blacks.

ok, good luck on the search.

Edited by davidwhangchoi
Posted

Some of the newer edge-lit ones claim local dimming---but they're really just turning off part of the edge. :p

Finding enough info to determine that it's truly "back-lit by an array of LED's" can be pretty tough. Usually need very in-depth reviews.

Posted

What's your budget? I was looking into this for a friend awhile back, and we pretty much concluded that AMD is better under $200, Nvidia is better over $200.

It's funny cause I'm trying to stay in that $200 area for a GPU.

Posted

It's funny cause I'm trying to stay in that $200 area for a GPU.

In that price range, I'd probably say a Radeon R9 270x would be your best bet.

If you're willing to go to $250, though, then get a GeForce GTX 760.

For what it's worth, I've been an Nvidia guy for a long time, and the GTX 760 is what I'd buy if I were in the market for a card today (except that I'm not, because I don't think it's a big enough upgrade over the GTX 660ti that I already have). I've been able to play most games at or near the highest settings and stay above 30FPS, with about 3 years in between upgrades.

Also, despite being "optimized for AMD", BF4 Central even suggested that the GTX 760 is probably the best bang-for-your-buck card for BF4.

Hope that helps!

Posted

As a complete PC noob, if a GTX 760 can ultra all games at 1080p, 60 FPS, what's the point of better cards like the 780 or Titan.

Also, can anyone fill me in on the difference between a GTX 690 and Titan (since everywhere I look, they seem to be getting the same performance, if not the 690 doing better). I thought the Titan was Nvidia's best card?

Posted (edited)

As a complete PC noob, if a GTX 760 can ultra all games at 1080p, 60 FPS, what's the point of better cards like the 780 or Titan.

I didn't say it can ultra all games at 1080p and 60fps. I said it can ultra or nearly ultra most games at at least 30fps.

So why spend money on a better card? To guarantee ultra, to guarantee 60fps (or more, especially if you're running a 120hz display), to go higher than 1080p (which is more common on monitors than TVs), or to run games on multiple 1080p+ displays.

Also, can anyone fill me in on the difference between a GTX 690 and Titan (since everywhere I look, they seem to be getting the same performance, if not the 690 doing better). I thought the Titan was Nvidia's best card?

Certainly.

Or put another way, Titan is one impressively powerful GPU, the 690 is actually a dual GPU in a single housing. Titan uses less power, has more memory, and a higher clock speed, and is based on the newer 700-series architecture. Because it's a single GPU, it's physically smaller than a 690. However, the 690 is actually more powerful.

Edited by mikeszekely
Posted

I didn't say it can ultra all games at 1080p and 60fps. I said it can ultra or nearly ultra most games at at least 30fps.

So why spend money on a better card? To guarantee ultra, to guarantee 60fps (or more, especially if you're running a 120hz display), to go higher than 1080p (which is more common on monitors than TVs), or to run games on multiple 1080p+ displays.

Certainly.

Or put another way, Titan is one impressively powerful GPU, the 690 is actually a dual GPU in a single housing. Titan uses less power, has more memory, and a higher clock speed, and is based on the newer 700-series architecture. Because it's a single GPU, it's physically smaller than a 690. However, the 690 is actually more powerful.

Ah, so is the 690 similar to SLI graphics, but saves the hassle by sticking both together in the same card? Sorry, can't get much more technical than that!

Can you SLI 690s then?

Last question. With the 800 series stuff coming out next year, would it make more sense to just wait it out, or splash out on a card right now. I'm currently debating with myself about a PC build, seeing a lot of games run better than PS4 already (and I can imagine that gap will only widen in the coming years).

Posted (edited)

Ah, so is the 690 similar to SLI graphics, but saves the hassle by sticking both together in the same card? Sorry, can't get much more technical than that!

Yes. Kind of. In fact, IIRC, the 690 is really like two 680 GPUs packed into one card. But there are advantages to a single-card, like reduced power consumption and more efficient cooling. Hence, Titan.

Can you SLI 690s then?

Yes. But you can only have two. If you go with Titan, you can have three.

Last question. With the 800 series stuff coming out next year, would it make more sense to just wait it out, or splash out on a card right now. I'm currently debating with myself about a PC build, seeing a lot of games run better than PS4 already (and I can imagine that gap will only widen in the coming years).

That's a super tough call. Because, let's say you wait for the 800 series. By the time you settle on the card that's right for you, you'll be reading about the 900 series. When it comes to computers, there is always something better just around the corner.

Me personally, I'll probably buy an 800-series card... but that's because I've already got a 600-series. Which in and of itself was a replacement for a 400-series card, and I skipped the 500-series, just like I'm doing with the 700-series.

Of course, you could always buy a 700-series card now, skip the 800s, and upgrade when the 900s come out.

But, that's one of the nice things about PCs. You can decide what to upgrade and when. I actually built a whole new computer when I went from the 400 to the 600, but I'm really satisfied with the speed of my computer. So instead of building a new one in a year or two, I'll probably just upgrade the graphics card.

Edited by mikeszekely
Posted

Ok, I lied, not last question. So, once you've got yourself a build, is the GPU all people really upgrade when they want to boost performance? I realize that GPU > CPU for gaming in recent years, but how many generations of GPU can a CPU last before it starts to bottleneck things? (Assuming power requirements and memory requirements stay consistent).

Posted

Ok, I lied, not last question. So, once you've got yourself a build, is the GPU all people really upgrade when they want to boost performance? I realize that GPU > CPU for gaming in recent years, but how many generations of GPU can a CPU last before it starts to bottleneck things? (Assuming power requirements and memory requirements stay consistent).

that's a tough question, and I'm sure even experts disagree. It's further complicated by publishers listing things like "quad core" as a requirement for games like BF4 that run just fine on dual core i3s.

I personally built three computers in a five year span, but i think it was a combination of factors that led me to do so more than pointing to any one part and thinking it needed upgraded. Truthfully, my previous box was a first-gen i7, and I probably would have been fine just upgrading the GPU. But i had some extra money, the third-gen i7s had come out, and for some reason I never determined, that box was always slow to boot.

Suffice to say that, if you invest in good parts to start, you can probably go 5-6 years on GPU upgrades only. If you get into it, though, and build instead of buy, you'll probably be interested in upgrading more or more often.

Posted

Ok, I lied, not last question. So, once you've got yourself a build, is the GPU all people really upgrade when they want to boost performance? I realize that GPU > CPU for gaming in recent years, but how many generations of GPU can a CPU last before it starts to bottleneck things? (Assuming power requirements and memory requirements stay consistent).

There are other bottlenecks like memory bandwidth, hard drive speeds, SSD read speeds.

But yes, good parts selection will last you a while and you can keep going for quite some time. My current system will be hitting 5 years next year and I can play BF4 with no problems (BF4's problems are with BF4 itself...). While I've updated monitors, video cards, hard drives, enclosures, optical drives, PSUs on a routine cycle, my box is still running on the same CPU, motherboard and RAM from 4 years ago. And yes, I'm looking at upgrades, but since I used good parts and taken care of my box, I'm not pushing myself to upgrade immediately since I can still get by on my current setup. Right now, the only thing holding me back is my use of a SSHD instead of a SSD. When I got my hybrid drive, SSDs were still pricy for my tastes and hybrid drives offered a performance bump over traditional HDDs but without the high cost/GB of a SSD. So quality parts, time, and care can produce a setup where you're not spending money on a full system upgrade the next year, but on smaller items like GPUs, storage, and such.

Posted (edited)

I bought a GTX 780 and haven't looked back. /grumblesomethingabouttheTIcomingout20daysafterheboughtit

Still pissed about my POS motherboard but Christmas will be rectifying that soon. Yay Asus ROG.

Was [] close to buying a 4930k but it sold out before I could. It would have been super overkill but for the price it was, it was certainly warranted.

Kind of adding to the tv thing, I have a no frills Samsung plasma and it plays everything beautifully. I have some issues with light, but closing the blinds instantly fixes it. I've never noticed a thing playing my 360 on it. I even tried out PC BF3 set to ultra on it and had more issue with 60 inches just being too much to take in all at once. Nothing with ghosting.

Edited by Chewie
Posted

Need some advice. i have a Synology 2-bay NAS and was wondering which is the better option for transporting it:

1) remove drives and pack in my handcarry bag, put the NAS with the checked-in luggage

2) keep drives in the NAS, put NAS back in original box, then inside checked-in luggage

Posted

Need some advice. i have a Synology 2-bay NAS and was wondering which is the better option for transporting it:

1) remove drives and pack in my handcarry bag, put the NAS with the checked-in luggage

2) keep drives in the NAS, put NAS back in original box, then inside checked-in luggage

I'm assuming spinning platters, right? Personally, I'd remove the drives and keep them in your carry-on. The fewer people handling them (especially the way luggage gets handled at airports) the better.
Posted

Need some advice. i have a Synology 2-bay NAS and was wondering which is the better option for transporting it:

1) remove drives and pack in my handcarry bag, put the NAS with the checked-in luggage

2) keep drives in the NAS, put NAS back in original box, then inside checked-in luggage

Yeah, I'd go with option 1. Less chance of your stuff being tossed around by the crew or the luggage conveyor belts. Just be sure they don't take away your hand-carry bag due to people carrying too much stuff on carry-on.

  • 2 weeks later...
Posted

:rolleyes:

So I got a few pop-ups and emails from Cox communications stating that they'd upgraded their cable internet and that I needed to replace my Motorola Surfboard 5101 cable modem to take advantage of their wonderful new upgraded cable service. Coincidentally, my downloading speeds have been TEH SUCK, as of late. The messages from Cox conveniently allowed me to select a new rental or a purchase plan that would include a "DOCSIS 3.0" cable modem.

Are cable internet providers truly upgrading their service speeds? Is my current Motorola 5101 in need of replacement (38Mbps maximum download speed)? Should I buy one of "their" DOCSIS 3.0 all-in-one modems or should I go on my own and buy something like the Motorola SB6141 that is just a dedicated cable modem? Should I buy an all-in-one cable modom/router? Will I need to buy a new router to take advantage of my wonderfully new and fast cable internet service, DOCSIS 3.0 modem and lighter wallet?

I normally couldn't be bothered with this sort of thing since I stopped online gaming, but when I want to see Holly Michaels getting pile-driven by a bunch of well-hung construction workers I want to see it on my screen NOW... :angry: Thanks in advance, guys....

Posted

Are cable internet providers truly upgrading their service speeds? Is my current Motorola 5101 in need of replacement (38Mbps maximum download speed)? Should I buy one of "their" DOCSIS 3.0 all-in-one modems or should I go on my own and buy something like the Motorola SB6141 that is just a dedicated cable modem? Should I buy an all-in-one cable modom/router? Will I need to buy a new router to take advantage of my wonderfully new and fast cable internet service, DOCSIS 3.0 modem and lighter wallet?

Yes, they are upgrading their speeds to attract customers. And with other companies trying to offer higher speeds, this will be a continuous cycle. As we are just hitting speeds of >100 Mbps (unless you're getting FiOS and their 1Gbps speeds) on home speeds, I doubt you will need a new router. Most routers are 10/100 Mbps switches and the wireless components are just hitting the pricy 1Gbps. You can get 1Gbps routers but those are usually 1Gbps/802.11n/ac which are normally on the pricy side (~$150-$200). If the speed you plan on getting is less than 100 Mbps, then there's little incentive to get a new router because you can't take advantage of that speed. If you are getting a higher tier, then it would probably be worth upgrading.

As for an all-in-one vs. modem+router, I'm old school and prefer the modem + router. The router can control my local LAN and if I upgrade, I don't need a new unit if I upgrade my plan. I can just swap the modem and my intranet will still be functional. That being said, if you're tight on space or don't want to deal with the mess of wires 2 units can become, then I would just use the all-in-one unit the company will give you.

Posted

Yes, they are upgrading their speeds to attract customers. And with other companies trying to offer higher speeds, this will be a continuous cycle. As we are just hitting speeds of >100 Mbps (unless you're getting FiOS and their 1Gbps speeds) on home speeds, I doubt you will need a new router. Most routers are 10/100 Mbps switches and the wireless components are just hitting the pricy 1Gbps. You can get 1Gbps routers but those are usually 1Gbps/802.11n/ac which are normally on the pricy side (~$150-$200). If the speed you plan on getting is less than 100 Mbps, then there's little incentive to get a new router because you can't take advantage of that speed. If you are getting a higher tier, then it would probably be worth upgrading.

As for an all-in-one vs. modem+router, I'm old school and prefer the modem + router. The router can control my local LAN and if I upgrade, I don't need a new unit if I upgrade my plan. I can just swap the modem and my intranet will still be functional. That being said, if you're tight on space or don't want to deal with the mess of wires 2 units can become, then I would just use the all-in-one unit the company will give you.

What Az said.

Put more simply, you probably don't need a to buy a new modem. You can buy one if you want (I did), since DOCSIS 3.0 is the new standard, but I don't recommend buying anything directly from Cox, and certainly not renting (I figured out how long I'd been with Comcast when I bought my modem, and realized that if I was rented I'd have paid over $800 for my modem by now). I do heartily recommend the Motorola SB6141.

Like Az, I recommend buying them separately. I went through two routers and three generations of 802.11 Wi-Fi in the time I had my old modem. Router's improve a lot faster than cable modems. Likewise, suppose you switch ISP, like if FIOS rolled into your area. Then you'd need a new modem, but you wouldn't have to change your router or Wi-Fi settings.

But perhaps the best argument for going separate is that Motorola is still the king of cable modems, but the router field is a bit more messy. I had a Netgear wireless B router for years that was pretty solid, until it totally died. I replaced it with an N300 router from Netgear that started dropping the internet connection and randomly dropping devices from the network after only a year or two. Linksys, who's WRT54G router was almost legendary, started releasing some pretty shoddy routers, followed by some good routers, before being bought up by Belkin, who (in my opinion) always made pretty lousy routers. When I decided to upgrade my network, I had my modem picked out without any issue, but I agonized over my router. After reading a ton of professional and personal reviews, I eventually settled on D-Link AC1750 router that was pretty easy to configure and set up, and has been solid so far if not spectacular.

Posted (edited)

Ok 'Az and Mike, thanks for the clarification. Cox has a dual-band unit I can rent for $10 a month for the next few years, or I can buy one from them for $150. I've seen the SB1641 go for $80 online so if I don't need an all-in-one modem (based on my reads I don't think they're what I want) then I'd rather buy my own, single purpose modem. Honestly, I still find it hard to believe that changing modems will increase my internet performance though:

3187664134.png

My current modem is rated at a maximum of 30 Mbps, so do I even need another one? I think Mike is right about not even needing to buy one; there has to be another issue at hand here...

Edited by myk
Posted

Ok 'Az and Mike, thanks for the clarification. Cox has a dual-band unit I can rent for $10 a month for the next few years, or I can buy one from them for $150. I've seen the SB1641 go for $80 online so if I don't need an all-in-one modem (based on my reads I don't think they're what I want) then I'd rather buy my own, single purpose modem. Honestly, I still find it hard to believe that changing modems will increase my internet performance though:

3187664134.png

My current modem is rated at a maximum of 30 Mbps, so do I even need another one? I think Mike is right about not even needing to buy one; there has to be another issue at hand here...

Hard to say. It doesn't matter what your modem is rated for... you connect at the highest DOCSIS version supported on both ends. So if your modem is only DOCSIS 2, and they're pushing you to upgrade to DOCSIS 3.0, they could be throttling DOCSIS 2 connections (for the record, I thought my cable speeds with Comcast's cheapest internet were pretty crappy, but I'm getting something like 10Mb/s more than you, and I've seen faster DSL). DOCSIS 3.0 does support more throughput than DOCSIS 2 anyway, so it's not like upgrading will hurt anything.

I'm not sure what all Cox offers, but I know that the cheapest modem-only from Comcast is $8/month. When I pointed out that a good modem like the SB6141 pays for itself in under a year, I convinced a friend of mine to to finally buy his own modem. And then, like I said, if you like your router, you don't have to change it out, so you don't have to mess with your Wi-Fi settings or anything like that.

Posted (edited)

Hard to say. It doesn't matter what your modem is rated for... you connect at the highest DOCSIS version supported on both ends. So if your modem is only DOCSIS 2, and they're pushing you to upgrade to DOCSIS 3.0, they could be throttling DOCSIS 2 connections (for the record, I thought my cable speeds with Comcast's cheapest internet were pretty crappy, but I'm getting something like 10Mb/s more than you, and I've seen faster DSL). DOCSIS 3.0 does support more throughput than DOCSIS 2 anyway, so it's not like upgrading will hurt anything.

I'm not sure what all Cox offers, but I know that the cheapest modem-only from Comcast is $8/month. When I pointed out that a good modem like the SB6141 pays for itself in under a year, I convinced a friend of mine to to finally buy his own modem. And then, like I said, if you like your router, you don't have to change it out, so you don't have to mess with your Wi-Fi settings or anything like that.

Why would Cox throttle DOCSIS 2.0 users like me? Lemme guess-so we'll buy their stupid modem and upgrade our plan? Well if that's the case then screw them-I'll go get that SB 6141 on my own. Should I try the speed test with the computer connected directly to the modem instead of the router? How do I know if my router is "good enough?"

Edited by myk
Guest davidwhangchoi
Posted

Why would Cox throttle DOCSIS 2.0 users like me? Lemme guess-so we'll buy their stupid modem and upgrade our plan? Well if that's the case then screw them-I'll go get that SB 6141 on my own. Should I try the speed test with the computer connected directly to the modem instead of the router? How do I know if my router is "good enough?"

what azrael and mike said,

i would whole heartily recommend getting your own modem. and a sb6141 at that! i'm not as thoroughly knowledgeable on routers as mike's post, (into tv's), but that's a real accurate account of the routers he's used. that WRT54G was rock solid. i currently use an apple airport extreme 5th gen that i got for under 79 bucks on a slickdeal and it's been rock solid coupled with a sb6141. never reset-ted my router since summer.

Posted

Why would Cox throttle DOCSIS 2.0 users like me? Lemme guess-so we'll buy their stupid modem and upgrade our plan?

Basically. It's also a way to "encourage" users to get on a new standard, reducing the need to support the old ones.

Well if that's the case then screw them-I'll go get that SB 6141 on my own.

As well you should.

Should I try the speed test with the computer connected directly to the modem instead of the router?

It's worth a shot, to make sure that you're not having issues with the router

How do I know if my router is "good enough?"

Technically, even 802.11b should support internet that slow. But, older devices can't handle as many devices. Back when I got my b router, I had two wired desktops and one wireless laptop. Today, I've got four wired computers, a wired printer, a wired PS3 and Xbox One, plus a couple of wireless consoles (Wii U, 360, and a second PS3), four laptops, two iPads, two smartphones, a Vita, a 3DS, a Shield, two Chromecasts, a Nexus 7, etc.

So... what kind of router do you have, and how many devices are connected to it?

Posted (edited)

Basically. It's also a way to "encourage" users to get on a new standard, reducing the need to support the old ones.

As well you should.

It's worth a shot, to make sure that you're not having issues with the router

Technically, even 802.11b should support internet that slow. But, older devices can't handle as many devices. Back when I got my b router, I had two wired desktops and one wireless laptop. Today, I've got four wired computers, a wired printer, a wired PS3 and Xbox One, plus a couple of wireless consoles (Wii U, 360, and a second PS3), four laptops, two iPads, two smartphones, a Vita, a 3DS, a Shield, two Chromecasts, a Nexus 7, etc.

So... what kind of router do you have, and how many devices are connected to it?

I was JUST about to type that up; I had to dig through a pile of G Taste and Alternator's boxes to get to it. The router is a Netgear WGR61454/802.11G. Wow-this thing is 11 years old. Anyway my PS3, PS4, Toshiba and Dell laptops, and the occasional i-whatever connects to this router; I don't THINK I have a lot of load on this unit, as no more than two or three devices are on simultaneously...

Edited by myk

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...