-
Posts
13158 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Gallery
Everything posted by JB0
-
Apparently NASA pulled a fast one while I wasn't looking... http://www.nasa.gov/exploration/systems/mpcv/ They took their work on the cancelled Constellation and slapped a new name on it. We DO have a plan to go into space that doesn't rely on hitch-hiking in Soyuz modules!
-
Ooooh, fun toys. Something to do once our 32FS100 gives up. Not the best TV EVER, but it's pretty good. And raw NTSC, so no issues with fancy HD circuits "helping"
-
SNES can do RGB(as long as it's not the redesigned SNES2, which can't do s-video either). Odds that a VGA input will handle the relatively low scan rates it generates are low, though. So it doesn't really help matters that much. Though I guess an RGB->component transcoder would work. But, but... the video chip used in the SMS can't GENERATE an interlaced signal, and almost certainly doesn't generate an exact 59.94 Hz vertical refresh, especially after a quarter-century of aging on the components. And the LCD shutters are dead simple. SMS switches one lens off at VBlank, and then switches back at the next VBlank. It's literally the EXACT same implementation aside from the new ones running twice as fast and commanded wirelessly. The TV CAN'T be displaying at any other frame rate than the SMS is generating, or it would get out of sync with the game(either running ahead of available frame data and start flickering, or running further and further behind until the buffer filled and the TV choked to death on it's own crappiness). Remember, it's a one-way communication. The TV can't tell the SMS or any other analog input device what IT wants, so it has to adapt to what THEY want. I'm not really surprised the computer LCD choked, though I AM surprised the HD CRT did. Curious what's going on in the electronics to muck it up. What kind of failure mode did you get, anyways? Simple double-imaging, inverted left and right, or something more complex/interesting? Meh. I need some SMS shutter glasses to experiment with now. They abandoned s-video before they abandoned composite. It's always been the bastard child of video connectors and, along with component video, exists solely to prove that consumer electronics manufacturers hate people. ... Especially component. 3 RCA connectors, one sharing a color code with the paired audio RCAs? WHO thought this was a good idea? There's not even that much of a resolution gain over the one-plug s-video connector, since it's still chroma/luma. And don't EVEN get me started about the RGB lie! Why couldn't we just get a pretty SCART connector like the Europeans? Best thing to ever come out of France, right there. Hell, the japanese RGB-only SCART look-alike would've been nice too, if a tad less versatile.
-
As awesome as LaserDisk is, it's time to get 'em on BluRay. Which actually STILL isn't high enough res to see Last Starfighter as intended(the CG scenes were rendered at 3000 x 5000 in 36-bit color). But it's far better than DVD(especially if you have the old DVD release and not the remastered 25th anniversary one). Your TV CAN offer a significant fraction of the original film resolution. EMBRACE HIGH-DEFINITION. RESISTANCE IS FUTILE. YOU WILL BE ASSIMILATED.
-
Actually, Trek gradually imploded under the weight of apathetic and ignorant executive producers until Paramount fired their asses after Enterprise. No one can explain why they weren't fired after(or DURING) Voyager.
-
Not true. The SMS LCD shutter glasses are the exact same tech used in modern 3D TVs. Just at 60 Hz instead of 120 Hz. They don't rely on interlacing or electron beam scan, and the shutter signals come from the SMS, not the TV. As long as the TV displays the frames when it's supposed to, or lags by an even number of frames, you're good. In fact, the SMS doesn't even output an interlaced video signal. Like many game consoles, it abuses the timing signals to create a 60Hz progressive scan display, a practice which is informally referred to as 240p nowadays. You'd probably be right if we were talking about light guns, though. But I still have me an SD tube either way, so I'm actually concerned about the future of my game collection when these things start crapping out. Lot of modern digital displays don't play well with the funky signals output by most game machines. And composite video is being phased out(never thought I'd see the day). What's a guy supposed to do when there's no composite video input? And the first era ALREADY requires mods to interface with new displays. They were all RF-only and there's no longer a TV available with an NTSC tuner.
-
What he said. Though mine has been visited by the Crimson Pacman, free repairs put it back in operation. I still say MS is being a cheap bunch of crapheads by not upgrading them to HDMI at the same time, especially after they designed a new motherboard exclusively for refurbs that uses the new more reliable chips, but doesn't have HDMI. They spent EXTRA MONEY to ensure they didn't have to upgrade anyone's feature set. But hey, one of your early adopters that made the system successful might not buy a replacement console just to get HDMI if you did that, and you'd be out a sale! And hey, it's not like your shoddy untested cooling design inconvenienced anyone when it failed miserably! You know what? I've got a VGA cable, and I'm not afraid to use it.
-
I knew the PS3 supported it, actually. When did they add 3D support to the 360? Also: It will only work on SOME 360s, since the 3DTV standard only works with HDMI.
-
Hell yes. This is where technology SHOULD be heading. Not doing more of the same, but doing more of MORE. I just hope it's available in red and black, so I can get one to match my Virtual Boy. Then I can get some SMS and FamiCom LCD glasses, and a Vectrex 3D Imager(fat chance), and I'll have every 3D game system that ever existed. You can take your PlayBoxes and your XStations and stuff them in the closet where the antiquated junk BELONGS!
-
Man, people... it's all about The Last Starfighter. And my review of 2010 can be summed up in the words of the same IBM ad that's summed up every year since it aired in 2000. Where are the flying cars? I was promised flying cars. I don't see any flying cars.
-
Inserting randomness is actually a highly valid approach in many circumstances. Modeling the real world being one of the more obvious. It's why microprocessors HAVE random number generators. Yup. Power and space are major obstacles. But when you look at what a supercomputer twenty years ago could do and what a laptop now can do... you start wondering how fast those barriers will come down. How would that defeat the purpose? If the problem is "generating human-like behavior" then your algorithms SHOULD do exactly that. It can't be picking the "wrong" solution if it does a dumb thing that a real human with a similar background would have done. Consider a weather forecast program. There's a lot of data that CAN'T be input into such a program because it can't be measured with the limited sensing equipment and computer capacity available. And some of it just looks like random data with no visible cause. Some of it will have known patterns, but some of it will appear as white noise to us, because we simply don't understand the system. I'm not a meteorologist, so this is admittedly speculation, but I would assume there's actually a good bit of randomization in the computerized weather models. And as paradoxical as this sounds, this can actually IMPROVE accuracy. Arguably, the human brain IS a computer running algorithms. They're just very complex ones with large quantities of input, some if it highly randomized. And, well, it's clearly NOT a classical binary Von Neuman machine. There's a lot of iffiness in the Deep Blue VS Kasparov rematch. And you grossly oversimplify how much reprogramming was done between rounds. That's also over a decade out of date. The best chess computers still have great difficulty beating the best human players. My quick and dirty research shows that most such matches end in draws. But they do it without being reprogrammed between rounds, and without forcing the opponent to come in effectively blind. I do take your point that computers play a very different style of chess than humans. I maintain that it could be coded to play more like a human if the problem was understood and the computer existed that could run the math in any reasonable amount of time. Well, my point was that there IS a deterministic process at work in the brain. It's just not one that works very well by computing standards. Ultimately, it comes down to physics. And physics is math. If the brain was understood, it could be modeled in a computer(as opposed to the more abstract neural networks that exist currently and only model neuron connections and not the whole assemblage). If it was understood well enough, it could be modeled accurately. And if that accurate model was run... you'd have an artificial brain. And legacy code can only be rewritten if you have the source code and the hardware necessary to read and write the media it's stored upon. This is a bigger problem than you might think. And you still need someone that can actually make heads or tails of that code. Obviously, we have none of that for the brain. Evolution isn't as rapid or all-powerful as you make it out. It's stuck choosing between different sets of humans, it can't just make an arbitrary "man 2.0" model and implant that on a new person. It's also more concerned with the short-term. Who's making the most babies? Which babies are surviving long enough to have their own babies? That's what evolution cares about. Besides, modern medicine has largely short-circuited evolution. Survival of the fittest no longer applies in a world of corrective surgeries and antibiotics. If someone has a defective heart, we fix it. If we can't fix it, we replace it. They live and pass those genes on. Anyways, computers do math. Given enough computer, you can solve any possible math. And like I said, the brain operates according to the laws of physics, and physics is math. Once you know the initial conditions and system constraints, you can program those in and simulate it accurately. And my point is that if you know how it works, you can simulate it. Everything in the universe is math. Some things are just more math than others. Actually, you're comparing apples and oranges. Quantum computing is a whole new hardware field. Neural networking is a SOFTWARE field. NNs run on deterministic binary math computers. Yes, some even run on ye olde IBM PC clones. Not incredibly advanced networks in comparison to organic life, but... it's PROVEN that modern computers CAN simulate small collections of neurons. It's even possible to simulate small collections of biologically-accurate neurons, to the extent that we know how they work. I'd say the issue is one of raw power, and understanding of the problem. Moreso the latter. But regarding the former... as a rough apples to oranges comparison, there are over a hundred billion neurons in an adult human brain. An Intel Core 2 Quad has some 500 million transistors, each of which is capable of doing far less than a neuron. Biology can be simulated once it is understood. Whether we'll still be running binary transistors once we understand it well enough to simulate is a less clear issue. ... Though some claim they can do it now. http://bluebrain.epfl.ch/ "With the present simulation facility, the technical feasibility to model a piece of neural tissue has been demonstrated. " Also: " In the cortex, neurons are organized into basic functional units, cylindrical volumes 0.5 mm wide by 2 mm high, each containing about 10,000 neurons that are connected in an intricate but consistent way. These units operate much like microcircuits in a computer. This microcircuit, known as the neocortical column (NCC), is repeated millions of times across the cortex. ... This structure lends itself to a systematic modeling approach." And the reality check is here: "Our Blue Gene is only just enough to launch this project. It is enough to simulate about 50'000 fully complex neurons close to real-time. Much more power will be needed to go beyond this. We can also simulate about 100 million simple neurons with the current power. In short, the computing power and not the neurophysiological data is the limiting factor." On one of the fastest supercomputers in the world, they can simulate 50,000 neurons out of 100,000,000,000. And here's the largely unrelated, but incredibly fascinating part... "Will consciousness emerge? We really do not know. If consciousness arises because of some critical mass of interactions, then it may be possible. But we really do not understand what consciousness actually is, so it is difficult to say." If they get enough virtual neurons firing, and the computer becomes conscious, it'll be just incredibly amazing(though obviously they need a major upgrade to their Blue Gene to even try). Either way, it's just incredible. That actually wasn't meant to be a RUDE "wrong on all counts." And may the joy of Hanukwanzmasgiving be visited upon you as well.
-
While I agree with the sentiment(and extend it to cross-console games that never see a general-purpose computer), I have to dispute several of your points. Since when is "because that's how it's always been done" a good reason to keep doing it? The easy way is rarely the BEST way. I HATE the post-Doom quick-save attitude. It inspired a LOT of bad level design and cheap deaths because "you can just reload anyways" and ultimately made death meaningless. If you want TRADITIONAL, I can point to a wide swath of computer games from over a decade of PC gaming that only allowed between-level saves. But I always thought the best thing about PC gaming was that it wasn't BOUND by tradition. Used to be that most developers wouldn't let copying someone else's bullet list get in the way of making the game THEY wanted. See, I think this, more than anything, is what's HURTING PC gaming. That desire to make games that bog down the latest and greatest hardware, then immediately introducing new hardware designed to run those latest and greatest games. In the old days, when PC gaming was strong, games were made to fit the hardware. Hardware wasn't made to fit the games. If your game couldn't run on the current hardware, you scaled the game back. You didn't tell everyone to buy a new video card or three. There should not BE a use for three graphics accelerators in one machine. I guess what I'm saying is... I vastly prefer the PC game landscape as it existed in the 80s and 90s. You know, when it was healthy. I can find more of value in a single bajillion shareware games disk from the early 90s than an entire software aisle today. Especially the 80s. I think the final dominance of the IBM clone running Windows has been bad for PCs as a whole. But that's really another topic entirely. True, but... keyboard+mouse is also probably the single worst standard controller ever. I will stand by this stance until the day I die(I also don't even own a mouse. Trackball 4 lyfe!). Aaaactually... The first model of Sega Genesis is slower than all later revisions. Sega used a knockoff of the 68000 for the first model that performed far worse than the official Motorola parts. And the TurboGrafX had a steady stream of RAM upgrades for the CD unit. But really, a stable target is a GOOD thing. Aside from letting developers know what they're working with and make the most of it... Can you imagine trying to explain why SOME Super Nintendos or PlayStations can run later games and others can't? When you're ready for a hardware upgrade, you release a whole new system. Upgrade the entire thing in one shot. And console gamers buy a new system every few years, often for the price of a single PC graphics card. What's your point? If it's backwards-compatibility, I suggest you dig a few games from 2005 out and try them in your new machine. I've found that a lot of them choke and die... or just glitch in REALLLLLLLLLLY interesting ways. This console generation is actually lasting unusually long, likely due in large part to MS and Sony both over-reaching with their initial hardware designs(as seen in their current systems' unusually high launch prices). Nintendo, though... I can't explain why there's no refresh there. I guess they just don't have to care until the Wii stops laying golden eggs and starts laying silver ones. And it's a situation the PC games market has brought upon itself. Spiraling hardware costs and incredibly short life cycles, combined with a lack of diversity and generally shoddy development* have, if not killed outright, seriously crippled a once-vital market. *I have great technical respect for Doom 3, despite being pretty un-fun to me. Largely because there was actual effort spent optimizing the game, and as a result it looked pretty good on several generations of hardware instead of looking best on the latest and greatest and varying levels of crappy on anything older. So yeah. I loved PC gaming once, and I hate what it has become all the more for it.
-
It's a lovely collection of large words, but... it's also a complete load of crap? 1. You can write fairly complex algorithms with a wide variety of input variables and output solutions. And you could always add a random input to churn the waters a bit. Or throw in a few bugs, because no code much more complex than PRINT "HELLO WORLD" is perfect. And making routines that choose blatantly wrong solutions to a complex situation is actually rather easy, much to the chagrin of video game designers the world over. Just because it's computerized doesn't mean it's inherently BETTER at making choices. 2. There's also a strong possibility that many of the dumb, irrational choices we make are driven by ancient caveman(or even PRE-caveman) thought patterns that simply aren't applicable to the modern world, but are still a large part of how the brain expects things to work. Effectively analogous to glitchy legacy code in a computer environment. It's old, outdated, and kludgy, but it still works... most of the time. 3. It wouldn't be at all like a dog or cat, because dogs and cats have organic brains that process information in similar manners to our own, albeit with different hard-wired priorities and much less horsepower. 4. Neural networks are designed to create organic-style "thinking" from computers, and they're rather good at it. They're also INCREDIBLY inefficient, so they take LOTS of power and space. But those are cheap these days, and getting cheaper almost by the minute. 5. GitS interfaced organic brains to technology. It didn't replace brains WITH technology. There's certainly problems with that(and Masamune Shirow actually calls some of them out himself in author's notes), but it completely sidesteps the issues with simulating the human brain(which isn't a system we understand well enough to simulate anyways). It also means going "total cyborg" isn't a cure for Alzheimer's or Parkinson's in the GitS world. So... ummm... you're pretty much wrong on all counts? Bring on the robo-bodies!
-
A little late, but... Apollo 17 launched on December 7, 1972. The lander module Challenger touched down on the moon the 11th, left and returned to the orbiting America on the 14th, and America splashed down onm the 19th. Eugene Cernan became the last man to walk on the moon. Him, Harrison Schmitt, and command module pilot Ronald Evans were the last men to ever leave Earth orbit. Schmitt remains the only scientist to have ever left. In 4 years we put twelve men on the moon. We've not returned once since then. It's been thirty-eight years since man last left the Earth. No, I don't consider mucking about in low Earth orbit to be "leaving" the Earth. You're still in the Earth's atmosphere. The International Space Station is the most common destination these days, and that's well inside the thermosphere. The soon-to-be-retired space shuttle can only BARELY reach the exosphere. To this day the moon remains the only frontier we've ever reached, then packed our bags and went home. Not space, as calling the moon "space" is a lot the same as calling some of the dust in your floorboard a car. We've never even scratched the surface of THAT frontier. But on a brighter note, as of December 13, Voyager 1 is no longer seeing an outward motion of solar wind. It is on the threshold of interstellar space, and expected to cross that border in the next 4 years.
-
Hell yeah. I wanna see some decent replacement parts come out before my OEM parts start dying!
-
You can't tell me with a straight face that you don't want to smear poop all over a nice big Robotech logo.
-
What are you hoping to get for Christmas (or a holiday gift)?
JB0 replied to ultimateone's topic in Anime or Science Fiction
Hey, my absolute world domination wish is BASICALLY the same thing! -
Ewww, ear buds. I guarantee 10-buck ear buds DON'T play music just as well as 800-dollar reference gear. At least you aren't using the iPod's pack-in buds. Could always be worse. Or, of course, I could just envy your less discerning ears. It certainly makes life less expensive. As far as the HDMI goes... that's a pretty variable thing, and you could write a book about it. Depending on how long the run is and how much data you're pushing(a 720p signal can work through a wore cable than a 1080p one because it's a far lower bitrate), a crappy cable can be just as good(due to digital showing no damage until bits degrade beyond legibility) or horrible(due to the bits degrading beyond legibility). Of course, HDMI cables are also one of the most absurdly over-markup products on store shelves today, and a lot of places get 30 and 40 bucks for the same damn 5-dollar cable. Soooo... get the cheap cable anyways. If it works, and there's no sparklies or dropped frames or desynced audio, then that rocks! In an analog environment, the cheapest cables WILL show degradation. Almost invariably. If you were pushing HD video signals over component, you'd probably notice a huge difference between the five-buck cables and the 20-buck cables. There's a lot less markup on the classic RCA-connector cables in general, so it's much more a case of "you get what you pay for," barring a few notable exceptions. But with a digital feed that's not hitting the limits of the cable... what model the TV is and how well it's calibrated means a lot more.
-
Yeah. Depending on gear and situation, cable quality can be more or less of an issue. It's not just down to noise reduction, though. Lower-quality cables have less overall bandwidth, so... if you try to send too complex a signal through, it gets garbled. Amusingly, it's actually more obvious in the digital realm, because rather than "smearing", you get clipping and dropouts. And HDMI was never intended for long runs, so it's very vulnerable to degradation. Analog is pretty durable in that respect. It degrades gracefully. And Monster generally makes overpriced products with greatly exaggerated benefits, even ignoring their incredibly unethical business practices and ridiculous hair-trigger lawsuits. My personal favorite example was a PS2 controller extension cable they used to make that they claimed would actually get signals from your controller to the PS2 faster so your game would be more responsive... which is completely and utterly impossible on so many levels that I'm not really sure where to start. They're one of those companies that spends a lot more on advertising (and lawyers) than they do any actual product development, and have a reputation that's bought, not earned. And this is a hot-button for me, so let me drag out my soapbox and climb up for a minute or three. I find the difference between 128kbit/sec MP3s and even just 192 kb/s MP3s is audible with almost everything. Maybe not through pack-in earbuds(I don't use them for comfort reasons as much as audio quality), but... even a pair of junk PC speakers will fail to mask how awful 128 kb/s is(I DO use junk PC speakers for my computer). 128 kb/s needs to just die. It was chosen as a tradeoff between size and quality when dialup was the only game in town, and the concerns that prompted the tradeoff to land where it did are long gone. There's simply no good reason for it to still exist. I'm not saying everyone needs to start using FLAC exclusively or, god forbid, raw wave, but... Variable bitrate, average 192 kb/s should be the new minimum standard. It sounds MUCH nicer, you aren't wasting bits by encoding dead silence at 192 kb/sec, and it's not like VBR is rare and unsupported at this point. VBR actually makes a LOT of sense, since the low-complexity parts of the track use a lower bitrate than the more complex parts. So a burst of dead silence can be near-0 kb/s, a full-orchestra crash can surge up to 320, and the file is about the size of a 192 kb/s continuous bitrate file. Now, I DO have some tracks where 128 doesn't really hurt them that much. The complexity of the audio WILL affect how badly it gets mangled. But most of the stuff... it shows(or sounds, I should say). And the more complex the music, the worse it will show. If it's a piano solo, that lone piano gets the full 128 kb/s, but if it's part of a full orchestra, it's sort of like it's divided among all the instruments, and the quality plummets. In the interests of full disclosure, I go for lossless files on my PC, and convert to high-quality VBR for use on my MP3 player(average bitrates landing between 192 and 256 kb/s). I'd recommend getting some nice cheap headphones. Used to use a set of Koss KSC-75s. They're pretty well-known for hitting a cheap-but-good window. Of course, my current headphones are pretty much junk, and while they don't sound particularly GOOD, I can still tell the difference between a low-bitrate and high-bitrate file through them. Definitely. But mid-range gear will benefit if you replace those flimsy cables that came with your DVD player with something a little better. I'm not saying go all-out and get some MONSTER Electroblaster Cables with Superconductive Toroidal Insulation and Patented Turbine-Cut Connectors for Maximal Look-Cool Factor and Optimum Wallet Drainage, in fact I am very much NOT saying that, but... something thicker than a sheet of paper is nice.
-
What, you mean you didn't collect Wire Transport Flywheel media too? Yeah, I collect a little bit of info about everything, it seems. Nice to find a use for it sometimes.
-
To be fair, records DO have more range than CDs, from a purely technical standpoint. Extracting that extra range without the physical issues of reading the record causing artifacts is another story. And at the end of the day, most of the difference boils down to the different masters used. Lot of CDs are mastered horribly incompetently. Tubes, though... that's indefensible. As far as old hardware VS new hardware... Audio doesn't really change a lot. A good piece of gear from 1987 is still a good piece of gear, as long as it's in good shape. If your CD/DVD/WTF player has good DACs and you aren't using shitty cables, there's not really an advantage to using a digital run instead of an analog one. And to the consternation of many and defiance of logic... analog video works BETTER over long runs than HDMI(because it's not a particularly GOOD digital video standard).
-
Any theory that ends with "I blame Disney" is okay in my book. They have a lot to answer for.
-
Yes, but look at it from their viewpoint. A refreshed Macross release would take manpower, shelf space, and eyeballs that could better be spent on the the latest installment/whoring out/slanderous abuse of the Gundam cash cow. Moo. Not MUCH manpower, shelf space, and eyeball time, but... Bandai is not exactly a neutral party here. They know what side their bread is buttered on. There's logic here, it's just not very consumer-friendly logic.
-
Really?! I may get him on that virtue alone. Had a whole slew of Generations stuff hit the shelf this week. Everything but Megatron and Blurr, I think. When you DO find Thunderwing... his two missile launchers can peg together into a single larger gun. And he has JUST BARELY ENOUGH articulation to hold it with a peg in each hand, which makes for some fairly impressive firepower on a bot that size(but still less impressive than Classics Bumblebee/Cliffjumper with Powermaster Prime's guns...). Sadly, the guns will NOT peg into the "detachable recon drone", which is what I thought those tabs on top of the guns were for initially. Spacing's all wrong. But to be fair, the guns are larger than the drone anyways. The wings can be folded in on teh robot so he looks less super robot-y and more Transformer-y. I'm tempted to tie the wing sweep to the guns, so they're only folded out when he's got the big two-handed gun. Sort of a "super mode" effect. They can also be flipped OUT in jet mode, giving him a forward-swept wing look, albeit not a very good one. But then, he's not a very good jet... though at least his face and upper legs are hidden, and his lower legs look like they belong there, which is better than SOME jets. In exchange, theres' a big hole in the chestplate where the part that hides the face folds up from. And there's just no hiding the arms, though twisting the fists around 90 degrees helps.
-
The best ellipsis use is in Super Robot Wars Alpha, when Heero and Rei have a conversation consisting ENTIRELY of ...s. Ah, yes. How DID we forget Sinistar, the greatest philosopher of our age? http://onastick.net/drew/sinistar/ Though for raw quotability, I prefer Berzerk. "Chicken, fight like a robot!"