Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - BlackGriffen

Pages: [1] 2 3 ... 7
1
TalkBack / RE:Virtual Console Mondays: August 13, 2007
« on: August 20, 2007, 11:48:32 PM »
I must say that I find the grinding and cheap shots in Ninja Gaiden more frustrating than the old Metroid. At least with Metroid I'll usually survive (the occasional inescapable pit notwithstanding) and refill my life. Yes, grinding for life is boring, but it develops a certain insane level of patience that comes in handy in life.

For those who are getting lost, GameFaqs has several maps, one of which consists of screen shots so you can see where you are.

Ah, those were the days - I still remember getting called when my sister was the first person to reach Mother Brain, though I was the first to actually beat it. One of my sisters even managed to work her way out of that bottomless pit in Brinstar without the wave beam once. I don't recall exactly how she did it, but it involved letting the blocks solidify around Samus somehow.

Now, GET OFF MY LAWN IF YA CAN'T APPRECIATE IT!

2
TalkBack / RE:Revolution Controller Still Has Secrets
« on: December 02, 2005, 08:19:05 PM »
That's brilliant! The controller has a built in analog thing mover! Just put the controller against anything you want to move, push, and it moves! The controller is so well designed that it provides exactly the same amount of force feedback as if you were trying to push the object yourself!*

BlackGriffen

*Nintendo is not liable for damage to either the Nintendo Revolution Controller or whatever Thing you are using the controller to move.

3
TalkBack / RE:Revolution Controller Still Has Secrets
« on: December 02, 2005, 04:17:39 PM »
The controller either transforms into a giant robot that rampages through Tokyo

-or-

it's something like the click on the GC controller - an interesting idea with some potential that few if any developers will use.

BG

4
Nintendo Gaming / RE:Controller and online implications
« on: September 16, 2005, 10:47:15 PM »
Hmmm, PGC seems to have fed that old thread to the bit bucket.

Here is the Wired article that started it all. The short of it: successful online businesses take full advantage of what the net and computers offer to make money where brick and mortar businesses cannot: effectively infinite "shelf space," nearly free reproduction and distribution, search technology driven by user recommendations and other data gathered from users to help users sort through it all. If Nintendo really leverages its past library and encourages indie devs to also use an online distribution channel, they should be able to take advantage of "the long tail" and dominate the world!

And thanks, zakkiel. I doubt that I would have had the stand-up-ness to post an about face like you have.

BG

5
TalkBack / RE:Peter Molyneux Comments on Revolution
« on: August 21, 2005, 01:32:17 PM »
Nintendo hasn't been around for more than a hundred years because they're stupid.

I'm getting really excited about the Rev because it's going to be Nintendo's first home console that Yamauchi didn't have a hand in. Not to disparage the man too much, but I think that Nintendo's hyper-conservative business tactics were primarily his. I still expect Nintendo to be conservative, but not to the crippling extent it used to be.

Show us your stuff, Iwata-san!

BlackGriffen

6
Nintendo Gaming / RE:Fact Only Thread: Rev Tech Specs
« on: August 06, 2005, 10:18:31 AM »
Thanks for the compliment, but after consulting with some people who actually know about this stuff (as opposed to just reading the forum posts of people who know about it for the last 5 years or so) I would put the chances of a Gekko + other chip at slim to none. The simple reason is that a Gekko would be massive overkill as a boot processor for an EIB. The part Apple uses is called an HC08 - a dirt cheap 8 bit processor that went for a buck or two at retail prices a few years ago. That plus the work necessary to actually make a Gekko available for use while booted in Rev mode would be substantial.

Not to mention the fact that if Nintendo wanted to tweak the main CPU in any other way, and it seems likely that they are judging from their own statements (custom CPU Broadway, etc), then the additional cost of making it able to support the Gekko's custom instructions is comparatively small.

So, there's a remote possibility it could happen, but it's down there in likelihood with Microsoft buying Nintendo.

BlackGriffen

7
Nintendo Gaming / RE:Fact Only Thread: Rev Tech Specs
« on: August 02, 2005, 11:54:03 AM »
A thought recently occurred to me. The Elastic Interface Bus (EIB) that the PPC970 uses is rather complicated and sensitive. For a detailed description of the process of engineering a board for it go here. Long story short - a PPC970 actually needs another CPU in order to help it start its bus and run. My understanding is that in Macs this is a PPC440. I can't think of any reason, however, why Nintendo couldn't use a Gekko to do it. Doing so would certainly simplify backward compatibility - no longer any need to tweak the 970 design to run the Gekko's special instructions (an expensive process).

Now, whether Nintendo could use this extra Gekko for other things, I don't know. I'm not sure if they'd want to, to be honest.

So, I must amend my prediction. I think Nintendo will go with one of the following:

- single core 970 + Gekko
- dual core 970 + Gekko

If Nintendo can actually use the Gekko for running Rev software, it would make sense to run it at a higher speed than the GC's roughly 500 MHz. Current G3's available suggest an upper limit of around 1 GHz. If they do go that route then I would expect a Rev with even a single 970 core in it to perform comparably to the competition - worse in some areas, better in others, but well enough to compete regardless.

For those who think this is ridiculous, remember how Sony got backward compatibility in the PS2 - they used the PS1 CPU as a sound chip. Since an EIB requires a "wake up" CPU anyway I can't think of any reason for Nintendo not to use a Gekko of some form in this role. That is, assuming Nintendo uses a chip with an EIB (a PPE based chip could use it also, I think), which is not a given, but plausible if Nintendo doesn't want to spend money tweaking.

BlackGriffen

8
Nintendo Gaming / RE:Fact Only Thread: Rev Tech Specs
« on: July 21, 2005, 12:11:10 PM »
Quote

Originally posted by: jasonditz
Since backwards compatibility and design simplicity seem to be the orders of the day, a G4-based CPU would make a lot of sense. Think Gekko with a higher clockspeed and an Altivec engine.

Except that a G4 is not just a G3 + Altivec. You remember how I said that the G3 had 1.5 integer units? Well, that 1.5 was meant to mean "1 full integer unit + 1 simple integer unit". The difference between the two being that an integer unit can do anything, whilst the simple one can only do a few operations like add and subtract. The reason that it's important to distinguish the two is that a G4 has 1 full unit + 3 simple integer units.

The G4 also has a longer pipeline. If you imagine a CPU as being like an assembly line, then dividing the work up in to more stages means that each stage takes less time. Thus the processor can be run at a higher frequency. I'm vague on exactly how many stages are involved, but I believe the G3 has 4, the G4 started with 7 and may have gone up to 14, and the G5/PPC970 has more than 20.

Another big difference is how the CPU communicates with the rest of the system, called the front side bus or FSB. The G4 FSB is plain better than the G3 one (more bandwidth and lower latency). Sad thing is, the G4 FSB still sucks by today's standards. That's why Freescale (formerly part of Motorola) is going the route of AMD's Opteron and adding a memory controller to the next G4s. Think of it as cutting out the chip that normally sits between the CPU and RAM.

The biggest problem with a G4 theory is that the G4 is Freescale's baby and nobody else can make it without permission.

Re: PII speed. Ooops. I didn't do a lot of in depth research, as you can tell, and just went by the top speed Intel makes available now. At 450MHz the PII is probably close enough to being as fast as the Gekko to make it plausible.

Re: KDR. Very interesting, if true.

Also, there are now more details available on the new single core PPC970s. This PDF has a table of maximum power usage at various temperatures. Important data follows (power in watts @ 85 C, power in watts @ 105 C):
  • 1.0 GHz: 12 W, 13 W
  • 1.2 GHz: 15 W, 16 W
  • 1.4 GHz: 18 W, 19 W
  • 1.6 GHz: 20 W, 21 W
  • 2.0 GHz: 47 W, 50 W
  • 2.2 GHz: 56 W, 60 W

For the non-power optimized parts:
  • 1.6 GHz: 27 W, 29 W
  • 1.8 GHz: 35 W, 37 W
  • 2.0 GHz: 56 W, 60 W
  • 2.2 GHz: 71 W, 76 W

Just judging by what I understand to be acceptable in a laptop computer (ballpark max 30 W), Nintendo could go with 1.6 GHz single core (1.8 GHz if the push it) or, just guessing that MC power ~ 2x, 1.2 GHz single core (1.4 GHz if the push it).

Also, a caveat from the data sheet:
Quote

Important: The data in this table are based on the best available simulation data at the time of  publication and may berevised after hardware characterization. Because they have not yet  been correlated with production-qualified hardware, these estimates should only be used as  guidelines for design.  

In other words, YMMV.

Something interesting occurred to me, though. If you look just at the execution resources of the PPE, it looks vaguely like a Gekko + AltiVec + SMT - simple integer unit. The PPE's big advantage being that it's floating point unit is better than the Gekko's. In other words, we'll be more likely to see a PPE single or dual core chip than we are a Gekko evolution.

BlackGriffen

9
Nintendo Gaming / RE:Fact Only Thread: Rev Tech Specs
« on: July 20, 2005, 12:23:40 PM »
KDR already addressed most of your points, ABfA, but I felt that one was worth expanding on. You're argument that a P2 could handle GC games and I'd bet you're right. The reason you're probably right, though, has nothing to do with architectural prowess and everything to do with the fact that it appears that Intel never made them at a speed faster than 333 MHz and the Gekko is near 500 MHz. If you got a PII that runs at about the same speed and was tweaked with Gekko's special fp instructions, I would be flabbergasted if it couldn't stand in for the Gekko. Sure, the game would have to be recompiled for the new ISA and any assembly replaced, but that's about it.

Look, I'm not saying that the Gekko was outright bad for the GC. It has done it's job admirably. It was fine for this gen, where it was competing with a Pentium III and the Emotion Engine, but it doesn't have the legs needed to perform well next gen. It has architectural features that were necessary at the time but that are severe shortcomings on the performance front now. Just like the Opteron outperforms Pentiums because of architectural features - not some supposed strength of its instruction set (ISA). Specifically, the integrated memory controller and the more abundant floating point execution resources help on this front.

Also re: ease of programming of the Cell. I'll believe it when I see it. The PS2 has three processor cores: the Emotion Engine general purpose processor and two vector units, V0 and V1. The Cell has the PPE (general purpose processor) and 7 SPEs (mainly vector units that look to be beefier than V0 and V1). If I recall correctly, Sony originally planned to just use the Cell like the PS2 just had the EE+V0,1. The Cell worked so well and was so easy to program for that they brought Nvidia on board to provide a graphics chip. In fact, that live demo at E3 was just running on the PPE and graphics chip - the vector units weren't even involved. In general, it just plain isn't easy to use multiple cores in parallel to do a job faster than one beefier core.

So, like I said, I'll believe that they can automagically make the Cell easier to program for when I see it.

BG

10
Nintendo Gaming / RE: Fact Only Thread: Rev Tech Specs
« on: July 19, 2005, 05:53:05 PM »
Also, re: APIs remaining the same. All that means is that the machine will be able to handle OpenGL (a standard feature of all graphics cards) and have a PowerPC processor that has been tweaked to handle Gekko's special instructions.

It also means a bunch of other minor stuff like the ability to handle the GC audio and any specialized bits from the Flipper. These things are not a big deal - the audio stuff will, I imagine, be incorporated into the NEC part and ATI will naturally be able to include any of the other special sauce from Flipper into Hollywood.

BG

11
Nintendo Gaming / RE:Fact Only Thread: Rev Tech Specs
« on: July 19, 2005, 05:47:03 PM »
I, for one, will be stunned if Nintendo goes with a Gekko/G3/PPC750 lineage processor. Why? Sorry to burst your bubble, folks, but the thing is weak as hell!

Let me count the ways:
* No G3 has been clocked faster than 1.1 GHz, ever. The pipeline is simply too short to clock higher.
* The execution resources on the G3 are anemic: 1.5 integer units, 1 floating point unit, 1 load/store unit, and zero vector units. The thing is comparable to a Pentium II in both age and oomph! The couple extra special floating point instructions for the Gekko aside.
* The front side bus (the thing that hooks the CPU to the rest of the system) on G3s (likely also Gekko, but that's information I don't have) is likewise too slow.
* It has zero ability to execute instructions out of order.
* Does not have the hardware necessary to play well with other CPUs - ie no multi CPU or multi-core.

Changing any of these things would require enough of an investment on Nintendo's part that it would be cheaper to just go with one of the other solutions IBM provides. Hell, Nintendo would be better of with a Motorola/Freescale G4 than with a G3, hands down.

We know that they went with IBM, though, so that leaves two realistic options: PPC970/G5 and PPE/Cell. They both have their up sides and down sides. A review:

PPC970:
+ Beefy execution resources: 2 integer, 2 floating point, 2 load/store, and 1 AltiVec
+ Out of order execution (OOOE) - makes it easier to program for, Cell and Xenon, by comparison, will have steep learning curves - also significantly improves performance of code with branches (think AI, game control, etc).
- May only be able to squeeze one into the tiny Rev.
- Can only run one software thread at a time

PPE:
+ Will be able to run at higher frequency.
+ SMT = can run two threads at once
+ Competitors will be using it, making ports easier
+ Guaranteed dual core at least.
- Anemic execution resources: 1 integer, 1 floating point, 1 load/store, and 1 AltiVec
- Zero OOOE - rumors have developers complaining that the slow down for branchy code is as much as a factor of 10 compared with standard PC cpus of similar speed! Expect it to get better as coders learn some tricks, but the gap will never completely close; the PPE is optimized for multi-thread throughput (which is why each core runs two threads despite its slim execution resources) on the cheap and not single thread speed.

That's it. If Nintendo goes with a PPE don't expect their processor to be any better than the XBox360s. Indeed, going this route Nintendo would likely opt for two cores instead of three, IMHO, with a clock speed of 2.5 to 3 GHz. If Nintendo goes the PPC970 route, expect numbers in line with what's at the top of the post. That is, unless IBM comes up with a miracle - something they haven't been known for these days.

Nintendo could, hypothetically, go asymmetric like Sony (Cell = 1 PPE + 7 SPE; asymmetric because not all cores the same) and have one PPC970 and one PPE in the chip. I don't think it's likely, but I can't think of an explicit reason why it couldn't be done. Just several reasons why it wouldn't be done: harder to program for, IBM would probably ask for a money hat to crowbar the two together, etc. Still in intriguing idea...

BlackGriffen

12
Nintendo Gaming / Fact Only Thread: Rev Tech Specs
« on: July 17, 2005, 12:52:29 PM »
CPU: made by IBM
GPU: made by ATI
LSI (bridge chip, I believe): made by NEC

Barring yet another custom variation that uses the PPE in the PS3/Cell and XBox360, it looks likely that Nintendo will be using some variant of the PPC970 (aka G5).

Given the historical rate of improvement at IBM, I expect that any 970 Ninty might use would have to be on this list:
  • 970MP (dual core, 1.4-2.5 GHz, 1MB L2 cache per core [2 MB total], optionally turn off one core for low power usage)
  • 970FX low power (uses 13W* at 1.4 GHz, 16W* at 1.6 GHz, runs up to 2.7 GHz, 513 KB L2 cache)
*Power usage figures are typical, not maximum - hardware design needs to handle maximum.

Those are Nintendo's 970 options. It is not safe to speculate about the power consumption of the MP model based on the FX - the size of the caches are different, for starters. Suffice it to say, given the small size of the Rev, I would be surprised to see an MP at any faster than 1.4 GHz if we see one at all. If Nintendo goes single core, I would guess that 1.8 GHz would be the top speed they could use. Maybe 2.0 GHz if they push it.

Remember, however, that Nintendo may not take this route. They may opt for something based on the same much simpler cores that MS and Sony are using.

BlackGriffen

13
Nintendo Gaming / RE:On Emulation on the Rev
« on: June 24, 2005, 07:18:40 PM »
Quote

Originally posted by: stevey
oh! so there just putting snes sprites in nes games. I hope they put lttp sprite in loz .

I'm pretty sure that they just put screen shots of the original SMB and the Super Mario All-Stars version side by side.

BG

14
Nintendo Gaming / RE: On Emulation on the Rev
« on: June 22, 2005, 06:36:03 PM »
Here's relevant news

Looks like it's settled: it will happen. It's just a question of extent.

BG  

15
Quote

Originally posted by: Savior
Theres no Saturn Emulator made yet. Which is a shame. Id kill for some Nights, or some Panzeer Dragoon.

Master System, Genesis, and Maybe 32X would be fine though. Love to get me some Zillion, and Alex Kidd. Oh and Some SHinobi? Wow.

Use the Google, Savior.

At least, there are several links that claim to go to Saturn emulators.

BlackGriffen

16
Quote

Originally posted by: TVman
Couldn't Sega put the emulators on a game disk so that you could just play the downladed games?

Even easier - make the emulators freely downloadable 'games'.

BG

17
Funny you should mention porting SEGA games because there's a Sonic the Hedgehog ROM floating around the net that is supposed to work with SNES emulators. I haven't tried it, myself, because if I want to play Sonic I'll fire up my Genesis or GameCube (I have the game for both). I don't know what the origin of the ROM is, but I would suspect one of the following: a leaked port SEGA made to hedge its bets, a fan done port (changing the Genesis ROM in whatever ways necessary), or a fan done rip-off (built up from scratch using only the sprites from the Genesis version). Of the preceding, honestly, the first would be the biggest surprise to me.

So it is possible, but certainly more trouble than it's worth, as Ian noted. Especially since multiple searches on Google indicate that there are emulators for every system SEGA ever made, including GameGear, SegaCD, and the 32x (Genesis addon). And all of them were made without the detailed technical knowledge that engineers at SEGA would have at their disposal. Many of them are also open source, so SEGA could even piggy back their effort.

That said, even though it would be nice for SEGA to do this, they would likely also bring their emulators and such to the other consoles to maximize their own profits. Since all the next gen consoles will be PowerPC based, it should take a minimal effort on SEGA's part. I wouldn't, therefor, count on it becoming a Rev advantage, if it's done at all.

BlackGriffen

18
I have to agree with the "I hope not" camp. Absolutely zero tactile feedback = bad bad bad.

BG

19
Quote

Originally posted by: bmfrosty
1080i is the same pixel bandwith 1080p(30).  
They are both 30 FRAMES per second.  
On typical (correct) HDTV displays the display refreshes 60 times per second.
In 1080p(30), the image changes every other refresh.  
In 1080i, the image changes every refresh, but only the even or odd scanlines.

I could go on, but I won't.  BlackGriffen will return with more wrong assumptions about the standard I hate and am all too intimate with.  I won't correct him anymore.  It's not worth my time.

Ok, genius. If 1080i and 1080p are exactly the same pixel bandwidth, then why doesn't the original XBox support both? After all, same pixel bandwidth = same processing requirements.

On a final note, you too seem to be confusing the broadcast standard with what the TVs are capable of doing.

Here are some more people you need to go correct, oh infallible sourpuss:
From this thread
Quote

Actually I don't know of any HDTVs that can accept 1080p at 30 fps. Since 60 Hz is the refresh rate of the display if it can accept 1080p is usually only accepts it at 60 fps. There are only two displays that I know of for under $10K that can currently display and accept 1080p at 60 fps and that is the 37" Westinghouse for $2500 and the 45" Sharp for $9000. The 37" Benq might be able to but there has not yet been any confirmation on that.

Communications Engineering & Design Magazine
Quote

[...]
That trend was in evidence at the recent National Association of Broadcasters (NAB) convention, according to Chuck Pagano, ESPN’s senior vice president of technology engineering and operations.
[...]
Even if such equipment does come to market, probably the biggest issue standing between 1080p and commercial service reality is transport bandwidth. By combining the higher resolution of 1080 with the greater refresh rate of progressive, 1080p is even more data-dense and could soak up even greater bandwidth.

That starts with the video produced from 1080p cameras. At present, 720p and 1080i cameras output video at about 1.5 Gigabits per second, but 1080p would roughly double that to 3 Gbps, Pagano says. To convert that into a standard 19.4 Megabit per second channel for transmission across a cable network, “there’s a whole set of other technologies that have got to be accomplished in between there.”
[...]

In other words, 1080p/60 isn't in the present broadcast standards because of bandwidth requirements broadcasters cannot meet, but that does not preclude TVs from receiving such a signal in situations where bandwidth is not limited (ie from a console, computer, etc). Equally important, it does not preclude it from being adopted in the broadcast spec at a later date.

Now, who should I believe? I'm genuinely torn between the engineering magazine reporter with ESPN's senior VP of technology and engineering on her side, and the infallible one here before me in the PGC forums. Whatever shall I do?

BlackGriffen

20
Quote

Originally posted by: bmfrosty
Quote

Originally posted by: BlackGriffen

Progressive scan runs at 60 full frames per second.

BlackGriffen


No it doesn't. see the bottom of the following article:

Article

or Page 31 of the standard itself:

The Standard

or Page 24 of the Guide to the Use of the ATSC Digital Television Standard.

The Guide

And just so you don't forget.  I maintain that the ATSC standard sucks.

Congratulations, you caught me unaware of the technicality that the 1080 formats don't support 60 fps. Whooptie-freakin'-do. That wasn't the point I was responding to, although it would effect my comparative pixel bandwidth assessments from earlier. Except, that is, for the one you mentioned. If you had payed closer attention to the link you provided, you'd see that both 1080 formats are limited to 30 fps. If 1080p were limited to 30 and 1080i were permitted to go at 60, you'd have been right, but because they are the same frame rate you were still wrong.

So, it looks like it goes like this because of frame rate limitations: 1080i, 720p, 1080p. Because of that, I would prefer 720p overall because it has the best refresh rate. 1080i becomes the acceptable minimum.

BlackGriffen

21
Quote

Originally posted by: anubis6789
I don't mean to be a know it all , but BlackGriffen 720i is not a standard ATSC (or any other digital television standard that I know of) resolution, 720p on the otherhand is. Just wanted to let you know.

I beg to differ. Perhaps you're confusing what the TV can receive with what MPEG2 can encode? Regardless, your point is moot. Even if the TV doesn't accept a 720i signal, Nintendo can use an effective 720i by only updating even or odd lines in the frame buffer.

Thanks for saying something, though.

Quote

Originally posted by: bmfrosty
1080p and 1080i have the same pixel bandwidth. I prefer 720p as it's progressive and 60fps. And oh yeah. The ATSC standard sucks ass.

I believe you're incorrect and here's why. Progressive scan runs at 60 full frames per second. Interlaced runs at 30 full frames per second rendered as 60 fields per second. Each field is half the size of a full frame. Thus, unless the card is rendering the full frame and then discarding half of it on every render pass (a technological decision that I doubt is necessary, otherwise the original XBox would support 1080p as well as 1080i), the graphics chip only has to handle half of the pixels per second in order to render interlaced frames.

BlackGriffen

22
Thanks for the advice, Ian. I wasn't really trying to be convincing, though. Mostly I was simply being sincere. I don't really have the high ideal that the content of any one email is going to have any effect whatsoever. If this works it will be because of the sheer mass of emails sent because it was on IGN. I seriously doubt that any more information will reach decision makers beyond "X pro HD, Y con" (and you can bet there will be at least a few con HD emails), if that.

The only reason I wrote what I did is that I meant it. If Nintendo doesn't even support at least 720i, I'm not going to buy a Rev until a year or two in to it's life, if then. It doesn't matter if they make it optional, available in the American market only, etc, but they should at least do that. Without even the option of token support I see it as tremendously damaging to the Rev's image in the U.S. The problem is twofold: the technogeeks who already have 16:9 screens and anyone who wants there console to be at least somewhat future-proof because they want to buy a 16:9 screen at some point in the future.

I honestly find it unusual that this information came out at all. This is the kind of thing you let slip under the cover of lots of positive news. That is, unless Nintendo is testing whether they can get away with it by judging the reaction. If that's the case, more information may get back to decision maker than just "pro" and "con." Not a whole lot more, mind you. Something like "make or break pro," "pro," "indifferent," "con." In which case, stick me in the make or break pro column because there is no technical reason whatsoever that the Rev can't perform well on a 720i screen that has about 1/3 the number of pixels of the benchmarked system in my previous post. After all, the three main factors in graphics performance seem to be "how complex is what you're drawing," "how sophisticated are the drawing algorithms," and "how many pixels do you have to push?" Given that 720i is demonstrably not that many pixels (a little more than half of a low res computer monitor) for a GPU to push, I don't see it significantly degrading game performance.

BlackGriffen

23
This was the email I sent to NOA on the matter:
Quote

I will not buy a console that is going to be a dead end, period. It's bad enough that the Revolution is planned to not support the impending broadcast standard in the United States. What's worse is the bad publicity it will generate. It will kill the Revolution before it launches.

Every relevant argument given in favor of excluding HD output only applies to not making HD mandatory for developers. Not making HD mandatory is acceptable. Excluding HD entirely is not.

The GameCube supports 480p. At bare minimum, the Revolution should offer support of at least one 16:9 format. It's 50% more pixels per second to push than 480p, but it would look better than any 4:3 format on an HD television. Even better would be if the Rev supported 1080i like the XBox. Best would be support for 1080p, but I won't hold my breath.

Considering that Nintendo is working with ATI, a PC graphics chip manufacturer, and graphics chips are designed to run computer displays at resolutions in excess of 1280 by 1024, a resolution with 26% more pixels than 1080i, I would expect no less from Nintendo's next console. Consider this graphics card:

http://www.zipzoomfly.com/jsp/ProductDetail.jsp?ProductCode=320776

by ATI. It has 32MB of video RAM and is based on 3 year old technology. In the manufacturer's specifications, it states:

" -Crisp and clear 32-bit 3D resolutions up to 1900 x 1200"

Given that the Rev should have more video RAM than that, it should be able to perform better than that. Now, I understand that there are issues of frame rate and effects to consider but 1080i, which is less than half of the pixels, is perfectly reasonable.

My Real Name
Proud owner of every Nintendo console ever made, but having serious doubts about the Revolution and Nintendo's future.

This was the canned response:
Quote

Message(#6851-000443-3918\4433918)

Hi!

Thanks for letting us know how you feel.  We appreciate you giving us your feedback and we will be forwarding it on to the appropriate people for review.

There will be more details released about the Revolution in the future so stay tuned to www.nintendo.com for more information.  We are confident that gamers and non-gamers alike will support our focus on fun, innovation, and affordability.  Once you have a chance to play games on the Revolution, we think you will!  

Sincerely,

Nintendo of America Inc.

Nintendo's home page: http://www.nintendo.com/
Power Line (Automated Product Info): (425) 885-7529

-----
ORIGINAL MESSAGE:
[...]

For those not initiated, here is how to calculate the number of pixels a card has to push per frame:
4:3 aspect ratio - square the height (in pixels) then multiply by 4/3
16:9 (all HD) - square the height then multiply by 16/9
all others: multiply height by width
if interlaced - divide by 2

For comparison's sake, I'm going to list some common resolutions and their total pixels along with a percentage increase over the previous size.
480i: 153600 (N/A)
480p: 307200 (100%)
720i: 460800 (50%)
1024 X 768: 786432 (70.7%) <- Very common, not high res at all, computer monitor resolution (I don't know if they make monitors with a lower native resolution than this any more)
720p: 921600 (17.2%)
1080i: 1036800 (12.5%)
1280 X 1024: 1310720 (26.4%) <- Another common monitor resolution
1080p: 2073600 (58.2%)
1900 X 1200: 2280000 (10.0%) <- Resolution mentioned in the specs for a Radeon 7000 GPU with 32 MB of VRAM (ie dirt cheap and old as the hills).

Now, consider the ATI Mobility Radeon X800 XT benchmarks at the link. Remember that this is a mobility product so they should be able to squeeze it in to the Rev's small form factor.

To summarize the results, I'll list the games that make the cut (about 60 fps or more) and those that don't to give you some idea of the graphic quality you could expect on a screen this size at full frame rate.

Game List: Doom 3, Far Cry, Half Life 2, Splinter Cell: CT, UT 2004
1280 X 1024, full effects: + = Half Life 2, Far Cry
ditto, no effects: + = Doom 3, Far Cry, Half Life 2, Splinter Cell
1680 X 1050 (17.6% more pixels to get 1080p), full effects: + = Half Life 2
ditto, no effects: + = Half Life 2, Far Cry, Doom 3 (almost)

So, if Nintendo doesn't support more than 1080i, it should be able to do just fine.

This also isn't about developers, because then Nintendo would only have to make HD support optional. There are only two reasons I can come up with to make this decision:

1 - save a few cents on every console manufactured, assuming Nintendo will save more than the profits from the lost console and game sales.
2 - Nintendo doesn't want to look bad for not supporting HD with its in house software.

Bottom line: if Nintendo supports 1080i I'll be happy, if Nintendo supports 720i I'll grumble but accept it, but if Nintendo doesn't support HD at all I say screw Nintendo.

BlackGriffen

24
Nintendo Gaming / Rev has no Eth Port a Mistake?
« on: May 25, 2005, 07:30:16 PM »
According to IGN's tech specs, the Rev doesn't have an ethernet jack. Specifically:
Quote

No Ethernet jack; Revolution connects to the Internet using 802.11b and 802.11g Wi-Fi wireless

At first I was in disbelief when I read this. Don't get me wrong, wireless is great. Wireless only, however, is a mistake. The investment to get wireless running is US$50+, a significant investment. Cat-5 (ethernet) cable is US$20 or less (depends on length). The hardware necessary for a simple 10/100 ethernet connection is minimal and cheap, so I can't think of a reason not to include it. If Nintendo still chooses not to, though, they should at least support the use of USB ethernet adaptors.

If Nintendo refuses to do both then there is no other way to describe the decision but colossally stupid. It will hurt adoption of their online strategy, for starters.

So, in sum: Nintendo would have to be stupid to exclude a basic ethernet adaptor, and colossally stupid to disallow the use of USB ethernet adaptors as well.

BlackGriffen

25
Nintendo Gaming / RE:On Emulation on the Rev
« on: May 21, 2005, 04:01:46 PM »
Quote

Originally posted by: jasonditz
the amount of design effort a lot of that would take would be prohibitively expensive. Its like putting a team of translators on some old SNES RPGs that never made it stateside.

If they can offer a cookie-cutter visual upgrade without breaking the compatibility (the way Sony's PS2 backward compatibility broke a lot of games), then that's probably worthwhile.

Of course. Everything I've described are literally just image filters that are put in after the emulator but before the images are displayed.

BG  

Pages: [1] 2 3 ... 7