Well, I can't imagine SQL Server handling your requirements, and I find it unlikely a mission critical system like this will use MySQL or Postgres (primarily because Enterprises like to pay a lot for that 'security' of support), so I'd shoot for Sybase if its not Oracle?
The reason I ask is because I am a pretty highly skilled Oracle guy, and some of the numbers I have seen around financial systems just seem so fast and big volume transactions per second. Maybe I could get Oracle to do them, but then the zero downtime scares me somewhat as until very recently, keeping Oracle up all the time, even through application upgrades is fairly tricky!
Actually, the db we use had similar problems. Schema changes require at least _some_ downtime, if only to lock the tables. The trouble is less the number of transactions than the linkage between each transaction and other transactions' liabilities in the system. In effect, serious DB's are in effect in-memory updates (some of our bigger customers run with no logical log write to disk on commit (really!)) running on 128-core Sun boxes (principal and failover). You can get through a lot of tps and the saving on dev cost and maintenance through this monolithic simplicity is great.
Makes sense. Thinking about it, most of the DB2 usage I've heard of is behind ATM and core accounting systems in retail banking rather than trading applications/low-latency...
The reason I ask is because I am a pretty highly skilled Oracle guy, and some of the numbers I have seen around financial systems just seem so fast and big volume transactions per second. Maybe I could get Oracle to do them, but then the zero downtime scares me somewhat as until very recently, keeping Oracle up all the time, even through application upgrades is fairly tricky!