Current backup solutions for Drupal, like the backup module, tend to assume the site is kept inactive during backup, to avoid inconsistency problems; then export the DB and files, and archive them somehow, in order to restore the site in the event of a complete data loss.
While this is fine for smaller and less active sites, a better mechanism for bigger sites sould provide the ability to have:
- backup of active sites, possibly by introducing a specific system mode allowing site access but queuing updates instead of applying them
- appropriate behaviour on multi-sites from a single backup mechanism
- ability to perform drupal-level consistent exports, which DBMS-level replication doesn't allow
- ability to create incremental backups
- ability to perform partial restores
- ability to restore from a basic system, ideally without even drupal core installed
This raises all sorts of issues, since the global data organization in Drupal does not currently support such a "queueing udpates" mode, and no significant transaction mechanism.
This "queueing updates" mechanism could possibly be extended later on to support server replication without DBMS-level replication.
OSInet would be glad to provide mentorship to any students who end up undertaking this task.

Comments
It can be achieved with
It can be achieved with triggers that executes "delayed" INSERT, UPDATE and DELETE to another table which acts as backup.
The term delayed doesn't specifically mean the MySQL DELAYED operation.
We usually use MySQL replication instead as a backup scheme for high traffic sites then with an automatic fail-over we can achieve HA as well.
Agreed
Replicate (you can even replicate to the same server with a second MySQL instance) and use hotcopy feature to back up. You can then dump the hotcopy or whatever.
Keeping sync
Indeed it "can be achieved". But this requires some very specific setup and knowledge/mastery, and does not take into account the synchronization of the multiple drupal instances in a multisite with the files and code. A backup system like the one I envision would tie all the disparate mechanisms needed for these 3 parts of backup, in just one easily integrable module that all sites could use.
Also, and maybe most importantly, having such a mechanism would allow drupal-level consistent snapshots to be taken: when replicating at the DBMS level, the current state, be it consistent or not at the application level, is replicated, meaning that unless the server is first downed there is no synchronization point at which every table is in a state where a reimport would result in a consistent system.
Consider the case of a user submitting an edit on a complex node: the save requires several inserts, which will be done in a non-transactional sequence in the DB. Now if a snapshot is taken, say on a replica, there is currently no warranty that it will include only complete updates, because at the DB level, every insert is in its own transaction, if any. So one may find oneself restoring inconsistent backups, and no clear way of restoring consistency afterwards.
Higher levels of back-up competency
Higher levels of back-up competency is usually done via DB level, OS level, and Hardware level (the best as well as the most expensive way).
Doing back-up on software level is the least reliable way to achieve business recovery and continuity but it is also the cheapest and that Drupal is just based on interpreted language, its not compiled/executed natively such as PHP or MySQL. Even Java has these problems (yeah, i do Java too -- don't hurt me :D)
I guess the probability of data loss on back-up is the best we can do at this level -- im no expert but i would say 95% based on experience which is surprisingly good enough for e-commerce standard.
Well.. Except if PHP or MYSQL starts providing methods to access the OS and Hardware level back-up routines then we can do a lot more.
+1
Although this could indeed be done at the DB level, many users do not have the necessary skills to do backup tasks. Adding this feature to Drupal will certainly improve usability and power.
+1
The Power of Drupal Categories
A Podcast for Mac Switchers
User Experience Design
A Podcast for Mac Switchers
This touches on a SoC proposal I almost made...
Since the advent of CCK I frequently find myself working on a new feature release for a site that's already deployed. I need to make changes to the content types on the site and perform other administrative DB-based functions, but I also need to avoid losing content that users are contributing to the live site. Simply swapping the dev database for the live one when it's time to deploy is not an option for this reason. Instead the DB work needs to be done on dev, tested, then manually redone on the live DB when it's time to deploy. The various serialized export options scattered about make this easier, but it still stinks.
It would be ideal to have a queued update system similar to what's being proposed. Even if the types of data transferred are relatively basic -- e.g. insert/update/delete for users, nodes, comments -- it would go a long way toward keeping a dev db synched and able to be swapped to live later on.
Of course, all sorts of issues would come up. You'd need a lookup table to keep sequences synched. You'd need to catch a billion references to nid (e.g. nodereference fields), making it only possible to support some modules. It's a big undertaking, but would dovetail with what's being proposed here and would make development much easier.
Deployment module
I have been working on a module that is targeted towards exactly this problem
http://drupal.org/project/deploy
Still very early in development, but it does deploy content types, views, and some system settings from server to server (not dealing with nodes just yet, which as you say is more complicated.) I'd welcome any comments you might have.
I had a couple SoC projects I was going to propose tied to this, but I want to release before August :/