I'm looking at a potential issue that we might be facing soon in one of our game backend servers where we have over a million + or so users hitting the server daily and buying assets in our game and doing other game related things.
Right now when buying an asset it gets inserted as a node and content type with cck, well we where discussing the other day that int(11) only goes up to about 2 billion+ nodes and while this is a lot for most sites we can easily reach that in a month or so and also as our customer base grows.
I was looking into seeing if I could convert the int to bigint for all the nid stuff we are using but not sure if thats a good route. I have never messed with a nosql db like mongo or couchdb but I was curious if I should look at creating that table of game assets that are bought into one of these tables..
I guess im just stating this thread to hear about what could be done in the drupal world with such a high node load or offload it to its own db ..
Comments
Do it in code
I would use hook_schema_alter() to change the DB schema in code as well as changing it in the database; you need to change nid and vid anywhere you come across them; other tables join to the node table. Looks like serial big is supported http://drupal.org/node/159605 so patching CCK so the tables it creates uses bigint would be the hardest thing to do. This change will make the database slightly slower. You also need to look through other contrib projects and see what joins to the node table via nid or vid.
Mikey, thanks for the reply..
Mikey, thanks for the reply.. I have been thinking about this and since this is our game server I don't necessarily need to save each buy of an asset as a node, im starting to lean towards a small custom module where I can great my own schema and using serial big.. I will have to create some indexes for searching but am a bit afraid to do a query like select a,b,c from gameassets where uid = 2 on a table that has over a billion records.
I'm also looking at potentially pushing this to mongo, but damn so many directions to go not enough clear data out their and well I cant afford to do trial and error.
patching nid/vid to do bigint sounds to much like a pain to maintain.
Im curious what others do that have huge amounts of data for drupal.. im not talking just a million but several million nodes or records being created on a dialy basis.
I'm also entertaining the thought of doing a light weight node.js --> mongodb but thats territory im unfamiliar with.