I want to rethink our curent production database solutions. We use a straigth forward and oldschool approach with a single Max DB database.
We don't store huge amounts of data. The biggest database is around 100GB. Table row counts at about 4-6 million rows.
So that is nothing special so far. The problem starts at the point where the data is created.
We have lots of 'users' in these cases hardware modules that send their data periodically to the server where it will be processed and stored in various tables.
We currently have different scenarios one where the modules send their data at a specific time of the day where the data of for example 5000 - 10000 modules has to be processed at once. Which takes hell lot of time. And one where the modules constantly send their data so here we have alot of queries per second incomming.
Is PostgreSQL suitable for this kind of treatment ?
Does Postgres provide any features to balance this load to maybe several data nodes or do i have to use Postgres-XL for such usecases ?
We asume that we face an increasing number of modules sending data to our database. So we need a solution to decrease the time each data block requires to be written to the database.
For example we currently need 45 minutes to process the data of nearly 3500 modules at once. This is not acceptable.
Is PostgreSQL the right choise to do my research on ?
Copyright Notice:Content Author:「Jan S.」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/35747470/postgresql-a-good-choise-for-load-balancing-needs