Hello, there is this online shop we are working on, however when big discounts come, the server cannot handle all of the requests. I was thinking of using multi-server website, to use reverse proxy or dns settings to load-balance the traffic, but to have them all connect to a single external database, so all the requests and the quantity of the products get refreshed in real time.
Is this the correct way to do such thing?
We don’t know enough about your set up to actually advise you about what your architecture or options should be, and so I will make a few assumptions for your use case. Those would be:
If your DB is multi-read, then updating all the products in real time is easy as you just need to add additional read-replicas and balance those out - pgpool and patroni (if my memory serves me right) can help you achieve that on Postgres. On MariaDB and MySQL you can look at galera cluster which supports multi active masters. For Mongo, there is a built-in capability to achieve that.
How do you balance these? HAProxy is a great and free solution that with a bit of tinkering with can help you balance your queries accordingly - this will require scripting to route your traffic to the correct database and higher bandwidth; another choice is to go at it at the DNS level according to geo distance with solutions like Technitium that will route the traffic to the right webapp server, though this does not dismiss the need for HAProxy,
For ease of use, you may want to look at hosting your database on a cloud provider (or migrating to), and scaling out your web servers accordingly, work is needed anyhow to configure your requirements for autoscaling and configurations.
Over the long run, you’d probably want to redesign your webapp to be able to scale horizontally (meaning adding more servers) while having the same backend (databases), or add some automation there.