I believe that you will find that the answer to this question is "It depends". It depends on the computer that is being used (e.g. a 1GB database running on a Raspberry PI could be considered large, but a 1GB database running on a 64 core x64 box with 1TB memory could be considered small). It also depends on the schema of the database and the workload of queries (inserts, updates, deletes, and selects).
answered 01 Dec '15, 15:38
500G with 1,000 active users, that's often regarded as "large" in the SQL Anywhere community.
Other measurements include statements like...
If it's hard to schedule a full backup then you have a large database.
If you are terrified of doing a restore then you have a large database.
If month-end reporting crushes online update performance then you have a large database.
If it takes days, weeks, months to make a simple schema change then you have a large database.
...where "large database" means "large for you" ( which is another way to say "it depends" :)
answered 01 Dec '15, 16:06
It is not uncommon for our customers to have databases in the size of ~300 GB using SQLA 12 and 16 and still growing. Some of them are running even on virtual servers.
And to add to Brecks list: It is large, if a sequential scan really hurts performance.
answered 04 Dec '15, 08:33