I have in my server database some tables that are too much big and produce some slow query, even with correct indexes created.
For my application, it's necessary to have all the data because we make an authentication process with RADIUS users (AAA protocol) to determine if one user can or not navigate in Internet (Depending on the time of all his sessions).
So, with 8GB of data in one table, what are your advises to follow? Fragmentation and sharing discarded because we are working with disk arrays, so not apply. Another option is to delete rows, but in this case, I
can't. For the other hand, maybe the only possible solution is increase the resources (RAM).
Any suggestions?