Any database knowledgeable folks wanna help me improve our query times?

Be sure to read and follow the guidelines for our forums.

Jun 1, 2025 1:50 am
As many of you have likely noticed, GP has been running a bit poorly as of late. I'm not totally sure why, but I do know a number of the database calls are sub-optimal. Unfortunately, this is a space where I'm less knowledgeable. I plan on learning what I need to, but if there are folks out there who wouldn't mind giving some feedback/advice, I'd appreciate it.

If you have experience with database optimization, namely with MySQL, I'd love your help! I know what queries are problematic, but not necessarily how to solve them. I'm working through the EXPLAINs, but don't always know how to interpret the answer. If nothing else, bouncing ideas/asking questions could help us speed up the queries that are keeping the site slow. If you're willing to help out, let me know, here or on Discord!
Jun 1, 2025 4:02 am
I have expert knowledge about T-SQL, and a working knowledge of MySQL. There is a great overlap.

I'll take a look and give you my thoughts, but I have never worked with queries through a website, just so you know.
Last edited June 1, 2025 5:11 am
Jun 1, 2025 7:11 am
Some questions for query optimising

Are there indexes on the tables you are querying?
Indexes take extra space on the database, but they are an ordered reference on the column in question. If you are joining two tables on their key column, having indexes on the key-columns will improve your query performance a lot

Is there room for improvement on your queries themself?
Feed them into ChatGPT and see if it has some optimising suggestions. A classic example is if your joins and where statements are working against each other.
OOC:
SELECT * FROM tableA a
LEFT JOIN tableB b on a.key = b.key
WHERE b.column is not NULL
This example, an inner join, will give you the same result, and is faster/better

Are your tables getting too big?
Without knowing too much about the GP database setup, could there be a performance gain by moving old inactive games to a separate table? You would have a "hot" table for active games and forum conversations, and after 30(?) days of inactivity, move them to a "cold" table.

Some databases have a built-in feature to handle such things, but I'm just spitballing right now.
Last edited June 1, 2025 7:12 am
Jun 1, 2025 7:34 am
runekyndig says:
... but I have never worked with queries through a website, ...
The 'website' aspect is just that it is PHP code performing queries, you can, mostly, see the SQL statements right there in the code. It should not matter that the presentation layer is the web. You can take a look in the code on github [link] for queries, maybe you can see some silly uses we can fix. I am sure we can find you some juicy examples to look it if you don't spot any right away.
runekyndig says:
... This example, an inner join, will give you the same result, and is faster/better ...
Those are going to be very valuable insights.
runekyndig says:
... could there be a performance gain by moving old inactive games to a separate table? ...
I suspect so, and have been thinking about suggesting this, mainly to take the pressure off the 'home page' load.
runekyndig says:
... and after 30(?) days of inactivity, move them to a "cold" table ...
I suspect we might want something more like 'after a year', but we will need to look at the shape of the data, and access patterns, and penalties for the slow table.
Jun 1, 2025 9:20 am
Would lo r to chat with you further on a lot of those details rune. Yes, we're indexing, but the question becomes if correctly for the queries we're making. I also suspect some of the queries are slow because of their pattern, such as some that may better being two queries over one.

As for table sizes and cold data, I don't think that will have a significant impact. For example, our most problematic query involves forums and threads. Nether is a relatively large table, and the EXPLAIN says the largest part of the query deals with 33k rows. Not a big number in the scale of databases. I'd be happy to send you an example of the current query I'm trying to debug, perhaps over Discord?
Jun 1, 2025 11:14 am
I would love to have a chat on discord work you @Keleth. I'll be home in about 4-5 hours, and then I'll look at the GitHub and the SQL there and see if I can spot some low hanging fruit
Jun 1, 2025 11:42 am
runekyndig says:
I would love to have a chat on discord work you @Keleth. I'll be home in about 4-5 hours, and then I'll look at the GitHub and the SQL there and see if I can spot some low hanging fruit
I ran our slow query logs through an analyzer, so I have some idea of the problem points. I just don't know what to change for efficiency. For example, I ran the query I was working on last night through ChatGPT as you suggested, and other than formatting, it didn't have anything to say (it did mention to swap to CTEs over subqueries, but best i can tell, that's not gonna be a game changer, as it'll still go through the same query optimizer).

You do not have permission to post in this thread.