Faster, more efficient searches for large quantities of data
I have been given a MySQL database which contain 110 databases within it. Each database within is derived from two tables:
Table A: Contains the database name and field names or field labels.
Table B: Contains records but each record is in one field separated by | which is linked to table by DBid
There are just over 200,000 records at moment. I have two options either to create 110 individual tables one for each database or continue on, as it is. If I create 110 tables, users need to search for the word "smith," for example, in all 110. Currently, it's not too bad and results are appearing quickly because it searchess the one table for all the records. However, it is inconvenient for the user. Would a search take a lot longer if I had 110 individual tables to search through with full text search? I have implemented a full text search but was wondering how much quicker would it be if I had 110 individual tables. I have been looking into the Soundex function. Would that make the search a lot slower when querying 110 tables?
While I cannot speak to the design choices made here, I can say that, in general, you are better off having a single table on which to perform a full text search as opposed to a larger group of tables in most situations. I would assume in this situation that the overhead of 110 queries would cause poorer performance than a single table search. The full text works better when it can eliminate duplicate words by having a larger data set.
As for the Soundex function, such a performance is intended for finding words by sound when the spelling of a word cannot be relied upon. This make it useful for things such as name searching where the user may not spell the name correctly. It may be of limited use depending on context.
This was first published in April 2006