I have a table with 3 columns but with more than 17,000,000 rows at this moment, I will add more.
I attached here the php that do the queries to this table. This php is requested by my desktop app. the problem is that the app work with multithreads, so the php do many request at the same time ,so the server show me timed out error and some requests dont get the right answer from server.
I need find a method to do my php code more efficient to can handle many request at the same time and do queries to big table.
I accept recommendation to migrate it to other programming language or other db like NoSql.
I'm using CentOS Linux.
Please before send me your proposal check the attached php for understand how it work. the php is simple and commented.
23 freelancers are bidding on average $120 for this job
Hello sir, I am available here to discuss more with this project. We have finished 14 node.js projects. Example: [login to view URL], [login to view URL], gnetentertainment.com. Thanks, Michael
Friend, I think you should to normalize the table, making more tables, and split the information, send me a data sample, we could find a way to reduce the number of rows.