Which Container is best suited for holding 10 millions of records (String, String) ?
-
@ksranjith786
As for container classes there is quite handy overview of them here. Actual choice depends on many factors, how the data would be accessed and managed etc. however it is your choice.
As for "deployment of SQL" you actually do not have to deploy anything besides your apps running environment, SQLite is well integrated.
As for the actual storage of that amount of data... You can choose to have it all in the memory using container classes - assuming you have enough of the memory ofc., then you can put it in the memory based SQLite db or you can use SQLite with regular file based db, write your self a model that will handle the data the way you need them.
Well written model would be able to cache some of the data in the memory making it available instantly performing some cache optimization in the background (read in advance/write when idle).
I do not know your use case though so it is just a speculation.This post is deleted! -
This post is deleted!
@ksranjith786
Hi
You should really test with SQlite if your use case is to select amount 10 millions lines and display the subset. -
This post is deleted!
@ksranjith786 said in Which Container is best suited for holding 10 millions of records (String, String) ?:
Around 5 msec for lookup
You probably need a proper database (not just SQLite but maybe PostgreSQL), index it properly and maybe even give it standalone-resources (i.e. run it on a dedicated machine) to achieve that performance
-
@ksranjith786 said in Which Container is best suited for holding 10 millions of records (String, String) ?:
Around 5 msec for lookup
You probably need a proper database (not just SQLite but maybe PostgreSQL), index it properly and maybe even give it standalone-resources (i.e. run it on a dedicated machine) to achieve that performance
@VRonin said in Which Container is best suited for holding 10 millions of records (String, String) ?:
You probably need a proper database (not just SQLite but maybe PostgreSQL), index it properly and maybe even give it standalone-resources (i.e. run it on a dedicated machine) to achieve that performance
Even this may not be viable. In a typical 10/100 network you'd get about 1ms of latency from the TCP/IP go-around, which shrinks that 5ms window considerably.
@ksranjith786
How are you going to use that dataset?Our use case is that, our application need to fetch offers associated for an item during item scan.
Elaborate on that, break it step by step for us and do say what are "offers" and "items" in this context, and most importantly what's an "item scan".
-
My db in MySQL with 3000 of records, takes at least 1.5 seconds to response.
-
My db in MySQL with 3000 of records, takes at least 1.5 seconds to response.
-
That's simply too long. You should inspect your database and how you use it.
@VRonin
It just occurred to me that this problem is a prime candidate for usage and testing of your big hash lib. :)@kshegunov Lol, thanks but 5msec is not achievable even in my wildest dreams. I also suspect 10m QString as key (that are not dumped on the hard drive, only values are) are enough to blow most memory