i have csv file 1000 rows , 15 columns. planning store each row simple json object redis value , row number of csv file redis key. once start uploading multiple csv files redis, what's best way distinguish 1 csv file in db? isn't redis key-values no other larger structures keys other different redis db instances themselves? know redis in memory so... isn't bit inefficient go search desired key if have millions of records? don't it. if uploaded 1 million csv files, , each had 1000 rows, 1 billion records search, many in-memory db. should way.
i looking way represent each of csv files in redis in efficient , sensible manner, each csv file row has it's own unique key, , redis value column headers , data.
how accomplish this?
one solution each key represent entire csv file, looking see if there option.
you creative.
you don't specify whether it's important keep of csvs separate, or if of data lumped 1 "group"-ish type of logical structure (like in table in rdbms).
operating on assumption of data go together, keep 1 key increment global counter ids each "row":
// synchronous easy writing/reading var rowid = client.get('csv row counter'); rowid = rowid || 1; // `csv` array of json objects (var i=0; i<csv.length; i++) { client.set('csv-'+(rowid+i), json.stringify(csv[i])); } client.set('csv row counter', rowid+i);
if need use method, make sure rowid
scoped correctly simultaneous uploads increment appropriately.
if, instead, need keep track of each csv separately, need name row reference csv:
// `csvname` initialized chosen name csv (var i=0; i<csv.length; i++) { client.set(csvname+'-'+i, json.stringify(csv[i])); }