mysql - Insert/ Update Scalability (searching increasingly longer list for update match untenable) -
haven't found clear answer here.
i'm performing basic "insert/ update' -- funneling new data mysql database of "tickets."
for example:
ticket_id: 154 status: open messages: 2
that ticket in db.
an incoming record either insert/ update based on ticket_id. aka if ticket_id new, inserted, if it's looked , found, updated. possibly simplify further, ticket_ids incremented sequentially in increasing order. ticket_id 1 first ticket ever, etc.
here's problem. right i'm insert/ updating against 100,000 ticket_ids in db. each insert/update write (unlike pure insert) -- has lookup each incoming id against 100,000 ids determine potential match update. each month increased 60,000 tickets ---- until there on 1,000,000 ticket_ids being "looked up" during each daily insert/ update. not scalable. in fact, seems extremely common issue regular insert/ update in large mysql database.
here potential things:
- ticket_ids unique , increase sequentially
- tickets become status: closed after 30 days of inactivity. means never updated again. key here. i'm not sure how technically "ignore" these tickets during insert/ update without "looking up" them every day. 1 method either daily, or monthly, transfer "closed" tickets separate db table, , use union database queries. thoughts on this? i'm no db admin means.
is answer? 2 tables, , ticket archiving?
and ... there benefit indexing ticket_id? heard increases write time, decreases read time.
my problem right now, think, writing time insert/ update, not select statements. 1 guy told me insert/ update select/ lookup way.
the first thing should review indices have
show create table my_table_name\g
if upserts getting slower, adding index ticket_id place start. suggest make unique index.
create unique index my_index_name my_table_name (ticket_id);
adding indices slow down inserts, database 60,000 new records per month, , 1,000,000 records in total, won't notice.