Hi, i'm making batch changes to several of our foxpro databases using C# and ado.net. Basically i am consolidating many lookups into the same key across multiple databases.
After moving them to the proper id and deleting unused id's, i go and find out from all of the databases what lookups they have, compile them into a uniquelist and then make sure that every database has that row. My problem is that once my consolidation happens and it deletes unused rows, i'm trying to add rows that use to be there back in because they're part of my 'master list' and i need them even though they aren't being used...
i then get this error even though the record has been deleted already.
example
id 9060 was not being used, i first go through and delete all codes that aren't being used in related tables (to clear out all of the junk --and unfortunately in some cases codes i need to keep around as well)
9060 was deleted
I consolidate all of the codes
i go to my master list and try to add 9060 again with it's proper description (one that matches the master list), and i get the uniqueness error.
Now, i could go back and rewrite the whole architecture of my software to support this 'bug ' but i'm hoping someone knows a way to rebuild the indexes programmatically (in ado.net) so i don't have to.
Thanks!