I am using binary_checksum to verify dataintegrity between two servers but the algorithm apparently is not strong enough and returns false positives. Are there any alternative algorithms that can be used instead of or combined with binary_checksum ? I am running on SQL Server 2000.
Rgds
Bertrand
Binary_checksum doesn't guarantee uniqueness. If you find matches then you will have to check the individual columns to eliminate/find the duplicates. There are no other options in SQL Server 2000. In SQL Server 2005, you can use the new hashbytes function to compute MD5 hash which is more closer to providing unique value based on your input. So you should probably compute the MD5 hash on the row outside the database and store it in the table. You can then perform quick comparisons to find the duplicates. And you should still check the individual columns.|||Thank you, you have a point there.
I would still like to experiment with a new user defined function for calculating alternative checksums. I have different checksum algorithms available but I need to find out how to create a user defined function that behaves as the BINARY_CHECKSUM function. E.g. it should take * as an argument and compute on the entire record contents. Do you have any suggestions on how I could proceed with this? Can it be done in T-SQL?
Rgds
Bertrand
|||A "checksum" will never guarantee a unique value. The best way to track changes it to use a change field and a trigger to update the changed field. Either a datetime or a Y/N or something. Then reset the Y/N or do a > last update date on the refresh.
No comments:
Post a Comment