The syntax of it's usage is
That is pretty easy to use if you are in Simple transnational mode and just doing some testing/research, but if you are looking at production database with huge Log file?
Here are 4 ways to filter the results to make research and troubleshooting more manageable:
Solution #1. Copy Log file content into tempdb table or into another database:
In this sample code I copy content into a temp table, permanent table in temp db and permanent table in TestDB. By doing this you have to be aware of the following:
1. Never copy Transnational Log data into the same database you are doing research in. By doing that you will change the log.
2. Be aware of the size of the data. It might take a long while to extract whole log and the amount of space required for that table will exceed amount of space used by the Log file.
If you went this path you can easily select needed data using WHERE clause.
Solution #2. Extract only needed columns.Function "fn_dblog" returns about 130 columns. It is obvious you do not need all of them.
You can limit an output by selecting only certain columns:
However, if you have millions and millions records in the log file it still might be not appropriate solution.
Solution #3. Search for particular operation.For instance you want to search for any changes in your database you can run following script:
In my test DB I've got following:
Here comes my last script to extract LSN for ONLY needed transaction:
Solution #4. Extract only needed LSNs.Would say we need to extract an information associated with an "UPDATE" for LSNs started at "0000004f:00000087:0001". You can just specify Starting and Ending LSNs as "fn_dblog" parameters:
That means you do not have to read entire log into a temp or staging table and do not have to scan entire log again and again to do your research.