surflog_logo

 

Squidwork’s garage is launching another neat plugin that fetches the browsing history from 4 major browsers available on Windows systems. SurfLog collects the browser history for Chrome, Internet Explorer, Firefox and Safari browsers and stores that data inside the LabTech database. The plugin in version 1.0  reads the last 30 days of logs and displays them on the computer console, allows you to launch a new scan and refresh your current view. You can also use the Clear All Logs button to remove all current logs for the computer your viewing or export the data into Excel.

 

Client-Stats

 

client-history

 

Each column is sort able so you are able to see by Browser Type, URL, Site Title, User Profile or the number of visits to a URL. The plugin comes with one script that collects the data and should be scheduled on widows client groups for once a day.

The Surflog script only grabs the last 24 hours of logs to keep from get to much data that it then cannot import  correctly in to the LT database using LT scripting. For this reason you should schedule script to run every 24 hours or less. It will not duplicate any entries into the database so you can run it every hour if you like and only new items will be added to the database. The Surflog script also manages the retention policy and will clear out logs based on the set policy. You can set the policy on the Info Tab of the Client console under the sub tab [SurfLog].

Enable SurfLog on a per client basis

Client Console ->Info Tab -> SurfLog Tab -> Enable Surflog Collection

You can enable Surflog Collection on a per client bases by enabling each client that you would like the collection process to run on. You should schedule the script to run on all windows systems for all systems in labtech then use the enable feature to allow scanner to run or not for each client. The script will validate whether or not to run based on this setting.

SurfLog History Retention

Client Console ->Info Tab -> SurfLog Tab -> Retention Policy

You can set the retention policy for each client between 1 and 90 days, as the collection script runs it will look at the policy for the client and will clear out history based on the retention policy set.

 

A “Client” must be enabled for any scans on systems under that client to run even if directly scheduled on that system! 

SurfLog Key Word Highlight

Client Console ->Info Tab -> SurfLog Tab -> Key Words List

If you place a comma “,” separated list of words in this field the view of each log under this client will highlight those rows where the words show up in the URL, Title and Visit From columns.

 

surflog-EDF

 

 

 

 

excelFileWe added the ability to surf the links from the history and you can now export the browser history to Excel.

 

 

 

 

 

New in version 1.1.0 Cloud hosted LT servers are now fully supported. Just edit the collector script to enable or disable cloud support. See [Grab Surf Logs] script notes for more information.

Get SurfLog 1.1..1

 

download

Feel free to donate to our cause if you find this software useful, Help keep our software free.

 ———New in Version 1.0.5———–
Added new Client Tab with Client level view of all browser histories.

 

———New in Version 1.0.6———–
Added threading to Excel exports.
Fixed display of Keyword box.
Added UK Time Support to Powershell collector scripts.

——-New in Version 1.1.0 —————-

Rewrote Export function, now really fast and works!
New look and feel
All new graphs and pie charts
Fixed several display issues

 

—-New in Version 1.1.0.25 —————–
Added support for LT10.5
Plugin now loads script and edfs automatically, no script imports needed
Added auto plugin updater so you will not need to update plugin manually again.

 

We give a shout out to  www.nirsoft.net for their  BrowserHistoryView application that provides the CSV creator. Thanks for a great tool.

 

Cubert 😎

 

 

Tagged with:
 

96 Responses to “LabTech – SurfLog Plugin stores browser history for IE, Chrome, Firefox and Safari”

  1. Elias Leslie says:

    Works great! We were already using this tool but in a more haphazard way 🙂

  2. David says:

    I have installed this plugin, I have enabled the client and the script is saying it has executed successfully and updated the database. However when I click on the surflog tab the log is blank.

    Any suggestions, I have labtech 2013.1?

  3. Ok 2 possibilities,

    #1 There are no logs on system for the last 24 hours. (Is this system actively being used and user is surfing the internet?)

    #2 Script failure, Uncomment line 49 in the Get SurfLog script. This will print out the SQL query that is updating the database. What is in it if anything?

    You can also right click and disable the 2 delete file cmds on line 52 and 53. This will leave the content file in c:\windows\ltsvc\surflog\ so you can review them. Do they exist and what is inside of them?

    This info will help determine if 1 script runs, 2 if data is returned and 3 if SQL is updating.

    Cubert

  4. David says:

    Hi Cubert,

    Done as you said, #2 gives me a print out of the sql query. also disabled lines 52 & 53 and I can see information in both the csv and sql files.

    But still nothing in the surflog tab. Where can I check on my server to see if the information is there? Have you any other suggestions, you help is much appreciated?

  5. cubert says:

    ok send me the entire SQL log entry in the script logs, I have a feeling that there is a special char in the sql code that is blowing up the insert cmd.

    send it to sanderson@ this domain

  6. We have released version 1.0.1, Links are now browse-able and system exports logs to Excel.

  7. Michael says:

    I don’t seem to be getting the Surflog Tab on the Client Info page.
    The script has shown up in the scripts area though.

  8. Cubert says:

    surflogs only show on the Computer console not client consoles

  9. Cubert says:

    Use the client console to enable/disable and to set retention only then view the Surflogs on the Computer Console for each windows system.

  10. Michael says:

    Thanks, I was confusing myself between client / location / computer. I see it all now. Thanks!

  11. Oliver H says:

    Hi Cubert,

    even with the new Version I have the same Problem, still see anything. Did the Logs tell you something?

  12. Cubert says:

    Oliver, I sent you a response email.

    Looked like your Visit Time is coming out in the wrong format.

    What OS is this running on and how many of them are doing this?

    Goto LTSVC dir and find Surflog dir. Run the BHV.exe manually by double clicking it. Se what format the Visit Time comes out as. I am wondering if the version of OS and or Browser is putting time out in non SQL format and thus causing SQL to put in 00/00/000 as the date and time.

  13. Michael says:

    I’m actually having similar issues. I’m testing this out at one of our clients who has about 80 computers. It’s working on about 25% of the computers.

    I’m trying to figure out why. As I do I’ll let you know.

    On the computers it’s not working on, BHV does seem to work on it’s own. Date format is DD\MM\YYYY TIME

  14. Michael says:

    I can’t see any substantive difference between ones that are working and ones that are not. I’ve emailed you the .sql files in order to try to help you.

    I have checked the table in the labtech database and only the “successful” ones are in there, so it is an import issue.

    Is there any way to capture a “success” or “failure” and reason?

  15. Oliver says:

    Hi Cubert,

    we are in Germany, so we have dd.mm.yyyy

  16. Everyone,

    We just released 1.0.3 today!!

    We fixed some major issues with Time that was causing all sorts of issues. Its in Beta so if you have any issues with the new script send me the format of the date column as it is seen in the CSV file.
    Germany I have you covered in the new script.
    Next we added a new feature that will highlight rows of data in Surflog based on Keywords. So if you want to see all Microsoft or all Porn just add those to the keywords list.

  17. Oliver says:

    Hi Cubert,

    thanks for the update, but now i have a Problem with the dll, saying something that there is no row at Position “0”.

    Thx Oliver

  18. Oliver says:

    Hi Cubert,

    with the dll from 2th of September is working, anyway I cant see any data.

  19. Oliver says:

    Hi Cubert,

    it seems to work now, I dont know why. I can see only one day in the logs, I will test more and Keep you informed.

    Thx Oliver

  20. Oliver says:

    Hi Cubert,

    I changed the script, that the log File is not deleted, so i can get it by Labtech File Explorer. Later it seemed to work, but only one time. I changed the scripts to the original one, still nothing.

    I’am getting nuts!

    Oliver

  21. Cubert says:

    There should be 2 files a csv and a sql file. When you run the script make sure it does not delete these files. Then lets take a look at whats inside the files.

    If you want you can send them to me at sanderson at this domain. (without the www ofcourse) and I will give them the once over to see what is not right.

  22. darrell says:

    Is there a way to make it grab more than 24 hours of data? for one off data collection?

    Thanks!

  23. Bryan says:

    Great looking plugin! I installed and tested it on a few of my clients and they all report back on this command:
    Parameters: Utilities\BrowsingHistoryView-64.exe|c:\windows\ltsvc\SurfLog\BHV.exe

    Output: ERR Could not Download file /Transfer/Utilities\BrowsingHistoryView-64.exe

    Plugin says it’s active and good to go, am I missing something? Thanks.

  24. Bryan says:

    Disregard, I realized it didn’t put the nirsoft utilities in the share.

  25. Matt says:

    Hello!

    Another GREAT plugin, guys!

    btw – perhaps you can add a line for downloading the x86 / x64 bit versions of BHV and saving / naming them in the appropriate spots (for dummies like me who tripped over this step).

    Looking forward to the next one!

  26. We just release version 1.0.5 adding a Client Tab to Surlogs with Export, Export can take a really long time if you toss 10K records or more at it so you been warned.

    Enjoy

    Cubert.

  27. Ben says:

    Hi, Im new to LT, and came across some of these add-ins, when I installed I get the Surflog tab on the Client / Location and Computer Tabs. I do not have the Enable surflog options on the Info tab so Im not able to enable anything.

    Any ideas? thx (I tried hitting scan now on a client and nothing happened, which makes sense if the client is not enabled per these docs)

  28. Ben says:

    Disregard, I didnt realize there were other folders in the zip file, i extracted, hit plugin manager there was 1 file (dll) I installed it, but didnt show me there was text and xml 🙂 seems to be working!

  29. We just updated to version 1.0.6, has UK time fixes and Threaded Exports to Excel along with several small fixes

  30. Darell,

    The problem with that is the limits LT has on file uploads to variables inside of scripting engine. we are limited to 2Mb per file. At the moment if we scan more than a day or 2 on any heavy users we will exceed that limit and you will get no data. Think the plugin is broken and ditch it as a utter failure.(smiles) Thus we have hard coded 24 hours into the script. You can modify the script where we set that limit and make it go as far as you like but be fore warned of the limits and watch for them to be exceeded.

    With that said I think we can over come this by using powershell to cut up the files in to 2 mb chunck’s and cycle through the files loading them up till all are consumed but that’s a lot of scripting magic that needs to be worked out. I see updates in the future…

  31. Cliff says:

    I installed this a few days ago and I’m loving this plugin.

    But I found a bug, I think.

    I went to my dashboard->Config->Addition field defaults->Clients->SurfLog and decided t enable surflog collection and set my retention and key words list.

    The issue is that surlog is enabled on my clients, but the key words and retention time are not there, the retention is -9999 and the keywords are present in tab, but not on the main screen unless I edit them and go back in.

    Any ideas?

  32. Cubert says:

    You must goto each client and in the client console select Info tab -> SurfLog

    Select enable there, set retention, and any keywords at the client level not at the dashboard level using the EDFs at that level.

  33. Cliff says:

    Bummer! But a solid answer. I’ll take it.

    But I just encountered an issue with a customer with about 200 agents and a few terminal servers. One of the terminal servers really got hung up on the data collection and peaked the CPU.

    Is there a way to disable it for just one system? It seems right now it is on for all agents at a client or off for all.

  34. Cubert says:

    Now I will be glad to place that in as a feature request.

    You want one single config that will set for all clients? You also want to one off set PC’s to be excluded from scheduled scans?

  35. Cliff says:

    I’m ok with having to enable it at the client level manually.

    But some systems, for performance, or even privacy reasons, it would be great to have a EDF to “exclude” the computer from the SurfLog scan.

  36. Dale says:

    Love the plugin. Had a client looking for this specifically. The only problem I’m having is that when I try to export the log file, it kills the LT Control Center.

  37. Dale, are you on the latest version, We thread the export now so that the plugin Tab does not “hang” during the export. Also the Excel export is really slow process.

  38. Dale says:

    I’m on Labtech 2013 60.276. I’m using 1.0.6 beta of Surflog. I am able to pull the logs and view them within the LT Control Center, but whenever I try to export the logs I get the warning about it taking a long time, click OK and POOF away goes the Control Center. It just closes out the second I hit OK. I’ve tried on the LT server itself and on 2 other systems running the control center. Really appreciate any help you can provide.

  39. David Hunter says:

    I have installed 1.0.6 but I do not have the Surflog tab under the client info tab, what have I done wrong

  40. Benito says:

    Hello admin do you need unlimited content for your blog ?
    What if you could copy post from other sources, make
    it unique and publish on your site – i know the right tool for you, just
    search in google:
    Loimqua’s article tool

  41. aaronjames says:

    love the tool. worked without issues! 🙂

    Just one question, can it only grab the last two days of history? Is there an option to say grab the last 7 days?

  42. Jonny says:

    At the client lever when i export i get the error “Excel Export went south, Sorry!” any ideas?

    Thanks,
    Jonny

  43. Jason says:

    Hello,

    Thanks for the plugin. I have gone through the install and the script, plugin, and files in the transfer folder are all there. The problem is that every time I do to one of the Surf Log tabs at the company or agent level I just get a box that says “Ops we hit an error”?
    Any ideas?
    Thanks very much,
    Jason

  44. Did you restart the LT DB agent on your LT server? This should be done when ever a new plugin is first installed

  45. Yea, large amounts of data is breaking the export. That is flagged for a fix in the next release

  46. JMG says:

    Very cool plugin, wondering if you could put a column in the history display that shows which keyword in the list caused the row to be highlighted, might be cool to highlight the keywords on the stats page\keyword lists that were found as well? I have entered lots of keywords in the list keywords list and I’m seeing a few sites get flagged and it’s very difficult to determine which of my keywords are causing the issue.

    Thanks again for sharing the plugin it is really cool!

  47. JMG says:

    I’m really digging this plugin and am trying it out on a few more clients w/ lots of agents spread out across several locations.
    I just got them configured on the client\info\surflog tab about and hour ago so I have just started collecting data for them.
    It took forever for the surflog tab to load at the client level, I’m sure due to the amount of data being collected. Then when I closed the client window and tried to open something else, labtech console crashed on me.
    Is there any way to add the surflog tab at the location level as well? So it can be viewed at the client\location\agent levels in case the amount of agents causes the client or location windows to hang?

  48. JMG,

    We sure can, those are both great ideas for improving SurfLog.

    I will see about putting that on the feature list.

  49. HillTech says:

    I posted on LabTechGeek as well. I keep getting failures on the download of the file, but the file is in the appropriate location. Both files are in the exact location shown in the zip file. What is the simple step I am missing? I have a request from a client for this information.

  50. Jordan says:

    Having the same issue as Dale above. Running 1.0.6 and labtech 2013 v60.276 – Everytime I export, I get the warning about export possibly taking a long time, and control center just closes.

    Note: its a very small amount of data in the logs when this happens.

    Otherwise, loving the plugin

Leave a Reply