• Creator
  • #2150615

    Student Web Surfing


    by lord_orochi ·

    This is the first time I’ve asked a question on here. Basically, the college I work at needs to track the viewing history of 50+ lab computers. The catch is that it must be free (IT is seen as a liability here, because we don’t bring in any money). We have a squid proxy that all of the labs go through but only two of our IT know it’s operation. So… we need a simple to read this ip looked at these sites. Thanks!

All Answers

  • Author
    • #2914128


      by lord_orochi ·

      In reply to Student Web Surfing


    • #2914104

      Go here for more info…..

      by Anonymous ·

      In reply to Student Web Surfing
      Some are freeware, some are not.

      Please post back if you have anymore problems or questions.

      • #2914090

        No luck

        by lord_orochi ·

        In reply to Go here for more info…..

        I looked around the site you suggested and tried some of the programs. However, they either didn’t work or didn’t do what we need. Thanks for your suggestion though. To be a bit more specific about what I need: a program that will gather the urls from lab machines that we can view easily from the IT dept.

    • #2926388

      If you are using IE try this

      by rob miners ·

      In reply to Student Web Surfing

      IEHistoryView v1.36 – View Visited Web Sites of Internet Explorer

    • #2925773

      Squid access.log

      by churdoo ·

      In reply to Student Web Surfing

      Seems to me, if you’re using Squid, that you already have a viable tool and you just need to learn how to use it. Further, if you have 2 on staff that know how to use it, you’re already better off than bringing in an brand new product that nobody knows.

      Maybe this link will help?
      Looks like the access.log has what you’re looking for, so parsing the log shouldn’t be too big of a deal.

      • #2924918

        Always a Catch…

        by lord_orochi ·

        In reply to Squid access.log

        It would be an option… but our squid cache and the lab machines are separated by a router… So all requests show the router’s ip instead of the individual lab machines. I think that I will try to implement one of the tools from above to export a text with the index.dat data to the main lab server. Trying to decide whether to have it run as a logoff script or just have it run ever so often. I’m still open for suggestments. ^_^

Viewing 3 reply threads