Malspider is a web spidering framework that inspects websites for characteristics of compromise. Malspider has three purposes:
Malspider has built-in detection for characteristics of compromise like hidden iframes, reconnaisance frameworks, vbscript injection, email address disclosure, etc.
> As we find stuff we will continue to add classifications to this tool and we hope you will do the same. Malspider will be a much better tool if CIRT teams and security practitioners around the world contribute to the project. ciscocsirt
Please make sure these technologies are installed before continuing:
Note: If your server already has specific versions of these components installed, you can use a virtualenv to create an isolated python environment.
Tested and working on minimal installations of:
Start the installation process by running “./quick_install” from the command line. Please read the prompts carefully!!
Malspider comes with a quick_install script found in the root directory. This scripts attempts to makes the installation process as painless as possible by completing the following steps:
Note : _ The quick_install script uses scripts found under the install/ directory. If any of the above steps fail you can attempt to complete them manually using those scripts. _
Start Malspider by running “./quick_start” from the command line. Malspider comes with a quick_start script found in the root directory. This script attempts to start the daemon and the web application. Malspider can be accessed from your browser on port 8080 @ http://0.0.0.0:8080
Interaction with Malspider happens via an easy-to-use dashboard accessible through your web browser. The dashboard enables you to view alerts, inspect injected code, add websites to monitor, and tune false positives. You can add websites to you want to crawl by navigating to the administrative panel @ http://0.0.0.0:8080/admin (or by clicking on the admin link from the dashboard). Click on “Organizations” and a new Organization. You’ll be prompted for the:
By default, Malspider crawls 20 pages per domain. This can be changed. You can crawl as many pages as you like (per domain) or you can crawl only the homepage of each site.
Malspider randomly selects a user agent string from a list found at malspider/resources/useragents.txt . If you would like to add more user agents to the list then simply edit that text file. Malspider has also built-in capabilities for taking screenshots of every page it crawls. Screenshots can be useful in a variety of situations, but this can cause a drastic increase in server space utilization. For that reason, screenshots are turned off by default. For this reason email address detection is also off by default. Malspider crawls websites and stores information about those sites in a database. The data in the database is post-processed and analyzed for potentially malicious characteristics. You can view results from the analyzer by simply viewing the dashboard and clicking on “View Alerts”. Your database can grow rather large very quickly. It is recommended that, for performance reasons, you delete data from the ‘pages’ table and the ‘elements’ table once per month