Skip to content

Commit 808d1ba

Browse files
committed
XSSer v1.8.1 - 'The Hive' release
1 parent 9f196b7 commit 808d1ba

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

61 files changed

+7179
-4562
lines changed

xsser/Makefile Makefile

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ PYTHON=`which python`
44
DESTDIR=/
55
BUILDIR=$(CURDIR)/debian/xsser
66
PROJECT=xsser
7-
VERSION=0.7.0
7+
VERSION=1.8.1
88

99
all:
1010
@echo "make source - Create source package"

README.md

+49-25
Original file line numberDiff line numberDiff line change
@@ -1,53 +1,77 @@
1-
![XSSer](https://xsser.03c8.net/xsser/zika1.png "XSSerBanner")
1+
![XSSer](https://xsser.03c8.net/xsser/thehive1.png "XSSer")
22

3-
===================================================================
3+
----------
44

5-
Cross Site "Scripter" (aka XSSer) is an automatic -framework- to detect, exploit and report XSS vulnerabilities.
5+
+ Web: https://xsser.03c8.net
66

77
----------
88

9-
XSSer is released under the GPLv3. You can find the full license text
10-
in the [COPYING](./xsser/doc/COPYING) file.
9+
Cross Site "Scripter" (aka XSSer) is an automatic -framework- to detect, exploit and report XSS vulnerabilities in web-based applications.
1110

12-
----------
11+
It provides several options to try to bypass certain filters and various special techniques for code injection.
1312

14-
+ Web: https://xsser.03c8.net
13+
XSSer has pre-installed [ > 1300 XSS ] attacking vectors and can bypass-exploit code on several browsers/WAFs:
1514

16-
----------
15+
[PHPIDS]: PHP-IDS
16+
[Imperva]: Imperva Incapsula WAF
17+
[WebKnight]: WebKnight WAF
18+
[F5]: F5 Big IP WAF
19+
[Barracuda]: Barracuda WAF
20+
[ModSec]: Mod-Security
21+
[QuickDF]: QuickDefense
22+
[Chrome]: Google Chrome
23+
[IE]: Internet Explorer
24+
[FF]: Mozilla's Gecko rendering engine, used by Firefox/Iceweasel
25+
[NS-IE]: Netscape in IE rendering engine mode
26+
[NS-G]: Netscape in the Gecko rendering engine mode
27+
[Opera]: Opera
1728

18-
![XSSer](https://xsser.03c8.net/xsser/zika2.png "XSSerManifesto")
29+
![XSSer](https://xsser.03c8.net/xsser/url_generation.png "XSSer URL Generation Schema")
30+
31+
----------
1932

2033
#### Installing:
2134

22-
XSSer runs on many platforms. It requires Python and the following libraries:
35+
XSSer runs on many platforms. It requires Python and the following libraries:
2336

24-
- python-pycurl - Python bindings to libcurl
25-
- python-xmlbuilder - create xml/(x)html files - Python 2.x
26-
- python-beautifulsoup - error-tolerant HTML parser for Python
27-
- python-geoip - Python bindings for the GeoIP IP-to-country resolver library
37+
python-pycurl - Python bindings to libcurl
38+
python-xmlbuilder - create xml/(x)html files - Python 2.x
39+
python-beautifulsoup - error-tolerant HTML parser for Python
40+
python-geoip - Python bindings for the GeoIP IP-to-country resolver library
2841

29-
On Debian-based systems (ex: Ubuntu), run:
42+
On Debian-based systems (ex: Ubuntu), run:
3043

31-
sudo apt-get install python-pycurl python-xmlbuilder python-beautifulsoup python-geoip
44+
sudo apt-get install python-pycurl python-xmlbuilder python-beautifulsoup python-geoip
3245

33-
On other systems such as: Kali, Ubuntu, ArchLinux, ParrotSec, Fedora, etc... also run:
46+
On other systems such as: Kali, Ubuntu, ArchLinux, ParrotSec, Fedora, etc... also run:
3447

35-
pip install geoip
48+
pip install geoip
3649

3750
#### Source libs:
3851

39-
* Python: https://www.python.org/downloads/
40-
* PyCurl: http://pycurl.sourceforge.net/
41-
* PyBeautifulSoup: https://pypi.python.org/pypi/BeautifulSoup
42-
* PyGeoIP: https://pypi.python.org/pypi/GeoIP
52+
* Python: https://www.python.org/downloads/
53+
* PyCurl: http://pycurl.sourceforge.net/
54+
* PyBeautifulSoup: https://pypi.python.org/pypi/BeautifulSoup
55+
* PyGeoIP: https://pypi.python.org/pypi/GeoIP
56+
57+
----------
58+
59+
#### License:
60+
61+
XSSer is released under the GPLv3. You can find the full license text
62+
in the [LICENSE](./docs/LICENSE) file.
4363

4464
----------
4565

4666
#### Screenshots:
4767

48-
![XSSer](https://xsser.03c8.net/xsser/url_generation.png "XSSerSchema")
68+
![XSSer](https://xsser.03c8.net/xsser/thehive2.png "XSSer Shell")
69+
70+
![XSSer](https://xsser.03c8.net/xsser/thehive3.png "XSSer Manifesto")
71+
72+
![XSSer](https://xsser.03c8.net/xsser/thehive4.png "XSSer Configuration")
4973

50-
![XSSer](https://xsser.03c8.net/xsser/zika3.png "XSSerAdvanced")
74+
![XSSer](https://xsser.03c8.net/xsser/thehive5.png "XSSer Bypassers")
5175

52-
![XSSer](https://xsser.03c8.net/xsser/zika4.png "XSSerGeoMap")
76+
![XSSer](https://xsser.03c8.net/xsser/zika4.png "XSSer GeoMap")
5377

xsser/core/__init__.py core/__init__.py

+2-4
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,7 @@
11
"""
2-
$Id$
2+
This file is part of the XSSer project, https://xsser.03c8.net
33
4-
This file is part of the xsser project, http://xsser.03c8.net
5-
6-
Copyright (c) 2011/2016 psy <[email protected]>
4+
Copyright (c) 2010/2019 | psy <[email protected]>
75
86
xsser is free software; you can redistribute it and/or modify it under
97
the terms of the GNU General Public License as published by the Free

xsser/core/crawler.py core/crawler.py

+46-49
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,9 @@
22
# -*- coding: utf-8 -*-"
33
# vim: set expandtab tabstop=4 shiftwidth=4:
44
"""
5-
$Id$
5+
This file is part of the XSSer project, https://xsser.03c8.net
66
7-
This file is part of the xsser project, http://xsser.03c8.net
8-
9-
Copyright (c) 2011/2016 psy <[email protected]>
7+
Copyright (c) 2010/2019 | psy <[email protected]>
108
119
xsser is free software; you can redistribute it and/or modify it under
1210
the terms of the GNU General Public License as published by the Free
@@ -40,14 +38,10 @@ class EmergencyLanding(Exception):
4038
class Crawler(object):
4139
"""
4240
Crawler class.
43-
44-
Crawls a webpage looking for url arguments.
45-
Dont call from several threads! You should create a new one
46-
for every thread.
4741
"""
4842
def __init__(self, parent, curlwrapper=None, crawled=None, pool=None):
4943
# verbose: 0-no printing, 1-prints dots, 2-prints full output
50-
self.verbose = 1
44+
self.verbose = 0
5145
self._parent = parent
5246
self._to_crawl = []
5347
self._parse_external = True
@@ -81,7 +75,10 @@ def _find_args(self, url):
8175
find parameters in given url.
8276
"""
8377
parsed = urllib2.urlparse.urlparse(url)
84-
qs = urlparse.parse_qs(parsed.query)
78+
if "C=" in parsed.query and "O=" in parsed.query:
79+
qs = ""
80+
else:
81+
qs = urlparse.parse_qs(parsed.query)
8582
if parsed.scheme:
8683
path = parsed.scheme + "://" + parsed.netloc + parsed.path
8784
else:
@@ -92,6 +89,14 @@ def _find_args(self, url):
9289
if not zipped or not path in zipped[0]:
9390
self._found_args[key].append([path, url])
9491
self.generate_result(arg_name, path, url)
92+
if not qs:
93+
parsed = urllib2.urlparse.urlparse(url)
94+
if path.endswith("/"):
95+
attack_url = path + "XSS"
96+
else:
97+
attack_url = path + "/XSS"
98+
if not attack_url in self._parent.crawled_urls:
99+
self._parent.crawled_urls.append(attack_url)
95100
ncurrent = sum(map(lambda s: len(s), self._found_args.values()))
96101
if ncurrent >= self._max:
97102
self._armed = False
@@ -121,6 +126,7 @@ def crawl(self, path, depth=3, width=0, local_only=True):
121126
attack_urls = []
122127
if not self._parent._landing and self._armed:
123128
self._crawl(basepath, path, depth, width)
129+
# now parse all found items
124130
if self._ownpool:
125131
self.pool.dismissWorkers(len(self.pool.workers))
126132
self.pool.joinAllDismissedWorkers()
@@ -138,7 +144,7 @@ def generate_result(self, arg_name, path, url):
138144
for key, val in qs.iteritems():
139145
qs_joint[key] = val[0]
140146
attack_qs = dict(qs_joint)
141-
attack_qs[arg_name] = "VECTOR"
147+
attack_qs[arg_name] = "XSS"
142148
attack_url = path + '?' + urllib.urlencode(attack_qs)
143149
if not attack_url in self._parent.crawled_urls:
144150
self._parent.crawled_urls.append(attack_url)
@@ -178,37 +184,35 @@ def _curl_main(self, pars):
178184
self._get_done(basepath, depth, width, path, res, c_info)
179185

180186
def _get_error(self, request, error):
181-
try:
182-
path, depth, width, basepath = request.args[0]
183-
e_type, e_value, e_tb = error
184-
if e_type == pycurl.error:
185-
errno, message = e_value.args
186-
if errno == 28:
187-
print("requests pyerror -1")
188-
self.enqueue_jobs()
189-
self._requests.remove(path)
190-
return # timeout
191-
else:
192-
self.report('crawler curl error: '+message+' ('+str(errno)+')')
193-
elif e_type == EmergencyLanding:
194-
pass
187+
path, depth, width, basepath = request.args[0]
188+
e_type, e_value, e_tb = error
189+
if e_type == pycurl.error:
190+
errno, message = e_value.args
191+
if errno == 28:
192+
print("requests pyerror -1")
193+
self.enqueue_jobs()
194+
self._requests.remove(path)
195+
return # timeout
195196
else:
196-
traceback.print_tb(e_tb)
197-
self.report('crawler error: '+str(e_value)+' '+path)
198-
if not e_type == EmergencyLanding:
199-
for reporter in self._parent._reporters:
200-
reporter.mosquito_crashed(path, str(e_value))
201-
self.enqueue_jobs()
202-
self._requests.remove(path)
203-
except:
204-
return
197+
self.report('crawler curl error: '+message+' ('+str(errno)+')')
198+
elif e_type == EmergencyLanding:
199+
pass
200+
else:
201+
traceback.print_tb(e_tb)
202+
self.report('crawler error: '+str(e_value)+' '+path)
203+
if not e_type == EmergencyLanding:
204+
for reporter in self._parent._reporters:
205+
reporter.mosquito_crashed(path, str(e_value))
206+
self.enqueue_jobs()
207+
self._requests.remove(path)
205208

206209
def _emergency_parse(self, html_data, start=0):
207210
links = set()
208211
pos = 0
209-
if not html_data:
210-
return
211-
data_len = len(html_data)
212+
try:
213+
data_len = len(html_data)
214+
except:
215+
data_len = html_data
212216
while pos < data_len:
213217
if len(links)+start > self._max:
214218
break
@@ -236,35 +240,31 @@ def enqueue_jobs(self):
236240
next_job = self._to_crawl.pop()
237241
self._crawl(*next_job)
238242

239-
def _get_done(self, basepath, depth, width, path, html_data, content_type): # request, result):
243+
def _get_done(self, basepath, depth, width, path, html_data, content_type):
240244
if not self._armed or len(self._parent.crawled_urls) >= self._max:
241245
raise EmergencyLanding
242246
try:
243247
encoding = content_type.split(";")[1].split("=")[1].strip()
244248
except:
245249
encoding = None
246250
try:
247-
soup = BeautifulSoup(html_data, from_encoding=encoding)
251+
soup = BeautifulSoup(html_data, fromEncoding=encoding)
248252
links = None
249253
except:
250254
soup = None
251255
links = self._emergency_parse(html_data)
252-
253256
for reporter in self._parent._reporters:
254257
reporter.start_crawl(path)
255-
256258
if not links and soup:
257-
links = soup.find_all('a')
258-
forms = soup.find_all('form')
259-
259+
links = soup.findAll('a')
260+
forms = soup.findAll('form')
260261
for form in forms:
261262
pars = {}
262263
if form.has_key("action"):
263264
action_path = urlparse.urljoin(path, form["action"])
264265
else:
265266
action_path = path
266-
for input_par in form.find_all('input'):
267-
267+
for input_par in form.findAll('input'):
268268
if not input_par.has_key("name"):
269269
continue
270270
value = "foo"
@@ -284,8 +284,6 @@ def _get_done(self, basepath, depth, width, path, html_data, content_type): # re
284284
elif self.verbose:
285285
sys.stdout.write(".")
286286
sys.stdout.flush()
287-
if not links:
288-
return
289287
if len(links) > self._max:
290288
links = links[:self._max]
291289
for a in links:
@@ -323,7 +321,6 @@ def _check_url(self, basepath, path, href, depth, width):
323321
self._find_args(href)
324322
for reporter in self._parent._reporters:
325323
reporter.add_link(path, href)
326-
self.report("\n[Info] Spidering: " + str(href))
327324
if self._armed and depth>0:
328325
if len(self._to_crawl) < self._max:
329326
self._to_crawl.append([basepath, href, depth-1, width])

xsser/core/curlcontrol.py core/curlcontrol.py

+21-24
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,9 @@
22
# -*- coding: utf-8 -*-"
33
# vim: set expandtab tabstop=4 shiftwidth=4:
44
"""
5-
$Id$
5+
This file is part of the XSSer project, https://xsser.03c8.net
66
7-
This file is part of the xsser project, http://xsser.03c8.net
8-
9-
Copyright (c) 2011/2018 psy <[email protected]>
7+
Copyright (c) 2010/2019 | psy <[email protected]>
108
119
xsser is free software; you can redistribute it and/or modify it under
1210
the terms of the GNU General Public License as published by the Free
@@ -469,38 +467,37 @@ def print_options(cls):
469467
"""
470468
Print selected options.
471469
"""
472-
print "\n[-]Verbose: active"
473-
print "[-]Cookie:", cls.cookie
474-
print "[-]HTTP User Agent:", cls.agent
475-
print "[-]HTTP Referer:", cls.referer
476-
print "[-]Extra HTTP Headers:", cls.headers
470+
print "\nCookie:", cls.cookie
471+
print "User Agent:", cls.agent
472+
print "Referer:", cls.referer
473+
print "Extra Headers:", cls.headers
477474
if cls.xforw == True:
478-
print "[-]X-Forwarded-For:", "Random IP"
475+
print "X-Forwarded-For:", "Random IP"
479476
else:
480-
print "[-]X-Forwarded-For:", cls.xforw
477+
print "X-Forwarded-For:", cls.xforw
481478
if cls.xclient == True:
482-
print "[-]X-Client-IP:", "Random IP"
479+
print "X-Client-IP:", "Random IP"
483480
else:
484-
print "[-]X-Client-IP:", cls.xclient
485-
print "[-]Authentication Type:", cls.atype
486-
print "[-]Authentication Credentials:", cls.acred
481+
print "X-Client-IP:", cls.xclient
482+
print "Authentication Type:", cls.atype
483+
print "Authentication Credentials:", cls.acred
487484
if cls.ignoreproxy == True:
488-
print "[-]Proxy:", "Ignoring system default HTTP proxy"
485+
print "Proxy:", "Ignoring system default HTTP proxy"
489486
else:
490-
print "[-]Proxy:", cls.proxy
491-
print "[-]Timeout:", cls.timeout
487+
print "Proxy:", cls.proxy
488+
print "Timeout:", cls.timeout
492489
if cls.tcp_nodelay == True:
493-
print "[-]Delaying:", "TCP_NODELAY activate"
490+
print "Delaying:", "TCP_NODELAY activate"
494491
else:
495-
print "[-]Delaying:", cls.delay, "seconds"
492+
print "Delaying:", cls.delay, "seconds"
496493
if cls.followred == True:
497-
print "[-]Follow 302 code:", "active"
494+
print "Follow 302 code:", "active"
498495
if cls.fli:
499-
print"[-]Limit to follow:", cls.fli
496+
print"Limit to follow:", cls.fli
500497
else:
501-
print "[-]Delaying:", cls.delay, "seconds"
498+
print "Delaying:", cls.delay, "seconds"
502499

503-
print "[-]Retries:", cls.retries, "\n"
500+
print "Retries:", cls.retries, "\n"
504501

505502
def answered(self, check):
506503
"""

0 commit comments

Comments
 (0)